Sample records for hallam automated pipeline

  1. Automated Monitoring of Pipeline Rights-of-Way

    NASA Technical Reports Server (NTRS)

    Frost, Chard Ritchie

    2010-01-01

    NASA Ames Research Center and the Pipeline Research Council International, Inc. have partnered in the formation of a research program to identify and develop the key technologies required to enable automated detection of threats to gas and oil transmission and distribution pipelines. This presentation describes the Right-of-way Automated Monitoring (RAM) program and highlights research successes to date, continuing challenges to implementing the RAM objectives, and the program's ongoing work and plans.

  2. ARTIP: Automated Radio Telescope Image Processing Pipeline

    NASA Astrophysics Data System (ADS)

    Sharma, Ravi; Gyanchandani, Dolly; Kulkarni, Sarang; Gupta, Neeraj; Pathak, Vineet; Pande, Arti; Joshi, Unmesh

    2018-02-01

    The Automated Radio Telescope Image Processing Pipeline (ARTIP) automates the entire process of flagging, calibrating, and imaging for radio-interferometric data. ARTIP starts with raw data, i.e. a measurement set and goes through multiple stages, such as flux calibration, bandpass calibration, phase calibration, and imaging to generate continuum and spectral line images. Each stage can also be run independently. The pipeline provides continuous feedback to the user through various messages, charts and logs. It is written using standard python libraries and the CASA package. The pipeline can deal with datasets with multiple spectral windows and also multiple target sources which may have arbitrary combinations of flux/bandpass/phase calibrators.

  3. Data Validation Package, June 2016 Groundwater Sampling at the Hallam, Nebraska, Decommissioned Reactor Site, August 2016

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Surovchak, Scott; Miller, Michele

    The 2008 Long-Term Surveillance Plan [LTSP] for the Decommissioned Hallam Nuclear Power Facility, Hallam, Nebraska (http://www.lm.doe.gov/Hallam/Documents.aspx) requires groundwater monitoring once every 2 years. Seventeen monitoring wells at the Hallam site were sampled during this event as specified in the plan. Planned monitoring locations are shown in Attachment 1, Sampling and Analysis Work Order. Water levels were measured at all sampled wells and at two additional wells (6A and 6B) prior to the start of sampling. Additionally, water levels of each sampled well were measured at the beginning of sampling. See Attachment 2, Trip Report, for additional details. Sampling and analysismore » were conducted as specified in Sampling and Analysis Plan for U.S. Department of Energy Office of Legacy Management Sites (LMS/PRO/S04351, continually updated, http://energy.gov/lm/downloads/sampling-and-analysis-plan-us-department- energy-office-legacy-management-sites). Gross alpha and gross beta are the only parameters that were detected at statistically significant concentrations. Time/concentration graphs of the gross alpha and gross beta data are included in Attachment 3, Data Presentation. The gross alpha and gross beta activity concentrations observed are consistent with values previously observed and are attributed to naturally occurring radionuclides (e.g., uranium and uranium decay chain products) in the groundwater.« less

  4. A Fully-Automated Subcortical and Ventricular Shape Generation Pipeline Preserving Smoothness and Anatomical Topology

    PubMed Central

    Tang, Xiaoying; Luo, Yuan; Chen, Zhibin; Huang, Nianwei; Johnson, Hans J.; Paulsen, Jane S.; Miller, Michael I.

    2018-01-01

    In this paper, we present a fully-automated subcortical and ventricular shape generation pipeline that acts on structural magnetic resonance images (MRIs) of the human brain. Principally, the proposed pipeline consists of three steps: (1) automated structure segmentation using the diffeomorphic multi-atlas likelihood-fusion algorithm; (2) study-specific shape template creation based on the Delaunay triangulation; (3) deformation-based shape filtering using the large deformation diffeomorphic metric mapping for surfaces. The proposed pipeline is shown to provide high accuracy, sufficient smoothness, and accurate anatomical topology. Two datasets focused upon Huntington's disease (HD) were used for evaluating the performance of the proposed pipeline. The first of these contains a total of 16 MRI scans, each with a gold standard available, on which the proposed pipeline's outputs were observed to be highly accurate and smooth when compared with the gold standard. Visual examinations and outlier analyses on the second dataset, which contains a total of 1,445 MRI scans, revealed 100% success rates for the putamen, the thalamus, the globus pallidus, the amygdala, and the lateral ventricle in both hemispheres and rates no smaller than 97% for the bilateral hippocampus and caudate. Another independent dataset, consisting of 15 atlas images and 20 testing images, was also used to quantitatively evaluate the proposed pipeline, with high accuracy having been obtained. In short, the proposed pipeline is herein demonstrated to be effective, both quantitatively and qualitatively, using a large collection of MRI scans. PMID:29867332

  5. A Fully-Automated Subcortical and Ventricular Shape Generation Pipeline Preserving Smoothness and Anatomical Topology.

    PubMed

    Tang, Xiaoying; Luo, Yuan; Chen, Zhibin; Huang, Nianwei; Johnson, Hans J; Paulsen, Jane S; Miller, Michael I

    2018-01-01

    In this paper, we present a fully-automated subcortical and ventricular shape generation pipeline that acts on structural magnetic resonance images (MRIs) of the human brain. Principally, the proposed pipeline consists of three steps: (1) automated structure segmentation using the diffeomorphic multi-atlas likelihood-fusion algorithm; (2) study-specific shape template creation based on the Delaunay triangulation; (3) deformation-based shape filtering using the large deformation diffeomorphic metric mapping for surfaces. The proposed pipeline is shown to provide high accuracy, sufficient smoothness, and accurate anatomical topology. Two datasets focused upon Huntington's disease (HD) were used for evaluating the performance of the proposed pipeline. The first of these contains a total of 16 MRI scans, each with a gold standard available, on which the proposed pipeline's outputs were observed to be highly accurate and smooth when compared with the gold standard. Visual examinations and outlier analyses on the second dataset, which contains a total of 1,445 MRI scans, revealed 100% success rates for the putamen, the thalamus, the globus pallidus, the amygdala, and the lateral ventricle in both hemispheres and rates no smaller than 97% for the bilateral hippocampus and caudate. Another independent dataset, consisting of 15 atlas images and 20 testing images, was also used to quantitatively evaluate the proposed pipeline, with high accuracy having been obtained. In short, the proposed pipeline is herein demonstrated to be effective, both quantitatively and qualitatively, using a large collection of MRI scans.

  6. MIEC-SVM: automated pipeline for protein peptide/ligand interaction prediction.

    PubMed

    Li, Nan; Ainsworth, Richard I; Wu, Meixin; Ding, Bo; Wang, Wei

    2016-03-15

    MIEC-SVM is a structure-based method for predicting protein recognition specificity. Here, we present an automated MIEC-SVM pipeline providing an integrated and user-friendly workflow for construction and application of the MIEC-SVM models. This pipeline can handle standard amino acids and those with post-translational modifications (PTMs) or small molecules. Moreover, multi-threading and support to Sun Grid Engine (SGE) are implemented to significantly boost the computational efficiency. The program is available at http://wanglab.ucsd.edu/MIEC-SVM CONTACT: : wei-wang@ucsd.edu Supplementary data available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  7. Automated processing pipeline for neonatal diffusion MRI in the developing Human Connectome Project.

    PubMed

    Bastiani, Matteo; Andersson, Jesper L R; Cordero-Grande, Lucilio; Murgasova, Maria; Hutter, Jana; Price, Anthony N; Makropoulos, Antonios; Fitzgibbon, Sean P; Hughes, Emer; Rueckert, Daniel; Victor, Suresh; Rutherford, Mary; Edwards, A David; Smith, Stephen M; Tournier, Jacques-Donald; Hajnal, Joseph V; Jbabdi, Saad; Sotiropoulos, Stamatios N

    2018-05-28

    The developing Human Connectome Project is set to create and make available to the scientific community a 4-dimensional map of functional and structural cerebral connectivity from 20 to 44 weeks post-menstrual age, to allow exploration of the genetic and environmental influences on brain development, and the relation between connectivity and neurocognitive function. A large set of multi-modal MRI data from fetuses and newborn infants is currently being acquired, along with genetic, clinical and developmental information. In this overview, we describe the neonatal diffusion MRI (dMRI) image processing pipeline and the structural connectivity aspect of the project. Neonatal dMRI data poses specific challenges, and standard analysis techniques used for adult data are not directly applicable. We have developed a processing pipeline that deals directly with neonatal-specific issues, such as severe motion and motion-related artefacts, small brain sizes, high brain water content and reduced anisotropy. This pipeline allows automated analysis of in-vivo dMRI data, probes tissue microstructure, reconstructs a number of major white matter tracts, and includes an automated quality control framework that identifies processing issues or inconsistencies. We here describe the pipeline and present an exemplar analysis of data from 140 infants imaged at 38-44 weeks post-menstrual age. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.

  8. SAND: an automated VLBI imaging and analysing pipeline - I. Stripping component trajectories

    NASA Astrophysics Data System (ADS)

    Zhang, M.; Collioud, A.; Charlot, P.

    2018-02-01

    We present our implementation of an automated very long baseline interferometry (VLBI) data-reduction pipeline that is dedicated to interferometric data imaging and analysis. The pipeline can handle massive VLBI data efficiently, which makes it an appropriate tool to investigate multi-epoch multiband VLBI data. Compared to traditional manual data reduction, our pipeline provides more objective results as less human interference is involved. The source extraction is carried out in the image plane, while deconvolution and model fitting are performed in both the image plane and the uv plane for parallel comparison. The output from the pipeline includes catalogues of CLEANed images and reconstructed models, polarization maps, proper motion estimates, core light curves and multiband spectra. We have developed a regression STRIP algorithm to automatically detect linear or non-linear patterns in the jet component trajectories. This algorithm offers an objective method to match jet components at different epochs and to determine their proper motions.

  9. UBO Detector - A cluster-based, fully automated pipeline for extracting white matter hyperintensities.

    PubMed

    Jiang, Jiyang; Liu, Tao; Zhu, Wanlin; Koncz, Rebecca; Liu, Hao; Lee, Teresa; Sachdev, Perminder S; Wen, Wei

    2018-07-01

    We present 'UBO Detector', a cluster-based, fully automated pipeline for extracting and calculating variables for regions of white matter hyperintensities (WMH) (available for download at https://cheba.unsw.edu.au/group/neuroimaging-pipeline). It takes T1-weighted and fluid attenuated inversion recovery (FLAIR) scans as input, and SPM12 and FSL functions are utilised for pre-processing. The candidate clusters are then generated by FMRIB's Automated Segmentation Tool (FAST). A supervised machine learning algorithm, k-nearest neighbor (k-NN), is applied to determine whether the candidate clusters are WMH or non-WMH. UBO Detector generates both image and text (volumes and the number of WMH clusters) outputs for whole brain, periventricular, deep, and lobar WMH, as well as WMH in arterial territories. The computation time for each brain is approximately 15 min. We validated the performance of UBO Detector by showing a) high segmentation (similarity index (SI) = 0.848) and volumetric (intraclass correlation coefficient (ICC) = 0.985) agreement between the UBO Detector-derived and manually traced WMH; b) highly correlated (r 2  > 0.9) and a steady increase of WMH volumes over time; and c) significant associations of periventricular (t = 22.591, p < 0.001) and deep (t = 14.523, p < 0.001) WMH volumes generated by UBO Detector with Fazekas rating scores. With parallel computing enabled in UBO Detector, the processing can take advantage of multi-core CPU's that are commonly available on workstations. In conclusion, UBO Detector is a reliable, efficient and fully automated WMH segmentation pipeline. Copyright © 2018 Elsevier Inc. All rights reserved.

  10. An automated field phenotyping pipeline for application in grapevine research.

    PubMed

    Kicherer, Anna; Herzog, Katja; Pflanz, Michael; Wieland, Markus; Rüger, Philipp; Kecke, Steffen; Kuhlmann, Heiner; Töpfer, Reinhard

    2015-02-26

    Due to its perennial nature and size, the acquisition of phenotypic data in grapevine research is almost exclusively restricted to the field and done by visual estimation. This kind of evaluation procedure is limited by time, cost and the subjectivity of records. As a consequence, objectivity, automation and more precision of phenotypic data evaluation are needed to increase the number of samples, manage grapevine repositories, enable genetic research of new phenotypic traits and, therefore, increase the efficiency in plant research. In the present study, an automated field phenotyping pipeline was setup and applied in a plot of genetic resources. The application of the PHENObot allows image acquisition from at least 250 individual grapevines per hour directly in the field without user interaction. Data management is handled by a database (IMAGEdata). The automatic image analysis tool BIVcolor (Berries in Vineyards-color) permitted the collection of precise phenotypic data of two important fruit traits, berry size and color, within a large set of plants. The application of the PHENObot represents an automated tool for high-throughput sampling of image data in the field. The automated analysis of these images facilitates the generation of objective and precise phenotypic data on a larger scale.

  11. An Automated Field Phenotyping Pipeline for Application in Grapevine Research

    PubMed Central

    Kicherer, Anna; Herzog, Katja; Pflanz, Michael; Wieland, Markus; Rüger, Philipp; Kecke, Steffen; Kuhlmann, Heiner; Töpfer, Reinhard

    2015-01-01

    Due to its perennial nature and size, the acquisition of phenotypic data in grapevine research is almost exclusively restricted to the field and done by visual estimation. This kind of evaluation procedure is limited by time, cost and the subjectivity of records. As a consequence, objectivity, automation and more precision of phenotypic data evaluation are needed to increase the number of samples, manage grapevine repositories, enable genetic research of new phenotypic traits and, therefore, increase the efficiency in plant research. In the present study, an automated field phenotyping pipeline was setup and applied in a plot of genetic resources. The application of the PHENObot allows image acquisition from at least 250 individual grapevines per hour directly in the field without user interaction. Data management is handled by a database (IMAGEdata). The automatic image analysis tool BIVcolor (Berries in Vineyards-color) permitted the collection of precise phenotypic data of two important fruit traits, berry size and color, within a large set of plants. The application of the PHENObot represents an automated tool for high-throughput sampling of image data in the field. The automated analysis of these images facilitates the generation of objective and precise phenotypic data on a larger scale. PMID:25730485

  12. Hal: an automated pipeline for phylogenetic analyses of genomic data.

    PubMed

    Robbertse, Barbara; Yoder, Ryan J; Boyd, Alex; Reeves, John; Spatafora, Joseph W

    2011-02-07

    The rapid increase in genomic and genome-scale data is resulting in unprecedented levels of discrete sequence data available for phylogenetic analyses. Major analytical impasses exist, however, prior to analyzing these data with existing phylogenetic software. Obstacles include the management of large data sets without standardized naming conventions, identification and filtering of orthologous clusters of proteins or genes, and the assembly of alignments of orthologous sequence data into individual and concatenated super alignments. Here we report the production of an automated pipeline, Hal that produces multiple alignments and trees from genomic data. These alignments can be produced by a choice of four alignment programs and analyzed by a variety of phylogenetic programs. In short, the Hal pipeline connects the programs BLASTP, MCL, user specified alignment programs, GBlocks, ProtTest and user specified phylogenetic programs to produce species trees. The script is available at sourceforge (http://sourceforge.net/projects/bio-hal/). The results from an example analysis of Kingdom Fungi are briefly discussed.

  13. Quantification of diffusion tensor imaging in normal white matter maturation of early childhood using an automated processing pipeline.

    PubMed

    Loh, K B; Ramli, N; Tan, L K; Roziah, M; Rahmat, K; Ariffin, H

    2012-07-01

    The degree and status of white matter myelination can be sensitively monitored using diffusion tensor imaging (DTI). This study looks at the measurement of fractional anistropy (FA) and mean diffusivity (MD) using an automated ROI with an existing DTI atlas. Anatomical MRI and structural DTI were performed cross-sectionally on 26 normal children (newborn to 48 months old), using 1.5-T MRI. The automated processing pipeline was implemented to convert diffusion-weighted images into the NIfTI format. DTI-TK software was used to register the processed images to the ICBM DTI-81 atlas, while AFNI software was used for automated atlas-based volumes of interest (VOIs) and statistical value extraction. DTI exhibited consistent grey-white matter contrast. Triphasic temporal variation of the FA and MD values was noted, with FA increasing and MD decreasing rapidly early in the first 12 months. The second phase lasted 12-24 months during which the rate of FA and MD changes was reduced. After 24 months, the FA and MD values plateaued. DTI is a superior technique to conventional MR imaging in depicting WM maturation. The use of the automated processing pipeline provides a reliable environment for quantitative analysis of high-throughput DTI data. Diffusion tensor imaging outperforms conventional MRI in depicting white matter maturation. • DTI will become an important clinical tool for diagnosing paediatric neurological diseases. • DTI appears especially helpful for developmental abnormalities, tumours and white matter disease. • An automated processing pipeline assists quantitative analysis of high throughput DTI data.

  14. Development of an Automated Imaging Pipeline for the Analysis of the Zebrafish Larval Kidney

    PubMed Central

    Westhoff, Jens H.; Giselbrecht, Stefan; Schmidts, Miriam; Schindler, Sebastian; Beales, Philip L.; Tönshoff, Burkhard; Liebel, Urban; Gehrig, Jochen

    2013-01-01

    The analysis of kidney malformation caused by environmental influences during nephrogenesis or by hereditary nephropathies requires animal models allowing the in vivo observation of developmental processes. The zebrafish has emerged as a useful model system for the analysis of vertebrate organ development and function, and it is suitable for the identification of organotoxic or disease-modulating compounds on a larger scale. However, to fully exploit its potential in high content screening applications, dedicated protocols are required allowing the consistent visualization of inner organs such as the embryonic kidney. To this end, we developed a high content screening compatible pipeline for the automated imaging of standardized views of the developing pronephros in zebrafish larvae. Using a custom designed tool, cavities were generated in agarose coated microtiter plates allowing for accurate positioning and orientation of zebrafish larvae. This enabled the subsequent automated acquisition of stable and consistent dorsal views of pronephric kidneys. The established pipeline was applied in a pilot screen for the analysis of the impact of potentially nephrotoxic drugs on zebrafish pronephros development in the Tg(wt1b:EGFP) transgenic line in which the developing pronephros is highlighted by GFP expression. The consistent image data that was acquired allowed for quantification of gross morphological pronephric phenotypes, revealing concentration dependent effects of several compounds on nephrogenesis. In addition, applicability of the imaging pipeline was further confirmed in a morpholino based model for cilia-associated human genetic disorders associated with different intraflagellar transport genes. The developed tools and pipeline can be used to study various aspects in zebrafish kidney research, and can be readily adapted for the analysis of other organ systems. PMID:24324758

  15. Development of an automated imaging pipeline for the analysis of the zebrafish larval kidney.

    PubMed

    Westhoff, Jens H; Giselbrecht, Stefan; Schmidts, Miriam; Schindler, Sebastian; Beales, Philip L; Tönshoff, Burkhard; Liebel, Urban; Gehrig, Jochen

    2013-01-01

    The analysis of kidney malformation caused by environmental influences during nephrogenesis or by hereditary nephropathies requires animal models allowing the in vivo observation of developmental processes. The zebrafish has emerged as a useful model system for the analysis of vertebrate organ development and function, and it is suitable for the identification of organotoxic or disease-modulating compounds on a larger scale. However, to fully exploit its potential in high content screening applications, dedicated protocols are required allowing the consistent visualization of inner organs such as the embryonic kidney. To this end, we developed a high content screening compatible pipeline for the automated imaging of standardized views of the developing pronephros in zebrafish larvae. Using a custom designed tool, cavities were generated in agarose coated microtiter plates allowing for accurate positioning and orientation of zebrafish larvae. This enabled the subsequent automated acquisition of stable and consistent dorsal views of pronephric kidneys. The established pipeline was applied in a pilot screen for the analysis of the impact of potentially nephrotoxic drugs on zebrafish pronephros development in the Tg(wt1b:EGFP) transgenic line in which the developing pronephros is highlighted by GFP expression. The consistent image data that was acquired allowed for quantification of gross morphological pronephric phenotypes, revealing concentration dependent effects of several compounds on nephrogenesis. In addition, applicability of the imaging pipeline was further confirmed in a morpholino based model for cilia-associated human genetic disorders associated with different intraflagellar transport genes. The developed tools and pipeline can be used to study various aspects in zebrafish kidney research, and can be readily adapted for the analysis of other organ systems.

  16. ENGINEERING AND CONSTRUCTING THE HALLAM NUCLEAR POWER FACILITY REACTOR STRUCTURE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mahlmeister, J E; Haberer, W V; Casey, D F

    1960-12-15

    The Hallam Nuclear Power Facility reactor structure, including the cavity liner, is described, and the design philosophy and special design requirements which were developed during the preliminary and final engineering phases of the project are explained. The structure was designed for 600 deg F inlet and 1000 deg F outlet operating sodium temperatures and fabricated of austenitic and ferritic stainless steels. Support for the reactor core components and adequate containment for biological safeguards were readily provided even though quite conservative design philosophy was used. The calculated operating characteristics, including heat generation, temperature distributions and stress levels for full-power operation, aremore » summarized. Ship fabrication and field installation experiences are also briefly related. Results of this project have established that the sodium graphite reactor permits practical and economical fabrication and field erection procedures; considerably higher operating design temperatures are believed possible without radical design changes. Also, larger reactor structures can be similarly constructed for higher capacity (300 to 1000 Mwe) nuclear power plants. (auth)« less

  17. TH-AB-207A-05: A Fully-Automated Pipeline for Generating CT Images Across a Range of Doses and Reconstruction Methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Young, S; Lo, P; Hoffman, J

    Purpose: To evaluate the robustness of CAD or Quantitative Imaging methods, they should be tested on a variety of cases and under a variety of image acquisition and reconstruction conditions that represent the heterogeneity encountered in clinical practice. The purpose of this work was to develop a fully-automated pipeline for generating CT images that represent a wide range of dose and reconstruction conditions. Methods: The pipeline consists of three main modules: reduced-dose simulation, image reconstruction, and quantitative analysis. The first two modules of the pipeline can be operated in a completely automated fashion, using configuration files and running the modulesmore » in a batch queue. The input to the pipeline is raw projection CT data; this data is used to simulate different levels of dose reduction using a previously-published algorithm. Filtered-backprojection reconstructions are then performed using FreeCT-wFBP, a freely-available reconstruction software for helical CT. We also added support for an in-house, model-based iterative reconstruction algorithm using iterative coordinate-descent optimization, which may be run in tandem with the more conventional recon methods. The reduced-dose simulations and image reconstructions are controlled automatically by a single script, and they can be run in parallel on our research cluster. The pipeline was tested on phantom and lung screening datasets from a clinical scanner (Definition AS, Siemens Healthcare). Results: The images generated from our test datasets appeared to represent a realistic range of acquisition and reconstruction conditions that we would expect to find clinically. The time to generate images was approximately 30 minutes per dose/reconstruction combination on a hybrid CPU/GPU architecture. Conclusion: The automated research pipeline promises to be a useful tool for either training or evaluating performance of quantitative imaging software such as classifiers and CAD algorithms across

  18. CloVR-ITS: Automated internal transcribed spacer amplicon sequence analysis pipeline for the characterization of fungal microbiota

    PubMed Central

    2013-01-01

    Background Besides the development of comprehensive tools for high-throughput 16S ribosomal RNA amplicon sequence analysis, there exists a growing need for protocols emphasizing alternative phylogenetic markers such as those representing eukaryotic organisms. Results Here we introduce CloVR-ITS, an automated pipeline for comparative analysis of internal transcribed spacer (ITS) pyrosequences amplified from metagenomic DNA isolates and representing fungal species. This pipeline performs a variety of steps similar to those commonly used for 16S rRNA amplicon sequence analysis, including preprocessing for quality, chimera detection, clustering of sequences into operational taxonomic units (OTUs), taxonomic assignment (at class, order, family, genus, and species levels) and statistical analysis of sample groups of interest based on user-provided information. Using ITS amplicon pyrosequencing data from a previous human gastric fluid study, we demonstrate the utility of CloVR-ITS for fungal microbiota analysis and provide runtime and cost examples, including analysis of extremely large datasets on the cloud. We show that the largest fractions of reads from the stomach fluid samples were assigned to Dothideomycetes, Saccharomycetes, Agaricomycetes and Sordariomycetes but that all samples were dominated by sequences that could not be taxonomically classified. Representatives of the Candida genus were identified in all samples, most notably C. quercitrusa, while sequence reads assigned to the Aspergillus genus were only identified in a subset of samples. CloVR-ITS is made available as a pre-installed, automated, and portable software pipeline for cloud-friendly execution as part of the CloVR virtual machine package (http://clovr.org). Conclusion The CloVR-ITS pipeline provides fungal microbiota analysis that can be complementary to bacterial 16S rRNA and total metagenome sequence analysis allowing for more comprehensive studies of environmental and host-associated microbial

  19. APPHi: Automated Photometry Pipeline for High Cadence Large Volume Data

    NASA Astrophysics Data System (ADS)

    Sánchez, E.; Castro, J.; Silva, J.; Hernández, J.; Reyes, M.; Hernández, B.; Alvarez, F.; García T.

    2018-04-01

    APPHi (Automated Photometry Pipeline) carries out aperture and differential photometry of TAOS-II project data. It is computationally efficient and can be used also with other astronomical wide-field image data. APPHi works with large volumes of data and handles both FITS and HDF5 formats. Due the large number of stars that the software has to handle in an enormous number of frames, it is optimized to automatically find the best value for parameters to carry out the photometry, such as mask size for aperture, size of window for extraction of a single star, and the number of counts for the threshold for detecting a faint star. Although intended to work with TAOS-II data, APPHi can analyze any set of astronomical images and is a robust and versatile tool to performing stellar aperture and differential photometry.

  20. a Fully Automated Pipeline for Classification Tasks with AN Application to Remote Sensing

    NASA Astrophysics Data System (ADS)

    Suzuki, K.; Claesen, M.; Takeda, H.; De Moor, B.

    2016-06-01

    Nowadays deep learning has been intensively in spotlight owing to its great victories at major competitions, which undeservedly pushed `shallow' machine learning methods, relatively naive/handy algorithms commonly used by industrial engineers, to the background in spite of their facilities such as small requisite amount of time/dataset for training. We, with a practical point of view, utilized shallow learning algorithms to construct a learning pipeline such that operators can utilize machine learning without any special knowledge, expensive computation environment, and a large amount of labelled data. The proposed pipeline automates a whole classification process, namely feature-selection, weighting features and the selection of the most suitable classifier with optimized hyperparameters. The configuration facilitates particle swarm optimization, one of well-known metaheuristic algorithms for the sake of generally fast and fine optimization, which enables us not only to optimize (hyper)parameters but also to determine appropriate features/classifier to the problem, which has conventionally been a priori based on domain knowledge and remained untouched or dealt with naïve algorithms such as grid search. Through experiments with the MNIST and CIFAR-10 datasets, common datasets in computer vision field for character recognition and object recognition problems respectively, our automated learning approach provides high performance considering its simple setting (i.e. non-specialized setting depending on dataset), small amount of training data, and practical learning time. Moreover, compared to deep learning the performance stays robust without almost any modification even with a remote sensing object recognition problem, which in turn indicates that there is a high possibility that our approach contributes to general classification problems.

  1. Development of an automated data acquisition and processing pipeline using multiple telescopes for observing transient phenomena

    NASA Astrophysics Data System (ADS)

    Savant, Vaibhav; Smith, Niall

    2016-07-01

    We report on the current status in the development of a pilot automated data acquisition and reduction pipeline based around the operation of two nodes of remotely operated robotic telescopes based in California, USA and Cork, Ireland. The observatories are primarily used as a testbed for automation and instrumentation and as a tool to facilitate STEM (Science Technology Engineering Mathematics) promotion. The Ireland node is situated at Blackrock Castle Observatory (operated by Cork Institute of Technology) and consists of two optical telescopes - 6" and 16" OTAs housed in two separate domes while the node in California is its 6" replica. Together they form a pilot Telescope ARrAy known as TARA. QuickPhot is an automated data reduction pipeline designed primarily to throw more light on the microvariability of blazars employing precision optical photometry and using data from the TARA telescopes as they constantly monitor predefined targets whenever observing conditions are favourable. After carrying out aperture photometry, if any variability above a given threshold is observed, the reporting telescope will communicate the source concerned and the other nodes will follow up with multi-band observations, taking advantage that they are located in strategically separated time-zones. Ultimately we wish to investigate the applicability of Shock-in-Jet and Geometric models. These try to explain the processes at work in AGNs which result in the formation of jets, by looking for temporal and spectral variability in TARA multi-band observations. We are also experimenting with using a Twochannel Optical PHotometric Imaging CAMera (TOΦCAM) that we have developed and which has been optimised for simultaneous two-band photometry on our 16" OTA.

  2. ALMA Pipeline: Current Status

    NASA Astrophysics Data System (ADS)

    Shinnaga, H.; Humphreys, E.; Indebetouw, R.; Villard, E.; Kern, J.; Davis, L.; Miura, R. E.; Nakazato, T.; Sugimoto, K.; Kosugi, G.; Akiyama, E.; Muders, D.; Wyrowski, F.; Williams, S.; Lightfoot, J.; Kent, B.; Momjian, E.; Hunter, T.; ALMA Pipeline Team

    2015-12-01

    The ALMA Pipeline is the automated data reduction tool that runs on ALMA data. Current version of the ALMA pipeline produces science quality data products for standard interferometric observing modes up to calibration process. The ALMA Pipeline is comprised of (1) heuristics in the form of Python scripts that select the best processing parameters, and (2) contexts that are given for book-keeping purpose of data processes. The ALMA Pipeline produces a "weblog" that showcases detailed plots for users to judge how each step of calibration processes are treated. The ALMA Interferometric Pipeline was conditionally accepted in March 2014 by processing Cycle 0 and Cycle 1 data sets. From Cycle 2, ALMA Pipeline is used for ALMA data reduction and quality assurance for the projects whose observing modes are supported by the ALMA Pipeline. Pipeline tasks are available based on CASA version 4.2.2, and the first public pipeline release called CASA 4.2.2-pipe has been available since October 2014. One can reduce ALMA data both by CASA tasks as well as by pipeline tasks by using CASA version 4.2.2-pipe.

  3. Main Pipelines Corrosion Monitoring Device

    NASA Astrophysics Data System (ADS)

    Anatoliy, Bazhenov; Galina, Bondareva; Natalia, Grivennaya; Sergey, Malygin; Mikhail, Goryainov

    2017-01-01

    The aim of the article is to substantiate the technical solution for the problem of monitoring corrosion changes in oil and gas pipelines with use (using) of an electromagnetic NDT method. Pipeline wall thinning under operating conditions can lead to perforations and leakage of the product to be transported outside the pipeline. In most cases there is danger for human life and environment. Monitoring of corrosion changes in pipeline inner wall under operating conditions is complicated because pipelines are mainly made of structural steels with conductive and magnetic properties that complicate test signal passage through the entire thickness of the object under study. The technical solution of this problem lies in monitoring of the internal corrosion changes in pipes under operating conditions in order to increase safety of pipelines by automated prediction of achieving the threshold pre-crash values due to corrosion.

  4. CloVR-Comparative: automated, cloud-enabled comparative microbial genome sequence analysis pipeline.

    PubMed

    Agrawal, Sonia; Arze, Cesar; Adkins, Ricky S; Crabtree, Jonathan; Riley, David; Vangala, Mahesh; Galens, Kevin; Fraser, Claire M; Tettelin, Hervé; White, Owen; Angiuoli, Samuel V; Mahurkar, Anup; Fricke, W Florian

    2017-04-27

    The benefit of increasing genomic sequence data to the scientific community depends on easy-to-use, scalable bioinformatics support. CloVR-Comparative combines commonly used bioinformatics tools into an intuitive, automated, and cloud-enabled analysis pipeline for comparative microbial genomics. CloVR-Comparative runs on annotated complete or draft genome sequences that are uploaded by the user or selected via a taxonomic tree-based user interface and downloaded from NCBI. CloVR-Comparative runs reference-free multiple whole-genome alignments to determine unique, shared and core coding sequences (CDSs) and single nucleotide polymorphisms (SNPs). Output includes short summary reports and detailed text-based results files, graphical visualizations (phylogenetic trees, circular figures), and a database file linked to the Sybil comparative genome browser. Data up- and download, pipeline configuration and monitoring, and access to Sybil are managed through CloVR-Comparative web interface. CloVR-Comparative and Sybil are distributed as part of the CloVR virtual appliance, which runs on local computers or the Amazon EC2 cloud. Representative datasets (e.g. 40 draft and complete Escherichia coli genomes) are processed in <36 h on a local desktop or at a cost of <$20 on EC2. CloVR-Comparative allows anybody with Internet access to run comparative genomics projects, while eliminating the need for on-site computational resources and expertise.

  5. Transforming microbial genotyping: a robotic pipeline for genotyping bacterial strains.

    PubMed

    O'Farrell, Brian; Haase, Jana K; Velayudhan, Vimalkumar; Murphy, Ronan A; Achtman, Mark

    2012-01-01

    Microbial genotyping increasingly deals with large numbers of samples, and data are commonly evaluated by unstructured approaches, such as spread-sheets. The efficiency, reliability and throughput of genotyping would benefit from the automation of manual manipulations within the context of sophisticated data storage. We developed a medium- throughput genotyping pipeline for MultiLocus Sequence Typing (MLST) of bacterial pathogens. This pipeline was implemented through a combination of four automated liquid handling systems, a Laboratory Information Management System (LIMS) consisting of a variety of dedicated commercial operating systems and programs, including a Sample Management System, plus numerous Python scripts. All tubes and microwell racks were bar-coded and their locations and status were recorded in the LIMS. We also created a hierarchical set of items that could be used to represent bacterial species, their products and experiments. The LIMS allowed reliable, semi-automated, traceable bacterial genotyping from initial single colony isolation and sub-cultivation through DNA extraction and normalization to PCRs, sequencing and MLST sequence trace evaluation. We also describe robotic sequencing to facilitate cherrypicking of sequence dropouts. This pipeline is user-friendly, with a throughput of 96 strains within 10 working days at a total cost of < €25 per strain. Since developing this pipeline, >200,000 items were processed by two to three people. Our sophisticated automated pipeline can be implemented by a small microbiology group without extensive external support, and provides a general framework for semi-automated bacterial genotyping of large numbers of samples at low cost.

  6. Transforming Microbial Genotyping: A Robotic Pipeline for Genotyping Bacterial Strains

    PubMed Central

    Velayudhan, Vimalkumar; Murphy, Ronan A.; Achtman, Mark

    2012-01-01

    Microbial genotyping increasingly deals with large numbers of samples, and data are commonly evaluated by unstructured approaches, such as spread-sheets. The efficiency, reliability and throughput of genotyping would benefit from the automation of manual manipulations within the context of sophisticated data storage. We developed a medium- throughput genotyping pipeline for MultiLocus Sequence Typing (MLST) of bacterial pathogens. This pipeline was implemented through a combination of four automated liquid handling systems, a Laboratory Information Management System (LIMS) consisting of a variety of dedicated commercial operating systems and programs, including a Sample Management System, plus numerous Python scripts. All tubes and microwell racks were bar-coded and their locations and status were recorded in the LIMS. We also created a hierarchical set of items that could be used to represent bacterial species, their products and experiments. The LIMS allowed reliable, semi-automated, traceable bacterial genotyping from initial single colony isolation and sub-cultivation through DNA extraction and normalization to PCRs, sequencing and MLST sequence trace evaluation. We also describe robotic sequencing to facilitate cherrypicking of sequence dropouts. This pipeline is user-friendly, with a throughput of 96 strains within 10 working days at a total cost of < €25 per strain. Since developing this pipeline, >200,000 items were processed by two to three people. Our sophisticated automated pipeline can be implemented by a small microbiology group without extensive external support, and provides a general framework for semi-automated bacterial genotyping of large numbers of samples at low cost. PMID:23144721

  7. The Harvard Automated Processing Pipeline for Electroencephalography (HAPPE): Standardized Processing Software for Developmental and High-Artifact Data.

    PubMed

    Gabard-Durnam, Laurel J; Mendez Leal, Adriana S; Wilkinson, Carol L; Levin, April R

    2018-01-01

    Electroenchephalography (EEG) recordings collected with developmental populations present particular challenges from a data processing perspective. These EEGs have a high degree of artifact contamination and often short recording lengths. As both sample sizes and EEG channel densities increase, traditional processing approaches like manual data rejection are becoming unsustainable. Moreover, such subjective approaches preclude standardized metrics of data quality, despite the heightened importance of such measures for EEGs with high rates of initial artifact contamination. There is presently a paucity of automated resources for processing these EEG data and no consistent reporting of data quality measures. To address these challenges, we propose the Harvard Automated Processing Pipeline for EEG (HAPPE) as a standardized, automated pipeline compatible with EEG recordings of variable lengths and artifact contamination levels, including high-artifact and short EEG recordings from young children or those with neurodevelopmental disorders. HAPPE processes event-related and resting-state EEG data from raw files through a series of filtering, artifact rejection, and re-referencing steps to processed EEG suitable for time-frequency-domain analyses. HAPPE also includes a post-processing report of data quality metrics to facilitate the evaluation and reporting of data quality in a standardized manner. Here, we describe each processing step in HAPPE, perform an example analysis with EEG files we have made freely available, and show that HAPPE outperforms seven alternative, widely-used processing approaches. HAPPE removes more artifact than all alternative approaches while simultaneously preserving greater or equivalent amounts of EEG signal in almost all instances. We also provide distributions of HAPPE's data quality metrics in an 867 file dataset as a reference distribution and in support of HAPPE's performance across EEG data with variable artifact contamination and

  8. The Harvard Automated Processing Pipeline for Electroencephalography (HAPPE): Standardized Processing Software for Developmental and High-Artifact Data

    PubMed Central

    Gabard-Durnam, Laurel J.; Mendez Leal, Adriana S.; Wilkinson, Carol L.; Levin, April R.

    2018-01-01

    Electroenchephalography (EEG) recordings collected with developmental populations present particular challenges from a data processing perspective. These EEGs have a high degree of artifact contamination and often short recording lengths. As both sample sizes and EEG channel densities increase, traditional processing approaches like manual data rejection are becoming unsustainable. Moreover, such subjective approaches preclude standardized metrics of data quality, despite the heightened importance of such measures for EEGs with high rates of initial artifact contamination. There is presently a paucity of automated resources for processing these EEG data and no consistent reporting of data quality measures. To address these challenges, we propose the Harvard Automated Processing Pipeline for EEG (HAPPE) as a standardized, automated pipeline compatible with EEG recordings of variable lengths and artifact contamination levels, including high-artifact and short EEG recordings from young children or those with neurodevelopmental disorders. HAPPE processes event-related and resting-state EEG data from raw files through a series of filtering, artifact rejection, and re-referencing steps to processed EEG suitable for time-frequency-domain analyses. HAPPE also includes a post-processing report of data quality metrics to facilitate the evaluation and reporting of data quality in a standardized manner. Here, we describe each processing step in HAPPE, perform an example analysis with EEG files we have made freely available, and show that HAPPE outperforms seven alternative, widely-used processing approaches. HAPPE removes more artifact than all alternative approaches while simultaneously preserving greater or equivalent amounts of EEG signal in almost all instances. We also provide distributions of HAPPE's data quality metrics in an 867 file dataset as a reference distribution and in support of HAPPE's performance across EEG data with variable artifact contamination and

  9. SeqMule: automated pipeline for analysis of human exome/genome sequencing data.

    PubMed

    Guo, Yunfei; Ding, Xiaolei; Shen, Yufeng; Lyon, Gholson J; Wang, Kai

    2015-09-18

    Next-generation sequencing (NGS) technology has greatly helped us identify disease-contributory variants for Mendelian diseases. However, users are often faced with issues such as software compatibility, complicated configuration, and no access to high-performance computing facility. Discrepancies exist among aligners and variant callers. We developed a computational pipeline, SeqMule, to perform automated variant calling from NGS data on human genomes and exomes. SeqMule integrates computational-cluster-free parallelization capability built on top of the variant callers, and facilitates normalization/intersection of variant calls to generate consensus set with high confidence. SeqMule integrates 5 alignment tools, 5 variant calling algorithms and accepts various combinations all by one-line command, therefore allowing highly flexible yet fully automated variant calling. In a modern machine (2 Intel Xeon X5650 CPUs, 48 GB memory), when fast turn-around is needed, SeqMule generates annotated VCF files in a day from a 30X whole-genome sequencing data set; when more accurate calling is needed, SeqMule generates consensus call set that improves over single callers, as measured by both Mendelian error rate and consistency. SeqMule supports Sun Grid Engine for parallel processing, offers turn-key solution for deployment on Amazon Web Services, allows quality check, Mendelian error check, consistency evaluation, HTML-based reports. SeqMule is available at http://seqmule.openbioinformatics.org.

  10. The ALMA Science Pipeline: Current Status

    NASA Astrophysics Data System (ADS)

    Humphreys, Elizabeth; Miura, Rie; Brogan, Crystal L.; Hibbard, John; Hunter, Todd R.; Indebetouw, Remy

    2016-09-01

    The ALMA Science Pipeline is being developed for the automated calibration and imaging of ALMA interferometric and single-dish data. The calibration Pipeline for interferometric data was accepted for use by ALMA Science Operations in 2014, and for single-dish data end-to-end processing in 2015. However, work is ongoing to expand the use cases for which the Pipeline can be used e.g. for higher frequency and lower signal-to-noise datasets, and for new observing modes. A current focus includes the commissioning of science target imaging for interferometric data. For the Single Dish Pipeline, the line finding algorithm used in baseline subtraction and baseline flagging heuristics have been greately improved since the prototype used for data from the previous cycle. These algorithms, unique to the Pipeline, produce better results than standard manual processing in many cases. In this poster, we report on the current status of the Pipeline capabilities, present initial results from the Imaging Pipeline, and the smart line finding and flagging algorithm used in the Single Dish Pipeline. The Pipeline is released as part of CASA (the Common Astronomy Software Applications package).

  11. The HTS barcode checker pipeline, a tool for automated detection of illegally traded species from high-throughput sequencing data.

    PubMed

    Lammers, Youri; Peelen, Tamara; Vos, Rutger A; Gravendeel, Barbara

    2014-02-06

    Mixtures of internationally traded organic substances can contain parts of species protected by the Convention on International Trade in Endangered Species of Wild Fauna and Flora (CITES). These mixtures often raise the suspicion of border control and customs offices, which can lead to confiscation, for example in the case of Traditional Chinese medicines (TCMs). High-throughput sequencing of DNA barcoding markers obtained from such samples provides insight into species constituents of mixtures, but manual cross-referencing of results against the CITES appendices is labor intensive. Matching DNA barcodes against NCBI GenBank using BLAST may yield misleading results both as false positives, due to incorrectly annotated sequences, and false negatives, due to spurious taxonomic re-assignment. Incongruence between the taxonomies of CITES and NCBI GenBank can result in erroneous estimates of illegal trade. The HTS barcode checker pipeline is an application for automated processing of sets of 'next generation' barcode sequences to determine whether these contain DNA barcodes obtained from species listed on the CITES appendices. This analytical pipeline builds upon and extends existing open-source applications for BLAST matching against the NCBI GenBank reference database and for taxonomic name reconciliation. In a single operation, reads are converted into taxonomic identifications matched with names on the CITES appendices. By inclusion of a blacklist and additional names databases, the HTS barcode checker pipeline prevents false positives and resolves taxonomic heterogeneity. The HTS barcode checker pipeline can detect and correctly identify DNA barcodes of CITES-protected species from reads obtained from TCM samples in just a few minutes. The pipeline facilitates and improves molecular monitoring of trade in endangered species, and can aid in safeguarding these species from extinction in the wild. The HTS barcode checker pipeline is available at https://github.com/naturalis/HTS-barcode-checker.

  12. The HTS barcode checker pipeline, a tool for automated detection of illegally traded species from high-throughput sequencing data

    PubMed Central

    2014-01-01

    Background Mixtures of internationally traded organic substances can contain parts of species protected by the Convention on International Trade in Endangered Species of Wild Fauna and Flora (CITES). These mixtures often raise the suspicion of border control and customs offices, which can lead to confiscation, for example in the case of Traditional Chinese medicines (TCMs). High-throughput sequencing of DNA barcoding markers obtained from such samples provides insight into species constituents of mixtures, but manual cross-referencing of results against the CITES appendices is labor intensive. Matching DNA barcodes against NCBI GenBank using BLAST may yield misleading results both as false positives, due to incorrectly annotated sequences, and false negatives, due to spurious taxonomic re-assignment. Incongruence between the taxonomies of CITES and NCBI GenBank can result in erroneous estimates of illegal trade. Results The HTS barcode checker pipeline is an application for automated processing of sets of 'next generation’ barcode sequences to determine whether these contain DNA barcodes obtained from species listed on the CITES appendices. This analytical pipeline builds upon and extends existing open-source applications for BLAST matching against the NCBI GenBank reference database and for taxonomic name reconciliation. In a single operation, reads are converted into taxonomic identifications matched with names on the CITES appendices. By inclusion of a blacklist and additional names databases, the HTS barcode checker pipeline prevents false positives and resolves taxonomic heterogeneity. Conclusions The HTS barcode checker pipeline can detect and correctly identify DNA barcodes of CITES-protected species from reads obtained from TCM samples in just a few minutes. The pipeline facilitates and improves molecular monitoring of trade in endangered species, and can aid in safeguarding these species from extinction in the wild. The HTS barcode checker pipeline is

  13. ToTem: a tool for variant calling pipeline optimization.

    PubMed

    Tom, Nikola; Tom, Ondrej; Malcikova, Jitka; Pavlova, Sarka; Kubesova, Blanka; Rausch, Tobias; Kolarik, Miroslav; Benes, Vladimir; Bystry, Vojtech; Pospisilova, Sarka

    2018-06-26

    High-throughput bioinformatics analyses of next generation sequencing (NGS) data often require challenging pipeline optimization. The key problem is choosing appropriate tools and selecting the best parameters for optimal precision and recall. Here we introduce ToTem, a tool for automated pipeline optimization. ToTem is a stand-alone web application with a comprehensive graphical user interface (GUI). ToTem is written in Java and PHP with an underlying connection to a MySQL database. Its primary role is to automatically generate, execute and benchmark different variant calling pipeline settings. Our tool allows an analysis to be started from any level of the process and with the possibility of plugging almost any tool or code. To prevent an over-fitting of pipeline parameters, ToTem ensures the reproducibility of these by using cross validation techniques that penalize the final precision, recall and F-measure. The results are interpreted as interactive graphs and tables allowing an optimal pipeline to be selected, based on the user's priorities. Using ToTem, we were able to optimize somatic variant calling from ultra-deep targeted gene sequencing (TGS) data and germline variant detection in whole genome sequencing (WGS) data. ToTem is a tool for automated pipeline optimization which is freely available as a web application at  https://totem.software .

  14. Data processing pipeline for Herschel HIFI

    NASA Astrophysics Data System (ADS)

    Shipman, R. F.; Beaulieu, S. F.; Teyssier, D.; Morris, P.; Rengel, M.; McCoey, C.; Edwards, K.; Kester, D.; Lorenzani, A.; Coeur-Joly, O.; Melchior, M.; Xie, J.; Sanchez, E.; Zaal, P.; Avruch, I.; Borys, C.; Braine, J.; Comito, C.; Delforge, B.; Herpin, F.; Hoac, A.; Kwon, W.; Lord, S. D.; Marston, A.; Mueller, M.; Olberg, M.; Ossenkopf, V.; Puga, E.; Akyilmaz-Yabaci, M.

    2017-12-01

    Context. The HIFI instrument on the Herschel Space Observatory performed over 9100 astronomical observations, almost 900 of which were calibration observations in the course of the nearly four-year Herschel mission. The data from each observation had to be converted from raw telemetry into calibrated products and were included in the Herschel Science Archive. Aims: The HIFI pipeline was designed to provide robust conversion from raw telemetry into calibrated data throughout all phases of the HIFI missions. Pre-launch laboratory testing was supported as were routine mission operations. Methods: A modular software design allowed components to be easily added, removed, amended and/or extended as the understanding of the HIFI data developed during and after mission operations. Results: The HIFI pipeline processed data from all HIFI observing modes within the Herschel automated processing environment as well as within an interactive environment. The same software can be used by the general astronomical community to reprocess any standard HIFI observation. The pipeline also recorded the consistency of processing results and provided automated quality reports. Many pipeline modules were in use since the HIFI pre-launch instrument level testing. Conclusions: Processing in steps facilitated data analysis to discover and address instrument artefacts and uncertainties. The availability of the same pipeline components from pre-launch throughout the mission made for well-understood, tested, and stable processing. A smooth transition from one phase to the next significantly enhanced processing reliability and robustness. Herschel was an ESA space observatory with science instruments provided by European-led Principal Investigator consortia and with important participation from NASA.

  15. Solvepol: A Reduction Pipeline for Imaging Polarimetry Data

    NASA Astrophysics Data System (ADS)

    Ramírez, Edgar A.; Magalhães, Antônio M.; Davidson, James W., Jr.; Pereyra, Antonio; Rubinho, Marcelo

    2017-05-01

    We present a newly, fully automated, data pipeline, Solvepol, designed to reduce and analyze polarimetric data. It has been optimized for imaging data from the Instituto de Astronomía, Geofísica e Ciências Atmosféricas (IAG) of the University of São Paulo (USP), calcite Savart prism plate-based IAGPOL polarimeter. Solvepol is also the basis of a reduction pipeline for the wide-field optical polarimeter that will execute SOUTH POL, a survey of the polarized southern sky. Solvepol was written using the Interactive data language (IDL) and is based on the Image Reduction and Analysis Facility (IRAF) task PCCDPACK, developed by our polarimetry group. We present and discuss reduced data from standard stars and other fields and compare these results with those obtained in the IRAF environment. Our analysis shows that Solvepol, in addition to being a fully automated pipeline, produces results consistent with those reduced by PCCDPACK and reported in the literature.

  16. Immersion probe arrays for rapid pipeline weld inspection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lebsack, S.; Heckhauser, H.

    In 1992, F.H. Gottfeld, Herne, Germany, a member of the SGA Group (Societe Generale de Surveillance) and Krautkramer Branson, Koin, undertook production of a rapid automated ultrasonic testing (UT) system to inspect manually and machine welded pipeline girth welds. The result of the project is a system called MIPA, or multiple immersion probe array. The advantages of using UT to detect certain weld defects have been realized for many years, however for some applications the time required for UT has been a limiting factor. Where time has not been a factor, automated ultrasonic technology has advanced a reliable solution tomore » many inspection problems across a broad industrial base. The recent past has seen the entrance of automated ultrasonic technology into the harsh and demanding environment of pipelay operations, However, the use of these systems has been focused on automated welding processes. Their effectiveness for manual pipeline welding inspection is contested. This is due to the infinite variability of the joint alignment and shape that is unavoidable even when highly skilled welders are used.« less

  17. The ORAC-DR data reduction pipeline

    NASA Astrophysics Data System (ADS)

    Cavanagh, B.; Jenness, T.; Economou, F.; Currie, M. J.

    2008-03-01

    The ORAC-DR data reduction pipeline has been used by the Joint Astronomy Centre since 1998. Originally developed for an infrared spectrometer and a submillimetre bolometer array, it has since expanded to support twenty instruments from nine different telescopes. By using shared code and a common infrastructure, rapid development of an automated data reduction pipeline for nearly any astronomical data is possible. This paper discusses the infrastructure available to developers and estimates the development timescales expected to reduce data for new instruments using ORAC-DR.

  18. PRIMO: An Interactive Homology Modeling Pipeline.

    PubMed

    Hatherley, Rowan; Brown, David K; Glenister, Michael; Tastan Bishop, Özlem

    2016-01-01

    The development of automated servers to predict the three-dimensional structure of proteins has seen much progress over the years. These servers make calculations simpler, but largely exclude users from the process. In this study, we present the PRotein Interactive MOdeling (PRIMO) pipeline for homology modeling of protein monomers. The pipeline eases the multi-step modeling process, and reduces the workload required by the user, while still allowing engagement from the user during every step. Default parameters are given for each step, which can either be modified or supplemented with additional external input. PRIMO has been designed for users of varying levels of experience with homology modeling. The pipeline incorporates a user-friendly interface that makes it easy to alter parameters used during modeling. During each stage of the modeling process, the site provides suggestions for novice users to improve the quality of their models. PRIMO provides functionality that allows users to also model ligands and ions in complex with their protein targets. Herein, we assess the accuracy of the fully automated capabilities of the server, including a comparative analysis of the available alignment programs, as well as of the refinement levels used during modeling. The tests presented here demonstrate the reliability of the PRIMO server when producing a large number of protein models. While PRIMO does focus on user involvement in the homology modeling process, the results indicate that in the presence of suitable templates, good quality models can be produced even without user intervention. This gives an idea of the base level accuracy of PRIMO, which users can improve upon by adjusting parameters in their modeling runs. The accuracy of PRIMO's automated scripts is being continuously evaluated by the CAMEO (Continuous Automated Model EvaluatiOn) project. The PRIMO site is free for non-commercial use and can be accessed at https://primo.rubi.ru.ac.za/.

  19. PRIMO: An Interactive Homology Modeling Pipeline

    PubMed Central

    Glenister, Michael

    2016-01-01

    The development of automated servers to predict the three-dimensional structure of proteins has seen much progress over the years. These servers make calculations simpler, but largely exclude users from the process. In this study, we present the PRotein Interactive MOdeling (PRIMO) pipeline for homology modeling of protein monomers. The pipeline eases the multi-step modeling process, and reduces the workload required by the user, while still allowing engagement from the user during every step. Default parameters are given for each step, which can either be modified or supplemented with additional external input. PRIMO has been designed for users of varying levels of experience with homology modeling. The pipeline incorporates a user-friendly interface that makes it easy to alter parameters used during modeling. During each stage of the modeling process, the site provides suggestions for novice users to improve the quality of their models. PRIMO provides functionality that allows users to also model ligands and ions in complex with their protein targets. Herein, we assess the accuracy of the fully automated capabilities of the server, including a comparative analysis of the available alignment programs, as well as of the refinement levels used during modeling. The tests presented here demonstrate the reliability of the PRIMO server when producing a large number of protein models. While PRIMO does focus on user involvement in the homology modeling process, the results indicate that in the presence of suitable templates, good quality models can be produced even without user intervention. This gives an idea of the base level accuracy of PRIMO, which users can improve upon by adjusting parameters in their modeling runs. The accuracy of PRIMO’s automated scripts is being continuously evaluated by the CAMEO (Continuous Automated Model EvaluatiOn) project. The PRIMO site is free for non-commercial use and can be accessed at https://primo.rubi.ru.ac.za/. PMID:27855192

  20. AGAPE (Automated Genome Analysis PipelinE) for Pan-Genome Analysis of Saccharomyces cerevisiae

    PubMed Central

    Song, Giltae; Dickins, Benjamin J. A.; Demeter, Janos; Engel, Stacia; Dunn, Barbara; Cherry, J. Michael

    2015-01-01

    The characterization and public release of genome sequences from thousands of organisms is expanding the scope for genetic variation studies. However, understanding the phenotypic consequences of genetic variation remains a challenge in eukaryotes due to the complexity of the genotype-phenotype map. One approach to this is the intensive study of model systems for which diverse sources of information can be accumulated and integrated. Saccharomyces cerevisiae is an extensively studied model organism, with well-known protein functions and thoroughly curated phenotype data. To develop and expand the available resources linking genomic variation with function in yeast, we aim to model the pan-genome of S. cerevisiae. To initiate the yeast pan-genome, we newly sequenced or re-sequenced the genomes of 25 strains that are commonly used in the yeast research community using advanced sequencing technology at high quality. We also developed a pipeline for automated pan-genome analysis, which integrates the steps of assembly, annotation, and variation calling. To assign strain-specific functional annotations, we identified genes that were not present in the reference genome. We classified these according to their presence or absence across strains and characterized each group of genes with known functional and phenotypic features. The functional roles of novel genes not found in the reference genome and associated with strains or groups of strains appear to be consistent with anticipated adaptations in specific lineages. As more S. cerevisiae strain genomes are released, our analysis can be used to collate genome data and relate it to lineage-specific patterns of genome evolution. Our new tool set will enhance our understanding of genomic and functional evolution in S. cerevisiae, and will be available to the yeast genetics and molecular biology community. PMID:25781462

  1. TriAnnot: A Versatile and High Performance Pipeline for the Automated Annotation of Plant Genomes

    PubMed Central

    Leroy, Philippe; Guilhot, Nicolas; Sakai, Hiroaki; Bernard, Aurélien; Choulet, Frédéric; Theil, Sébastien; Reboux, Sébastien; Amano, Naoki; Flutre, Timothée; Pelegrin, Céline; Ohyanagi, Hajime; Seidel, Michael; Giacomoni, Franck; Reichstadt, Mathieu; Alaux, Michael; Gicquello, Emmanuelle; Legeai, Fabrice; Cerutti, Lorenzo; Numa, Hisataka; Tanaka, Tsuyoshi; Mayer, Klaus; Itoh, Takeshi; Quesneville, Hadi; Feuillet, Catherine

    2012-01-01

    In support of the international effort to obtain a reference sequence of the bread wheat genome and to provide plant communities dealing with large and complex genomes with a versatile, easy-to-use online automated tool for annotation, we have developed the TriAnnot pipeline. Its modular architecture allows for the annotation and masking of transposable elements, the structural, and functional annotation of protein-coding genes with an evidence-based quality indexing, and the identification of conserved non-coding sequences and molecular markers. The TriAnnot pipeline is parallelized on a 712 CPU computing cluster that can run a 1-Gb sequence annotation in less than 5 days. It is accessible through a web interface for small scale analyses or through a server for large scale annotations. The performance of TriAnnot was evaluated in terms of sensitivity, specificity, and general fitness using curated reference sequence sets from rice and wheat. In less than 8 h, TriAnnot was able to predict more than 83% of the 3,748 CDS from rice chromosome 1 with a fitness of 67.4%. On a set of 12 reference Mb-sized contigs from wheat chromosome 3B, TriAnnot predicted and annotated 93.3% of the genes among which 54% were perfectly identified in accordance with the reference annotation. It also allowed the curation of 12 genes based on new biological evidences, increasing the percentage of perfect gene prediction to 63%. TriAnnot systematically showed a higher fitness than other annotation pipelines that are not improved for wheat. As it is easily adaptable to the annotation of other plant genomes, TriAnnot should become a useful resource for the annotation of large and complex genomes in the future. PMID:22645565

  2. Automated image alignment for 2D gel electrophoresis in a high-throughput proteomics pipeline.

    PubMed

    Dowsey, Andrew W; Dunn, Michael J; Yang, Guang-Zhong

    2008-04-01

    The quest for high-throughput proteomics has revealed a number of challenges in recent years. Whilst substantial improvements in automated protein separation with liquid chromatography and mass spectrometry (LC/MS), aka 'shotgun' proteomics, have been achieved, large-scale open initiatives such as the Human Proteome Organization (HUPO) Brain Proteome Project have shown that maximal proteome coverage is only possible when LC/MS is complemented by 2D gel electrophoresis (2-DE) studies. Moreover, both separation methods require automated alignment and differential analysis to relieve the bioinformatics bottleneck and so make high-throughput protein biomarker discovery a reality. The purpose of this article is to describe a fully automatic image alignment framework for the integration of 2-DE into a high-throughput differential expression proteomics pipeline. The proposed method is based on robust automated image normalization (RAIN) to circumvent the drawbacks of traditional approaches. These use symbolic representation at the very early stages of the analysis, which introduces persistent errors due to inaccuracies in modelling and alignment. In RAIN, a third-order volume-invariant B-spline model is incorporated into a multi-resolution schema to correct for geometric and expression inhomogeneity at multiple scales. The normalized images can then be compared directly in the image domain for quantitative differential analysis. Through evaluation against an existing state-of-the-art method on real and synthetically warped 2D gels, the proposed analysis framework demonstrates substantial improvements in matching accuracy and differential sensitivity. High-throughput analysis is established through an accelerated GPGPU (general purpose computation on graphics cards) implementation. Supplementary material, software and images used in the validation are available at http://www.proteomegrid.org/rain/.

  3. An Aperture Photometry Pipeline for K2 Data

    NASA Astrophysics Data System (ADS)

    Buzasi, Derek L.; Carboneau, Lindsey; Lezcano, Andy; Vydra, Ekaterina

    2016-01-01

    As part of an ongoing research program with undergraduate students at Florida Gulf Coast University, we have constructed an aperture photometry pipeline for K2 data. The pipeline performs dynamic automated aperture mask definition for all targets in the K2 fields, followed by aperture photometry and detrending. Our pipeline is currently used to support a number of projects, including studies of stellar rotation and activity, red giant asteroseismology, gyrochronology, and exoplanet searches. In addition, output is used to support an undergraduate class on exoplanets aimed at a student audience of both majors and non-majors. The pipeline is designed for both batch and single-target use, and is easily extensible to data from other missions, and pipeline output is available to the community. This paper will describe our pipeline and its capabilities and illustrate the quality of the results, drawing on all of the applications for which it is currently used.

  4. MToolBox: a highly automated pipeline for heteroplasmy annotation and prioritization analysis of human mitochondrial variants in high-throughput sequencing

    PubMed Central

    Diroma, Maria Angela; Santorsola, Mariangela; Guttà, Cristiano; Gasparre, Giuseppe; Picardi, Ernesto; Pesole, Graziano; Attimonelli, Marcella

    2014-01-01

    Motivation: The increasing availability of mitochondria-targeted and off-target sequencing data in whole-exome and whole-genome sequencing studies (WXS and WGS) has risen the demand of effective pipelines to accurately measure heteroplasmy and to easily recognize the most functionally important mitochondrial variants among a huge number of candidates. To this purpose, we developed MToolBox, a highly automated pipeline to reconstruct and analyze human mitochondrial DNA from high-throughput sequencing data. Results: MToolBox implements an effective computational strategy for mitochondrial genomes assembling and haplogroup assignment also including a prioritization analysis of detected variants. MToolBox provides a Variant Call Format file featuring, for the first time, allele-specific heteroplasmy and annotation files with prioritized variants. MToolBox was tested on simulated samples and applied on 1000 Genomes WXS datasets. Availability and implementation: MToolBox package is available at https://sourceforge.net/projects/mtoolbox/. Contact: marcella.attimonelli@uniba.it Supplementary information: Supplementary data are available at Bioinformatics online. PMID:25028726

  5. ASTROPOP: ASTROnomical Polarimetry and Photometry pipeline

    NASA Astrophysics Data System (ADS)

    Campagnolo, Julio C. N.

    2018-05-01

    AstroPoP reduces almost any CCD photometry and image polarimetry data. For photometry reduction, the code performs source finding, aperture and PSF photometry, astrometry calibration using different automated and non-automated methods and automated source identification and magnitude calibration based on online and local catalogs. For polarimetry, the code resolves linear and circular Stokes parameters produced by image beam splitter or polarizer polarimeters. In addition to the modular functions, ready-to-use pipelines based in configuration files and header keys are also provided with the code. AstroPOP was initially developed to reduce the IAGPOL polarimeter data installed at Observatório Pico dos Dias (Brazil).

  6. Aerial image databases for pipeline rights-of-way management

    NASA Astrophysics Data System (ADS)

    Jadkowski, Mark A.

    1996-03-01

    Pipeline companies that own and manage extensive rights-of-way corridors are faced with ever-increasing regulatory pressures, operating issues, and the need to remain competitive in today's marketplace. Automation has long been an answer to the problem of having to do more work with less people, and Automated Mapping/Facilities Management/Geographic Information Systems (AM/FM/GIS) solutions have been implemented at several pipeline companies. Until recently, the ability to cost-effectively acquire and incorporate up-to-date aerial imagery into these computerized systems has been out of the reach of most users. NASA's Earth Observations Commercial Applications Program (EOCAP) is providing a means by which pipeline companies can bridge this gap. The EOCAP project described in this paper includes a unique partnership with NASA and James W. Sewall Company to develop an aircraft-mounted digital camera system and a ground-based computer system to geometrically correct and efficiently store and handle the digital aerial images in an AM/FM/GIS environment. This paper provides a synopsis of the project, including details on (1) the need for aerial imagery, (2) NASA's interest and role in the project, (3) the design of a Digital Aerial Rights-of-Way Monitoring System, (4) image georeferencing strategies for pipeline applications, and (5) commercialization of the EOCAP technology through a prototype project at Algonquin Gas Transmission Company which operates major gas pipelines in New England, New York, and New Jersey.

  7. Automated protein structure modeling in CASP9 by I-TASSER pipeline combined with QUARK-based ab initio folding and FG-MD-based structure refinement

    PubMed Central

    Xu, Dong; Zhang, Jian; Roy, Ambrish; Zhang, Yang

    2011-01-01

    I-TASSER is an automated pipeline for protein tertiary structure prediction using multiple threading alignments and iterative structure assembly simulations. In CASP9 experiments, two new algorithms, QUARK and FG-MD, were added to the I-TASSER pipeline for improving the structural modeling accuracy. QUARK is a de novo structure prediction algorithm used for structure modeling of proteins that lack detectable template structures. For distantly homologous targets, QUARK models are found useful as a reference structure for selecting good threading alignments and guiding the I-TASSER structure assembly simulations. FG-MD is an atomic-level structural refinement program that uses structural fragments collected from the PDB structures to guide molecular dynamics simulation and improve the local structure of predicted model, including hydrogen-bonding networks, torsion angles and steric clashes. Despite considerable progress in both the template-based and template-free structure modeling, significant improvements on protein target classification, domain parsing, model selection, and ab initio folding of beta-proteins are still needed to further improve the I-TASSER pipeline. PMID:22069036

  8. An Automated Pipeline for Engineering Many-Enzyme Pathways: Computational Sequence Design, Pathway Expression-Flux Mapping, and Scalable Pathway Optimization.

    PubMed

    Halper, Sean M; Cetnar, Daniel P; Salis, Howard M

    2018-01-01

    Engineering many-enzyme metabolic pathways suffers from the design curse of dimensionality. There are an astronomical number of synonymous DNA sequence choices, though relatively few will express an evolutionary robust, maximally productive pathway without metabolic bottlenecks. To solve this challenge, we have developed an integrated, automated computational-experimental pipeline that identifies a pathway's optimal DNA sequence without high-throughput screening or many cycles of design-build-test. The first step applies our Operon Calculator algorithm to design a host-specific evolutionary robust bacterial operon sequence with maximally tunable enzyme expression levels. The second step applies our RBS Library Calculator algorithm to systematically vary enzyme expression levels with the smallest-sized library. After characterizing a small number of constructed pathway variants, measurements are supplied to our Pathway Map Calculator algorithm, which then parameterizes a kinetic metabolic model that ultimately predicts the pathway's optimal enzyme expression levels and DNA sequences. Altogether, our algorithms provide the ability to efficiently map the pathway's sequence-expression-activity space and predict DNA sequences with desired metabolic fluxes. Here, we provide a step-by-step guide to applying the Pathway Optimization Pipeline on a desired multi-enzyme pathway in a bacterial host.

  9. Bioinformatic pipelines in Python with Leaf

    PubMed Central

    2013-01-01

    Background An incremental, loosely planned development approach is often used in bioinformatic studies when dealing with custom data analysis in a rapidly changing environment. Unfortunately, the lack of a rigorous software structuring can undermine the maintainability, communicability and replicability of the process. To ameliorate this problem we propose the Leaf system, the aim of which is to seamlessly introduce the pipeline formality on top of a dynamical development process with minimum overhead for the programmer, thus providing a simple layer of software structuring. Results Leaf includes a formal language for the definition of pipelines with code that can be transparently inserted into the user’s Python code. Its syntax is designed to visually highlight dependencies in the pipeline structure it defines. While encouraging the developer to think in terms of bioinformatic pipelines, Leaf supports a number of automated features including data and session persistence, consistency checks between steps of the analysis, processing optimization and publication of the analytic protocol in the form of a hypertext. Conclusions Leaf offers a powerful balance between plan-driven and change-driven development environments in the design, management and communication of bioinformatic pipelines. Its unique features make it a valuable alternative to other related tools. PMID:23786315

  10. DEWS (DEep White matter hyperintensity Segmentation framework): A fully automated pipeline for detecting small deep white matter hyperintensities in migraineurs.

    PubMed

    Park, Bo-Yong; Lee, Mi Ji; Lee, Seung-Hak; Cha, Jihoon; Chung, Chin-Sang; Kim, Sung Tae; Park, Hyunjin

    2018-01-01

    Migraineurs show an increased load of white matter hyperintensities (WMHs) and more rapid deep WMH progression. Previous methods for WMH segmentation have limited efficacy to detect small deep WMHs. We developed a new fully automated detection pipeline, DEWS (DEep White matter hyperintensity Segmentation framework), for small and superficially-located deep WMHs. A total of 148 non-elderly subjects with migraine were included in this study. The pipeline consists of three components: 1) white matter (WM) extraction, 2) WMH detection, and 3) false positive reduction. In WM extraction, we adjusted the WM mask to re-assign misclassified WMHs back to WM using many sequential low-level image processing steps. In WMH detection, the potential WMH clusters were detected using an intensity based threshold and region growing approach. For false positive reduction, the detected WMH clusters were classified into final WMHs and non-WMHs using the random forest (RF) classifier. Size, texture, and multi-scale deep features were used to train the RF classifier. DEWS successfully detected small deep WMHs with a high positive predictive value (PPV) of 0.98 and true positive rate (TPR) of 0.70 in the training and test sets. Similar performance of PPV (0.96) and TPR (0.68) was attained in the validation set. DEWS showed a superior performance in comparison with other methods. Our proposed pipeline is freely available online to help the research community in quantifying deep WMHs in non-elderly adults.

  11. Automated pipeline to analyze non-contact infrared images of the paraventricular nucleus specific leptin receptor knock-out mouse model

    NASA Astrophysics Data System (ADS)

    Diaz Martinez, Myriam; Ghamari-Langroudi, Masoud; Gifford, Aliya; Cone, Roger; Welch, E. B.

    2015-03-01

    Evidence of leptin resistance is indicated by elevated leptin levels together with other hallmarks of obesity such as a defect in energy homeostasis.1 As obesity is an increasing epidemic in the US, the investigation of mechanisms by which leptin resistance has a pathophysiological impact on energy is an intensive field of research.2 However, the manner in which leptin resistance contributes to the dysregulation of energy, specifically thermoregulation,3 is not known. The aim of this study was to investigate whether the leptin receptor expressed in paraventricular nucleus (PVN) neurons plays a role in thermoregulation at different temperatures. Non-contact infrared (NCIR) thermometry was employed to measure surface body temperature (SBT) of nonanesthetized mice with a specific deletion of the leptin receptor in the PVN after exposure to room (25 °C) and cold (4 °C) temperature. Dorsal side infrared images of wild type (LepRwtwt/sim1-Cre), heterozygous (LepRfloxwt/sim1-Cre) and knock-out (LepRfloxflox/sim1-Cre) mice were collected. Images were input to an automated post-processing pipeline developed in MATLAB to calculate average and maximum SBTs. Linear regression was used to evaluate the relationship between sex, cold exposure and leptin genotype with SBT measurements. Findings indicate that average SBT has a negative relationship to the LepRfloxflox/sim1-Cre genotype, the female sex and cold exposure. However, max SBT is affected by the LepRfloxflox/sim1-Cre genotype and the female sex. In conclusion this data suggests that leptin within the PVN may have a neuroendocrine role in thermoregulation and that NCIR thermometry combined with an automated imaging-processing pipeline is a promising approach to determine SBT in non-anesthetized mice.

  12. MG-Digger: An Automated Pipeline to Search for Giant Virus-Related Sequences in Metagenomes

    PubMed Central

    Verneau, Jonathan; Levasseur, Anthony; Raoult, Didier; La Scola, Bernard; Colson, Philippe

    2016-01-01

    The number of metagenomic studies conducted each year is growing dramatically. Storage and analysis of such big data is difficult and time-consuming. Interestingly, analysis shows that environmental and human metagenomes include a significant amount of non-annotated sequences, representing a ‘dark matter.’ We established a bioinformatics pipeline that automatically detects metagenome reads matching query sequences from a given set and applied this tool to the detection of sequences matching large and giant DNA viral members of the proposed order Megavirales or virophages. A total of 1,045 environmental and human metagenomes (≈ 1 Terabase) were collected, processed, and stored on our bioinformatics server. In addition, nucleotide and protein sequences from 93 Megavirales representatives, including 19 giant viruses of amoeba, and 5 virophages, were collected. The pipeline was generated by scripts written in Python language and entitled MG-Digger. Metagenomes previously found to contain megavirus-like sequences were tested as controls. MG-Digger was able to annotate 100s of metagenome sequences as best matching those of giant viruses. These sequences were most often found to be similar to phycodnavirus or mimivirus sequences, but included reads related to recently available pandoraviruses, Pithovirus sibericum, and faustoviruses. Compared to other tools, MG-Digger combined stand-alone use on Linux or Windows operating systems through a user-friendly interface, implementation of ready-to-use customized metagenome databases and query sequence databases, adjustable parameters for BLAST searches, and creation of output files containing selected reads with best match identification. Compared to Metavir 2, a reference tool in viral metagenome analysis, MG-Digger detected 8% more true positive Megavirales-related reads in a control metagenome. The present work shows that massive, automated and recurrent analyses of metagenomes are effective in improving knowledge about

  13. Automated Laser Ultrasonic Testing (ALUT) of Hybrid Arc Welds for Pipeline Construction, #272

    DOT National Transportation Integrated Search

    2009-12-22

    One challenge in developing new gas reserves is the high cost of pipeline construction. Welding costs are a major component of overall construction costs. Industry continues to seek advanced pipeline welding technologies to improve productivity and s...

  14. A computational genomics pipeline for prokaryotic sequencing projects.

    PubMed

    Kislyuk, Andrey O; Katz, Lee S; Agrawal, Sonia; Hagen, Matthew S; Conley, Andrew B; Jayaraman, Pushkala; Nelakuditi, Viswateja; Humphrey, Jay C; Sammons, Scott A; Govil, Dhwani; Mair, Raydel D; Tatti, Kathleen M; Tondella, Maria L; Harcourt, Brian H; Mayer, Leonard W; Jordan, I King

    2010-08-01

    New sequencing technologies have accelerated research on prokaryotic genomes and have made genome sequencing operations outside major genome sequencing centers routine. However, no off-the-shelf solution exists for the combined assembly, gene prediction, genome annotation and data presentation necessary to interpret sequencing data. The resulting requirement to invest significant resources into custom informatics support for genome sequencing projects remains a major impediment to the accessibility of high-throughput sequence data. We present a self-contained, automated high-throughput open source genome sequencing and computational genomics pipeline suitable for prokaryotic sequencing projects. The pipeline has been used at the Georgia Institute of Technology and the Centers for Disease Control and Prevention for the analysis of Neisseria meningitidis and Bordetella bronchiseptica genomes. The pipeline is capable of enhanced or manually assisted reference-based assembly using multiple assemblers and modes; gene predictor combining; and functional annotation of genes and gene products. Because every component of the pipeline is executed on a local machine with no need to access resources over the Internet, the pipeline is suitable for projects of a sensitive nature. Annotation of virulence-related features makes the pipeline particularly useful for projects working with pathogenic prokaryotes. The pipeline is licensed under the open-source GNU General Public License and available at the Georgia Tech Neisseria Base (http://nbase.biology.gatech.edu/). The pipeline is implemented with a combination of Perl, Bourne Shell and MySQL and is compatible with Linux and other Unix systems.

  15. The LCOGT Science Archive and Data Pipeline

    NASA Astrophysics Data System (ADS)

    Lister, Tim; Walker, Z.; Ciardi, D.; Gelino, C. R.; Good, J.; Laity, A.; Swain, M.

    2013-01-01

    Las Cumbres Observatory Global Telescope (LCOGT) is building and deploying a world-wide network of optical telescopes dedicated to time-domain astronomy. In the past year, we have deployed and commissioned four new 1m telescopes at McDonald Observatory, Texas and at CTIO, Chile, with more to come at SAAO, South Africa and Siding Spring Observatory, Australia. To handle these new data sources coming from the growing LCOGT network, and to serve them to end users, we have constructed a new data pipeline and Science Archive. We describe the new LCOGT pipeline, currently under development and testing, which makes use of the ORAC-DR automated recipe-based data reduction pipeline and illustrate some of the new data products. We also present the new Science Archive, which is being developed in partnership with the Infrared Processing and Analysis Center (IPAC) and show some of the new features the Science Archive provides.

  16. Mathematical simulation for compensation capacities area of pipeline routes in ship systems

    NASA Astrophysics Data System (ADS)

    Ngo, G. V.; Sakhno, K. N.

    2018-05-01

    In this paper, the authors considered the problem of manufacturability’s enhancement of ship systems pipeline at the designing stage. The analysis of arrangements and possibilities for compensation of deviations for pipeline routes has been carried out. The task was set to produce the “fit pipe” together with the rest of the pipes in the route. It was proposed to compensate for deviations by movement of the pipeline route during pipe installation and to calculate maximum values of these displacements in the analyzed path. Theoretical bases of deviation compensation for pipeline routes using rotations of parallel section pairs of pipes are assembled. Mathematical and graphical simulations of compensation area capacities of pipeline routes with various configurations are completed. Prerequisites have been created for creating an automated program that will allow one to determine values of the compensatory capacities area for pipeline routes and to assign quantities of necessary allowances.

  17. Enhanced Automated Guidance System for Horizontal Auger Boring Based on Image Processing

    PubMed Central

    Wu, Lingling; Wen, Guojun; Wang, Yudan; Huang, Lei; Zhou, Jiang

    2018-01-01

    Horizontal auger boring (HAB) is a widely used trenchless technology for the high-accuracy installation of gravity or pressure pipelines on line and grade. Differing from other pipeline installations, HAB requires a more precise and automated guidance system for use in a practical project. This paper proposes an economic and enhanced automated optical guidance system, based on optimization research of light-emitting diode (LED) light target and five automated image processing bore-path deviation algorithms. An LED target was optimized for many qualities, including light color, filter plate color, luminous intensity, and LED layout. The image preprocessing algorithm, feature extraction algorithm, angle measurement algorithm, deflection detection algorithm, and auto-focus algorithm, compiled in MATLAB, are used to automate image processing for deflection computing and judging. After multiple indoor experiments, this guidance system is applied in a project of hot water pipeline installation, with accuracy controlled within 2 mm in 48-m distance, providing accurate line and grade controls and verifying the feasibility and reliability of the guidance system. PMID:29462855

  18. Enhanced Automated Guidance System for Horizontal Auger Boring Based on Image Processing.

    PubMed

    Wu, Lingling; Wen, Guojun; Wang, Yudan; Huang, Lei; Zhou, Jiang

    2018-02-15

    Horizontal auger boring (HAB) is a widely used trenchless technology for the high-accuracy installation of gravity or pressure pipelines on line and grade. Differing from other pipeline installations, HAB requires a more precise and automated guidance system for use in a practical project. This paper proposes an economic and enhanced automated optical guidance system, based on optimization research of light-emitting diode (LED) light target and five automated image processing bore-path deviation algorithms. An LED light target was optimized for many qualities, including light color, filter plate color, luminous intensity, and LED layout. The image preprocessing algorithm, direction location algorithm, angle measurement algorithm, deflection detection algorithm, and auto-focus algorithm, compiled in MATLAB, are used to automate image processing for deflection computing and judging. After multiple indoor experiments, this guidance system is applied in a project of hot water pipeline installation, with accuracy controlled within 2 mm in 48-m distance, providing accurate line and grade controls and verifying the feasibility and reliability of the guidance system.

  19. Complex solution of problem of all-season construction of roads and pipelines on universal composite pontoon units

    NASA Astrophysics Data System (ADS)

    Ryabkov, A. V.; Stafeeva, N. A.; Ivanov, V. A.; Zakuraev, A. F.

    2018-05-01

    A complex construction consisting of a universal floating pontoon road for laying pipelines in automatic mode on its body all year round and in any weather for Siberia and the Far North has been designed. A new method is proposed for the construction of pipelines on pontoon modules, which are made of composite materials. Pontoons made of composite materials for bedding pipelines with track-forming guides for automated wheeled transport, pipelayer, are designed. The proposed system eliminates the construction of a road along the route, ensures the buoyancy and smoothness of the self-propelled automated stacker in the form of a "centipede", which has a number of significant advantages in the construction and operation of the entire complex in the swamp and watered areas without overburden.

  20. Automated Sanger Analysis Pipeline (ASAP): A Tool for Rapidly Analyzing Sanger Sequencing Data with Minimum User Interference.

    PubMed

    Singh, Aditya; Bhatia, Prateek

    2016-12-01

    Sanger sequencing platforms, such as applied biosystems instruments, generate chromatogram files. Generally, for 1 region of a sequence, we use both forward and reverse primers to sequence that area, in that way, we have 2 sequences that need to be aligned and a consensus generated before mutation detection studies. This work is cumbersome and takes time, especially if the gene is large with many exons. Hence, we devised a rapid automated command system to filter, build, and align consensus sequences and also optionally extract exonic regions, translate them in all frames, and perform an amino acid alignment starting from raw sequence data within a very short time. In full capabilities of Automated Mutation Analysis Pipeline (ASAP), it is able to read "*.ab1" chromatogram files through command line interface, convert it to the FASTQ format, trim the low-quality regions, reverse-complement the reverse sequence, create a consensus sequence, extract the exonic regions using a reference exonic sequence, translate the sequence in all frames, and align the nucleic acid and amino acid sequences to reference nucleic acid and amino acid sequences, respectively. All files are created and can be used for further analysis. ASAP is available as Python 3.x executable at https://github.com/aditya-88/ASAP. The version described in this paper is 0.28.

  1. Mary, a Pipeline to Aid Discovery of Optical Transients

    NASA Astrophysics Data System (ADS)

    Andreoni, I.; Jacobs, C.; Hegarty, S.; Pritchard, T.; Cooke, J.; Ryder, S.

    2017-09-01

    The ability to quickly detect transient sources in optical images and trigger multi-wavelength follow up is key for the discovery of fast transients. These include events rare and difficult to detect such as kilonovae, supernova shock breakout, and `orphan' Gamma-ray Burst afterglows. We present the Mary pipeline, a (mostly) automated tool to discover transients during high-cadenced observations with the Dark Energy Camera at Cerro Tololo Inter-American Observatory (CTIO). The observations are part of the `Deeper Wider Faster' programme, a multi-facility, multi-wavelength programme designed to discover fast transients, including counterparts to Fast Radio Bursts and gravitational waves. Our tests of the Mary pipeline on Dark Energy Camera images return a false positive rate of 2.2% and a missed fraction of 3.4% obtained in less than 2 min, which proves the pipeline to be suitable for rapid and high-quality transient searches. The pipeline can be adapted to search for transients in data obtained with imagers other than Dark Energy Camera.

  2. Pipeline scada upgrade uses satellite terminal system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Conrad, W.; Skovrinski, J.R.

    In the recent automation of its supervisory control and data acquisition (scada) system, Transwestern Pipeline Co. has become the first to use very small aperture satellite terminals (VSAT's) for scada. A subsidiary of Enron Interstate Pipeline, Houston, Transwestern moves natural gas through a 4,400-mile system from West Texas, New Mexico, and Oklahoma to southern California markets. Transwestern's modernization, begun in November 1985, addressed problems associated with its aging control equipment which had been installed when the compressor stations were built in 1960. Over the years a combination of three different systems had been added. All were cumbersome to maintain andmore » utilized outdated technology. Problems with reliability, high maintenance time, and difficulty in getting new parts were determining factors in Transwestern's decision to modernize its scada system. In addition, the pipeline was anticipating moving its control center from Roswell, N.M., to Houston and believed it would be impossible to marry the old system with the new computer equipment in Houston.« less

  3. StructRNAfinder: an automated pipeline and web server for RNA families prediction.

    PubMed

    Arias-Carrasco, Raúl; Vásquez-Morán, Yessenia; Nakaya, Helder I; Maracaja-Coutinho, Vinicius

    2018-02-17

    The function of many noncoding RNAs (ncRNAs) depend upon their secondary structures. Over the last decades, several methodologies have been developed to predict such structures or to use them to functionally annotate RNAs into RNA families. However, to fully perform this analysis, researchers should utilize multiple tools, which require the constant parsing and processing of several intermediate files. This makes the large-scale prediction and annotation of RNAs a daunting task even to researchers with good computational or bioinformatics skills. We present an automated pipeline named StructRNAfinder that predicts and annotates RNA families in transcript or genome sequences. This single tool not only displays the sequence/structural consensus alignments for each RNA family, according to Rfam database but also provides a taxonomic overview for each assigned functional RNA. Moreover, we implemented a user-friendly web service that allows researchers to upload their own nucleotide sequences in order to perform the whole analysis. Finally, we provided a stand-alone version of StructRNAfinder to be used in large-scale projects. The tool was developed under GNU General Public License (GPLv3) and is freely available at http://structrnafinder.integrativebioinformatics.me . The main advantage of StructRNAfinder relies on the large-scale processing and integrating the data obtained by each tool and database employed along the workflow, of which several files are generated and displayed in user-friendly reports, useful for downstream analyses and data exploration.

  4. Fully automated processing of fMRI data in SPM: from MRI scanner to PACS.

    PubMed

    Maldjian, Joseph A; Baer, Aaron H; Kraft, Robert A; Laurienti, Paul J; Burdette, Jonathan H

    2009-01-01

    Here we describe the Wake Forest University Pipeline, a fully automated method for the processing of fMRI data using SPM. The method includes fully automated data transfer and archiving from the point of acquisition, real-time batch script generation, distributed grid processing, interface to SPM in MATLAB, error recovery and data provenance, DICOM conversion and PACS insertion. It has been used for automated processing of fMRI experiments, as well as for the clinical implementation of fMRI and spin-tag perfusion imaging. The pipeline requires no manual intervention, and can be extended to any studies requiring offline processing.

  5. The Minimal Preprocessing Pipelines for the Human Connectome Project

    PubMed Central

    Glasser, Matthew F.; Sotiropoulos, Stamatios N; Wilson, J Anthony; Coalson, Timothy S; Fischl, Bruce; Andersson, Jesper L; Xu, Junqian; Jbabdi, Saad; Webster, Matthew; Polimeni, Jonathan R; Van Essen, David C; Jenkinson, Mark

    2013-01-01

    The Human Connectome Project (HCP) faces the challenging task of bringing multiple magnetic resonance imaging (MRI) modalities together in a common automated preprocessing framework across a large cohort of subjects. The MRI data acquired by the HCP differ in many ways from data acquired on conventional 3 Tesla scanners and often require newly developed preprocessing methods. We describe the minimal preprocessing pipelines for structural, functional, and diffusion MRI that were developed by the HCP to accomplish many low level tasks, including spatial artifact/distortion removal, surface generation, cross-modal registration, and alignment to standard space. These pipelines are specially designed to capitalize on the high quality data offered by the HCP. The final standard space makes use of a recently introduced CIFTI file format and the associated grayordinates spatial coordinate system. This allows for combined cortical surface and subcortical volume analyses while reducing the storage and processing requirements for high spatial and temporal resolution data. Here, we provide the minimum image acquisition requirements for the HCP minimal preprocessing pipelines and additional advice for investigators interested in replicating the HCP’s acquisition protocols or using these pipelines. Finally, we discuss some potential future improvements for the pipelines. PMID:23668970

  6. An Automated, Adaptive Framework for Optimizing Preprocessing Pipelines in Task-Based Functional MRI

    PubMed Central

    Churchill, Nathan W.; Spring, Robyn; Afshin-Pour, Babak; Dong, Fan; Strother, Stephen C.

    2015-01-01

    BOLD fMRI is sensitive to blood-oxygenation changes correlated with brain function; however, it is limited by relatively weak signal and significant noise confounds. Many preprocessing algorithms have been developed to control noise and improve signal detection in fMRI. Although the chosen set of preprocessing and analysis steps (the “pipeline”) significantly affects signal detection, pipelines are rarely quantitatively validated in the neuroimaging literature, due to complex preprocessing interactions. This paper outlines and validates an adaptive resampling framework for evaluating and optimizing preprocessing choices by optimizing data-driven metrics of task prediction and spatial reproducibility. Compared to standard “fixed” preprocessing pipelines, this optimization approach significantly improves independent validation measures of within-subject test-retest, and between-subject activation overlap, and behavioural prediction accuracy. We demonstrate that preprocessing choices function as implicit model regularizers, and that improvements due to pipeline optimization generalize across a range of simple to complex experimental tasks and analysis models. Results are shown for brief scanning sessions (<3 minutes each), demonstrating that with pipeline optimization, it is possible to obtain reliable results and brain-behaviour correlations in relatively small datasets. PMID:26161667

  7. A computational genomics pipeline for prokaryotic sequencing projects

    PubMed Central

    Kislyuk, Andrey O.; Katz, Lee S.; Agrawal, Sonia; Hagen, Matthew S.; Conley, Andrew B.; Jayaraman, Pushkala; Nelakuditi, Viswateja; Humphrey, Jay C.; Sammons, Scott A.; Govil, Dhwani; Mair, Raydel D.; Tatti, Kathleen M.; Tondella, Maria L.; Harcourt, Brian H.; Mayer, Leonard W.; Jordan, I. King

    2010-01-01

    Motivation: New sequencing technologies have accelerated research on prokaryotic genomes and have made genome sequencing operations outside major genome sequencing centers routine. However, no off-the-shelf solution exists for the combined assembly, gene prediction, genome annotation and data presentation necessary to interpret sequencing data. The resulting requirement to invest significant resources into custom informatics support for genome sequencing projects remains a major impediment to the accessibility of high-throughput sequence data. Results: We present a self-contained, automated high-throughput open source genome sequencing and computational genomics pipeline suitable for prokaryotic sequencing projects. The pipeline has been used at the Georgia Institute of Technology and the Centers for Disease Control and Prevention for the analysis of Neisseria meningitidis and Bordetella bronchiseptica genomes. The pipeline is capable of enhanced or manually assisted reference-based assembly using multiple assemblers and modes; gene predictor combining; and functional annotation of genes and gene products. Because every component of the pipeline is executed on a local machine with no need to access resources over the Internet, the pipeline is suitable for projects of a sensitive nature. Annotation of virulence-related features makes the pipeline particularly useful for projects working with pathogenic prokaryotes. Availability and implementation: The pipeline is licensed under the open-source GNU General Public License and available at the Georgia Tech Neisseria Base (http://nbase.biology.gatech.edu/). The pipeline is implemented with a combination of Perl, Bourne Shell and MySQL and is compatible with Linux and other Unix systems. Contact: king.jordan@biology.gatech.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:20519285

  8. Extending the Fermi-LAT data processing pipeline to the grid

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zimmer, S.; Arrabito, L.; Glanzman, T.

    2015-05-12

    The Data Handling Pipeline ("Pipeline") has been developed for the Fermi Gamma-Ray Space Telescope (Fermi) Large Area Telescope (LAT) which launched in June 2008. Since then it has been in use to completely automate the production of data quality monitoring quantities, reconstruction and routine analysis of all data received from the satellite and to deliver science products to the collaboration and the Fermi Science Support Center. Aside from the reconstruction of raw data from the satellite (Level 1), data reprocessing and various event-level analyses are also reasonably heavy loads on the pipeline and computing resources. These other loads, unlike Levelmore » 1, can run continuously for weeks or months at a time. Additionally, it receives heavy use in performing production Monte Carlo tasks.« less

  9. Open source pipeline for ESPaDOnS reduction and analysis

    NASA Astrophysics Data System (ADS)

    Martioli, Eder; Teeple, Doug; Manset, Nadine; Devost, Daniel; Withington, Kanoa; Venne, Andre; Tannock, Megan

    2012-09-01

    OPERA is a Canada-France-Hawaii Telescope (CFHT) open source collaborative software project currently under development for an ESPaDOnS echelle spectro-polarimetric image reduction pipeline. OPERA is designed to be fully automated, performing calibrations and reduction, producing one-dimensional intensity and polarimetric spectra. The calibrations are performed on two-dimensional images. Spectra are extracted using an optimal extraction algorithm. While primarily designed for CFHT ESPaDOnS data, the pipeline is being written to be extensible to other echelle spectrographs. A primary design goal is to make use of fast, modern object-oriented technologies. Processing is controlled by a harness, which manages a set of processing modules, that make use of a collection of native OPERA software libraries and standard external software libraries. The harness and modules are completely parametrized by site configuration and instrument parameters. The software is open- ended, permitting users of OPERA to extend the pipeline capabilities. All these features have been designed to provide a portable infrastructure that facilitates collaborative development, code re-usability and extensibility. OPERA is free software with support for both GNU/Linux and MacOSX platforms. The pipeline is hosted on SourceForge under the name "opera-pipeline".

  10. The PREP pipeline: standardized preprocessing for large-scale EEG analysis.

    PubMed

    Bigdely-Shamlo, Nima; Mullen, Tim; Kothe, Christian; Su, Kyung-Min; Robbins, Kay A

    2015-01-01

    The technology to collect brain imaging and physiological measures has become portable and ubiquitous, opening the possibility of large-scale analysis of real-world human imaging. By its nature, such data is large and complex, making automated processing essential. This paper shows how lack of attention to the very early stages of an EEG preprocessing pipeline can reduce the signal-to-noise ratio and introduce unwanted artifacts into the data, particularly for computations done in single precision. We demonstrate that ordinary average referencing improves the signal-to-noise ratio, but that noisy channels can contaminate the results. We also show that identification of noisy channels depends on the reference and examine the complex interaction of filtering, noisy channel identification, and referencing. We introduce a multi-stage robust referencing scheme to deal with the noisy channel-reference interaction. We propose a standardized early-stage EEG processing pipeline (PREP) and discuss the application of the pipeline to more than 600 EEG datasets. The pipeline includes an automatically generated report for each dataset processed. Users can download the PREP pipeline as a freely available MATLAB library from http://eegstudy.org/prepcode.

  11. The PREP pipeline: standardized preprocessing for large-scale EEG analysis

    PubMed Central

    Bigdely-Shamlo, Nima; Mullen, Tim; Kothe, Christian; Su, Kyung-Min; Robbins, Kay A.

    2015-01-01

    The technology to collect brain imaging and physiological measures has become portable and ubiquitous, opening the possibility of large-scale analysis of real-world human imaging. By its nature, such data is large and complex, making automated processing essential. This paper shows how lack of attention to the very early stages of an EEG preprocessing pipeline can reduce the signal-to-noise ratio and introduce unwanted artifacts into the data, particularly for computations done in single precision. We demonstrate that ordinary average referencing improves the signal-to-noise ratio, but that noisy channels can contaminate the results. We also show that identification of noisy channels depends on the reference and examine the complex interaction of filtering, noisy channel identification, and referencing. We introduce a multi-stage robust referencing scheme to deal with the noisy channel-reference interaction. We propose a standardized early-stage EEG processing pipeline (PREP) and discuss the application of the pipeline to more than 600 EEG datasets. The pipeline includes an automatically generated report for each dataset processed. Users can download the PREP pipeline as a freely available MATLAB library from http://eegstudy.org/prepcode. PMID:26150785

  12. An Automatic Quality Control Pipeline for High-Throughput Screening Hit Identification.

    PubMed

    Zhai, Yufeng; Chen, Kaisheng; Zhong, Yang; Zhou, Bin; Ainscow, Edward; Wu, Ying-Ta; Zhou, Yingyao

    2016-09-01

    The correction or removal of signal errors in high-throughput screening (HTS) data is critical to the identification of high-quality lead candidates. Although a number of strategies have been previously developed to correct systematic errors and to remove screening artifacts, they are not universally effective and still require fair amount of human intervention. We introduce a fully automated quality control (QC) pipeline that can correct generic interplate systematic errors and remove intraplate random artifacts. The new pipeline was first applied to ~100 large-scale historical HTS assays; in silico analysis showed auto-QC led to a noticeably stronger structure-activity relationship. The method was further tested in several independent HTS runs, where QC results were sampled for experimental validation. Significantly increased hit confirmation rates were obtained after the QC steps, confirming that the proposed method was effective in enriching true-positive hits. An implementation of the algorithm is available to the screening community. © 2016 Society for Laboratory Automation and Screening.

  13. Slaying Hydra: A Python-Based Reduction Pipeline for the Hydra Multi-Object Spectrograph

    NASA Astrophysics Data System (ADS)

    Seifert, Richard; Mann, Andrew

    2018-01-01

    We present a Python-based data reduction pipeline for the Hydra Multi-Object Spectrograph on the WIYN 3.5 m telescope, an instrument which enables simultaneous spectroscopy of up to 93 targets. The reduction steps carried out include flat-fielding, dynamic fiber tracing, wavelength calibration, optimal fiber extraction, and sky subtraction. The pipeline also supports the use of sky lines to correct for zero-point offsets between fibers. To account for the moving parts on the instrument and telescope, fiber positions and wavelength solutions are derived in real-time for each dataset. The end result is a one-dimensional spectrum for each target fiber. Quick and fully automated, the pipeline enables on-the-fly reduction while observing, and has been known to outperform the IRAF pipeline by more accurately reproducing known RVs. While Hydra has many configurations in both high- and low-resolution, the pipeline was developed and tested with only one high-resolution mode. In the future we plan to expand the pipeline to work in most commonly used modes.

  14. 76 FR 6516 - Pipeline Safety: Agency Information Collection Activities: Notice of Request for Extension of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-04

    ... condition that is a hazard to life, property or the environment. Affected Public: Operators of pipeline... automated, electronic, mechanical, or other technological collection techniques. A comment to OMB is most...

  15. Structural health monitoring of pipelines rehabilitated with lining technology

    NASA Astrophysics Data System (ADS)

    Farhidzadeh, Alireza; Dehghan-Niri, Ehsan; Salamone, Salvatore

    2014-03-01

    Damage detection of pipeline systems is a tedious and time consuming job due to digging requirement, accessibility, interference with other facilities, and being extremely wide spread in metropolitans. Therefore, a real-time and automated monitoring system can pervasively reduce labor work, time, and expenditures. This paper presents the results of an experimental study aimed at monitoring the performance of full scale pipe lining systems, subjected to static and dynamic (seismic) loading, using Acoustic Emission (AE) technique and Guided Ultrasonic Waves (GUWs). Particularly, two damage mechanisms are investigated: 1) delamination between pipeline and liner as the early indicator of damage, and 2) onset of nonlinearity and incipient failure of the liner as critical damage state.

  16. BioBlend: automating pipeline analyses within Galaxy and CloudMan.

    PubMed

    Sloggett, Clare; Goonasekera, Nuwan; Afgan, Enis

    2013-07-01

    We present BioBlend, a unified API in a high-level language (python) that wraps the functionality of Galaxy and CloudMan APIs. BioBlend makes it easy for bioinformaticians to automate end-to-end large data analysis, from scratch, in a way that is highly accessible to collaborators, by allowing them to both provide the required infrastructure and automate complex analyses over large datasets within the familiar Galaxy environment. http://bioblend.readthedocs.org/. Automated installation of BioBlend is available via PyPI (e.g. pip install bioblend). Alternatively, the source code is available from the GitHub repository (https://github.com/afgane/bioblend) under the MIT open source license. The library has been tested and is working on Linux, Macintosh and Windows-based systems.

  17. FIGENIX: Intelligent automation of genomic annotation: expertise integration in a new software platform

    PubMed Central

    Gouret, Philippe; Vitiello, Vérane; Balandraud, Nathalie; Gilles, André; Pontarotti, Pierre; Danchin, Etienne GJ

    2005-01-01

    Background Two of the main objectives of the genomic and post-genomic era are to structurally and functionally annotate genomes which consists of detecting genes' position and structure, and inferring their function (as well as of other features of genomes). Structural and functional annotation both require the complex chaining of numerous different software, algorithms and methods under the supervision of a biologist. The automation of these pipelines is necessary to manage huge amounts of data released by sequencing projects. Several pipelines already automate some of these complex chaining but still necessitate an important contribution of biologists for supervising and controlling the results at various steps. Results Here we propose an innovative automated platform, FIGENIX, which includes an expert system capable to substitute to human expertise at several key steps. FIGENIX currently automates complex pipelines of structural and functional annotation under the supervision of the expert system (which allows for example to make key decisions, check intermediate results or refine the dataset). The quality of the results produced by FIGENIX is comparable to those obtained by expert biologists with a drastic gain in terms of time costs and avoidance of errors due to the human manipulation of data. Conclusion The core engine and expert system of the FIGENIX platform currently handle complex annotation processes of broad interest for the genomic community. They could be easily adapted to new, or more specialized pipelines, such as for example the annotation of miRNAs, the classification of complex multigenic families, annotation of regulatory elements and other genomic features of interest. PMID:16083500

  18. CERES: A Set of Automated Routines for Echelle Spectra

    NASA Astrophysics Data System (ADS)

    Brahm, Rafael; Jordán, Andrés; Espinoza, Néstor

    2017-03-01

    We present the Collection of Elemental Routines for Echelle Spectra (CERES). These routines were developed for the construction of automated pipelines for the reduction, extraction, and analysis of spectra acquired with different instruments, allowing the obtention of homogeneous and standardized results. This modular code includes tools for handling the different steps of the processing: CCD image reductions; identification and tracing of the echelle orders; optimal and rectangular extraction; computation of the wavelength solution; estimation of radial velocities; and rough and fast estimation of the atmospheric parameters. Currently, CERES has been used to develop automated pipelines for 13 different spectrographs, namely CORALIE, FEROS, HARPS, ESPaDOnS, FIES, PUCHEROS, FIDEOS, CAFE, DuPont/Echelle, Magellan/Mike, Keck/HIRES, Magellan/PFS, and APO/ARCES, but the routines can be easily used to deal with data coming from other spectrographs. We show the high precision in radial velocity that CERES achieves for some of these instruments, and we briefly summarize some results that have already been obtained using the CERES pipelines.

  19. Toward practical 3D radiography of pipeline girth welds

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wassink, Casper, E-mail: casper.wassink@applusrtd.com; Hol, Martijn, E-mail: martijn.hol@applusrtd.com; Flikweert, Arjan, E-mail: martijn.hol@applusrtd.com

    2015-03-31

    Digital radiography has made its way into in-the-field girth weld testing. With recent generations of detectors and x-ray tubes it is possible to reach the image quality desired in standards as well as the speed of inspection desired to be competitive with film radiography and automated ultrasonic testing. This paper will show the application of these technologies in the RTD Rayscan system. The method for achieving an image quality that complies with or even exceeds prevailing industrial standards will be presented, as well as the application on pipeline girth welds with CRA layers. A next step in development will bemore » to also achieve a measurement of weld flaw height to allow for performing an Engineering Critical Assessment on the weld. This will allow for similar acceptance limits as currently used with Automated Ultrasonic Testing of pipeline girth welds. Although a sufficient sizing accuracy was already demonstrated and qualified in the TomoCAR system, testing in some applications is restricted to time limits. The paper will present some experiments that were performed to achieve flaw height approximation within these time limits.« less

  20. Gap-free segmentation of vascular networks with automatic image processing pipeline.

    PubMed

    Hsu, Chih-Yang; Ghaffari, Mahsa; Alaraj, Ali; Flannery, Michael; Zhou, Xiaohong Joe; Linninger, Andreas

    2017-03-01

    Current image processing techniques capture large vessels reliably but often fail to preserve connectivity in bifurcations and small vessels. Imaging artifacts and noise can create gaps and discontinuity of intensity that hinders segmentation of vascular trees. However, topological analysis of vascular trees require proper connectivity without gaps, loops or dangling segments. Proper tree connectivity is also important for high quality rendering of surface meshes for scientific visualization or 3D printing. We present a fully automated vessel enhancement pipeline with automated parameter settings for vessel enhancement of tree-like structures from customary imaging sources, including 3D rotational angiography, magnetic resonance angiography, magnetic resonance venography, and computed tomography angiography. The output of the filter pipeline is a vessel-enhanced image which is ideal for generating anatomical consistent network representations of the cerebral angioarchitecture for further topological or statistical analysis. The filter pipeline combined with computational modeling can potentially improve computer-aided diagnosis of cerebrovascular diseases by delivering biometrics and anatomy of the vasculature. It may serve as the first step in fully automatic epidemiological analysis of large clinical datasets. The automatic analysis would enable rigorous statistical comparison of biometrics in subject-specific vascular trees. The robust and accurate image segmentation using a validated filter pipeline would also eliminate operator dependency that has been observed in manual segmentation. Moreover, manual segmentation is time prohibitive given that vascular trees have more than thousands of segments and bifurcations so that interactive segmentation consumes excessive human resources. Subject-specific trees are a first step toward patient-specific hemodynamic simulations for assessing treatment outcomes. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. Automated pulmonary lobar ventilation measurements using volume-matched thoracic CT and MRI

    NASA Astrophysics Data System (ADS)

    Guo, F.; Svenningsen, S.; Bluemke, E.; Rajchl, M.; Yuan, J.; Fenster, A.; Parraga, G.

    2015-03-01

    Objectives: To develop and evaluate an automated registration and segmentation pipeline for regional lobar pulmonary structure-function measurements, using volume-matched thoracic CT and MRI in order to guide therapy. Methods: Ten subjects underwent pulmonary function tests and volume-matched 1H and 3He MRI and thoracic CT during a single 2-hr visit. CT was registered to 1H MRI using an affine method that incorporated block-matching and this was followed by a deformable step using free-form deformation. The resultant deformation field was used to deform the associated CT lobe mask that was generated using commercial software. 3He-1H image registration used the same two-step registration method and 3He ventilation was segmented using hierarchical k-means clustering. Whole lung and lobar 3He ventilation and ventilation defect percent (VDP) were generated by mapping ventilation defects to CT-defined whole lung and lobe volumes. Target CT-3He registration accuracy was evaluated using region- , surface distance- and volume-based metrics. Automated whole lung and lobar VDP was compared with semi-automated and manual results using paired t-tests. Results: The proposed pipeline yielded regional spatial agreement of 88.0+/-0.9% and surface distance error of 3.9+/-0.5 mm. Automated and manual whole lung and lobar ventilation and VDP were not significantly different and they were significantly correlated (r = 0.77, p < 0.0001). Conclusion: The proposed automated pipeline can be used to generate regional pulmonary structural-functional maps with high accuracy and robustness, providing an important tool for image-guided pulmonary interventions.

  2. 77 FR 70543 - Pipeline Safety: Meeting of the Gas Pipeline Advisory Committee and the Liquid Pipeline Advisory...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-26

    .... PHMSA-2009-0203] Pipeline Safety: Meeting of the Gas Pipeline Advisory Committee and the Liquid Pipeline..., and safety policies for natural gas pipelines and for hazardous liquid pipelines. Both committees were...: Notice of advisory committee meeting. SUMMARY: This notice announces a public meeting of the Gas Pipeline...

  3. Historical analysis of US pipeline accidents triggered by natural hazards

    NASA Astrophysics Data System (ADS)

    Girgin, Serkan; Krausmann, Elisabeth

    2015-04-01

    Natural hazards, such as earthquakes, floods, landslides, or lightning, can initiate accidents in oil and gas pipelines with potentially major consequences on the population or the environment due to toxic releases, fires and explosions. Accidents of this type are also referred to as Natech events. Many major accidents highlight the risk associated with natural-hazard impact on pipelines transporting dangerous substances. For instance, in the USA in 1994, flooding of the San Jacinto River caused the rupture of 8 and the undermining of 29 pipelines by the floodwaters. About 5.5 million litres of petroleum and related products were spilled into the river and ignited. As a results, 547 people were injured and significant environmental damage occurred. Post-incident analysis is a valuable tool for better understanding the causes, dynamics and impacts of pipeline Natech accidents in support of future accident prevention and mitigation. Therefore, data on onshore hazardous-liquid pipeline accidents collected by the US Pipeline and Hazardous Materials Safety Administration (PHMSA) was analysed. For this purpose, a database-driven incident data analysis system was developed to aid the rapid review and categorization of PHMSA incident reports. Using an automated data-mining process followed by a peer review of the incident records and supported by natural hazard databases and external information sources, the pipeline Natechs were identified. As a by-product of the data-collection process, the database now includes over 800,000 incidents from all causes in industrial and transportation activities, which are automatically classified in the same way as the PHMSA record. This presentation describes the data collection and reviewing steps conducted during the study, provides information on the developed database and data analysis tools, and reports the findings of a statistical analysis of the identified hazardous liquid pipeline incidents in terms of accident dynamics and

  4. Tapping into the Hexagon spy imagery database: A new automated pipeline for geomorphic change detection

    NASA Astrophysics Data System (ADS)

    Maurer, Joshua; Rupper, Summer

    2015-10-01

    Declassified historical imagery from the Hexagon spy satellite database has near-global coverage, yet remains a largely untapped resource for geomorphic change studies. Unavailable satellite ephemeris data make DEM (digital elevation model) extraction difficult in terms of time and accuracy. A new fully-automated pipeline for DEM extraction and image orthorectification is presented which yields accurate results and greatly increases efficiency over traditional photogrammetric methods, making the Hexagon image database much more appealing and accessible. A 1980 Hexagon DEM is extracted and geomorphic change computed for the Thistle Creek Landslide region in the Wasatch Range of North America to demonstrate an application of the new method. Surface elevation changes resulting from the landslide show an average elevation decrease of 14.4 ± 4.3 m in the source area, an increase of 17.6 ± 4.7 m in the deposition area, and a decrease of 30.2 ± 5.1 m resulting from a new roadcut. Two additional applications of the method include volume estimates of material excavated during the Mount St. Helens volcanic eruption and the volume of net ice loss over a 34-year period for glaciers in the Bhutanese Himalayas. These results show the value of Hexagon imagery in detecting and quantifying historical geomorphic change, especially in regions where other data sources are limited.

  5. 78 FR 70623 - Pipeline Safety: Meeting of the Gas Pipeline Advisory Committee and the Liquid Pipeline Advisory...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-26

    ... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket No. PHMSA-2009-0203] Pipeline Safety: Meeting of the Gas Pipeline Advisory Committee and the Liquid Pipeline Advisory Committee AGENCY: Pipeline and Hazardous Materials Safety Administration (PHMSA), DOT. [[Page...

  6. An automated, open-source pipeline for mass production of digital elevation models (DEMs) from very-high-resolution commercial stereo satellite imagery

    NASA Astrophysics Data System (ADS)

    Shean, David E.; Alexandrov, Oleg; Moratto, Zachary M.; Smith, Benjamin E.; Joughin, Ian R.; Porter, Claire; Morin, Paul

    2016-06-01

    We adapted the automated, open source NASA Ames Stereo Pipeline (ASP) to generate digital elevation models (DEMs) and orthoimages from very-high-resolution (VHR) commercial imagery of the Earth. These modifications include support for rigorous and rational polynomial coefficient (RPC) sensor models, sensor geometry correction, bundle adjustment, point cloud co-registration, and significant improvements to the ASP code base. We outline a processing workflow for ˜0.5 m ground sample distance (GSD) DigitalGlobe WorldView-1 and WorldView-2 along-track stereo image data, with an overview of ASP capabilities, an evaluation of ASP correlator options, benchmark test results, and two case studies of DEM accuracy. Output DEM products are posted at ˜2 m with direct geolocation accuracy of <5.0 m CE90/LE90. An automated iterative closest-point (ICP) co-registration tool reduces absolute vertical and horizontal error to <0.5 m where appropriate ground-control data are available, with observed standard deviation of ˜0.1-0.5 m for overlapping, co-registered DEMs (n = 14, 17). While ASP can be used to process individual stereo pairs on a local workstation, the methods presented here were developed for large-scale batch processing in a high-performance computing environment. We are leveraging these resources to produce dense time series and regional mosaics for the Earth's polar regions.

  7. Chimenea and other tools: Automated imaging of multi-epoch radio-synthesis data with CASA

    NASA Astrophysics Data System (ADS)

    Staley, T. D.; Anderson, G. E.

    2015-11-01

    In preparing the way for the Square Kilometre Array and its pathfinders, there is a pressing need to begin probing the transient sky in a fully robotic fashion using the current generation of radio telescopes. Effective exploitation of such surveys requires a largely automated data-reduction process. This paper introduces an end-to-end automated reduction pipeline, AMIsurvey, used for calibrating and imaging data from the Arcminute Microkelvin Imager Large Array. AMIsurvey makes use of several component libraries which have been packaged separately for open-source release. The most scientifically significant of these is chimenea, which implements a telescope-agnostic algorithm for automated imaging of pre-calibrated multi-epoch radio-synthesis data, of the sort typically acquired for transient surveys or follow-up. The algorithm aims to improve upon standard imaging pipelines by utilizing iterative RMS-estimation and automated source-detection to avoid so called 'Clean-bias', and makes use of CASA subroutines for the underlying image-synthesis operations. At a lower level, AMIsurvey relies upon two libraries, drive-ami and drive-casa, built to allow use of mature radio-astronomy software packages from within Python scripts. While targeted at automated imaging, the drive-casa interface can also be used to automate interaction with any of the CASA subroutines from a generic Python process. Additionally, these packages may be of wider technical interest beyond radio-astronomy, since they demonstrate use of the Python library pexpect to emulate terminal interaction with an external process. This approach allows for rapid development of a Python interface to any legacy or externally-maintained pipeline which accepts command-line input, without requiring alterations to the original code.

  8. A De-Novo Genome Analysis Pipeline (DeNoGAP) for large-scale comparative prokaryotic genomics studies.

    PubMed

    Thakur, Shalabh; Guttman, David S

    2016-06-30

    Comparative analysis of whole genome sequence data from closely related prokaryotic species or strains is becoming an increasingly important and accessible approach for addressing both fundamental and applied biological questions. While there are number of excellent tools developed for performing this task, most scale poorly when faced with hundreds of genome sequences, and many require extensive manual curation. We have developed a de-novo genome analysis pipeline (DeNoGAP) for the automated, iterative and high-throughput analysis of data from comparative genomics projects involving hundreds of whole genome sequences. The pipeline is designed to perform reference-assisted and de novo gene prediction, homolog protein family assignment, ortholog prediction, functional annotation, and pan-genome analysis using a range of proven tools and databases. While most existing methods scale quadratically with the number of genomes since they rely on pairwise comparisons among predicted protein sequences, DeNoGAP scales linearly since the homology assignment is based on iteratively refined hidden Markov models. This iterative clustering strategy enables DeNoGAP to handle a very large number of genomes using minimal computational resources. Moreover, the modular structure of the pipeline permits easy updates as new analysis programs become available. DeNoGAP integrates bioinformatics tools and databases for comparative analysis of a large number of genomes. The pipeline offers tools and algorithms for annotation and analysis of completed and draft genome sequences. The pipeline is developed using Perl, BioPerl and SQLite on Ubuntu Linux version 12.04 LTS. Currently, the software package accompanies script for automated installation of necessary external programs on Ubuntu Linux; however, the pipeline should be also compatible with other Linux and Unix systems after necessary external programs are installed. DeNoGAP is freely available at https://sourceforge.net/projects/denogap/ .

  9. Approaches to automatic parameter fitting in a microscopy image segmentation pipeline: An exploratory parameter space analysis.

    PubMed

    Held, Christian; Nattkemper, Tim; Palmisano, Ralf; Wittenberg, Thomas

    2013-01-01

    Research and diagnosis in medicine and biology often require the assessment of a large amount of microscopy image data. Although on the one hand, digital pathology and new bioimaging technologies find their way into clinical practice and pharmaceutical research, some general methodological issues in automated image analysis are still open. In this study, we address the problem of fitting the parameters in a microscopy image segmentation pipeline. We propose to fit the parameters of the pipeline's modules with optimization algorithms, such as, genetic algorithms or coordinate descents, and show how visual exploration of the parameter space can help to identify sub-optimal parameter settings that need to be avoided. This is of significant help in the design of our automatic parameter fitting framework, which enables us to tune the pipeline for large sets of micrographs. The underlying parameter spaces pose a challenge for manual as well as automated parameter optimization, as the parameter spaces can show several local performance maxima. Hence, optimization strategies that are not able to jump out of local performance maxima, like the hill climbing algorithm, often result in a local maximum.

  10. Approaches to automatic parameter fitting in a microscopy image segmentation pipeline: An exploratory parameter space analysis

    PubMed Central

    Held, Christian; Nattkemper, Tim; Palmisano, Ralf; Wittenberg, Thomas

    2013-01-01

    Introduction: Research and diagnosis in medicine and biology often require the assessment of a large amount of microscopy image data. Although on the one hand, digital pathology and new bioimaging technologies find their way into clinical practice and pharmaceutical research, some general methodological issues in automated image analysis are still open. Methods: In this study, we address the problem of fitting the parameters in a microscopy image segmentation pipeline. We propose to fit the parameters of the pipeline's modules with optimization algorithms, such as, genetic algorithms or coordinate descents, and show how visual exploration of the parameter space can help to identify sub-optimal parameter settings that need to be avoided. Results: This is of significant help in the design of our automatic parameter fitting framework, which enables us to tune the pipeline for large sets of micrographs. Conclusion: The underlying parameter spaces pose a challenge for manual as well as automated parameter optimization, as the parameter spaces can show several local performance maxima. Hence, optimization strategies that are not able to jump out of local performance maxima, like the hill climbing algorithm, often result in a local maximum. PMID:23766941

  11. A pipeline for comprehensive and automated processing of electron diffraction data in IPLT.

    PubMed

    Schenk, Andreas D; Philippsen, Ansgar; Engel, Andreas; Walz, Thomas

    2013-05-01

    Electron crystallography of two-dimensional crystals allows the structural study of membrane proteins in their native environment, the lipid bilayer. Determining the structure of a membrane protein at near-atomic resolution by electron crystallography remains, however, a very labor-intense and time-consuming task. To simplify and accelerate the data processing aspect of electron crystallography, we implemented a pipeline for the processing of electron diffraction data using the Image Processing Library and Toolbox (IPLT), which provides a modular, flexible, integrated, and extendable cross-platform, open-source framework for image processing. The diffraction data processing pipeline is organized as several independent modules implemented in Python. The modules can be accessed either from a graphical user interface or through a command line interface, thus meeting the needs of both novice and expert users. The low-level image processing algorithms are implemented in C++ to achieve optimal processing performance, and their interface is exported to Python using a wrapper. For enhanced performance, the Python processing modules are complemented with a central data managing facility that provides a caching infrastructure. The validity of our data processing algorithms was verified by processing a set of aquaporin-0 diffraction patterns with the IPLT pipeline and comparing the resulting merged data set with that obtained by processing the same diffraction patterns with the classical set of MRC programs. Copyright © 2013 Elsevier Inc. All rights reserved.

  12. A pipeline for comprehensive and automated processing of electron diffraction data in IPLT

    PubMed Central

    Schenk, Andreas D.; Philippsen, Ansgar; Engel, Andreas; Walz, Thomas

    2013-01-01

    Electron crystallography of two-dimensional crystals allows the structural study of membrane proteins in their native environment, the lipid bilayer. Determining the structure of a membrane protein at near-atomic resolution by electron crystallography remains, however, a very labor-intense and time-consuming task. To simplify and accelerate the data processing aspect of electron crystallography, we implemented a pipeline for the processing of electron diffraction data using the Image Processing Library & Toolbox (IPLT), which provides a modular, flexible, integrated, and extendable cross-platform, open-source framework for image processing. The diffraction data processing pipeline is organized as several independent modules implemented in Python. The modules can be accessed either from a graphical user interface or through a command line interface, thus meeting the needs of both novice and expert users. The low-level image processing algorithms are implemented in C++ to achieve optimal processing performance, and their interface is exported to Python using a wrapper. For enhanced performance, the Python processing modules are complemented with a central data managing facility that provides a caching infrastructure. The validity of our data processing algorithms was verified by processing a set of aquaporin-0 diffraction patterns with the IPLT pipeline and comparing the resulting merged data set with that obtained by processing the same diffraction patterns with the classical set of MRC programs. PMID:23500887

  13. Design and Implementation of Data Reduction Pipelines for the Keck Observatory Archive

    NASA Astrophysics Data System (ADS)

    Gelino, C. R.; Berriman, G. B.; Kong, M.; Laity, A. C.; Swain, M. A.; Campbell, R.; Goodrich, R. W.; Holt, J.; Lyke, J.; Mader, J. A.; Tran, H. D.; Barlow, T.

    2015-09-01

    The Keck Observatory Archive (KOA), a collaboration between the NASA Exoplanet Science Institute and the W. M. Keck Observatory, serves science and calibration data for all active and inactive instruments from the twin Keck Telescopes located near the summit of Mauna Kea, Hawaii. In addition to the raw data, we produce and provide quick look reduced data for four instruments (HIRES, LWS, NIRC2, and OSIRIS) so that KOA users can more easily assess the scientific content and the quality of the data, which can often be difficult with raw data. The reduced products derive from both publicly available data reduction packages (when available) and KOA-created reduction scripts. The automation of publicly available data reduction packages has the benefit of providing a good quality product without the additional time and expense of creating a new reduction package, and is easily applied to bulk processing needs. The downside is that the pipeline is not always able to create an ideal product, particularly for spectra, because the processing options for one type of target (eg., point sources) may not be appropriate for other types of targets (eg., extended galaxies and nebulae). In this poster we present the design and implementation for the current pipelines used at KOA and discuss our strategies for handling data for which the nature of the targets and the observers' scientific goals and data taking procedures are unknown. We also discuss our plans for implementing automated pipelines for the remaining six instruments.

  14. The LCOGT Observation Portal, Data Pipeline and Science Archive

    NASA Astrophysics Data System (ADS)

    Lister, Tim; LCOGT Science Archive Team

    2014-01-01

    Las Cumbres Observatory Global Telescope (LCOGT) is building and deploying a world-wide network of optical telescopes dedicated to time-domain astronomy. During 2012-2013, we successfully deployed and commissioned nine new 1m telescopes at McDonald Observatory (Texas), CTIO (Chile), SAAO (South Africa) and Siding Spring Observatory (Australia). New, improved cameras and additional telescopes will be deployed during 2014. To enable the diverse LCOGT user community of scientific and educational users to request observations on the LCOGT Network and to see their progress and get access to their data, we have developed an Observation Portal system. This Observation Portal integrates proposal submission and observation requests with seamless access to the data products from the data pipelines in near-realtime and long-term products from the Science Archive. We describe the LCOGT Observation Portal and the data pipeline, currently in operation, which makes use of the ORAC-DR automated recipe-based data reduction pipeline and illustrate some of the new data products. We also present the LCOGT Science Archive, which is being developed in partnership with the Infrared Processing and Analysis Center (IPAC) and show some of the new features the Science Archive provides.

  15. Towards the automated reduction and calibration of SCUBA data from the James Clerk Maxwell Telescope

    NASA Astrophysics Data System (ADS)

    Jenness, T.; Stevens, J. A.; Archibald, E. N.; Economou, F.; Jessop, N. E.; Robson, E. I.

    2002-10-01

    The Submillimetre Common User Bolometer Array (SCUBA) instrument has been operating on the James Clerk Maxwell Telescope (JCMT) since 1997. The data archive is now sufficiently large that it can be used to investigate instrumental properties and the variability of astronomical sources. This paper describes the automated calibration and reduction scheme used to process the archive data, with particular emphasis on `jiggle-map' observations of compact sources. We demonstrate the validity of our automated approach at both 850 and 450 μm, and apply it to several of the JCMT secondary flux calibrators. We determine light curves for the variable sources IRC +10216 and OH 231.8. This automation is made possible by using the ORAC-DR data reduction pipeline, a flexible and extensible data reduction pipeline that is used on the United Kingdom Infrared Telescope (UKIRT) and the JCMT.

  16. Transcriptator: An Automated Computational Pipeline to Annotate Assembled Reads and Identify Non Coding RNA.

    PubMed

    Tripathi, Kumar Parijat; Evangelista, Daniela; Zuccaro, Antonio; Guarracino, Mario Rosario

    2015-01-01

    RNA-seq is a new tool to measure RNA transcript counts, using high-throughput sequencing at an extraordinary accuracy. It provides quantitative means to explore the transcriptome of an organism of interest. However, interpreting this extremely large data into biological knowledge is a problem, and biologist-friendly tools are lacking. In our lab, we developed Transcriptator, a web application based on a computational Python pipeline with a user-friendly Java interface. This pipeline uses the web services available for BLAST (Basis Local Search Alignment Tool), QuickGO and DAVID (Database for Annotation, Visualization and Integrated Discovery) tools. It offers a report on statistical analysis of functional and Gene Ontology (GO) annotation's enrichment. It helps users to identify enriched biological themes, particularly GO terms, pathways, domains, gene/proteins features and protein-protein interactions related informations. It clusters the transcripts based on functional annotations and generates a tabular report for functional and gene ontology annotations for each submitted transcript to the web server. The implementation of QuickGo web-services in our pipeline enable the users to carry out GO-Slim analysis, whereas the integration of PORTRAIT (Prediction of transcriptomic non coding RNA (ncRNA) by ab initio methods) helps to identify the non coding RNAs and their regulatory role in transcriptome. In summary, Transcriptator is a useful software for both NGS and array data. It helps the users to characterize the de-novo assembled reads, obtained from NGS experiments for non-referenced organisms, while it also performs the functional enrichment analysis of differentially expressed transcripts/genes for both RNA-seq and micro-array experiments. It generates easy to read tables and interactive charts for better understanding of the data. The pipeline is modular in nature, and provides an opportunity to add new plugins in the future. Web application is freely

  17. Neuroimaging Study Designs, Computational Analyses and Data Provenance Using the LONI Pipeline

    PubMed Central

    Dinov, Ivo; Lozev, Kamen; Petrosyan, Petros; Liu, Zhizhong; Eggert, Paul; Pierce, Jonathan; Zamanyan, Alen; Chakrapani, Shruthi; Van Horn, John; Parker, D. Stott; Magsipoc, Rico; Leung, Kelvin; Gutman, Boris; Woods, Roger; Toga, Arthur

    2010-01-01

    Modern computational neuroscience employs diverse software tools and multidisciplinary expertise to analyze heterogeneous brain data. The classical problems of gathering meaningful data, fitting specific models, and discovering appropriate analysis and visualization tools give way to a new class of computational challenges—management of large and incongruous data, integration and interoperability of computational resources, and data provenance. We designed, implemented and validated a new paradigm for addressing these challenges in the neuroimaging field. Our solution is based on the LONI Pipeline environment [3], [4], a graphical workflow environment for constructing and executing complex data processing protocols. We developed study-design, database and visual language programming functionalities within the LONI Pipeline that enable the construction of complete, elaborate and robust graphical workflows for analyzing neuroimaging and other data. These workflows facilitate open sharing and communication of data and metadata, concrete processing protocols, result validation, and study replication among different investigators and research groups. The LONI Pipeline features include distributed grid-enabled infrastructure, virtualized execution environment, efficient integration, data provenance, validation and distribution of new computational tools, automated data format conversion, and an intuitive graphical user interface. We demonstrate the new LONI Pipeline features using large scale neuroimaging studies based on data from the International Consortium for Brain Mapping [5] and the Alzheimer's Disease Neuroimaging Initiative [6]. User guides, forums, instructions and downloads of the LONI Pipeline environment are available at http://pipeline.loni.ucla.edu. PMID:20927408

  18. From sequencer to supercomputer: an automatic pipeline for managing and processing next generation sequencing data.

    PubMed

    Camerlengo, Terry; Ozer, Hatice Gulcin; Onti-Srinivasan, Raghuram; Yan, Pearlly; Huang, Tim; Parvin, Jeffrey; Huang, Kun

    2012-01-01

    Next Generation Sequencing is highly resource intensive. NGS Tasks related to data processing, management and analysis require high-end computing servers or even clusters. Additionally, processing NGS experiments requires suitable storage space and significant manual interaction. At The Ohio State University's Biomedical Informatics Shared Resource, we designed and implemented a scalable architecture to address the challenges associated with the resource intensive nature of NGS secondary analysis built around Illumina Genome Analyzer II sequencers and Illumina's Gerald data processing pipeline. The software infrastructure includes a distributed computing platform consisting of a LIMS called QUEST (http://bisr.osumc.edu), an Automation Server, a computer cluster for processing NGS pipelines, and a network attached storage device expandable up to 40TB. The system has been architected to scale to multiple sequencers without requiring additional computing or labor resources. This platform provides demonstrates how to manage and automate NGS experiments in an institutional or core facility setting.

  19. Data Validation in the Kepler Science Operations Center Pipeline

    NASA Technical Reports Server (NTRS)

    Wu, Hayley; Twicken, Joseph D.; Tenenbaum, Peter; Clarke, Bruce D.; Li, Jie; Quintana, Elisa V.; Allen, Christopher; Chandrasekaran, Hema; Jenkins, Jon M.; Caldwell, Douglas A.; hide

    2010-01-01

    We present an overview of the Data Validation (DV) software component and its context within the Kepler Science Operations Center (SOC) pipeline and overall Kepler Science mission. The SOC pipeline performs a transiting planet search on the corrected light curves for over 150,000 targets across the focal plane array. We discuss the DV strategy for automated validation of Threshold Crossing Events (TCEs) generated in the transiting planet search. For each TCE, a transiting planet model is fitted to the target light curve. A multiple planet search is conducted by repeating the transiting planet search on the residual light curve after the model flux has been removed; if an additional detection occurs, a planet model is fitted to the new TCE. A suite of automated tests are performed after all planet candidates have been identified. We describe a centroid motion test to determine the significance of the motion of the target photocenter during transit and to estimate the coordinates of the transit source within the photometric aperture; a series of eclipsing binary discrimination tests on the parameters of the planet model fits to all transits and the sequences of odd and even transits; and a statistical bootstrap to assess the likelihood that the TCE would have been generated purely by chance given the target light curve with all transits removed. Keywords: photometry, data validation, Kepler, Earth-size planets

  20. 76 FR 70953 - Pipeline Safety: Safety of Gas Transmission Pipelines

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-16

    ... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration 49 CFR Part 192 [Docket ID PHMSA-2011-0023] RIN 2137-AE72 Pipeline Safety: Safety of Gas Transmission Pipelines AGENCY: Pipeline and Hazardous Materials Safety Administration (PHMSA); DOT. ACTION: Advance notice of...

  1. CFHT data processing and calibration ESPaDOnS pipeline: Upena and OPERA (optical spectropolarimetry)

    NASA Astrophysics Data System (ADS)

    Martioli, Eder; Teeple, D.; Manset, Nadine

    2011-03-01

    CFHT is ESPaDOnS responsible for processing raw images, removing instrument related artifacts, and delivering science-ready data to the PIs. Here we describe the Upena pipeline, which is the software used to reduce the echelle spectro-polarimetric data obtained with the ESPaDOnS instrument. Upena is an automated pipeline that performs calibration and reduction of raw images. Upena has the capability of both performing real-time image-by-image basis reduction and a post observing night complete reduction. Upena produces polarization and intensity spectra in FITS format. The pipeline is designed to perform parallel computing for improved speed, which assures that the final products are delivered to the PIs before noon HST after each night of observations. We also present the OPERA project, which is an open-source pipeline to reduce ESPaDOnS data that will be developed as a collaborative work between CFHT and the scientific community. OPERA will match the core capabilities of Upena and in addition will be open-source, flexible and extensible.

  2. Fully automated rodent brain MR image processing pipeline on a Midas server: from acquired images to region-based statistics.

    PubMed

    Budin, Francois; Hoogstoel, Marion; Reynolds, Patrick; Grauer, Michael; O'Leary-Moore, Shonagh K; Oguz, Ipek

    2013-01-01

    Magnetic resonance imaging (MRI) of rodent brains enables study of the development and the integrity of the brain under certain conditions (alcohol, drugs etc.). However, these images are difficult to analyze for biomedical researchers with limited image processing experience. In this paper we present an image processing pipeline running on a Midas server, a web-based data storage system. It is composed of the following steps: rigid registration, skull-stripping, average computation, average parcellation, parcellation propagation to individual subjects, and computation of region-based statistics on each image. The pipeline is easy to configure and requires very little image processing knowledge. We present results obtained by processing a data set using this pipeline and demonstrate how this pipeline can be used to find differences between populations.

  3. Low-cost lightweight airborne laser-based sensors for pipeline leak detection and reporting

    NASA Astrophysics Data System (ADS)

    Frish, Michael B.; Wainner, Richard T.; Laderer, Matthew C.; Allen, Mark G.; Rutherford, James; Wehnert, Paul; Dey, Sean; Gilchrist, John; Corbi, Ron; Picciaia, Daniele; Andreussi, Paolo; Furry, David

    2013-05-01

    Laser sensing enables aerial detection of natural gas pipeline leaks without need to fly through a hazardous gas plume. This paper describes adaptations of commercial laser-based methane sensing technology that provide relatively low-cost lightweight and battery-powered aerial leak sensors. The underlying technology is near-infrared Standoff Tunable Diode Laser Absorption Spectroscopy (sTDLAS). In one configuration, currently in commercial operation for pipeline surveillance, sTDLAS is combined with automated data reduction, alerting, navigation, and video imagery, integrated into a single-engine single-pilot light fixed-wing aircraft or helicopter platform. In a novel configuration for mapping landfill methane emissions, a miniaturized ultra-lightweight sTDLAS sensor flies aboard a small quad-rotor unmanned aerial vehicle (UAV).

  4. 76 FR 73570 - Pipeline Safety: Miscellaneous Changes to Pipeline Safety Regulations

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-29

    ... pipeline facilities to facilitate the removal of liquids and other materials from the gas stream. These... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration 49 CFR Parts... Changes to Pipeline Safety Regulations AGENCY: Pipeline and Hazardous Materials Safety Administration...

  5. HTAPP: High-Throughput Autonomous Proteomic Pipeline

    PubMed Central

    Yu, Kebing; Salomon, Arthur R.

    2011-01-01

    Recent advances in the speed and sensitivity of mass spectrometers and in analytical methods, the exponential acceleration of computer processing speeds, and the availability of genomic databases from an array of species and protein information databases have led to a deluge of proteomic data. The development of a lab-based automated proteomic software platform for the automated collection, processing, storage, and visualization of expansive proteomic datasets is critically important. The high-throughput autonomous proteomic pipeline (HTAPP) described here is designed from the ground up to provide critically important flexibility for diverse proteomic workflows and to streamline the total analysis of a complex proteomic sample. This tool is comprised of software that controls the acquisition of mass spectral data along with automation of post-acquisition tasks such as peptide quantification, clustered MS/MS spectral database searching, statistical validation, and data exploration within a user-configurable lab-based relational database. The software design of HTAPP focuses on accommodating diverse workflows and providing missing software functionality to a wide range of proteomic researchers to accelerate the extraction of biological meaning from immense proteomic data sets. Although individual software modules in our integrated technology platform may have some similarities to existing tools, the true novelty of the approach described here is in the synergistic and flexible combination of these tools to provide an integrated and efficient analysis of proteomic samples. PMID:20336676

  6. An automated design and fabrication pipeline for improving the strength of 3D printed artifacts under tensile loading

    NASA Astrophysics Data System (ADS)

    Al, Can Mert; Yaman, Ulas

    2018-05-01

    In the scope of this study, an alternative automated method to the conventional design and fabrication pipeline of 3D printers is developed by using an integrated CAD/CAE/CAM approach. It increases the load carrying capacity of the parts by constructing heterogeneous infill structures. Traditional CAM software of Additive Manufacturing machinery starts with a design model in STL file format which only includes data about the outer boundary in the triangular mesh form. Depending on the given infill percentage, the algorithm running behind constructs the interior of the artifact by using homogeneous infill structures. As opposed to the current CAM software, the proposed method provides a way to construct heterogeneous infill structures with respect to the Von Misses stress field results obtained from a finite element analysis. Throughout the work, Rhinoceros3D is used for the design of the parts along with Grasshopper3D, an algorithmic design tool for Rhinoceros3D. In addition, finite element analyses are performed using Karamba3D, a plug-in for Grasshopper3D. According to the results of the tensile tests, the method offers an improvement of load carrying capacity about 50% compared to traditional slicing algorithms of 3D printing.

  7. PipelineDog: a simple and flexible graphic pipeline construction and maintenance tool.

    PubMed

    Zhou, Anbo; Zhang, Yeting; Sun, Yazhou; Xing, Jinchuan

    2018-05-01

    Analysis pipelines are an essential part of bioinformatics research, and ad hoc pipelines are frequently created by researchers for prototyping and proof-of-concept purposes. However, most existing pipeline management system or workflow engines are too complex for rapid prototyping or learning the pipeline concept. A lightweight, user-friendly and flexible solution is thus desirable. In this study, we developed a new pipeline construction and maintenance tool, PipelineDog. This is a web-based integrated development environment with a modern web graphical user interface. It offers cross-platform compatibility, project management capabilities, code formatting and error checking functions and an online repository. It uses an easy-to-read/write script system that encourages code reuse. With the online repository, it also encourages sharing of pipelines, which enhances analysis reproducibility and accountability. For most users, PipelineDog requires no software installation. Overall, this web application provides a way to rapidly create and easily manage pipelines. PipelineDog web app is freely available at http://web.pipeline.dog. The command line version is available at http://www.npmjs.com/package/pipelinedog and online repository at http://repo.pipeline.dog. ysun@kean.edu or xing@biology.rutgers.edu or ysun@diagnoa.com. Supplementary data are available at Bioinformatics online.

  8. Platform for Automated Real-Time High Performance Analytics on Medical Image Data.

    PubMed

    Allen, William J; Gabr, Refaat E; Tefera, Getaneh B; Pednekar, Amol S; Vaughn, Matthew W; Narayana, Ponnada A

    2018-03-01

    Biomedical data are quickly growing in volume and in variety, providing clinicians an opportunity for better clinical decision support. Here, we demonstrate a robust platform that uses software automation and high performance computing (HPC) resources to achieve real-time analytics of clinical data, specifically magnetic resonance imaging (MRI) data. We used the Agave application programming interface to facilitate communication, data transfer, and job control between an MRI scanner and an off-site HPC resource. In this use case, Agave executed the graphical pipeline tool GRAphical Pipeline Environment (GRAPE) to perform automated, real-time, quantitative analysis of MRI scans. Same-session image processing will open the door for adaptive scanning and real-time quality control, potentially accelerating the discovery of pathologies and minimizing patient callbacks. We envision this platform can be adapted to other medical instruments, HPC resources, and analytics tools.

  9. Approaches to automated protein crystal harvesting

    PubMed Central

    Deller, Marc C.; Rupp, Bernhard

    2014-01-01

    The harvesting of protein crystals is almost always a necessary step in the determination of a protein structure using X-ray crystallographic techniques. However, protein crystals are usually fragile and susceptible to damage during the harvesting process. For this reason, protein crystal harvesting is the single step that remains entirely dependent on skilled human intervention. Automation has been implemented in the majority of other stages of the structure-determination pipeline, including cloning, expression, purification, crystallization and data collection. The gap in automation between crystallization and data collection results in a bottleneck in throughput and presents unfortunate opportunities for crystal damage. Several automated protein crystal harvesting systems have been developed, including systems utilizing microcapillaries, microtools, microgrippers, acoustic droplet ejection and optical traps. However, these systems have yet to be commonly deployed in the majority of crystallography laboratories owing to a variety of technical and cost-related issues. Automation of protein crystal harvesting remains essential for harnessing the full benefits of fourth-generation synchrotrons, free-electron lasers and microfocus beamlines. Furthermore, automation of protein crystal harvesting offers several benefits when compared with traditional manual approaches, including the ability to harvest microcrystals, improved flash-cooling procedures and increased throughput. PMID:24637746

  10. Advancements in automated tissue segmentation pipeline for contrast-enhanced CT scans of adult and pediatric patients

    NASA Astrophysics Data System (ADS)

    Somasundaram, Elanchezhian; Kaufman, Robert; Brady, Samuel

    2017-03-01

    The development of a random forests machine learning technique is presented for fully-automated neck, chest, abdomen, and pelvis tissue segmentation of CT images using Trainable WEKA (Waikato Environment for Knowledge Analysis) Segmentation (TWS) plugin of FIJI (ImageJ, NIH). The use of a single classifier model to segment six tissue classes (lung, fat, muscle, solid organ, blood/contrast agent, bone) in the CT images is studied. An automated unbiased scheme to sample pixels from the training images and generate a balanced training dataset over the seven classes is also developed. Two independent training datasets are generated from a pool of 4 adult (>55 kg) and 3 pediatric patients (<=55 kg) with 7 manually contoured slices for each patient. Classifier training investigated 28 image filters comprising a total of 272 features. Highly correlated and insignificant features are eliminated using Correlated Feature Subset (CFS) selection with Best First Search (BFS) algorithms in WEKA. The 2 training models (from the 2 training datasets) had 74 and 71 input training features, respectively. The study also investigated the effect of varying the number of trees (25, 50, 100, and 200) in the random forest algorithm. The performance of the 2 classifier models are evaluated on inter-patient intra-slice, intrapatient inter-slice and inter-patient inter-slice test datasets. The Dice similarity coefficients (DSC) and confusion matrices are used to understand the performance of the classifiers across the tissue segments. The effect of number of features in the training input on the performance of the classifiers for tissue classes with less than optimal DSC values is also studied. The average DSC values for the two training models on the inter-patient intra-slice test data are: 0.98, 0.89, 0.87, 0.79, 0.68, and 0.84, for lung, fat, muscle, solid organ, blood/contrast agent, and bone, respectively. The study demonstrated that a robust segmentation accuracy for lung, muscle and

  11. United States petroleum pipelines: An empirical analysis of pipeline sizing

    NASA Astrophysics Data System (ADS)

    Coburn, L. L.

    1980-12-01

    The undersizing theory hypothesizes that integrated oil companies have a strong economic incentive to size the petroleum pipelines they own and ship over in a way that means that some of the demand must utilize higher cost alternatives. The DOJ theory posits that excess or monopoly profits are earned due to the natural monopoly characteristics of petroleum pipelines and the existence of market power in some pipelines at either the upstream or downstream market. The theory holds that independent petroleum pipelines owned by companies not otherwise affiliated with the petroleum industry (independent pipelines) do not have these incentives and all the efficiencies of pipeline transportation are passed to the ultimate consumer. Integrated oil companies on the other hand, keep these cost efficiencies for themselves in the form of excess profits.

  12. Automated Reduction and Calibration of SCUBA Archive Data Using ORAC-DR

    NASA Astrophysics Data System (ADS)

    Jenness, T.; Stevens, J. A.; Archibald, E. N.; Economou, F.; Jessop, N.; Robson, E. I.; Tilanus, R. P. J.; Holland, W. S.

    The Submillimetre Common User Bolometer Array (SCUBA) instrument has been operating on the James Clerk Maxwell Telescope (JCMT) since 1997. The data archive is now sufficiently large that it can be used for investigating instrumental properties and the variability of astronomical sources. This paper describes the automated calibration and reduction scheme used to process the archive data with particular emphasis on the pointing observations. This is made possible by using the ORAC-DR data reduction pipeline, a flexible and extensible data reduction pipeline that is used on UKIRT and the JCMT.

  13. A midas plugin to enable construction of reproducible web-based image processing pipelines

    PubMed Central

    Grauer, Michael; Reynolds, Patrick; Hoogstoel, Marion; Budin, Francois; Styner, Martin A.; Oguz, Ipek

    2013-01-01

    Image processing is an important quantitative technique for neuroscience researchers, but difficult for those who lack experience in the field. In this paper we present a web-based platform that allows an expert to create a brain image processing pipeline, enabling execution of that pipeline even by those biomedical researchers with limited image processing knowledge. These tools are implemented as a plugin for Midas, an open-source toolkit for creating web based scientific data storage and processing platforms. Using this plugin, an image processing expert can construct a pipeline, create a web-based User Interface, manage jobs, and visualize intermediate results. Pipelines are executed on a grid computing platform using BatchMake and HTCondor. This represents a new capability for biomedical researchers and offers an innovative platform for scientific collaboration. Current tools work well, but can be inaccessible for those lacking image processing expertise. Using this plugin, researchers in collaboration with image processing experts can create workflows with reasonable default settings and streamlined user interfaces, and data can be processed easily from a lab environment without the need for a powerful desktop computer. This platform allows simplified troubleshooting, centralized maintenance, and easy data sharing with collaborators. These capabilities enable reproducible science by sharing datasets and processing pipelines between collaborators. In this paper, we present a description of this innovative Midas plugin, along with results obtained from building and executing several ITK based image processing workflows for diffusion weighted MRI (DW MRI) of rodent brain images, as well as recommendations for building automated image processing pipelines. Although the particular image processing pipelines developed were focused on rodent brain MRI, the presented plugin can be used to support any executable or script-based pipeline. PMID:24416016

  14. A midas plugin to enable construction of reproducible web-based image processing pipelines.

    PubMed

    Grauer, Michael; Reynolds, Patrick; Hoogstoel, Marion; Budin, Francois; Styner, Martin A; Oguz, Ipek

    2013-01-01

    Image processing is an important quantitative technique for neuroscience researchers, but difficult for those who lack experience in the field. In this paper we present a web-based platform that allows an expert to create a brain image processing pipeline, enabling execution of that pipeline even by those biomedical researchers with limited image processing knowledge. These tools are implemented as a plugin for Midas, an open-source toolkit for creating web based scientific data storage and processing platforms. Using this plugin, an image processing expert can construct a pipeline, create a web-based User Interface, manage jobs, and visualize intermediate results. Pipelines are executed on a grid computing platform using BatchMake and HTCondor. This represents a new capability for biomedical researchers and offers an innovative platform for scientific collaboration. Current tools work well, but can be inaccessible for those lacking image processing expertise. Using this plugin, researchers in collaboration with image processing experts can create workflows with reasonable default settings and streamlined user interfaces, and data can be processed easily from a lab environment without the need for a powerful desktop computer. This platform allows simplified troubleshooting, centralized maintenance, and easy data sharing with collaborators. These capabilities enable reproducible science by sharing datasets and processing pipelines between collaborators. In this paper, we present a description of this innovative Midas plugin, along with results obtained from building and executing several ITK based image processing workflows for diffusion weighted MRI (DW MRI) of rodent brain images, as well as recommendations for building automated image processing pipelines. Although the particular image processing pipelines developed were focused on rodent brain MRI, the presented plugin can be used to support any executable or script-based pipeline.

  15. 75 FR 13342 - Pipeline Safety: Workshop on Distribution Pipeline Construction

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-19

    ... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket ID... natural gas distribution construction. Natural gas distribution pipelines are subject to a unique subset... distribution pipeline construction practices. This workshop will focus solely on natural gas distribution...

  16. Accelerating root system phenotyping of seedlings through a computer-assisted processing pipeline.

    PubMed

    Dupuy, Lionel X; Wright, Gladys; Thompson, Jacqueline A; Taylor, Anna; Dekeyser, Sebastien; White, Christopher P; Thomas, William T B; Nightingale, Mark; Hammond, John P; Graham, Neil S; Thomas, Catherine L; Broadley, Martin R; White, Philip J

    2017-01-01

    There are numerous systems and techniques to measure the growth of plant roots. However, phenotyping large numbers of plant roots for breeding and genetic analyses remains challenging. One major difficulty is to achieve high throughput and resolution at a reasonable cost per plant sample. Here we describe a cost-effective root phenotyping pipeline, on which we perform time and accuracy benchmarking to identify bottlenecks in such pipelines and strategies for their acceleration. Our root phenotyping pipeline was assembled with custom software and low cost material and equipment. Results show that sample preparation and handling of samples during screening are the most time consuming task in root phenotyping. Algorithms can be used to speed up the extraction of root traits from image data, but when applied to large numbers of images, there is a trade-off between time of processing the data and errors contained in the database. Scaling-up root phenotyping to large numbers of genotypes will require not only automation of sample preparation and sample handling, but also efficient algorithms for error detection for more reliable replacement of manual interventions.

  17. Auto-rickshaw: an automated crystal structure determination platform as an efficient tool for the validation of an X-ray diffraction experiment.

    PubMed

    Panjikar, Santosh; Parthasarathy, Venkataraman; Lamzin, Victor S; Weiss, Manfred S; Tucker, Paul A

    2005-04-01

    The EMBL-Hamburg Automated Crystal Structure Determination Platform is a system that combines a number of existing macromolecular crystallographic computer programs and several decision-makers into a software pipeline for automated and efficient crystal structure determination. The pipeline can be invoked as soon as X-ray data from derivatized protein crystals have been collected and processed. It is controlled by a web-based graphical user interface for data and parameter input, and for monitoring the progress of structure determination. A large number of possible structure-solution paths are encoded in the system and the optimal path is selected by the decision-makers as the structure solution evolves. The processes have been optimized for speed so that the pipeline can be used effectively for validating the X-ray experiment at a synchrotron beamline.

  18. CANEapp: a user-friendly application for automated next generation transcriptomic data analysis.

    PubMed

    Velmeshev, Dmitry; Lally, Patrick; Magistri, Marco; Faghihi, Mohammad Ali

    2016-01-13

    Next generation sequencing (NGS) technologies are indispensable for molecular biology research, but data analysis represents the bottleneck in their application. Users need to be familiar with computer terminal commands, the Linux environment, and various software tools and scripts. Analysis workflows have to be optimized and experimentally validated to extract biologically meaningful data. Moreover, as larger datasets are being generated, their analysis requires use of high-performance servers. To address these needs, we developed CANEapp (application for Comprehensive automated Analysis of Next-generation sequencing Experiments), a unique suite that combines a Graphical User Interface (GUI) and an automated server-side analysis pipeline that is platform-independent, making it suitable for any server architecture. The GUI runs on a PC or Mac and seamlessly connects to the server to provide full GUI control of RNA-sequencing (RNA-seq) project analysis. The server-side analysis pipeline contains a framework that is implemented on a Linux server through completely automated installation of software components and reference files. Analysis with CANEapp is also fully automated and performs differential gene expression analysis and novel noncoding RNA discovery through alternative workflows (Cuffdiff and R packages edgeR and DESeq2). We compared CANEapp to other similar tools, and it significantly improves on previous developments. We experimentally validated CANEapp's performance by applying it to data derived from different experimental paradigms and confirming the results with quantitative real-time PCR (qRT-PCR). CANEapp adapts to any server architecture by effectively using available resources and thus handles large amounts of data efficiently. CANEapp performance has been experimentally validated on various biological datasets. CANEapp is available free of charge at http://psychiatry.med.miami.edu/research/laboratory-of-translational-rna-genomics/CANE-app . We

  19. 75 FR 63774 - Pipeline Safety: Safety of On-Shore Hazardous Liquid Pipelines

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-18

    ... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration 49 CFR Part... Pipelines AGENCY: Pipeline and Hazardous Materials Safety Administration (PHMSA), Department of... Gas Pipeline Safety Act of 1968, Public Law 90-481, delegated to DOT the authority to develop...

  20. 77 FR 61825 - Pipeline Safety: Notice of Public Meeting on Pipeline Data

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-11

    ... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket ID... program performance measures for gas distribution, gas transmission, and hazardous liquids pipelines. The... distribution pipelines (49 CFR 192.1007(e)), gas transmission pipelines (49 CFR 192.945) and hazardous liquids...

  1. AmeriFlux Data Processing: Integrating automated and manual data management across software technologies and an international network to generate timely data products

    NASA Astrophysics Data System (ADS)

    Christianson, D. S.; Beekwilder, N.; Chan, S.; Cheah, Y. W.; Chu, H.; Dengel, S.; O'Brien, F.; Pastorello, G.; Sandesh, M.; Torn, M. S.; Agarwal, D.

    2017-12-01

    AmeriFlux is a network of scientists who independently collect eddy covariance and related environmental observations at over 250 locations across the Americas. As part of the AmeriFlux Management Project, the AmeriFlux Data Team manages standardization, collection, quality assurance / quality control (QA/QC), and distribution of data submitted by network members. To generate data products that are timely, QA/QC'd, and repeatable, and have traceable provenance, we developed a semi-automated data processing pipeline. The new pipeline consists of semi-automated format and data QA/QC checks. Results are communicated via on-line reports as well as an issue-tracking system. Data processing time has been reduced from 2-3 days to a few hours of manual review time, resulting in faster data availability from the time of data submission. The pipeline is scalable to the network level and has the following key features. (1) On-line results of the format QA/QC checks are available immediately for data provider review. This enables data providers to correct and resubmit data quickly. (2) The format QA/QC assessment includes an automated attempt to fix minor format errors. Data submissions that are formatted in the new AmeriFlux FP-In standard can be queued for the data QA/QC assessment, often with minimal delay. (3) Automated data QA/QC checks identify and communicate potentially erroneous data via online, graphical quick views that highlight observations with unexpected values, incorrect units, time drifts, invalid multivariate correlations, and/or radiation shadows. (4) Progress through the pipeline is integrated with an issue-tracking system that facilitates communications between data providers and the data processing team in an organized and searchable fashion. Through development of these and other features of the pipeline, we present solutions to challenges that include optimizing automated with manual processing, bridging legacy data management infrastructure with

  2. Rapid, Vehicle-Based Identification of Location and Magnitude of Urban Natural Gas Pipeline Leaks.

    PubMed

    von Fischer, Joseph C; Cooley, Daniel; Chamberlain, Sam; Gaylord, Adam; Griebenow, Claire J; Hamburg, Steven P; Salo, Jessica; Schumacher, Russ; Theobald, David; Ham, Jay

    2017-04-04

    Information about the location and magnitudes of natural gas (NG) leaks from urban distribution pipelines is important for minimizing greenhouse gas emissions and optimizing investment in pipeline management. To enable rapid collection of such data, we developed a relatively simple method using high-precision methane analyzers in Google Street View cars. Our data indicate that this automated leak survey system can document patterns in leak location and magnitude within and among cities, even without wind data. We found that urban areas with prevalent corrosion-prone distribution lines (Boston, MA, Staten Island, NY, and Syracuse, NY), leaked approximately 25-fold more methane than cities with more modern pipeline materials (Burlington, VT, and Indianapolis, IN). Although this mobile monitoring method produces conservative estimates of leak rates and leak counts, it can still help prioritize both leak repairs and replacement of leak-prone sections of distribution lines, thus minimizing methane emissions over short and long terms.

  3. 76 FR 303 - Pipeline Safety: Safety of On-Shore Hazardous Liquid Pipelines

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-04

    ... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration 49 CFR Part 195 [Docket ID PHMSA-2010-0229] RIN 2137-AE66 Pipeline Safety: Safety of On-Shore Hazardous Liquid Pipelines AGENCY: Pipeline and Hazardous Materials Safety Administration (PHMSA), DOT. ACTION: Notice of...

  4. Development and Applications of Pipeline Steel in Long-Distance Gas Pipeline of China

    NASA Astrophysics Data System (ADS)

    Chunyong, Huo; Yang, Li; Lingkang, Ji

    In past decades, with widely utilizing of Microalloying and Thermal Mechanical Control Processing (TMCP) technology, the good matching of strength, toughness, plasticity and weldability on pipeline steel has been reached so that oil and gas pipeline has been greatly developed in China to meet the demand of strong domestic consumption of energy. In this paper, development history of pipeline steel and gas pipeline in china is briefly reviewed. The microstructure characteristic and mechanical performance of pipeline steel used in some representative gas pipelines of china built in different stage are summarized. Through the analysis on the evolution of pipeline service environment, some prospective development trend of application of pipeline steel in China is also presented.

  5. A completely automated processing pipeline for lung and lung lobe segmentation and its application to the LIDC-IDRI data base

    NASA Astrophysics Data System (ADS)

    Blaffert, Thomas; Wiemker, Rafael; Barschdorf, Hans; Kabus, Sven; Klinder, Tobias; Lorenz, Cristian; Schadewaldt, Nicole; Dharaiya, Ekta

    2010-03-01

    Automated segmentation of lung lobes in thoracic CT images has relevance for various diagnostic purposes like localization of tumors within the lung or quantification of emphysema. Since emphysema is a known risk factor for lung cancer, both purposes are even related to each other. The main steps of the segmentation pipeline described in this paper are the lung detector and the lung segmentation based on a watershed algorithm, and the lung lobe segmentation based on mesh model adaptation. The segmentation procedure was applied to data sets of the data base of the Image Database Resource Initiative (IDRI) that currently contains over 500 thoracic CT scans with delineated lung nodule annotations. We visually assessed the reliability of the single segmentation steps, with a success rate of 98% for the lung detection and 90% for lung delineation. For about 20% of the cases we found the lobe segmentation not to be anatomically plausible. A modeling confidence measure is introduced that gives a quantitative indication of the segmentation quality. For a demonstration of the segmentation method we studied the correlation between emphysema score and malignancy on a per-lobe basis.

  6. Grape RNA-Seq analysis pipeline environment

    PubMed Central

    Knowles, David G.; Röder, Maik; Merkel, Angelika; Guigó, Roderic

    2013-01-01

    Motivation: The avalanche of data arriving since the development of NGS technologies have prompted the need for developing fast, accurate and easily automated bioinformatic tools capable of dealing with massive datasets. Among the most productive applications of NGS technologies is the sequencing of cellular RNA, known as RNA-Seq. Although RNA-Seq provides similar or superior dynamic range than microarrays at similar or lower cost, the lack of standard and user-friendly pipelines is a bottleneck preventing RNA-Seq from becoming the standard for transcriptome analysis. Results: In this work we present a pipeline for processing and analyzing RNA-Seq data, that we have named Grape (Grape RNA-Seq Analysis Pipeline Environment). Grape supports raw sequencing reads produced by a variety of technologies, either in FASTA or FASTQ format, or as prealigned reads in SAM/BAM format. A minimal Grape configuration consists of the file location of the raw sequencing reads, the genome of the species and the corresponding gene and transcript annotation. Grape first runs a set of quality control steps, and then aligns the reads to the genome, a step that is omitted for prealigned read formats. Grape next estimates gene and transcript expression levels, calculates exon inclusion levels and identifies novel transcripts. Grape can be run on a single computer or in parallel on a computer cluster. It is distributed with specific mapping and quantification tools, but given its modular design, any tool supporting popular data interchange formats can be integrated. Availability: Grape can be obtained from the Bioinformatics and Genomics website at: http://big.crg.cat/services/grape. Contact: david.gonzalez@crg.eu or roderic.guigo@crg.eu PMID:23329413

  7. Automated Inspection of Power Line Corridors to Measure Vegetation Undercut Using Uav-Based Images

    NASA Astrophysics Data System (ADS)

    Maurer, M.; Hofer, M.; Fraundorfer, F.; Bischof, H.

    2017-08-01

    Power line corridor inspection is a time consuming task that is performed mostly manually. As the development of UAVs made huge progress in recent years, and photogrammetric computer vision systems became well established, it is time to further automate inspection tasks. In this paper we present an automated processing pipeline to inspect vegetation undercuts of power line corridors. For this, the area of inspection is reconstructed, geo-referenced, semantically segmented and inter class distance measurements are calculated. The presented pipeline performs an automated selection of the proper 3D reconstruction method for on the one hand wiry (power line), and on the other hand solid objects (surrounding). The automated selection is realized by performing pixel-wise semantic segmentation of the input images using a Fully Convolutional Neural Network. Due to the geo-referenced semantic 3D reconstructions a documentation of areas where maintenance work has to be performed is inherently included in the distance measurements and can be extracted easily. We evaluate the influence of the semantic segmentation according to the 3D reconstruction and show that the automated semantic separation in wiry and dense objects of the 3D reconstruction routine improves the quality of the vegetation undercut inspection. We show the generalization of the semantic segmentation to datasets acquired using different acquisition routines and to varied seasons in time.

  8. Gathering pipeline methane emissions in Fayetteville shale pipelines and scoping guidelines for future pipeline measurement campaigns

    DOE PAGES

    Zimmerle, Daniel J.; Pickering, Cody K.; Bell, Clay S.; ...

    2017-11-24

    Gathering pipelines, which transport gas from well pads to downstream processing, are a sector of the natural gas supply chain for which little measured methane emissions data are available. This study performed leak detection and measurement on 96 km of gathering pipeline and the associated 56 pigging facilities and 39 block valves. The study found one underground leak accounting for 83% (4.0 kg CH 4/hr) of total measured emissions. Methane emissions for the 4684 km of gathering pipeline in the study area were estimated at 402 kg CH 4/hr [95 to 1065 kg CH 4/hr, 95% CI], or 1% [0.2%more » to 2.6%] of all methane emissions measured during a prior aircraft study of the same area. Emissions estimated by this study fall within the uncertainty range of emissions estimated using emission factors from EPA's 2015 Greenhouse Inventory and study activity estimates. While EPA's current inventory is based upon emission factors from distribution mains measured in the 1990s, this study indicates that using emission factors from more recent distribution studies could significantly underestimate emissions from gathering pipelines. To guide broader studies of pipeline emissions, we also estimate the fraction of the pipeline length within a basin that must be measured to constrain uncertainty of pipeline emissions estimates to within 1% of total basin emissions. The study provides both substantial insight into the mix of emission sources and guidance for future gathering pipeline studies, but since measurements were made in a single basin, the results are not sufficiently representative to provide methane emission factors at the regional or national level.« less

  9. Gathering pipeline methane emissions in Fayetteville shale pipelines and scoping guidelines for future pipeline measurement campaigns

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zimmerle, Daniel J.; Pickering, Cody K.; Bell, Clay S.

    Gathering pipelines, which transport gas from well pads to downstream processing, are a sector of the natural gas supply chain for which little measured methane emissions data are available. This study performed leak detection and measurement on 96 km of gathering pipeline and the associated 56 pigging facilities and 39 block valves. The study found one underground leak accounting for 83% (4.0 kg CH 4/hr) of total measured emissions. Methane emissions for the 4684 km of gathering pipeline in the study area were estimated at 402 kg CH 4/hr [95 to 1065 kg CH 4/hr, 95% CI], or 1% [0.2%more » to 2.6%] of all methane emissions measured during a prior aircraft study of the same area. Emissions estimated by this study fall within the uncertainty range of emissions estimated using emission factors from EPA's 2015 Greenhouse Inventory and study activity estimates. While EPA's current inventory is based upon emission factors from distribution mains measured in the 1990s, this study indicates that using emission factors from more recent distribution studies could significantly underestimate emissions from gathering pipelines. To guide broader studies of pipeline emissions, we also estimate the fraction of the pipeline length within a basin that must be measured to constrain uncertainty of pipeline emissions estimates to within 1% of total basin emissions. The study provides both substantial insight into the mix of emission sources and guidance for future gathering pipeline studies, but since measurements were made in a single basin, the results are not sufficiently representative to provide methane emission factors at the regional or national level.« less

  10. Metric Evaluation Pipeline for 3d Modeling of Urban Scenes

    NASA Astrophysics Data System (ADS)

    Bosch, M.; Leichtman, A.; Chilcott, D.; Goldberg, H.; Brown, M.

    2017-05-01

    Publicly available benchmark data and metric evaluation approaches have been instrumental in enabling research to advance state of the art methods for remote sensing applications in urban 3D modeling. Most publicly available benchmark datasets have consisted of high resolution airborne imagery and lidar suitable for 3D modeling on a relatively modest scale. To enable research in larger scale 3D mapping, we have recently released a public benchmark dataset with multi-view commercial satellite imagery and metrics to compare 3D point clouds with lidar ground truth. We now define a more complete metric evaluation pipeline developed as publicly available open source software to assess semantically labeled 3D models of complex urban scenes derived from multi-view commercial satellite imagery. Evaluation metrics in our pipeline include horizontal and vertical accuracy and completeness, volumetric completeness and correctness, perceptual quality, and model simplicity. Sources of ground truth include airborne lidar and overhead imagery, and we demonstrate a semi-automated process for producing accurate ground truth shape files to characterize building footprints. We validate our current metric evaluation pipeline using 3D models produced using open source multi-view stereo methods. Data and software is made publicly available to enable further research and planned benchmarking activities.

  11. 77 FR 34123 - Pipeline Safety: Public Meeting on Integrity Management of Gas Distribution Pipelines

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-08

    ... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket No. PHMSA-2012-0100] Pipeline Safety: Public Meeting on Integrity Management of Gas Distribution Pipelines AGENCY: Office of Pipeline Safety, Pipeline and Hazardous Materials Safety Administration, DOT. ACTION...

  12. 75 FR 5244 - Pipeline Safety: Integrity Management Program for Gas Distribution Pipelines; Correction

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-02

    ... Management Program for Gas Distribution Pipelines; Correction AGENCY: Pipeline and Hazardous Materials Safety... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration 49 CFR Part... Regulations to require operators of gas distribution pipelines to develop and implement integrity management...

  13. Molgenis-impute: imputation pipeline in a box.

    PubMed

    Kanterakis, Alexandros; Deelen, Patrick; van Dijk, Freerk; Byelas, Heorhiy; Dijkstra, Martijn; Swertz, Morris A

    2015-08-19

    Genotype imputation is an important procedure in current genomic analysis such as genome-wide association studies, meta-analyses and fine mapping. Although high quality tools are available that perform the steps of this process, considerable effort and expertise is required to set up and run a best practice imputation pipeline, particularly for larger genotype datasets, where imputation has to scale out in parallel on computer clusters. Here we present MOLGENIS-impute, an 'imputation in a box' solution that seamlessly and transparently automates the set up and running of all the steps of the imputation process. These steps include genome build liftover (liftovering), genotype phasing with SHAPEIT2, quality control, sample and chromosomal chunking/merging, and imputation with IMPUTE2. MOLGENIS-impute builds on MOLGENIS-compute, a simple pipeline management platform for submission and monitoring of bioinformatics tasks in High Performance Computing (HPC) environments like local/cloud servers, clusters and grids. All the required tools, data and scripts are downloaded and installed in a single step. Researchers with diverse backgrounds and expertise have tested MOLGENIS-impute on different locations and imputed over 30,000 samples so far using the 1,000 Genomes Project and new Genome of the Netherlands data as the imputation reference. The tests have been performed on PBS/SGE clusters, cloud VMs and in a grid HPC environment. MOLGENIS-impute gives priority to the ease of setting up, configuring and running an imputation. It has minimal dependencies and wraps the pipeline in a simple command line interface, without sacrificing flexibility to adapt or limiting the options of underlying imputation tools. It does not require knowledge of a workflow system or programming, and is targeted at researchers who just want to apply best practices in imputation via simple commands. It is built on the MOLGENIS compute workflow framework to enable customization with additional

  14. The automated data processing architecture for the GPI Exoplanet Survey

    NASA Astrophysics Data System (ADS)

    Wang, Jason J.; Perrin, Marshall D.; Savransky, Dmitry; Arriaga, Pauline; Chilcote, Jeffrey K.; De Rosa, Robert J.; Millar-Blanchaer, Maxwell A.; Marois, Christian; Rameau, Julien; Wolff, Schuyler G.; Shapiro, Jacob; Ruffio, Jean-Baptiste; Graham, James R.; Macintosh, Bruce

    2017-09-01

    The Gemini Planet Imager Exoplanet Survey (GPIES) is a multi-year direct imaging survey of 600 stars to discover and characterize young Jovian exoplanets and their environments. We have developed an automated data architecture to process and index all data related to the survey uniformly. An automated and flexible data processing framework, which we term the GPIES Data Cruncher, combines multiple data reduction pipelines together to intelligently process all spectroscopic, polarimetric, and calibration data taken with GPIES. With no human intervention, fully reduced and calibrated data products are available less than an hour after the data are taken to expedite follow-up on potential objects of interest. The Data Cruncher can run on a supercomputer to reprocess all GPIES data in a single day as improvements are made to our data reduction pipelines. A backend MySQL database indexes all files, which are synced to the cloud, and a front-end web server allows for easy browsing of all files associated with GPIES. To help observers, quicklook displays show reduced data as they are processed in real-time, and chatbots on Slack post observing information as well as reduced data products. Together, the GPIES automated data processing architecture reduces our workload, provides real-time data reduction, optimizes our observing strategy, and maintains a homogeneously reduced dataset to study planet occurrence and instrument performance.

  15. Seqping: gene prediction pipeline for plant genomes using self-training gene models and transcriptomic data.

    PubMed

    Chan, Kuang-Lim; Rosli, Rozana; Tatarinova, Tatiana V; Hogan, Michael; Firdaus-Raih, Mohd; Low, Eng-Ti Leslie

    2017-01-27

    Gene prediction is one of the most important steps in the genome annotation process. A large number of software tools and pipelines developed by various computing techniques are available for gene prediction. However, these systems have yet to accurately predict all or even most of the protein-coding regions. Furthermore, none of the currently available gene-finders has a universal Hidden Markov Model (HMM) that can perform gene prediction for all organisms equally well in an automatic fashion. We present an automated gene prediction pipeline, Seqping that uses self-training HMM models and transcriptomic data. The pipeline processes the genome and transcriptome sequences of the target species using GlimmerHMM, SNAP, and AUGUSTUS pipelines, followed by MAKER2 program to combine predictions from the three tools in association with the transcriptomic evidence. Seqping generates species-specific HMMs that are able to offer unbiased gene predictions. The pipeline was evaluated using the Oryza sativa and Arabidopsis thaliana genomes. Benchmarking Universal Single-Copy Orthologs (BUSCO) analysis showed that the pipeline was able to identify at least 95% of BUSCO's plantae dataset. Our evaluation shows that Seqping was able to generate better gene predictions compared to three HMM-based programs (MAKER2, GlimmerHMM and AUGUSTUS) using their respective available HMMs. Seqping had the highest accuracy in rice (0.5648 for CDS, 0.4468 for exon, and 0.6695 nucleotide structure) and A. thaliana (0.5808 for CDS, 0.5955 for exon, and 0.8839 nucleotide structure). Seqping provides researchers a seamless pipeline to train species-specific HMMs and predict genes in newly sequenced or less-studied genomes. We conclude that the Seqping pipeline predictions are more accurate than gene predictions using the other three approaches with the default or available HMMs.

  16. Automated Coal-Mining System

    NASA Technical Reports Server (NTRS)

    Gangal, M. D.; Isenberg, L.; Lewis, E. V.

    1985-01-01

    Proposed system offers safety and large return on investment. System, operating by year 2000, employs machines and processes based on proven principles. According to concept, line of parallel machines, connected in groups of four to service modules, attacks face of coal seam. High-pressure water jets and central auger on each machine break face. Jaws scoop up coal chunks, and auger grinds them and forces fragments into slurry-transport system. Slurry pumped through pipeline to point of use. Concept for highly automated coal-mining system increases productivity, makes mining safer, and protects health of mine workers.

  17. Automated tissue segmentation of MR brain images in the presence of white matter lesions.

    PubMed

    Valverde, Sergi; Oliver, Arnau; Roura, Eloy; González-Villà, Sandra; Pareto, Deborah; Vilanova, Joan C; Ramió-Torrentà, Lluís; Rovira, Àlex; Lladó, Xavier

    2017-01-01

    Over the last few years, the increasing interest in brain tissue volume measurements on clinical settings has led to the development of a wide number of automated tissue segmentation methods. However, white matter lesions are known to reduce the performance of automated tissue segmentation methods, which requires manual annotation of the lesions and refilling them before segmentation, which is tedious and time-consuming. Here, we propose a new, fully automated T1-w/FLAIR tissue segmentation approach designed to deal with images in the presence of WM lesions. This approach integrates a robust partial volume tissue segmentation with WM outlier rejection and filling, combining intensity and probabilistic and morphological prior maps. We evaluate the performance of this method on the MRBrainS13 tissue segmentation challenge database, which contains images with vascular WM lesions, and also on a set of Multiple Sclerosis (MS) patient images. On both databases, we validate the performance of our method with other state-of-the-art techniques. On the MRBrainS13 data, the presented approach was at the time of submission the best ranked unsupervised intensity model method of the challenge (7th position) and clearly outperformed the other unsupervised pipelines such as FAST and SPM12. On MS data, the differences in tissue segmentation between the images segmented with our method and the same images where manual expert annotations were used to refill lesions on T1-w images before segmentation were lower or similar to the best state-of-the-art pipeline incorporating automated lesion segmentation and filling. Our results show that the proposed pipeline achieved very competitive results on both vascular and MS lesions. A public version of this approach is available to download for the neuro-imaging community. Copyright © 2016 Elsevier B.V. All rights reserved.

  18. 78 FR 41991 - Pipeline Safety: Potential for Damage to Pipeline Facilities Caused by Flooding

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-12

    ... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket No...: Pipeline and Hazardous Materials Safety Administration (PHMSA); DOT. ACTION: Notice; Issuance of Advisory... Gas and Hazardous Liquid Pipeline Systems. Subject: Potential for Damage to Pipeline Facilities Caused...

  19. 78 FR 41496 - Pipeline Safety: Meetings of the Gas and Liquid Pipeline Advisory Committees

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-10

    ... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket No. PHMSA-2013-0156] Pipeline Safety: Meetings of the Gas and Liquid Pipeline Advisory Committees AGENCY: Pipeline and Hazardous Materials Safety Administration (PHMSA), DOT. ACTION: Notice of advisory committee...

  20. MetAMOS: a modular and open source metagenomic assembly and analysis pipeline

    PubMed Central

    2013-01-01

    We describe MetAMOS, an open source and modular metagenomic assembly and analysis pipeline. MetAMOS represents an important step towards fully automated metagenomic analysis, starting with next-generation sequencing reads and producing genomic scaffolds, open-reading frames and taxonomic or functional annotations. MetAMOS can aid in reducing assembly errors, commonly encountered when assembling metagenomic samples, and improves taxonomic assignment accuracy while also reducing computational cost. MetAMOS can be downloaded from: https://github.com/treangen/MetAMOS. PMID:23320958

  1. Characterization and Validation of Transiting Planets in the TESS SPOC Pipeline

    NASA Astrophysics Data System (ADS)

    Twicken, Joseph D.; Caldwell, Douglas A.; Davies, Misty; Jenkins, Jon Michael; Li, Jie; Morris, Robert L.; Rose, Mark; Smith, Jeffrey C.; Tenenbaum, Peter; Ting, Eric; Wohler, Bill

    2018-06-01

    Light curves for Transiting Exoplanet Survey Satellite (TESS) target stars will be extracted and searched for transiting planet signatures in the Science Processing Operations Center (SPOC) Science Pipeline at NASA Ames Research Center. Targets for which the transiting planet detection threshold is exceeded will be processed in the Data Validation (DV) component of the Pipeline. The primary functions of DV are to (1) characterize planets identified in the transiting planet search, (2) search for additional transiting planet signatures in light curves after modeled transit signatures have been removed, and (3) perform a comprehensive suite of diagnostic tests to aid in discrimination between true transiting planets and false positive detections. DV data products include extensive reports by target, one-page summaries by planet candidate, and tabulated transit model fit and diagnostic test results. DV products may be employed by humans and automated systems to vet planet candidates identified in the Pipeline. TESS will launch in 2018 and survey the full sky for transiting exoplanets over a period of two years. The SPOC pipeline was ported from the Kepler Science Operations Center (SOC) codebase and extended for TESS after the mission was selected for flight in the NASA Astrophysics Explorer program. We describe the Data Validation component of the SPOC Pipeline. The diagnostic tests exploit the flux (i.e., light curve) and pixel time series associated with each target to support the determination of the origin of each purported transiting planet signature. We also highlight the differences between the DV components for Kepler and TESS. Candidate planet detections and data products will be delivered to the Mikulski Archive for Space Telescopes (MAST); the MAST URL is archive.stsci.edu/tess. Funding for the TESS Mission has been provided by the NASA Science Mission Directorate.

  2. 2dx_automator: implementation of a semiautomatic high-throughput high-resolution cryo-electron crystallography pipeline.

    PubMed

    Scherer, Sebastian; Kowal, Julia; Chami, Mohamed; Dandey, Venkata; Arheit, Marcel; Ringler, Philippe; Stahlberg, Henning

    2014-05-01

    The introduction of direct electron detectors (DED) to cryo-electron microscopy has tremendously increased the signal-to-noise ratio (SNR) and quality of the recorded images. We discuss the optimal use of DEDs for cryo-electron crystallography, introduce a new automatic image processing pipeline, and demonstrate the vast improvement in the resolution achieved by the use of both together, especially for highly tilted samples. The new processing pipeline (now included in the software package 2dx) exploits the high SNR and frame readout frequency of DEDs to automatically correct for beam-induced sample movement, and reliably processes individual crystal images without human interaction as data are being acquired. A new graphical user interface (GUI) condenses all information required for quality assessment in one window, allowing the imaging conditions to be verified and adjusted during the data collection session. With this new pipeline an automatically generated unit cell projection map of each recorded 2D crystal is available less than 5 min after the image was recorded. The entire processing procedure yielded a three-dimensional reconstruction of the 2D-crystallized ion-channel membrane protein MloK1 with a much-improved resolution of 5Å in-plane and 7Å in the z-direction, within 2 days of data acquisition and simultaneous processing. The results obtained are superior to those delivered by conventional photographic film-based methodology of the same sample, and demonstrate the importance of drift-correction. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.

  3. PANDA: a pipeline toolbox for analyzing brain diffusion images.

    PubMed

    Cui, Zaixu; Zhong, Suyu; Xu, Pengfei; He, Yong; Gong, Gaolang

    2013-01-01

    Diffusion magnetic resonance imaging (dMRI) is widely used in both scientific research and clinical practice in in-vivo studies of the human brain. While a number of post-processing packages have been developed, fully automated processing of dMRI datasets remains challenging. Here, we developed a MATLAB toolbox named "Pipeline for Analyzing braiN Diffusion imAges" (PANDA) for fully automated processing of brain diffusion images. The processing modules of a few established packages, including FMRIB Software Library (FSL), Pipeline System for Octave and Matlab (PSOM), Diffusion Toolkit and MRIcron, were employed in PANDA. Using any number of raw dMRI datasets from different subjects, in either DICOM or NIfTI format, PANDA can automatically perform a series of steps to process DICOM/NIfTI to diffusion metrics [e.g., fractional anisotropy (FA) and mean diffusivity (MD)] that are ready for statistical analysis at the voxel-level, the atlas-level and the Tract-Based Spatial Statistics (TBSS)-level and can finish the construction of anatomical brain networks for all subjects. In particular, PANDA can process different subjects in parallel, using multiple cores either in a single computer or in a distributed computing environment, thus greatly reducing the time cost when dealing with a large number of datasets. In addition, PANDA has a friendly graphical user interface (GUI), allowing the user to be interactive and to adjust the input/output settings, as well as the processing parameters. As an open-source package, PANDA is freely available at http://www.nitrc.org/projects/panda/. This novel toolbox is expected to substantially simplify the image processing of dMRI datasets and facilitate human structural connectome studies.

  4. PANDA: a pipeline toolbox for analyzing brain diffusion images

    PubMed Central

    Cui, Zaixu; Zhong, Suyu; Xu, Pengfei; He, Yong; Gong, Gaolang

    2013-01-01

    Diffusion magnetic resonance imaging (dMRI) is widely used in both scientific research and clinical practice in in-vivo studies of the human brain. While a number of post-processing packages have been developed, fully automated processing of dMRI datasets remains challenging. Here, we developed a MATLAB toolbox named “Pipeline for Analyzing braiN Diffusion imAges” (PANDA) for fully automated processing of brain diffusion images. The processing modules of a few established packages, including FMRIB Software Library (FSL), Pipeline System for Octave and Matlab (PSOM), Diffusion Toolkit and MRIcron, were employed in PANDA. Using any number of raw dMRI datasets from different subjects, in either DICOM or NIfTI format, PANDA can automatically perform a series of steps to process DICOM/NIfTI to diffusion metrics [e.g., fractional anisotropy (FA) and mean diffusivity (MD)] that are ready for statistical analysis at the voxel-level, the atlas-level and the Tract-Based Spatial Statistics (TBSS)-level and can finish the construction of anatomical brain networks for all subjects. In particular, PANDA can process different subjects in parallel, using multiple cores either in a single computer or in a distributed computing environment, thus greatly reducing the time cost when dealing with a large number of datasets. In addition, PANDA has a friendly graphical user interface (GUI), allowing the user to be interactive and to adjust the input/output settings, as well as the processing parameters. As an open-source package, PANDA is freely available at http://www.nitrc.org/projects/panda/. This novel toolbox is expected to substantially simplify the image processing of dMRI datasets and facilitate human structural connectome studies. PMID:23439846

  5. PepLine: a software pipeline for high-throughput direct mapping of tandem mass spectrometry data on genomic sequences.

    PubMed

    Ferro, Myriam; Tardif, Marianne; Reguer, Erwan; Cahuzac, Romain; Bruley, Christophe; Vermat, Thierry; Nugues, Estelle; Vigouroux, Marielle; Vandenbrouck, Yves; Garin, Jérôme; Viari, Alain

    2008-05-01

    PepLine is a fully automated software which maps MS/MS fragmentation spectra of trypsic peptides to genomic DNA sequences. The approach is based on Peptide Sequence Tags (PSTs) obtained from partial interpretation of QTOF MS/MS spectra (first module). PSTs are then mapped on the six-frame translations of genomic sequences (second module) giving hits. Hits are then clustered to detect potential coding regions (third module). Our work aimed at optimizing the algorithms of each component to allow the whole pipeline to proceed in a fully automated manner using raw nucleic acid sequences (i.e., genomes that have not been "reduced" to a database of ORFs or putative exons sequences). The whole pipeline was tested on controlled MS/MS spectra sets from standard proteins and from Arabidopsis thaliana envelope chloroplast samples. Our results demonstrate that PepLine competed with protein database searching softwares and was fast enough to potentially tackle large data sets and/or high size genomes. We also illustrate the potential of this approach for the detection of the intron/exon structure of genes.

  6. 76 FR 29333 - Pipeline Safety: Meetings of the Technical Pipeline Safety Standards Committee and the Technical...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-20

    ... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket No... Technical Hazardous Liquid Pipeline Safety Standards Committee AGENCY: Pipeline and Hazardous Materials... for natural gas pipelines and for hazardous liquid pipelines. Both committees were established under...

  7. 76 FR 43743 - Pipeline Safety: Meetings of the Technical Pipeline Safety Standards Committee and the Technical...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-21

    ... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket No. PHMSA-2011-0127] Pipeline Safety: Meetings of the Technical Pipeline Safety Standards Committee and the Technical Hazardous Liquid Pipeline Safety Standards Committee AGENCY: Pipeline and Hazardous Materials...

  8. WASS: An open-source pipeline for 3D stereo reconstruction of ocean waves

    NASA Astrophysics Data System (ADS)

    Bergamasco, Filippo; Torsello, Andrea; Sclavo, Mauro; Barbariol, Francesco; Benetazzo, Alvise

    2017-10-01

    Stereo 3D reconstruction of ocean waves is gaining more and more popularity in the oceanographic community and industry. Indeed, recent advances of both computer vision algorithms and computer processing power now allow the study of the spatio-temporal wave field with unprecedented accuracy, especially at small scales. Even if simple in theory, multiple details are difficult to be mastered for a practitioner, so that the implementation of a sea-waves 3D reconstruction pipeline is in general considered a complex task. For instance, camera calibration, reliable stereo feature matching and mean sea-plane estimation are all factors for which a well designed implementation can make the difference to obtain valuable results. For this reason, we believe that the open availability of a well tested software package that automates the reconstruction process from stereo images to a 3D point cloud would be a valuable addition for future researches in this area. We present WASS (http://www.dais.unive.it/wass), an Open-Source stereo processing pipeline for sea waves 3D reconstruction. Our tool completely automates all the steps required to estimate dense point clouds from stereo images. Namely, it computes the extrinsic parameters of the stereo rig so that no delicate calibration has to be performed on the field. It implements a fast 3D dense stereo reconstruction procedure based on the consolidated OpenCV library and, lastly, it includes set of filtering techniques both on the disparity map and the produced point cloud to remove the vast majority of erroneous points that can naturally arise while analyzing the optically complex nature of the water surface. In this paper, we describe the architecture of WASS and the internal algorithms involved. The pipeline workflow is shown step-by-step and demonstrated on real datasets acquired at sea.

  9. Fuzzy-based propagation of prior knowledge to improve large-scale image analysis pipelines

    PubMed Central

    Mikut, Ralf

    2017-01-01

    Many automatically analyzable scientific questions are well-posed and a variety of information about expected outcomes is available a priori. Although often neglected, this prior knowledge can be systematically exploited to make automated analysis operations sensitive to a desired phenomenon or to evaluate extracted content with respect to this prior knowledge. For instance, the performance of processing operators can be greatly enhanced by a more focused detection strategy and by direct information about the ambiguity inherent in the extracted data. We present a new concept that increases the result quality awareness of image analysis operators by estimating and distributing the degree of uncertainty involved in their output based on prior knowledge. This allows the use of simple processing operators that are suitable for analyzing large-scale spatiotemporal (3D+t) microscopy images without compromising result quality. On the foundation of fuzzy set theory, we transform available prior knowledge into a mathematical representation and extensively use it to enhance the result quality of various processing operators. These concepts are illustrated on a typical bioimage analysis pipeline comprised of seed point detection, segmentation, multiview fusion and tracking. The functionality of the proposed approach is further validated on a comprehensive simulated 3D+t benchmark data set that mimics embryonic development and on large-scale light-sheet microscopy data of a zebrafish embryo. The general concept introduced in this contribution represents a new approach to efficiently exploit prior knowledge to improve the result quality of image analysis pipelines. The generality of the concept makes it applicable to practically any field with processing strategies that are arranged as linear pipelines. The automated analysis of terabyte-scale microscopy data will especially benefit from sophisticated and efficient algorithms that enable a quantitative and fast readout. PMID

  10. 77 FR 16471 - Pipeline Safety: Implementation of the National Registry of Pipeline and Liquefied Natural Gas...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-21

    ... Registry of Pipeline and Liquefied Natural Gas Operators AGENCY: Pipeline and Hazardous Materials Safety... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration 49 CFR Parts... Register (75 FR 72878) titled: ``Pipeline Safety: Updates to Pipeline and Liquefied Natural Gas Reporting...

  11. Designing Image Analysis Pipelines in Light Microscopy: A Rational Approach.

    PubMed

    Arganda-Carreras, Ignacio; Andrey, Philippe

    2017-01-01

    With the progress of microscopy techniques and the rapidly growing amounts of acquired imaging data, there is an increased need for automated image processing and analysis solutions in biological studies. Each new application requires the design of a specific image analysis pipeline, by assembling a series of image processing operations. Many commercial or free bioimage analysis software are now available and several textbooks and reviews have presented the mathematical and computational fundamentals of image processing and analysis. Tens, if not hundreds, of algorithms and methods have been developed and integrated into image analysis software, resulting in a combinatorial explosion of possible image processing sequences. This paper presents a general guideline methodology to rationally address the design of image processing and analysis pipelines. The originality of the proposed approach is to follow an iterative, backwards procedure from the target objectives of analysis. The proposed goal-oriented strategy should help biologists to better apprehend image analysis in the context of their research and should allow them to efficiently interact with image processing specialists.

  12. 75 FR 45591 - Pipeline Safety: Notice of Technical Pipeline Safety Advisory Committee Meetings

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-03

    ... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration 49 CFR Part... Committee Meetings AGENCY: Pipeline and Hazardous Materials Safety Administration (PHMSA); DOT. ACTION... safety standards, risk assessments, and safety policies for natural gas pipelines and for hazardous...

  13. Multinode reconfigurable pipeline computer

    NASA Technical Reports Server (NTRS)

    Nosenchuck, Daniel M. (Inventor); Littman, Michael G. (Inventor)

    1989-01-01

    A multinode parallel-processing computer is made up of a plurality of innerconnected, large capacity nodes each including a reconfigurable pipeline of functional units such as Integer Arithmetic Logic Processors, Floating Point Arithmetic Processors, Special Purpose Processors, etc. The reconfigurable pipeline of each node is connected to a multiplane memory by a Memory-ALU switch NETwork (MASNET). The reconfigurable pipeline includes three (3) basic substructures formed from functional units which have been found to be sufficient to perform the bulk of all calculations. The MASNET controls the flow of signals from the memory planes to the reconfigurable pipeline and vice versa. the nodes are connectable together by an internode data router (hyperspace router) so as to form a hypercube configuration. The capability of the nodes to conditionally configure the pipeline at each tick of the clock, without requiring a pipeline flush, permits many powerful algorithms to be implemented directly.

  14. The developing human connectome project: A minimal processing pipeline for neonatal cortical surface reconstruction.

    PubMed

    Makropoulos, Antonios; Robinson, Emma C; Schuh, Andreas; Wright, Robert; Fitzgibbon, Sean; Bozek, Jelena; Counsell, Serena J; Steinweg, Johannes; Vecchiato, Katy; Passerat-Palmbach, Jonathan; Lenz, Gregor; Mortari, Filippo; Tenev, Tencho; Duff, Eugene P; Bastiani, Matteo; Cordero-Grande, Lucilio; Hughes, Emer; Tusor, Nora; Tournier, Jacques-Donald; Hutter, Jana; Price, Anthony N; Teixeira, Rui Pedro A G; Murgasova, Maria; Victor, Suresh; Kelly, Christopher; Rutherford, Mary A; Smith, Stephen M; Edwards, A David; Hajnal, Joseph V; Jenkinson, Mark; Rueckert, Daniel

    2018-06-01

    The Developing Human Connectome Project (dHCP) seeks to create the first 4-dimensional connectome of early life. Understanding this connectome in detail may provide insights into normal as well as abnormal patterns of brain development. Following established best practices adopted by the WU-MINN Human Connectome Project (HCP), and pioneered by FreeSurfer, the project utilises cortical surface-based processing pipelines. In this paper, we propose a fully automated processing pipeline for the structural Magnetic Resonance Imaging (MRI) of the developing neonatal brain. This proposed pipeline consists of a refined framework for cortical and sub-cortical volume segmentation, cortical surface extraction, and cortical surface inflation, which has been specifically designed to address considerable differences between adult and neonatal brains, as imaged using MRI. Using the proposed pipeline our results demonstrate that images collected from 465 subjects ranging from 28 to 45 weeks post-menstrual age (PMA) can be processed fully automatically; generating cortical surface models that are topologically correct, and correspond well with manual evaluations of tissue boundaries in 85% of cases. Results improve on state-of-the-art neonatal tissue segmentation models and significant errors were found in only 2% of cases, where these corresponded to subjects with high motion. Downstream, these surfaces will enhance comparisons of functional and diffusion MRI datasets, supporting the modelling of emerging patterns of brain connectivity. Copyright © 2018 Elsevier Inc. All rights reserved.

  15. RGAugury: a pipeline for genome-wide prediction of resistance gene analogs (RGAs) in plants.

    PubMed

    Li, Pingchuan; Quan, Xiande; Jia, Gaofeng; Xiao, Jin; Cloutier, Sylvie; You, Frank M

    2016-11-02

    Resistance gene analogs (RGAs), such as NBS-encoding proteins, receptor-like protein kinases (RLKs) and receptor-like proteins (RLPs), are potential R-genes that contain specific conserved domains and motifs. Thus, RGAs can be predicted based on their conserved structural features using bioinformatics tools. Computer programs have been developed for the identification of individual domains and motifs from the protein sequences of RGAs but none offer a systematic assessment of the different types of RGAs. A user-friendly and efficient pipeline is needed for large-scale genome-wide RGA predictions of the growing number of sequenced plant genomes. An integrative pipeline, named RGAugury, was developed to automate RGA prediction. The pipeline first identifies RGA-related protein domains and motifs, namely nucleotide binding site (NB-ARC), leucine rich repeat (LRR), transmembrane (TM), serine/threonine and tyrosine kinase (STTK), lysin motif (LysM), coiled-coil (CC) and Toll/Interleukin-1 receptor (TIR). RGA candidates are identified and classified into four major families based on the presence of combinations of these RGA domains and motifs: NBS-encoding, TM-CC, and membrane associated RLP and RLK. All time-consuming analyses of the pipeline are paralleled to improve performance. The pipeline was evaluated using the well-annotated Arabidopsis genome. A total of 98.5, 85.2, and 100 % of the reported NBS-encoding genes, membrane associated RLPs and RLKs were validated, respectively. The pipeline was also successfully applied to predict RGAs for 50 sequenced plant genomes. A user-friendly web interface was implemented to ease command line operations, facilitate visualization and simplify result management for multiple datasets. RGAugury is an efficiently integrative bioinformatics tool for large scale genome-wide identification of RGAs. It is freely available at Bitbucket: https://bitbucket.org/yaanlpc/rgaugury .

  16. Pipeline repair development in support of the Oman to India gas pipeline

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abadie, W.; Carlson, W.

    1995-12-01

    This paper provides a summary of development which has been conducted to date for the ultra deep, diverless pipeline repair system for the proposed Oman to India Gas Pipeline. The work has addressed critical development areas involving testing and/or prototype development of tools and procedures required to perform a diverless pipeline repair in water depths of up to 3,525 m.

  17. 49 CFR 195.210 - Pipeline location.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 3 2010-10-01 2010-10-01 false Pipeline location. 195.210 Section 195.210 Transportation Other Regulations Relating to Transportation (Continued) PIPELINE AND HAZARDOUS MATERIALS SAFETY... PIPELINE Construction § 195.210 Pipeline location. (a) Pipeline right-of-way must be selected to avoid, as...

  18. Recent development in low-constraint fracture toughness testing for structural integrity assessment of pipelines

    NASA Astrophysics Data System (ADS)

    Kang, Jidong; Gianetto, James A.; Tyson, William R.

    2018-03-01

    Fracture toughness measurement is an integral part of structural integrity assessment of pipelines. Traditionally, a single-edge-notched bend (SE(B)) specimen with a deep crack is recommended in many existing pipeline structural integrity assessment procedures. Such a test provides high constraint and therefore conservative fracture toughness results. However, for girth welds in service, defects are usually subjected to primarily tensile loading where the constraint is usually much lower than in the three-point bend case. Moreover, there is increasing use of strain-based design of pipelines that allows applied strains above yield. Low-constraint toughness tests represent more realistic loading conditions for girth weld defects, and the corresponding increased toughness can minimize unnecessary conservatism in assessments. In this review, we present recent developments in low-constraint fracture toughness testing, specifically using single-edgenotched tension specimens, SENT or SE(T). We focus our review on the test procedure development and automation, round-robin test results and some common concerns such as the effect of crack tip, crack size monitoring techniques, and testing at low temperatures. Examples are also given of the integration of fracture toughness data from SE(T) tests into structural integrity assessment.

  19. 77 FR 2126 - Pipeline Safety: Implementation of the National Registry of Pipeline and Liquefied Natural Gas...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-01-13

    ... Natural Gas Operators AGENCY: Pipeline and Hazardous Materials Safety Administration (PHMSA), DOT. ACTION... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket No...: ``Pipeline Safety: Updates to Pipeline and Liquefied Natural Gas Reporting Requirements.'' The final rule...

  20. 75 FR 72877 - Pipeline Safety: Updates to Pipeline and Liquefied Natural Gas Reporting Requirements

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-26

    ... liquid pipelines, and liquefied natural gas (LNG) facilities. These revisions will enhance PHMSA's... of natural gas pipelines, hazardous liquid pipelines, and LNG facilities. Specifically, PHMSA... commodity transported, and type of commodity transported. 8. Modify hazardous liquid operator telephonic...

  1. 78 FR 42889 - Pipeline Safety: Reminder of Requirements for Utility LP-Gas and LPG Pipeline Systems

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-18

    ... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration 49 CFR Part 192 [Docket No. PHMSA-2013-0097] Pipeline Safety: Reminder of Requirements for Utility LP-Gas and LPG Pipeline Systems AGENCY: Pipeline and Hazardous Materials Safety Administration (PHMSA), DOT. ACTION...

  2. Method and system for pipeline communication

    DOEpatents

    Richardson,; John, G [Idaho Falls, ID

    2008-01-29

    A pipeline communication system and method includes a pipeline having a surface extending along at least a portion of the length of the pipeline. A conductive bus is formed to and extends along a portion of the surface of the pipeline. The conductive bus includes a first conductive trace and a second conductive trace with the first and second conductive traces being adapted to conformally couple with a pipeline at the surface extending along at least a portion of the length of the pipeline. A transmitter for sending information along the conductive bus on the pipeline is coupled thereto and a receiver for receiving the information from the conductive bus on the pipeline is also couple to the conductive bus.

  3. An integrated pipeline to create and experience compelling scenarios in virtual reality

    NASA Astrophysics Data System (ADS)

    Springer, Jan P.; Neumann, Carsten; Reiners, Dirk; Cruz-Neira, Carolina

    2011-03-01

    One of the main barriers to create and use compelling scenarios in virtual reality is the complexity and time-consuming efforts for modeling, element integration, and the software development to properly display and interact with the content in the available systems. Still today, most virtual reality applications are tedious to create and they are hard-wired to the specific display and interaction system available to the developers when creating the application. Furthermore, it is not possible to alter the content or the dynamics of the content once the application has been created. We present our research on designing a software pipeline that enables the creation of compelling scenarios with a fair degree of visual and interaction complexity in a semi-automated way. Specifically, we are targeting drivable urban scenarios, ranging from large cities to sparsely populated rural areas that incorporate both static components (e. g., houses, trees) and dynamic components (e. g., people, vehicles) as well as events, such as explosions or ambient noise. Our pipeline has four basic components. First, an environment designer, where users sketch the overall layout of the scenario, and an automated method constructs the 3D environment from the information in the sketch. Second, a scenario editor used for authoring the complete scenario, incorporate the dynamic elements and events, fine tune the automatically generated environment, define the execution conditions of the scenario, and set up any data gathering that may be necessary during the execution of the scenario. Third, a run-time environment for different virtual-reality systems provides users with the interactive experience as designed with the designer and the editor. And fourth, a bi-directional monitoring system that allows for capturing and modification of information from the virtual environment. One of the interesting capabilities of our pipeline is that scenarios can be built and modified on-the-fly as they are

  4. 76 FR 75894 - Information Collection Activities: Pipelines and Pipeline Rights-of-Way; Submitted for Office of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-05

    ... pipelines `` * * * for the transportation of oil, natural gas, sulphur, or other minerals, or under such...) Submit repair report 3 1008(f) Submit report of pipeline failure analysis...... 30 1008(g) Submit plan of.... BSEE-2011-0002; OMB Control Number 1010-0050] Information Collection Activities: Pipelines and Pipeline...

  5. 77 FR 16052 - Information Collection Activities: Pipelines and Pipeline Rights-of-Way; Submitted for Office of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-19

    ... submerged lands of the OCS for pipelines ``* * * for the transportation of oil, natural gas, sulphur, or... ensure that the pipeline, as constructed, will provide for safe transportation of oil and gas and other...-0002; OMB Control Number 1014-0016] Information Collection Activities: Pipelines and Pipeline Rights...

  6. Hydrocarbons pipeline transportation risk assessment

    NASA Astrophysics Data System (ADS)

    Zanin, A. V.; Milke, A. A.; Kvasov, I. N.

    2018-04-01

    The pipeline transportation applying risks assessment issue in the arctic conditions is addressed in the paper. Pipeline quality characteristics in the given environment has been assessed. To achieve the stated objective, the pipelines mathematical model was designed and visualized by using the software product SOLIDWORKS. When developing the mathematical model the obtained results made possible to define the pipeline optimal characteristics for designing on the Arctic sea bottom. In the course of conducting the research the pipe avalanche collapse risks were examined, internal longitudinal and circular loads acting on the pipeline were analyzed, as well as the water impact hydrodynamic force was taken into consideration. The conducted calculation can contribute to the pipeline transport further development under the harsh climate conditions of the Russian Federation Arctic shelf territory.

  7. Pipeline surveillance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1976-03-01

    Each week, pilots of Michigan Wisconsin Pipe Line Co.'s aerial patrol fly over the entire 12,000 mi of the company's pipeline system to discover possible gas leaks and prevent accidental encroachment of pipeline right-of-way. (Leaks are uncommon, but as many as 10 potential trespassing incidents can occur along a route in a week.) Following the pipeline is facilitated by the warmth emitted by the line which affects the plant life directly above it. Gas leaks may be indicated by a patch of brown vegetation or by sediment brought to water surfaces by escaping gas bubbles. The 1- and 2-engine patrolmore » planes, which receive special permission from federal authorities to fly at low altitudes, undergo 100-h safety checks and frequent overhauls.« less

  8. Launching genomics into the cloud: deployment of Mercury, a next generation sequence analysis pipeline.

    PubMed

    Reid, Jeffrey G; Carroll, Andrew; Veeraraghavan, Narayanan; Dahdouli, Mahmoud; Sundquist, Andreas; English, Adam; Bainbridge, Matthew; White, Simon; Salerno, William; Buhay, Christian; Yu, Fuli; Muzny, Donna; Daly, Richard; Duyk, Geoff; Gibbs, Richard A; Boerwinkle, Eric

    2014-01-29

    Massively parallel DNA sequencing generates staggering amounts of data. Decreasing cost, increasing throughput, and improved annotation have expanded the diversity of genomics applications in research and clinical practice. This expanding scale creates analytical challenges: accommodating peak compute demand, coordinating secure access for multiple analysts, and sharing validated tools and results. To address these challenges, we have developed the Mercury analysis pipeline and deployed it in local hardware and the Amazon Web Services cloud via the DNAnexus platform. Mercury is an automated, flexible, and extensible analysis workflow that provides accurate and reproducible genomic results at scales ranging from individuals to large cohorts. By taking advantage of cloud computing and with Mercury implemented on the DNAnexus platform, we have demonstrated a powerful combination of a robust and fully validated software pipeline and a scalable computational resource that, to date, we have applied to more than 10,000 whole genome and whole exome samples.

  9. Commanding Constellations (Pipeline Architecture)

    NASA Technical Reports Server (NTRS)

    Ray, Tim; Condron, Jeff

    2003-01-01

    Providing ground command software for constellations of spacecraft is a challenging problem. Reliable command delivery requires a feedback loop; for a constellation there will likely be an independent feedback loop for each constellation member. Each command must be sent via the proper Ground Station, which may change from one contact to the next (and may be different for different members). Dynamic configuration of the ground command software is usually required (e.g. directives to configure each member's feedback loop and assign the appropriate Ground Station). For testing purposes, there must be a way to insert command data at any level in the protocol stack. The Pipeline architecture described in this paper can support all these capabilities with a sequence of software modules (the pipeline), and a single self-identifying message format (for all types of command data and configuration directives). The Pipeline architecture is quite simple, yet it can solve some complex problems. The resulting solutions are conceptually simple, and therefore, reliable. They are also modular, and therefore, easy to distribute and extend. We first used the Pipeline architecture to design a CCSDS (Consultative Committee for Space Data Systems) Ground Telecommand system (to command one spacecraft at a time with a fixed Ground Station interface). This pipeline was later extended to include gateways to any of several Ground Stations. The resulting pipeline was then extended to handle a small constellation of spacecraft. The use of the Pipeline architecture allowed us to easily handle the increasing complexity. This paper will describe the Pipeline architecture, show how it was used to solve each of the above commanding situations, and how it can easily be extended to handle larger constellations.

  10. Mapping of Brain Activity by Automated Volume Analysis of Immediate Early Genes.

    PubMed

    Renier, Nicolas; Adams, Eliza L; Kirst, Christoph; Wu, Zhuhao; Azevedo, Ricardo; Kohl, Johannes; Autry, Anita E; Kadiri, Lolahon; Umadevi Venkataraju, Kannan; Zhou, Yu; Wang, Victoria X; Tang, Cheuk Y; Olsen, Olav; Dulac, Catherine; Osten, Pavel; Tessier-Lavigne, Marc

    2016-06-16

    Understanding how neural information is processed in physiological and pathological states would benefit from precise detection, localization, and quantification of the activity of all neurons across the entire brain, which has not, to date, been achieved in the mammalian brain. We introduce a pipeline for high-speed acquisition of brain activity at cellular resolution through profiling immediate early gene expression using immunostaining and light-sheet fluorescence imaging, followed by automated mapping and analysis of activity by an open-source software program we term ClearMap. We validate the pipeline first by analysis of brain regions activated in response to haloperidol. Next, we report new cortical regions downstream of whisker-evoked sensory processing during active exploration. Last, we combine activity mapping with axon tracing to uncover new brain regions differentially activated during parenting behavior. This pipeline is widely applicable to different experimental paradigms, including animal species for which transgenic activity reporters are not readily available. Copyright © 2016 Elsevier Inc. All rights reserved.

  11. Mapping of brain activity by automated volume analysis of immediate early genes

    PubMed Central

    Renier, Nicolas; Adams, Eliza L.; Kirst, Christoph; Wu, Zhuhao; Azevedo, Ricardo; Kohl, Johannes; Autry, Anita E.; Kadiri, Lolahon; Venkataraju, Kannan Umadevi; Zhou, Yu; Wang, Victoria X.; Tang, Cheuk Y.; Olsen, Olav; Dulac, Catherine; Osten, Pavel; Tessier-Lavigne, Marc

    2016-01-01

    Summary Understanding how neural information is processed in physiological and pathological states would benefit from precise detection, localization and quantification of the activity of all neurons across the entire brain, which has not to date been achieved in the mammalian brain. We introduce a pipeline for high speed acquisition of brain activity at cellular resolution through profiling immediate early gene expression using immunostaining and light-sheet fluorescence imaging, followed by automated mapping and analysis of activity by an open-source software program we term ClearMap. We validate the pipeline first by analysis of brain regions activated in response to Haloperidol. Next, we report new cortical regions downstream of whisker-evoked sensory processing during active exploration. Lastly, we combine activity mapping with axon tracing to uncover new brain regions differentially activated during parenting behavior. This pipeline is widely applicable to different experimental paradigms, including animal species for which transgenic activity reporters are not readily available. PMID:27238021

  12. Streamlining workflow and automation to accelerate laboratory scale protein production.

    PubMed

    Konczal, Jennifer; Gray, Christopher H

    2017-05-01

    Protein production facilities are often required to produce diverse arrays of proteins for demanding methodologies including crystallography, NMR, ITC and other reagent intensive techniques. It is common for these teams to find themselves a bottleneck in the pipeline of ambitious projects. This pressure to deliver has resulted in the evolution of many novel methods to increase capacity and throughput at all stages in the pipeline for generation of recombinant proteins. This review aims to describe current and emerging options to accelerate the success of protein production in Escherichia coli. We emphasize technologies that have been evaluated and implemented in our laboratory, including innovative molecular biology and expression vectors, small-scale expression screening strategies and the automation of parallel and multidimensional chromatography. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  13. Automated extraction of radiation dose information for CT examinations.

    PubMed

    Cook, Tessa S; Zimmerman, Stefan; Maidment, Andrew D A; Kim, Woojin; Boonn, William W

    2010-11-01

    Exposure to radiation as a result of medical imaging is currently in the spotlight, receiving attention from Congress as well as the lay press. Although scanner manufacturers are moving toward including effective dose information in the Digital Imaging and Communications in Medicine headers of imaging studies, there is a vast repository of retrospective CT data at every imaging center that stores dose information in an image-based dose sheet. As such, it is difficult for imaging centers to participate in the ACR's Dose Index Registry. The authors have designed an automated extraction system to query their PACS archive and parse CT examinations to extract the dose information stored in each dose sheet. First, an open-source optical character recognition program processes each dose sheet and converts the information to American Standard Code for Information Interchange (ASCII) text. Each text file is parsed, and radiation dose information is extracted and stored in a database which can be queried using an existing pathology and radiology enterprise search tool. Using this automated extraction pipeline, it is possible to perform dose analysis on the >800,000 CT examinations in the PACS archive and generate dose reports for all of these patients. It is also possible to more effectively educate technologists, radiologists, and referring physicians about exposure to radiation from CT by generating report cards for interpreted and performed studies. The automated extraction pipeline enables compliance with the ACR's reporting guidelines and greater awareness of radiation dose to patients, thus resulting in improved patient care and management. Copyright © 2010 American College of Radiology. Published by Elsevier Inc. All rights reserved.

  14. The X-shooter pipeline

    NASA Astrophysics Data System (ADS)

    Goldoni, P.

    2011-03-01

    The X-shooter data reduction pipeline is an integral part of the X-shooter project, it allows the production of reduced data in physical quantities from the raw data produced by the instrument. The pipeline is based on the data reduction library developed by the X-shooter consortium with contributions from France, The Netherlands and ESO and it uses the Common Pipeline Library (CPL) developed at ESO. The pipeline has been developed for two main functions. The first function is to monitor the operation of the instrument through the reduction of the acquired data, both at Paranal, for a quick-look control, and in Garching, for a more thorough evaluation. The second function is to allow an optimized data reduction for a scientific user. In the following I will first outline the main steps of data reduction with the pipeline then I will briefly show two examples of optimization of the results for science reduction.

  15. Quantification of common carotid artery and descending aorta vessel wall thickness from MR vessel wall imaging using a fully automated processing pipeline.

    PubMed

    Gao, Shan; van 't Klooster, Ronald; Brandts, Anne; Roes, Stijntje D; Alizadeh Dehnavi, Reza; de Roos, Albert; Westenberg, Jos J M; van der Geest, Rob J

    2017-01-01

    To develop and evaluate a method that can fully automatically identify the vessel wall boundaries and quantify the wall thickness for both common carotid artery (CCA) and descending aorta (DAO) from axial magnetic resonance (MR) images. 3T MRI data acquired with T 1 -weighted gradient-echo black-blood imaging sequence from carotid (39 subjects) and aorta (39 subjects) were used to develop and test the algorithm. The vessel wall segmentation was achieved by respectively fitting a 3D cylindrical B-spline surface to the boundaries of lumen and outer wall. The tube-fitting was based on the edge detection performed on the signal intensity (SI) profile along the surface normal. To achieve a fully automated process, Hough Transform (HT) was developed to estimate the lumen centerline and radii for the target vessel. Using the outputs of HT, a tube model for lumen segmentation was initialized and deformed to fit the image data. Finally, lumen segmentation was dilated to initiate the adaptation procedure of outer wall tube. The algorithm was validated by determining: 1) its performance against manual tracing; 2) its interscan reproducibility in quantifying vessel wall thickness (VWT); 3) its capability of detecting VWT difference in hypertensive patients compared with healthy controls. Statistical analysis including Bland-Altman analysis, t-test, and sample size calculation were performed for the purpose of algorithm evaluation. The mean distance between the manual and automatically detected lumen/outer wall contours was 0.00 ± 0.23/0.09 ± 0.21 mm for CCA and 0.12 ± 0.24/0.14 ± 0.35 mm for DAO. No significant difference was observed between the interscan VWT assessment using automated segmentation for both CCA (P = 0.19) and DAO (P = 0.94). Both manual and automated segmentation detected significantly higher carotid (P = 0.016 and P = 0.005) and aortic (P < 0.001 and P = 0.021) wall thickness in the hypertensive patients. A reliable and reproducible pipeline for fully

  16. The pipeline system for Octave and Matlab (PSOM): a lightweight scripting framework and execution engine for scientific workflows

    PubMed Central

    Bellec, Pierre; Lavoie-Courchesne, Sébastien; Dickinson, Phil; Lerch, Jason P.; Zijdenbos, Alex P.; Evans, Alan C.

    2012-01-01

    The analysis of neuroimaging databases typically involves a large number of inter-connected steps called a pipeline. The pipeline system for Octave and Matlab (PSOM) is a flexible framework for the implementation of pipelines in the form of Octave or Matlab scripts. PSOM does not introduce new language constructs to specify the steps and structure of the workflow. All steps of analysis are instead described by a regular Matlab data structure, documenting their associated command and options, as well as their input, output, and cleaned-up files. The PSOM execution engine provides a number of automated services: (1) it executes jobs in parallel on a local computing facility as long as the dependencies between jobs allow for it and sufficient resources are available; (2) it generates a comprehensive record of the pipeline stages and the history of execution, which is detailed enough to fully reproduce the analysis; (3) if an analysis is started multiple times, it executes only the parts of the pipeline that need to be reprocessed. PSOM is distributed under an open-source MIT license and can be used without restriction for academic or commercial projects. The package has no external dependencies besides Matlab or Octave, is straightforward to install and supports of variety of operating systems (Linux, Windows, Mac). We ran several benchmark experiments on a public database including 200 subjects, using a pipeline for the preprocessing of functional magnetic resonance images (fMRI). The benchmark results showed that PSOM is a powerful solution for the analysis of large databases using local or distributed computing resources. PMID:22493575

  17. The pipeline system for Octave and Matlab (PSOM): a lightweight scripting framework and execution engine for scientific workflows.

    PubMed

    Bellec, Pierre; Lavoie-Courchesne, Sébastien; Dickinson, Phil; Lerch, Jason P; Zijdenbos, Alex P; Evans, Alan C

    2012-01-01

    The analysis of neuroimaging databases typically involves a large number of inter-connected steps called a pipeline. The pipeline system for Octave and Matlab (PSOM) is a flexible framework for the implementation of pipelines in the form of Octave or Matlab scripts. PSOM does not introduce new language constructs to specify the steps and structure of the workflow. All steps of analysis are instead described by a regular Matlab data structure, documenting their associated command and options, as well as their input, output, and cleaned-up files. The PSOM execution engine provides a number of automated services: (1) it executes jobs in parallel on a local computing facility as long as the dependencies between jobs allow for it and sufficient resources are available; (2) it generates a comprehensive record of the pipeline stages and the history of execution, which is detailed enough to fully reproduce the analysis; (3) if an analysis is started multiple times, it executes only the parts of the pipeline that need to be reprocessed. PSOM is distributed under an open-source MIT license and can be used without restriction for academic or commercial projects. The package has no external dependencies besides Matlab or Octave, is straightforward to install and supports of variety of operating systems (Linux, Windows, Mac). We ran several benchmark experiments on a public database including 200 subjects, using a pipeline for the preprocessing of functional magnetic resonance images (fMRI). The benchmark results showed that PSOM is a powerful solution for the analysis of large databases using local or distributed computing resources.

  18. About U.S. Natural Gas Pipelines

    EIA Publications

    2007-01-01

    This information product provides the interested reader with a broad and non-technical overview of how the U.S. natural gas pipeline network operates, along with some insights into the many individual pipeline systems that make up the network. While the focus of the presentation is the transportation of natural gas over the interstate and intrastate pipeline systems, information on subjects related to pipeline development, such as system design and pipeline expansion, are also included.

  19. Automated reliability assessment for spectroscopic redshift measurements

    NASA Astrophysics Data System (ADS)

    Jamal, S.; Le Brun, V.; Le Fèvre, O.; Vibert, D.; Schmitt, A.; Surace, C.; Copin, Y.; Garilli, B.; Moresco, M.; Pozzetti, L.

    2018-03-01

    Context. Future large-scale surveys, such as the ESA Euclid mission, will produce a large set of galaxy redshifts (≥106) that will require fully automated data-processing pipelines to analyze the data, extract crucial information and ensure that all requirements are met. A fundamental element in these pipelines is to associate to each galaxy redshift measurement a quality, or reliability, estimate. Aim. In this work, we introduce a new approach to automate the spectroscopic redshift reliability assessment based on machine learning (ML) and characteristics of the redshift probability density function. Methods: We propose to rephrase the spectroscopic redshift estimation into a Bayesian framework, in order to incorporate all sources of information and uncertainties related to the redshift estimation process and produce a redshift posterior probability density function (PDF). To automate the assessment of a reliability flag, we exploit key features in the redshift posterior PDF and machine learning algorithms. Results: As a working example, public data from the VIMOS VLT Deep Survey is exploited to present and test this new methodology. We first tried to reproduce the existing reliability flags using supervised classification in order to describe different types of redshift PDFs, but due to the subjective definition of these flags (classification accuracy 58%), we soon opted for a new homogeneous partitioning of the data into distinct clusters via unsupervised classification. After assessing the accuracy of the new clusters via resubstitution and test predictions (classification accuracy 98%), we projected unlabeled data from preliminary mock simulations for the Euclid space mission into this mapping to predict their redshift reliability labels. Conclusions: Through the development of a methodology in which a system can build its own experience to assess the quality of a parameter, we are able to set a preliminary basis of an automated reliability assessment for

  20. Alaska oil pipeline in retrospect

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Klein, D.R.

    Caribou have not adjusted as well as moose to the presence of the trans-Alaska pipeline. Research has shown that caribou have altered their movements and patterns and range use in relation to the pipeline corridor. Cows with calves show pronounced avoidance of the pipeline, road, and oil field. Traffic and human activity appear more directly responsible for avoidance behavior than does that physical presence of the pipeline, road, and facilities. Animals along the haul road are especially vulnerable to poaching because of the open terrain and the fact that many become tame during the peak of construction activity. Poaching, especiallymore » of furbearers, has increased as pipeline-related traffic has decreased.« less

  1. 75 FR 4134 - Pipeline Safety: Leak Detection on Hazardous Liquid Pipelines

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-01-26

    ... http://dms.dot.gov . General information about the PHMSA Office of Pipeline Safety (OPS) can be... of leak detection by tracking product movement is essential to an understanding of line balance... pipelines, the line balance technique for leak detection can often be performed with manual calculations...

  2. 77 FR 6857 - Pipeline Safety: Notice of Public Meetings on Improving Pipeline Leak Detection System...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-09

    ... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket ID... installed to lessen the volume of natural gas and hazardous liquid released during catastrophic pipeline... p.m. Panel 3: Considerations for Natural Gas Pipeline Leak Detection Systems 3:30 p.m. Break 3:45 p...

  3. Launching genomics into the cloud: deployment of Mercury, a next generation sequence analysis pipeline

    PubMed Central

    2014-01-01

    Background Massively parallel DNA sequencing generates staggering amounts of data. Decreasing cost, increasing throughput, and improved annotation have expanded the diversity of genomics applications in research and clinical practice. This expanding scale creates analytical challenges: accommodating peak compute demand, coordinating secure access for multiple analysts, and sharing validated tools and results. Results To address these challenges, we have developed the Mercury analysis pipeline and deployed it in local hardware and the Amazon Web Services cloud via the DNAnexus platform. Mercury is an automated, flexible, and extensible analysis workflow that provides accurate and reproducible genomic results at scales ranging from individuals to large cohorts. Conclusions By taking advantage of cloud computing and with Mercury implemented on the DNAnexus platform, we have demonstrated a powerful combination of a robust and fully validated software pipeline and a scalable computational resource that, to date, we have applied to more than 10,000 whole genome and whole exome samples. PMID:24475911

  4. Improvement of the banana "Musa acuminata" reference sequence using NGS data and semi-automated bioinformatics methods.

    PubMed

    Martin, Guillaume; Baurens, Franc-Christophe; Droc, Gaëtan; Rouard, Mathieu; Cenci, Alberto; Kilian, Andrzej; Hastie, Alex; Doležel, Jaroslav; Aury, Jean-Marc; Alberti, Adriana; Carreel, Françoise; D'Hont, Angélique

    2016-03-16

    Recent advances in genomics indicate functional significance of a majority of genome sequences and their long range interactions. As a detailed examination of genome organization and function requires very high quality genome sequence, the objective of this study was to improve reference genome assembly of banana (Musa acuminata). We have developed a modular bioinformatics pipeline to improve genome sequence assemblies, which can handle various types of data. The pipeline comprises several semi-automated tools. However, unlike classical automated tools that are based on global parameters, the semi-automated tools proposed an expert mode for a user who can decide on suggested improvements through local compromises. The pipeline was used to improve the draft genome sequence of Musa acuminata. Genotyping by sequencing (GBS) of a segregating population and paired-end sequencing were used to detect and correct scaffold misassemblies. Long insert size paired-end reads identified scaffold junctions and fusions missed by automated assembly methods. GBS markers were used to anchor scaffolds to pseudo-molecules with a new bioinformatics approach that avoids the tedious step of marker ordering during genetic map construction. Furthermore, a genome map was constructed and used to assemble scaffolds into super scaffolds. Finally, a consensus gene annotation was projected on the new assembly from two pre-existing annotations. This approach reduced the total Musa scaffold number from 7513 to 1532 (i.e. by 80%), with an N50 that increased from 1.3 Mb (65 scaffolds) to 3.0 Mb (26 scaffolds). 89.5% of the assembly was anchored to the 11 Musa chromosomes compared to the previous 70%. Unknown sites (N) were reduced from 17.3 to 10.0%. The release of the Musa acuminata reference genome version 2 provides a platform for detailed analysis of banana genome variation, function and evolution. Bioinformatics tools developed in this work can be used to improve genome sequence assemblies in

  5. Development of an optical character recognition pipeline for handwritten form fields from an electronic health record.

    PubMed

    Rasmussen, Luke V; Peissig, Peggy L; McCarty, Catherine A; Starren, Justin

    2012-06-01

    Although the penetration of electronic health records is increasing rapidly, much of the historical medical record is only available in handwritten notes and forms, which require labor-intensive, human chart abstraction for some clinical research. The few previous studies on automated extraction of data from these handwritten notes have focused on monolithic, custom-developed recognition systems or third-party systems that require proprietary forms. We present an optical character recognition processing pipeline, which leverages the capabilities of existing third-party optical character recognition engines, and provides the flexibility offered by a modular custom-developed system. The system was configured and run on a selected set of form fields extracted from a corpus of handwritten ophthalmology forms. The processing pipeline allowed multiple configurations to be run, with the optimal configuration consisting of the Nuance and LEADTOOLS engines running in parallel with a positive predictive value of 94.6% and a sensitivity of 13.5%. While limitations exist, preliminary experience from this project yielded insights on the generalizability and applicability of integrating multiple, inexpensive general-purpose third-party optical character recognition engines in a modular pipeline.

  6. Development of an optical character recognition pipeline for handwritten form fields from an electronic health record

    PubMed Central

    Peissig, Peggy L; McCarty, Catherine A; Starren, Justin

    2011-01-01

    Background Although the penetration of electronic health records is increasing rapidly, much of the historical medical record is only available in handwritten notes and forms, which require labor-intensive, human chart abstraction for some clinical research. The few previous studies on automated extraction of data from these handwritten notes have focused on monolithic, custom-developed recognition systems or third-party systems that require proprietary forms. Methods We present an optical character recognition processing pipeline, which leverages the capabilities of existing third-party optical character recognition engines, and provides the flexibility offered by a modular custom-developed system. The system was configured and run on a selected set of form fields extracted from a corpus of handwritten ophthalmology forms. Observations The processing pipeline allowed multiple configurations to be run, with the optimal configuration consisting of the Nuance and LEADTOOLS engines running in parallel with a positive predictive value of 94.6% and a sensitivity of 13.5%. Discussion While limitations exist, preliminary experience from this project yielded insights on the generalizability and applicability of integrating multiple, inexpensive general-purpose third-party optical character recognition engines in a modular pipeline. PMID:21890871

  7. Automated analysis of immunohistochemistry images identifies candidate location biomarkers for cancers.

    PubMed

    Kumar, Aparna; Rao, Arvind; Bhavani, Santosh; Newberg, Justin Y; Murphy, Robert F

    2014-12-23

    Molecular biomarkers are changes measured in biological samples that reflect disease states. Such markers can help clinicians identify types of cancer or stages of progression, and they can guide in tailoring specific therapies. Many efforts to identify biomarkers consider genes that mutate between normal and cancerous tissues or changes in protein or RNA expression levels. Here we define location biomarkers, proteins that undergo changes in subcellular location that are indicative of disease. To discover such biomarkers, we have developed an automated pipeline to compare the subcellular location of proteins between two sets of immunohistochemistry images. We used the pipeline to compare images of healthy and tumor tissue from the Human Protein Atlas, ranking hundreds of proteins in breast, liver, prostate, and bladder based on how much their location was estimated to have changed. The performance of the system was evaluated by determining whether proteins previously known to change location in tumors were ranked highly. We present a number of candidate location biomarkers for each tissue, and identify biochemical pathways that are enriched in proteins that change location. The analysis technology is anticipated to be useful not only for discovering new location biomarkers but also for enabling automated analysis of biomarker distributions as an aid to determining diagnosis.

  8. Automated X-ray image analysis for cargo security: Critical review and future promise.

    PubMed

    Rogers, Thomas W; Jaccard, Nicolas; Morton, Edward J; Griffin, Lewis D

    2017-01-01

    We review the relatively immature field of automated image analysis for X-ray cargo imagery. There is increasing demand for automated analysis methods that can assist in the inspection and selection of containers, due to the ever-growing volumes of traded cargo and the increasing concerns that customs- and security-related threats are being smuggled across borders by organised crime and terrorist networks. We split the field into the classical pipeline of image preprocessing and image understanding. Preprocessing includes: image manipulation; quality improvement; Threat Image Projection (TIP); and material discrimination and segmentation. Image understanding includes: Automated Threat Detection (ATD); and Automated Contents Verification (ACV). We identify several gaps in the literature that need to be addressed and propose ideas for future research. Where the current literature is sparse we borrow from the single-view, multi-view, and CT X-ray baggage domains, which have some characteristics in common with X-ray cargo.

  9. 77 FR 45417 - Pipeline Safety: Inspection and Protection of Pipeline Facilities After Railway Accidents

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-31

    ... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket No... Accidents AGENCY: Pipeline and Hazardous Materials Safety Administration (PHMSA); DOT. [[Page 45418

  10. Fast, Automated, Photo realistic, 3D Modeling of Building Interiors

    DTIC Science & Technology

    2016-09-12

    project, we developed two algorithmic pipelines for GPS-denied indoor mobile 3D mapping using an ambulatory backpack system. By mounting scanning...equipment on a backpack system, a human operator can traverse the interior of a building to produce a high-quality 3D reconstruction. In each of our...Unlimited UU UU UU UU 12-09-2016 1-May-2011 30-Jun-2015 Final Report: Fast, Automated, Photo-realistic, 3D Modeling of Building Interiors (ATTN

  11. 76 FR 28326 - Pipeline Safety: National Pipeline Mapping System Data Submissions and Submission Dates for Gas...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-17

    ... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration 49 CFR 191... Reports AGENCY: Pipeline and Hazardous Materials Safety Administration (PHMSA), DOT. ACTION: Issuance of... Pipeline and Hazardous Materials Safety Administration (PHMSA) published a final rule on November 26, 2010...

  12. Freight pipelines: Current status and anticipated future use

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1998-07-01

    This report is issued by the Task Committee on Freight Pipelines, Pipeline Division, ASCE. Freight pipelines of various types (including slurry pipeline, pneumatic pipeline, and capsule pipeline) have been used throughout the world for over a century for transporting solid and sometimes even package products. Recent advancements in pipeline technology, aided by advanced computer control systems and trenchless technologies, have greatly facilitated the transportation of solids by pipelines. Today, in many situations, freight pipelines are not only the most economical and practical means for transporting solids, they are also the most reliable, safest and most environmentally friendly transportation mode. Increasedmore » use of underground pipelines to transport freight is anticipated in the future, especially as the technology continues to improve and surface transportation modes such as highways become more congested. This paper describes the state of the art and expected future uses of various types of freight pipelines. Obstacles hindering the development and use of the most advanced freight pipeline systems, such as the pneumatic capsule pipeline for interstate transport of freight, are discussed.« less

  13. SERPent: Automated reduction and RFI-mitigation software for e-MERLIN

    NASA Astrophysics Data System (ADS)

    Peck, Luke W.; Fenech, Danielle M.

    2013-08-01

    The Scripted E-merlin Rfi-mitigation PipelinE for iNTerferometry (SERPent) is an automated reduction and RFI-mitigation procedure utilising the SumThreshold methodology (Offringa et al., 2010a), originally developed for the LOFAR pipeline. SERPent is written in the Parseltongue language enabling interaction with the Astronomical Image Processing Software (AIPS) program. Moreover, SERPent is a simple 'out of the box' Python script, which is easy to set up and is free of compilers. In addition to the flagging of RFI affected visibilities, the script also flags antenna zero-amplitude dropouts and Lovell telescope phase calibrator stationary scans inherent to the e-MERLIN system. Both the flagging and computational performances of SERPent are presented here, for e-MERLIN commissioning datasets for both L-band (1.3-1.8 GHz) and C-band (4-8 GHz) observations. RFI typically amounts to <20%-25% for the more problematic L-band observations and <5% for the generally RFI quieter C-band. The level of RFI detection and flagging is more accurate and delicate than visual manual flagging, with the output immediately ready for AIPS calibration. SERPent is fully parallelised and has been tested on a range of computing systems. The current flagging rate is at 110 GB day-1 on a 'high-end' computer (16 CPUs, 100 GB memory) which amounts to ˜6.9 GB CPU-1 day-1, with an expected increase in performance when e-MERLIN has completed its commissioning. The refining of automated reduction and calibration procedures is essential for the e-MERLIN legacy projects and future interferometers such as the SKA and the associated pathfinders (MeerKAT and ASKAP), where the vast data sizes (>TB) make traditional astronomer interactions unfeasible.

  14. Autoreject: Automated artifact rejection for MEG and EEG data.

    PubMed

    Jas, Mainak; Engemann, Denis A; Bekhti, Yousra; Raimondo, Federico; Gramfort, Alexandre

    2017-10-01

    We present an automated algorithm for unified rejection and repair of bad trials in magnetoencephalography (MEG) and electroencephalography (EEG) signals. Our method capitalizes on cross-validation in conjunction with a robust evaluation metric to estimate the optimal peak-to-peak threshold - a quantity commonly used for identifying bad trials in M/EEG. This approach is then extended to a more sophisticated algorithm which estimates this threshold for each sensor yielding trial-wise bad sensors. Depending on the number of bad sensors, the trial is then repaired by interpolation or by excluding it from subsequent analysis. All steps of the algorithm are fully automated thus lending itself to the name Autoreject. In order to assess the practical significance of the algorithm, we conducted extensive validation and comparisons with state-of-the-art methods on four public datasets containing MEG and EEG recordings from more than 200 subjects. The comparisons include purely qualitative efforts as well as quantitatively benchmarking against human supervised and semi-automated preprocessing pipelines. The algorithm allowed us to automate the preprocessing of MEG data from the Human Connectome Project (HCP) going up to the computation of the evoked responses. The automated nature of our method minimizes the burden of human inspection, hence supporting scalability and reliability demanded by data analysis in modern neuroscience. Copyright © 2017 Elsevier Inc. All rights reserved.

  15. An automated microphysiological assay for toxicity evaluation.

    PubMed

    Eggert, S; Alexander, F A; Wiest, J

    2015-08-01

    Screening a newly developed drug, food additive or cosmetic ingredient for toxicity is a critical preliminary step before it can move forward in the development pipeline. Due to the sometimes dire consequences when a harmful agent is overlooked, toxicologists work under strict guidelines to effectively catalogue and classify new chemical agents. Conventional assays involve long experimental hours and many manual steps that increase the probability of user error; errors that can potentially manifest as inaccurate toxicology results. Automated assays can overcome many potential mistakes that arise due to human error. In the presented work, we created and validated a novel, automated platform for a microphysiological assay that can examine cellular attributes with sensors measuring changes in cellular metabolic rate, oxygen consumption, and vitality mediated by exposure to a potentially toxic agent. The system was validated with low buffer culture medium with varied conductivities that caused changes in the measured impedance on integrated impedance electrodes.

  16. Lateral instability of high temperature pipelines, the 20-in. Sleipner Vest pipeline

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Saevik, S.; Levold, E.; Johnsen, O.K.

    1996-12-01

    The present paper addresses methods to control snaking behavior of high temperature pipelines resting on a flat sea bed. A case study is presented based on the detail engineering of the 12.5 km long 20 inch gas pipeline connecting the Sleipner Vest wellhead platform to the Sleipner T processing platform in the North Sea. The study includes screening and evaluation of alternative expansion control methods, ending up with a recommended method. The methodology and philosophy, used as basis to ensure sufficient structural strength throughout the lifetime of the pipeline, are thereafter presented. The results show that in order to findmore » the optimum technical solution to control snaking behavior, many aspects need to be considered such as process requirements, allowable strain, hydrodynamic stability, vertical profile, pipelay installation and trawlboard loading. It is concluded that by proper consideration of all the above aspects, the high temperature pipeline can be designed to obtain sufficient safety level.« less

  17. The Hyper Suprime-Cam software pipeline

    NASA Astrophysics Data System (ADS)

    Bosch, James; Armstrong, Robert; Bickerton, Steven; Furusawa, Hisanori; Ikeda, Hiroyuki; Koike, Michitaro; Lupton, Robert; Mineo, Sogo; Price, Paul; Takata, Tadafumi; Tanaka, Masayuki; Yasuda, Naoki; AlSayyad, Yusra; Becker, Andrew C.; Coulton, William; Coupon, Jean; Garmilla, Jose; Huang, Song; Krughoff, K. Simon; Lang, Dustin; Leauthaud, Alexie; Lim, Kian-Tat; Lust, Nate B.; MacArthur, Lauren A.; Mandelbaum, Rachel; Miyatake, Hironao; Miyazaki, Satoshi; Murata, Ryoma; More, Surhud; Okura, Yuki; Owen, Russell; Swinbank, John D.; Strauss, Michael A.; Yamada, Yoshihiko; Yamanoi, Hitomi

    2018-01-01

    In this paper, we describe the optical imaging data processing pipeline developed for the Subaru Telescope's Hyper Suprime-Cam (HSC) instrument. The HSC Pipeline builds on the prototype pipeline being developed by the Large Synoptic Survey Telescope's Data Management system, adding customizations for HSC, large-scale processing capabilities, and novel algorithms that have since been reincorporated into the LSST codebase. While designed primarily to reduce HSC Subaru Strategic Program (SSP) data, it is also the recommended pipeline for reducing general-observer HSC data. The HSC pipeline includes high-level processing steps that generate coadded images and science-ready catalogs as well as low-level detrending and image characterizations.

  18. 78 FR 53190 - Pipeline Safety: Notice to Operators of Hazardous Liquid and Natural Gas Pipelines of a Recall on...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-28

    ... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket No. PHMSA-2013-0185] Pipeline Safety: Notice to Operators of Hazardous Liquid and Natural Gas Pipelines of a Recall on Leak Repair Clamps Due to Defective Seal AGENCY: Pipeline and Hazardous Materials Safety...

  19. DDBJ read annotation pipeline: a cloud computing-based pipeline for high-throughput analysis of next-generation sequencing data.

    PubMed

    Nagasaki, Hideki; Mochizuki, Takako; Kodama, Yuichi; Saruhashi, Satoshi; Morizaki, Shota; Sugawara, Hideaki; Ohyanagi, Hajime; Kurata, Nori; Okubo, Kousaku; Takagi, Toshihisa; Kaminuma, Eli; Nakamura, Yasukazu

    2013-08-01

    High-performance next-generation sequencing (NGS) technologies are advancing genomics and molecular biological research. However, the immense amount of sequence data requires computational skills and suitable hardware resources that are a challenge to molecular biologists. The DNA Data Bank of Japan (DDBJ) of the National Institute of Genetics (NIG) has initiated a cloud computing-based analytical pipeline, the DDBJ Read Annotation Pipeline (DDBJ Pipeline), for a high-throughput annotation of NGS reads. The DDBJ Pipeline offers a user-friendly graphical web interface and processes massive NGS datasets using decentralized processing by NIG supercomputers currently free of charge. The proposed pipeline consists of two analysis components: basic analysis for reference genome mapping and de novo assembly and subsequent high-level analysis of structural and functional annotations. Users may smoothly switch between the two components in the pipeline, facilitating web-based operations on a supercomputer for high-throughput data analysis. Moreover, public NGS reads of the DDBJ Sequence Read Archive located on the same supercomputer can be imported into the pipeline through the input of only an accession number. This proposed pipeline will facilitate research by utilizing unified analytical workflows applied to the NGS data. The DDBJ Pipeline is accessible at http://p.ddbj.nig.ac.jp/.

  20. DDBJ Read Annotation Pipeline: A Cloud Computing-Based Pipeline for High-Throughput Analysis of Next-Generation Sequencing Data

    PubMed Central

    Nagasaki, Hideki; Mochizuki, Takako; Kodama, Yuichi; Saruhashi, Satoshi; Morizaki, Shota; Sugawara, Hideaki; Ohyanagi, Hajime; Kurata, Nori; Okubo, Kousaku; Takagi, Toshihisa; Kaminuma, Eli; Nakamura, Yasukazu

    2013-01-01

    High-performance next-generation sequencing (NGS) technologies are advancing genomics and molecular biological research. However, the immense amount of sequence data requires computational skills and suitable hardware resources that are a challenge to molecular biologists. The DNA Data Bank of Japan (DDBJ) of the National Institute of Genetics (NIG) has initiated a cloud computing-based analytical pipeline, the DDBJ Read Annotation Pipeline (DDBJ Pipeline), for a high-throughput annotation of NGS reads. The DDBJ Pipeline offers a user-friendly graphical web interface and processes massive NGS datasets using decentralized processing by NIG supercomputers currently free of charge. The proposed pipeline consists of two analysis components: basic analysis for reference genome mapping and de novo assembly and subsequent high-level analysis of structural and functional annotations. Users may smoothly switch between the two components in the pipeline, facilitating web-based operations on a supercomputer for high-throughput data analysis. Moreover, public NGS reads of the DDBJ Sequence Read Archive located on the same supercomputer can be imported into the pipeline through the input of only an accession number. This proposed pipeline will facilitate research by utilizing unified analytical workflows applied to the NGS data. The DDBJ Pipeline is accessible at http://p.ddbj.nig.ac.jp/. PMID:23657089

  1. 77 FR 19414 - Pipeline Safety: Public Comment on Leak and Valve Studies Mandated by the Pipeline Safety...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-30

    ... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket ID... Safety, Regulatory Certainty, and Job Creation Act of 2011 AGENCY: Pipeline and Hazardous Materials... Transportation (DOT), Pipeline and Hazardous Materials Safety Administration (PHMSA) is providing an important...

  2. A detailed comparison of analysis processes for MCC-IMS data in disease classification—Automated methods can replace manual peak annotations

    PubMed Central

    Horsch, Salome; Kopczynski, Dominik; Kuthe, Elias; Baumbach, Jörg Ingo; Rahmann, Sven

    2017-01-01

    Motivation Disease classification from molecular measurements typically requires an analysis pipeline from raw noisy measurements to final classification results. Multi capillary column—ion mobility spectrometry (MCC-IMS) is a promising technology for the detection of volatile organic compounds in the air of exhaled breath. From raw measurements, the peak regions representing the compounds have to be identified, quantified, and clustered across different experiments. Currently, several steps of this analysis process require manual intervention of human experts. Our goal is to identify a fully automatic pipeline that yields competitive disease classification results compared to an established but subjective and tedious semi-manual process. Method We combine a large number of modern methods for peak detection, peak clustering, and multivariate classification into analysis pipelines for raw MCC-IMS data. We evaluate all combinations on three different real datasets in an unbiased cross-validation setting. We determine which specific algorithmic combinations lead to high AUC values in disease classifications across the different medical application scenarios. Results The best fully automated analysis process achieves even better classification results than the established manual process. The best algorithms for the three analysis steps are (i) SGLTR (Savitzky-Golay Laplace-operator filter thresholding regions) and LM (Local Maxima) for automated peak identification, (ii) EM clustering (Expectation Maximization) and DBSCAN (Density-Based Spatial Clustering of Applications with Noise) for the clustering step and (iii) RF (Random Forest) for multivariate classification. Thus, automated methods can replace the manual steps in the analysis process to enable an unbiased high throughput use of the technology. PMID:28910313

  3. Toward cognitive pipelines of medical assistance algorithms.

    PubMed

    Philipp, Patrick; Maleshkova, Maria; Katic, Darko; Weber, Christian; Götz, Michael; Rettinger, Achim; Speidel, Stefanie; Kämpgen, Benedikt; Nolden, Marco; Wekerle, Anna-Laura; Dillmann, Rüdiger; Kenngott, Hannes; Müller, Beat; Studer, Rudi

    2016-09-01

    Assistance algorithms for medical tasks have great potential to support physicians with their daily work. However, medicine is also one of the most demanding domains for computer-based support systems, since medical assistance tasks are complex and the practical experience of the physician is crucial. Recent developments in the area of cognitive computing appear to be well suited to tackle medicine as an application domain. We propose a system based on the idea of cognitive computing and consisting of auto-configurable medical assistance algorithms and their self-adapting combination. The system enables automatic execution of new algorithms, given they are made available as Medical Cognitive Apps and are registered in a central semantic repository. Learning components can be added to the system to optimize the results in the cases when numerous Medical Cognitive Apps are available for the same task. Our prototypical implementation is applied to the areas of surgical phase recognition based on sensor data and image progressing for tumor progression mappings. Our results suggest that such assistance algorithms can be automatically configured in execution pipelines, candidate results can be automatically scored and combined, and the system can learn from experience. Furthermore, our evaluation shows that the Medical Cognitive Apps are providing the correct results as they did for local execution and run in a reasonable amount of time. The proposed solution is applicable to a variety of medical use cases and effectively supports the automated and self-adaptive configuration of cognitive pipelines based on medical interpretation algorithms.

  4. U.S. pipeline industry enters new era

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnsen, M.R.

    1999-11-01

    The largest construction project in North America this year and next--the Alliance Pipeline--marks some advances for the US pipeline industry. With the Alliance Pipeline system (Alliance), mechanized welding and ultrasonic testing are making their debuts in the US as primary mainline construction techniques. Particularly in Canada and Europe, mechanized welding technology has been used for both onshore and offshore pipeline construction for at least 15 years. However, it has never before been used to build a cross-country pipeline in the US, although it has been tested on short segments. This time, however, an accelerated construction schedule, among other reasons, necessitatedmore » the use of mechanized gas metal arc welding (GMAW). The $3-billion pipeline will delivery natural gas from northwestern British Columbia and northeastern Alberta in Canada to a hub near Chicago, Ill., where it will connect to the North American pipeline grid. Once the pipeline is completed and buried, crews will return the topsoil. Corn and other crops will reclaim the land. While the casual passerby probably won't know the Alliance pipeline is there, it may have a far-reaching effect on the way mainline pipelines are built in the US. For even though mechanized welding and ultrasonic testing are being used for the first time in the United States on this project, some US workers had already gained experience with the technology on projects elsewhere. And work on this pipeline has certainly developed a much larger pool of experienced workers for industry to draw from. The Alliance project could well signal the start of a new era in US pipeline construction.« less

  5. The Hyper Suprime-Cam software pipeline

    DOE PAGES

    Bosch, James; Armstrong, Robert; Bickerton, Steven; ...

    2017-10-12

    Here in this article, we describe the optical imaging data processing pipeline developed for the Subaru Telescope’s Hyper Suprime-Cam (HSC) instrument. The HSC Pipeline builds on the prototype pipeline being developed by the Large Synoptic Survey Telescope’s Data Management system, adding customizations for HSC, large-scale processing capabilities, and novel algorithms that have since been reincorporated into the LSST codebase. While designed primarily to reduce HSC Subaru Strategic Program (SSP) data, it is also the recommended pipeline for reducing general-observer HSC data. The HSC pipeline includes high-level processing steps that generate coadded images and science-ready catalogs as well as low-level detrendingmore » and image characterizations.« less

  6. The Hyper Suprime-Cam software pipeline

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bosch, James; Armstrong, Robert; Bickerton, Steven

    Here in this article, we describe the optical imaging data processing pipeline developed for the Subaru Telescope’s Hyper Suprime-Cam (HSC) instrument. The HSC Pipeline builds on the prototype pipeline being developed by the Large Synoptic Survey Telescope’s Data Management system, adding customizations for HSC, large-scale processing capabilities, and novel algorithms that have since been reincorporated into the LSST codebase. While designed primarily to reduce HSC Subaru Strategic Program (SSP) data, it is also the recommended pipeline for reducing general-observer HSC data. The HSC pipeline includes high-level processing steps that generate coadded images and science-ready catalogs as well as low-level detrendingmore » and image characterizations.« less

  7. PRADA: pipeline for RNA sequencing data analysis.

    PubMed

    Torres-García, Wandaliz; Zheng, Siyuan; Sivachenko, Andrey; Vegesna, Rahulsimham; Wang, Qianghu; Yao, Rong; Berger, Michael F; Weinstein, John N; Getz, Gad; Verhaak, Roel G W

    2014-08-01

    Technological advances in high-throughput sequencing necessitate improved computational tools for processing and analyzing large-scale datasets in a systematic automated manner. For that purpose, we have developed PRADA (Pipeline for RNA-Sequencing Data Analysis), a flexible, modular and highly scalable software platform that provides many different types of information available by multifaceted analysis starting from raw paired-end RNA-seq data: gene expression levels, quality metrics, detection of unsupervised and supervised fusion transcripts, detection of intragenic fusion variants, homology scores and fusion frame classification. PRADA uses a dual-mapping strategy that increases sensitivity and refines the analytical endpoints. PRADA has been used extensively and successfully in the glioblastoma and renal clear cell projects of The Cancer Genome Atlas program.  http://sourceforge.net/projects/prada/  gadgetz@broadinstitute.org or rverhaak@mdanderson.org  Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  8. Pipelining in a changing competitive environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jones, E.G.; Wishart, D.M.

    1996-12-31

    The changing competitive environment for the pipeline industry presents a broad spectrum of new challenges and opportunities: international cooperation; globalization of opportunities, organizations and competition; and integrated systems approach to system configuration, financing, contracting strategy, materials sourcing, and operations; cutting edge and emerging technologies; adherence to high standards of environmental protection; an emphasis on safety; innovative approaches to project financing; and advances in technology and programs to maintain the long term, cost effective integrity of operating pipeline systems. These challenges and opportunities are partially a result of the increasingly competitive nature of pipeline development and the public`s intolerance to incidentsmore » of pipeline failure. A creative systems approach to these challenges is often the key to the project moving ahead. This usually encompasses collaboration among users of the pipeline, pipeline owners and operators, international engineering and construction companies, equipment and materials suppliers, in-country engineers and constructors, international lending agencies and financial institutions.« less

  9. The Very Large Array Data Processing Pipeline

    NASA Astrophysics Data System (ADS)

    Kent, Brian R.; Masters, Joseph S.; Chandler, Claire J.; Davis, Lindsey E.; Kern, Jeffrey S.; Ott, Juergen; Schinzel, Frank K.; Medlin, Drew; Muders, Dirk; Williams, Stewart; Geers, Vincent C.; Momjian, Emmanuel; Butler, Bryan J.; Nakazato, Takeshi; Sugimoto, Kanako

    2018-01-01

    We present the VLA Pipeline, software that is part of the larger pipeline processing framework used for the Karl G. Jansky Very Large Array (VLA), and Atacama Large Millimeter/sub-millimeter Array (ALMA) for both interferometric and single dish observations.Through a collection of base code jointly used by the VLA and ALMA, the pipeline builds a hierarchy of classes to execute individual atomic pipeline tasks within the Common Astronomy Software Applications (CASA) package. Each pipeline task contains heuristics designed by the team to actively decide the best processing path and execution parameters for calibration and imaging. The pipeline code is developed and written in Python and uses a "context" structure for tracking the heuristic decisions and processing results. The pipeline "weblog" acts as the user interface in verifying the quality assurance of each calibration and imaging stage. The majority of VLA scheduling blocks above 1 GHz are now processed with the standard continuum recipe of the pipeline and offer a calibrated measurement set as a basic data product to observatory users. In addition, the pipeline is used for processing data from the VLA Sky Survey (VLASS), a seven year community-driven endeavor started in September 2017 to survey the entire sky down to a declination of -40 degrees at S-band (2-4 GHz). This 5500 hour next-generation large radio survey will explore the time and spectral domains, relying on pipeline processing to generate calibrated measurement sets, polarimetry, and imaging data products that are available to the astronomical community with no proprietary period. Here we present an overview of the pipeline design philosophy, heuristics, and calibration and imaging results produced by the pipeline. Future development will include the testing of spectral line recipes, low signal-to-noise heuristics, and serving as a testing platform for science ready data products.The pipeline is developed as part of the CASA software package by an

  10. Improved coal-slurry pipeline

    NASA Technical Reports Server (NTRS)

    Dowler, W. L.

    1979-01-01

    High strength steel pipeline carries hot mixture of powdered coal and coal derived oil to electric-power-generating station. Slurry is processed along way to remove sulfur, ash, and nitrogen and to recycle part of oil. System eliminates hazards and limitations associated with anticipated coal/water-slurry pipelines.

  11. drPACS: A Simple UNIX Execution Pipeline

    NASA Astrophysics Data System (ADS)

    Teuben, P.

    2011-07-01

    We describe a very simple yet flexible and effective pipeliner for UNIX commands. It creates a Makefile to define a set of serially dependent commands. The commands in the pipeline share a common set of parameters by which they can communicate. Commands must follow a simple convention to retrieve and store parameters. Pipeline parameters can optionally be made persistent across multiple runs of the pipeline. Tools were added to simplify running a large series of pipelines, which can then also be run in parallel.

  12. Acoustic system for communication in pipelines

    DOEpatents

    Martin, II, Louis Peter; Cooper, John F [Oakland, CA

    2008-09-09

    A system for communication in a pipe, or pipeline, or network of pipes containing a fluid. The system includes an encoding and transmitting sub-system connected to the pipe, or pipeline, or network of pipes that transmits a signal in the frequency range of 3-100 kHz into the pipe, or pipeline, or network of pipes containing a fluid, and a receiver and processor sub-system connected to the pipe, or pipeline, or network of pipes containing a fluid that receives said signal and uses said signal for a desired application.

  13. High throughput light absorber discovery, Part 1: An algorithm for automated tauc analysis

    DOE PAGES

    Suram, Santosh K.; Newhouse, Paul F.; Gregoire, John M.

    2016-09-23

    High-throughput experimentation provides efficient mapping of composition-property relationships, and its implementation for the discovery of optical materials enables advancements in solar energy and other technologies. In a high throughput pipeline, automated data processing algorithms are often required to match experimental throughput, and we present an automated Tauc analysis algorithm for estimating band gap energies from optical spectroscopy data. The algorithm mimics the judgment of an expert scientist, which is demonstrated through its application to a variety of high throughput spectroscopy data, including the identification of indirect or direct band gaps in Fe 2O 3, Cu 2V 2O 7, and BiVOmore » 4. Here, the applicability of the algorithm to estimate a range of band gap energies for various materials is demonstrated by a comparison of direct-allowed band gaps estimated by expert scientists and by automated algorithm for 60 optical spectra.« less

  14. Current status and future prospects of an automated sample exchange system PAM for protein crystallography

    NASA Astrophysics Data System (ADS)

    Hiraki, M.; Yamada, Y.; Chavas, L. M. G.; Matsugaki, N.; Igarashi, N.; Wakatsuki, S.

    2013-03-01

    To achieve fully-automated and/or remote data collection in high-throughput X-ray experiments, the Structural Biology Research Centre at the Photon Factory (PF) has installed PF automated mounting system (PAM) for sample exchange robots at PF macromolecular crystallography beamlines BL-1A, BL-5A, BL-17A, AR-NW12A and AR-NE3A. We are upgrading the experimental systems, including the PAM for stable and efficient operation. To prevent human error in automated data collection, we installed a two-dimensional barcode reader for identification of the cassettes and sample pins. Because no liquid nitrogen pipeline in the PF experimental hutch is installed, the users commonly add liquid nitrogen using a small Dewar. To address this issue, an automated liquid nitrogen filling system that links a 100-liter tank to the robot Dewar has been installed on the PF macromolecular beamline. Here we describe this new implementation, as well as future prospects.

  15. Magnetic pipeline for coal and oil

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Knolle, E.

    1998-07-01

    A 1994 analysis of the recorded costs of the Alaska oil pipeline, in a paper entitled Maglev Crude Oil Pipeline, (NASA CP-3247 pp. 671--684) concluded that, had the Knolle Magnetrans pipeline technology been available and used, some $10 million per day in transportation costs could have been saved over the 20 years of the Alaska oil pipeline's existence. This over 800 mile long pipeline requires about 500 horsepower per mile in pumping power, which together with the cost of the pipeline's capital investment consumes about one-third of the energy value of the pumped oil. This does not include the costmore » of getting the oil out of the ground. The reason maglev technology performs superior to conventional pipelines is because by magnetically levitating the oil into contact-free suspense, there is no drag-causing adhesion. In addition, by using permanent magnets in repulsion, suspension is achieved without using energy. Also, the pumped oil's adhesion to the inside of pipes limits its speed. In the case of the Alaska pipeline the speed is limited to about 7 miles per hour, which, with its 48-inch pipe diameter and 1200 psi pressure, pumps about 2 million barrels per day. The maglev system, as developed by Knolle Magnetrans, would transport oil in magnetically suspended sealed containers and, thus free of adhesion, at speeds 10 to 20 times faster. Furthermore, the diameter of the levitated containers can be made smaller with the same capacity, which makes the construction of the maglev system light and inexpensive. There are similar advantages when using maglev technology to transport coal. Also, a maglev system has advantages over railroads in mountainous regions where coal is primarily mined. A maglev pipeline can travel, all-year and all weather, in a straight line to the end-user, whereas railroads have difficult circuitous routes. In contrast, a maglev pipeline can climb over steep hills without much difficulty.« less

  16. Comprehensive investigation into historical pipeline construction costs and engineering economic analysis of Alaska in-state gas pipeline

    NASA Astrophysics Data System (ADS)

    Rui, Zhenhua

    This study analyzes historical cost data of 412 pipelines and 220 compressor stations. On the basis of this analysis, the study also evaluates the feasibility of an Alaska in-state gas pipeline using Monte Carlo simulation techniques. Analysis of pipeline construction costs shows that component costs, shares of cost components, and learning rates for material and labor costs vary by diameter, length, volume, year, and location. Overall average learning rates for pipeline material and labor costs are 6.1% and 12.4%, respectively. Overall average cost shares for pipeline material, labor, miscellaneous, and right of way (ROW) are 31%, 40%, 23%, and 7%, respectively. Regression models are developed to estimate pipeline component costs for different lengths, cross-sectional areas, and locations. An analysis of inaccuracy in pipeline cost estimation demonstrates that the cost estimation of pipeline cost components is biased except for in the case of total costs. Overall overrun rates for pipeline material, labor, miscellaneous, ROW, and total costs are 4.9%, 22.4%, -0.9%, 9.1%, and 6.5%, respectively, and project size, capacity, diameter, location, and year of completion have different degrees of impacts on cost overruns of pipeline cost components. Analysis of compressor station costs shows that component costs, shares of cost components, and learning rates for material and labor costs vary in terms of capacity, year, and location. Average learning rates for compressor station material and labor costs are 12.1% and 7.48%, respectively. Overall average cost shares of material, labor, miscellaneous, and ROW are 50.6%, 27.2%, 21.5%, and 0.8%, respectively. Regression models are developed to estimate compressor station component costs in different capacities and locations. An investigation into inaccuracies in compressor station cost estimation demonstrates that the cost estimation for compressor stations is biased except for in the case of material costs. Overall average

  17. STAMPS: Software Tool for Automated MRI Post-processing on a supercomputer.

    PubMed

    Bigler, Don C; Aksu, Yaman; Miller, David J; Yang, Qing X

    2009-08-01

    This paper describes a Software Tool for Automated MRI Post-processing (STAMP) of multiple types of brain MRIs on a workstation and for parallel processing on a supercomputer (STAMPS). This software tool enables the automation of nonlinear registration for a large image set and for multiple MR image types. The tool uses standard brain MRI post-processing tools (such as SPM, FSL, and HAMMER) for multiple MR image types in a pipeline fashion. It also contains novel MRI post-processing features. The STAMP image outputs can be used to perform brain analysis using Statistical Parametric Mapping (SPM) or single-/multi-image modality brain analysis using Support Vector Machines (SVMs). Since STAMPS is PBS-based, the supercomputer may be a multi-node computer cluster or one of the latest multi-core computers.

  18. Comparisons of sediment losses from a newly constructed cross-country natural gas pipeline and an existing in-road pipeline

    Treesearch

    Pamela J. Edwards; Bridget M. Harrison; Daniel J. Holz; Karl W.J. Williard; Jon E. Schoonover

    2014-01-01

    Sediment loads were measured for about one year from natural gas pipelines in two studies in north central West Virginia. One study involved a 1-year-old pipeline buried within the bed of a 25-year-old skid road, and the other involved a newly constructed cross-country pipeline. Both pipelines were the same diameter and were installed using similar trenching and...

  19. 77 FR 27279 - Pipeline Safety: Information Collection Activities

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-09

    ... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket No... collections relate to the pipeline integrity management requirements for gas transmission pipeline operators... Management in High Consequence Areas Gas Transmission Pipeline Operators. OMB Control Number: 2137-0610...

  20. 75 FR 53733 - Pipeline Safety: Information Collection Activities

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-01

    ... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket No. PHMSA-2010-0246] Pipeline Safety: Information Collection Activities AGENCY: Pipeline and Hazardous... liquefied natural gas, hazardous liquid, and gas transmission pipeline systems operated by a company. The...

  1. 77 FR 46155 - Pipeline Safety: Information Collection Activities

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-02

    ... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket No... collections relate to the pipeline integrity management requirements for gas transmission pipeline operators... Management in High Consequence Areas Gas Transmission Pipeline Operators. OMB Control Number: 2137-0610...

  2. 78 FR 46560 - Pipeline Safety: Class Location Requirements

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-01

    ... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration 49 CFR Part... class location requirements for gas transmission pipelines. Section 5 of the Pipeline Safety, Regulatory... and, with respect to gas transmission pipeline facilities, whether applying IMP requirements to...

  3. 77 FR 15453 - Pipeline Safety: Information Collection Activities

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-15

    ... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket No... information collection titled, ``Gas Pipeline Safety Program Certification and Hazardous Liquid Pipeline... collection request that PHMSA will be submitting to OMB for renewal titled, ``Gas Pipeline Safety Program...

  4. Capsule injection system for a hydraulic capsule pipelining system

    DOEpatents

    Liu, Henry

    1982-01-01

    An injection system for injecting capsules into a hydraulic capsule pipelining system, the pipelining system comprising a pipeline adapted for flow of a carrier liquid therethrough, and capsules adapted to be transported through the pipeline by the carrier liquid flowing through the pipeline. The injection system comprises a reservoir of carrier liquid, the pipeline extending within the reservoir and extending downstream out of the reservoir, and a magazine in the reservoir for holding capsules in a series, one above another, for injection into the pipeline in the reservoir. The magazine has a lower end in communication with the pipeline in the reservoir for delivery of capsules from the magazine into the pipeline.

  5. 77 FR 51848 - Pipeline Safety: Information Collection Activities

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-27

    ... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket No... Program for Gas Distribution Pipelines. DATES: Interested persons are invited to submit comments on or.... These regulations require operators of hazardous liquid pipelines and gas pipelines to develop and...

  6. 77 FR 26822 - Pipeline Safety: Verification of Records

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-07

    ... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket No. PHMSA-2012-0068] Pipeline Safety: Verification of Records AGENCY: Pipeline and Hazardous Materials... issuing an Advisory Bulletin to remind operators of gas and hazardous liquid pipeline facilities to verify...

  7. 77 FR 74275 - Pipeline Safety: Information Collection Activities

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-13

    ... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket No.... These regulations require operators of hazardous liquid pipelines and gas pipelines to develop and... control room. Affected Public: Operators of both natural gas and hazardous liquid pipeline systems. Annual...

  8. INTERNAL REPAIR OF PIPELINES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robin Gordon; Bill Bruce; Nancy Porter

    2003-05-01

    The two broad categories of deposited weld metal repair and fiber-reinforced composite repair technologies were reviewed for potential application for internal repair of gas transmission pipelines. Both are used to some extent for other applications and could be further developed for internal, local, structural repair of gas transmission pipelines. Preliminary test programs were developed for both deposited weld metal repairs and for fiber-reinforced composite repair. To date, all of the experimental work pertaining to the evaluation of potential repair methods has focused on fiber-reinforced composite repairs. Hydrostatic testing was also conducted on four pipeline sections with simulated corrosion damage: twomore » with composite liners and two without.« less

  9. Pipeline monitoring with unmanned aerial vehicles

    NASA Astrophysics Data System (ADS)

    Kochetkova, L. I.

    2018-05-01

    Pipeline leakage during transportation of combustible substances leads to explosion and fire thus causing death of people and destruction of production and accommodation facilities. Continuous pipeline monitoring allows identifying leaks in due time and quickly taking measures for their elimination. The paper describes the solution of identification of pipeline leakage using unmanned aerial vehicles. It is recommended to apply the spectral analysis with input RGB signal to identify pipeline damages. The application of multi-zone digital images allows defining potential spill of oil hydrocarbons as well as possible soil pollution. The method of multi-temporal digital images within the visible region makes it possible to define changes in soil morphology for its subsequent analysis. The given solution is cost efficient and reliable thus allowing reducing timing and labor resources in comparison with other methods of pipeline monitoring.

  10. 75 FR 73160 - Pipeline Safety: Information Collection Activities

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-29

    ... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket No...-Related Conditions on Gas, Hazardous Liquid, and Carbon Dioxide Pipelines and Liquefied Natural Gas... Pipelines and Liquefied Natural Gas Facilities.'' The Pipeline Safety Laws (49 U.S.C. 60132) require each...

  11. Extending the Fermi-LAT Data Processing Pipeline to the Grid

    NASA Astrophysics Data System (ADS)

    Zimmer, S.; Arrabito, L.; Glanzman, T.; Johnson, T.; Lavalley, C.; Tsaregorodtsev, A.

    2012-12-01

    The Data Handling Pipeline (“Pipeline”) has been developed for the Fermi Gamma-Ray Space Telescope (Fermi) Large Area Telescope (LAT) which launched in June 2008. Since then it has been in use to completely automate the production of data quality monitoring quantities, reconstruction and routine analysis of all data received from the satellite and to deliver science products to the collaboration and the Fermi Science Support Center. Aside from the reconstruction of raw data from the satellite (Level 1), data reprocessing and various event-level analyses are also reasonably heavy loads on the pipeline and computing resources. These other loads, unlike Level 1, can run continuously for weeks or months at a time. In addition it receives heavy use in performing production Monte Carlo tasks. In daily use it receives a new data download every 3 hours and launches about 2000 jobs to process each download, typically completing the processing of the data before the next download arrives. The need for manual intervention has been reduced to less than 0.01% of submitted jobs. The Pipeline software is written almost entirely in Java and comprises several modules. The software comprises web-services that allow online monitoring and provides charts summarizing work flow aspects and performance information. The server supports communication with several batch systems such as LSF and BQS and recently also Sun Grid Engine and Condor. This is accomplished through dedicated job control services that for Fermi are running at SLAC and the other computing site involved in this large scale framework, the Lyon computing center of IN2P3. While being different in the logic of a task, we evaluate a separate interface to the Dirac system in order to communicate with EGI sites to utilize Grid resources, using dedicated Grid optimized systems rather than developing our own. More recently the Pipeline and its associated data catalog have been generalized for use by other experiments, and are

  12. Landslide and Land Subsidence Hazards to Pipelines

    USGS Publications Warehouse

    Baum, Rex L.; Galloway, Devin L.; Harp, Edwin L.

    2008-01-01

    Landslides and land subsidence pose serious hazards to pipelines throughout the world. Many existing pipeline corridors and more and more new pipelines cross terrain that is affected by either landslides, land subsidence, or both. Consequently the pipeline industry recognizes a need for increased awareness of methods for identifying and evaluating landslide and subsidence hazard for pipeline corridors. This report was prepared in cooperation with the U.S. Department of Transportation Pipeline and Hazardous Materials Safety Administration, and Pipeline Research Council International through a cooperative research and development agreement (CRADA) with DGH Consulting, Inc., to address the need for up-to-date information about current methods to identify and assess these hazards. Chapters in this report (1) describe methods for evaluating landslide hazard on a regional basis, (2) describe the various types of land subsidence hazard in the United States and available methods for identifying and quantifying subsidence, and (3) summarize current methods for investigating individual landslides. In addition to the descriptions, this report provides information about the relative costs, limitations and reliability of various methods.

  13. AutoDrug: fully automated macromolecular crystallography workflows for fragment-based drug discovery

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tsai, Yingssu; Stanford University, 333 Campus Drive, Mudd Building, Stanford, CA 94305-5080; McPhillips, Scott E.

    New software has been developed for automating the experimental and data-processing stages of fragment-based drug discovery at a macromolecular crystallography beamline. A new workflow-automation framework orchestrates beamline-control and data-analysis software while organizing results from multiple samples. AutoDrug is software based upon the scientific workflow paradigm that integrates the Stanford Synchrotron Radiation Lightsource macromolecular crystallography beamlines and third-party processing software to automate the crystallography steps of the fragment-based drug-discovery process. AutoDrug screens a cassette of fragment-soaked crystals, selects crystals for data collection based on screening results and user-specified criteria and determines optimal data-collection strategies. It then collects and processes diffraction data,more » performs molecular replacement using provided models and detects electron density that is likely to arise from bound fragments. All processes are fully automated, i.e. are performed without user interaction or supervision. Samples can be screened in groups corresponding to particular proteins, crystal forms and/or soaking conditions. A single AutoDrug run is only limited by the capacity of the sample-storage dewar at the beamline: currently 288 samples. AutoDrug was developed in conjunction with RestFlow, a new scientific workflow-automation framework. RestFlow simplifies the design of AutoDrug by managing the flow of data and the organization of results and by orchestrating the execution of computational pipeline steps. It also simplifies the execution and interaction of third-party programs and the beamline-control system. Modeling AutoDrug as a scientific workflow enables multiple variants that meet the requirements of different user groups to be developed and supported. A workflow tailored to mimic the crystallography stages comprising the drug-discovery pipeline of CoCrystal Discovery Inc. has been deployed and

  14. 49 CFR 192.755 - Protecting cast-iron pipelines.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 49 Transportation 3 2012-10-01 2012-10-01 false Protecting cast-iron pipelines. 192.755 Section... NATURAL AND OTHER GAS BY PIPELINE: MINIMUM FEDERAL SAFETY STANDARDS Maintenance § 192.755 Protecting cast... pipeline is disturbed: (a) That segment of the pipeline must be protected, as necessary, against damage...

  15. 49 CFR 192.755 - Protecting cast-iron pipelines.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 49 Transportation 3 2011-10-01 2011-10-01 false Protecting cast-iron pipelines. 192.755 Section... NATURAL AND OTHER GAS BY PIPELINE: MINIMUM FEDERAL SAFETY STANDARDS Maintenance § 192.755 Protecting cast... pipeline is disturbed: (a) That segment of the pipeline must be protected, as necessary, against damage...

  16. 49 CFR 192.755 - Protecting cast-iron pipelines.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 49 Transportation 3 2014-10-01 2014-10-01 false Protecting cast-iron pipelines. 192.755 Section... NATURAL AND OTHER GAS BY PIPELINE: MINIMUM FEDERAL SAFETY STANDARDS Maintenance § 192.755 Protecting cast... pipeline is disturbed: (a) That segment of the pipeline must be protected, as necessary, against damage...

  17. 49 CFR 192.755 - Protecting cast-iron pipelines.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 49 Transportation 3 2013-10-01 2013-10-01 false Protecting cast-iron pipelines. 192.755 Section... NATURAL AND OTHER GAS BY PIPELINE: MINIMUM FEDERAL SAFETY STANDARDS Maintenance § 192.755 Protecting cast... pipeline is disturbed: (a) That segment of the pipeline must be protected, as necessary, against damage...

  18. 30 CFR 250.1005 - Inspection requirements for DOI pipelines.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 30 Mineral Resources 2 2011-07-01 2011-07-01 false Inspection requirements for DOI pipelines. 250.1005 Section 250.1005 Mineral Resources BUREAU OF OCEAN ENERGY MANAGEMENT, REGULATION, AND ENFORCEMENT... Pipelines and Pipeline Rights-of-Way § 250.1005 Inspection requirements for DOI pipelines. (a) Pipeline...

  19. Pipeline perspective

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kern, J.J.

    1978-01-01

    The recently completed 800-mile trans-Alaska pipeline is reviewed from the perspective of its first six months of successful operation. Because of the many environmental and political constraints, the $7.7 billion project is viewed as a triumph of both engineering and capitalism. Design problems were imposed by the harsh climate and terrain and by the constant public and bureaucratic monitoring. Specifications are reviewed for the pipes, valves, river crossings, pump stations, control stations, and the terminal at Valdez, where special ballast treatment and a vapor-recovery system were required to protect the harbor's water and air quality. The article outlines operating proceduresmore » and contingency planning for the pipeline and terminal. (DCK)« less

  20. Characterization and Validation of Transiting Planets in the Kepler and TESS Pipelines

    NASA Astrophysics Data System (ADS)

    Twicken, Joseph; Brownston, Lee; Catanzarite, Joseph; Clarke, Bruce; Cote, Miles; Girouard, Forrest; Li, Jie; McCauliff, Sean; Seader, Shawn; Tenenbaum, Peter; Wohler, Bill; Jenkins, Jon Michael; Batalha, Natalie; Bryson, Steve; Burke, Christopher; Caldwell, Douglas

    2015-08-01

    Light curves for Kepler targets are searched for transiting planet signatures in the Transiting Planet Search (TPS) component of the Science Operations Center (SOC) Processing Pipeline. Targets for which the detection threshold is exceeded are subsequently processed in the Data Validation (DV) Pipeline component. The primary functions of DV are to (1) characterize planets identified in the transiting planet search, (2) search for additional transiting planet signatures in light curves after modeled transit signatures have been removed, and (3) perform a comprehensive suite of diagnostic tests to aid in discrimination between true transiting planets and false positive detections. DV output products include extensive reports by target, one-page report summaries by planet candidate, and tabulated planet model fit and diagnostic test results. The DV products are employed by humans and automated systems to vet planet candidates identified in the pipeline. The final revision of the Kepler SOC codebase (9.3) was released in March 2015. It will be utilized to reprocess the complete Q1-Q17 data set later this year. At the same time, the SOC Pipeline codebase is being ported to support the Transiting Exoplanet Survey Satellite (TESS) Mission. TESS is expected to launch in 2017 and survey the entire sky for transiting exoplanets over a period of two years. We describe the final revision of the Kepler Data Validation component with emphasis on the diagnostic tests and reports. This revision also serves as the DV baseline for TESS. The diagnostic tests exploit the flux (i.e., light curve), centroid and pixel time series associated with each target to facilitate the determination of the true origin of each purported transiting planet signature. Candidate planet detections and DV products for Kepler are delivered to the Exoplanet Archive at the NASA Exoplanet Science Institute (NExScI). The Exoplanet Archive is located at exoplanetarchive.ipac.caltech.edu. Funding for the Kepler

  1. 49 CFR 192.627 - Tapping pipelines under pressure.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 3 2010-10-01 2010-10-01 false Tapping pipelines under pressure. 192.627 Section... NATURAL AND OTHER GAS BY PIPELINE: MINIMUM FEDERAL SAFETY STANDARDS Operations § 192.627 Tapping pipelines under pressure. Each tap made on a pipeline under pressure must be performed by a crew qualified to make...

  2. Stability of subsea pipelines during large storms

    PubMed Central

    Draper, Scott; An, Hongwei; Cheng, Liang; White, David J.; Griffiths, Terry

    2015-01-01

    On-bottom stability design of subsea pipelines transporting hydrocarbons is important to ensure safety and reliability but is challenging to achieve in the onerous metocean (meteorological and oceanographic) conditions typical of large storms (such as tropical cyclones, hurricanes or typhoons). This challenge is increased by the fact that industry design guidelines presently give no guidance on how to incorporate the potential benefits of seabed mobility, which can lead to lowering and self-burial of the pipeline on a sandy seabed. In this paper, we demonstrate recent advances in experimental modelling of pipeline scour and present results investigating how pipeline stability can change in a large storm. An emphasis is placed on the initial development of the storm, where scour is inevitable on an erodible bed as the storm velocities build up to peak conditions. During this initial development, we compare the rate at which peak near-bed velocities increase in a large storm (typically less than 10−3 m s−2) to the rate at which a pipeline scours and subsequently lowers (which is dependent not only on the storm velocities, but also on the mechanism of lowering and the pipeline properties). We show that the relative magnitude of these rates influences pipeline embedment during a storm and the stability of the pipeline. PMID:25512592

  3. Commissioning of a new helium pipeline

    NASA Technical Reports Server (NTRS)

    2000-01-01

    At the commissioning of a new high-pressure helium pipeline at Kennedy Space Center, Ramon Lugo, acting executive director, JPMO , presents a plaque to Center Director Roy Bridges. The pipeline will service launch needs at the new Delta IV Complex 37 at Cape Canaveral Air Force Station. Others at the ceremony were Jerry Jorgensen, pipeline project manager, Space Gateway Support (SGS); Col. Samuel Dick, representative of the 45th Space Wing; David Herst, director, Delta IV Launch Sites; Pierre Dufour, president and CEO, Air Liquide America Corporation; and Michael Butchko, president, SGS. The nine-mile-long buried pipeline will also serve as a backup helium resource for Shuttle launches. Nearly one launch's worth of helium will be available in the pipeline to support a Shuttle pad in an emergency. The line originates at the Helium Facility on KSC and terminates in a meter station at the perimeter of the Delta IV launch pad.

  4. Commissioning of a new helium pipeline

    NASA Technical Reports Server (NTRS)

    2000-01-01

    Jerry Jorgensen, pipeline project manager, Space Gateway Support (SGS) presents an award of appreciation to H.T. Everett, KSC Propellants manager, at the commissioning of a new high-pressure helium pipeline at Kennedy Space Center. The pipeline will service launch needs at the new Delta IV Complex 37 at Cape Canaveral Air Force Station. The nine-mile-long buried pipeline will also serve as a backup helium resource for Shuttle launches. Nearly one launch's worth of helium will be available in the pipeline to support a Shuttle pad in an emergency. The line originates at the Helium Facility on KSC and terminates in a meter station at the perimeter of the Delta IV launch pad. Others at the ceremony were Center Director Roy Bridges;); Col. Samuel Dick, representative of the 45th Space Wing; Ramon Lugo, acting executive director, JPMO; David Herst, director, Delta IV Launch Sites; Pierre Dufour, president and CEO, Air Liquide America Corporation; and Michael Butchko, president, SGS.

  5. Commissioning of a new helium pipeline

    NASA Technical Reports Server (NTRS)

    2000-01-01

    Jerry Jorgensen welcomes the audience to the commissioning of a new high-pressure helium pipeline at Kennedy Space Center. Jorgensen, with Space Gateway Support (SGS), is the pipeline project manager. To the right is Ramon Lugo, acting executive director, JPMO. Others at the ceremony were Center Director Roy Bridges; Col. Samuel Dick, representative of the 45th Space Wing; David Herst, director, Delta IV Launch Sites; Pierre Dufour, president and CEO, Air Liquide America Corporation; and Michael Butchko, president, SGS. The pipeline will service launch needs at the new Delta IV Complex 37 at Cape Canaveral Air Force Station. The nine-mile-long buried pipeline will also serve as a backup helium resource for Shuttle launches. Nearly one launch's worth of helium will be available in the pipeline to support a Shuttle pad in an emergency. The line originates at the Helium Facility on KSC and terminates in a meter station at the perimeter of the Delta IV launch pad.

  6. Integrated surface management for pipeline construction: The Mid-America Pipeline Company Four Corners Project

    Treesearch

    Maria L. Sonett

    1999-01-01

    Integrated surface management techniques for pipeline construction through arid and semi-arid rangeland ecosystems are presented in a case history of a 412-mile pipeline construction project in New Mexico. Planning, implementation and monitoring for restoration of surface hydrology, soil stabilization, soil cover, and plant species succession are discussed. Planning...

  7. Pipeline transport and simultaneous saccharification of corn stover.

    PubMed

    Kumar, Amit; Cameron, Jay B; Flynn, Peter C

    2005-05-01

    Pipeline transport of corn stover delivered by truck from the field is evaluated against a range of truck transport costs. Corn stover transported by pipeline at 20% solids concentration (wet basis) or higher could directly enter an ethanol fermentation plant, and hence the investment in the pipeline inlet end processing facilities displaces comparable investment in the plant. At 20% solids, pipeline transport of corn stover costs less than trucking at capacities in excess of 1.4 M drytonnes/yr when compared to a mid range of truck transport cost (excluding any credit for economies of scale achieved in the ethanol fermentation plant from larger scale due to multiple pipelines). Pipelining of corn stover gives the opportunity to conduct simultaneous transport and saccharification (STS). If current enzymes are used, this would require elevated temperature. Heating of the slurry for STS, which in a fermentation plant is achieved from waste heat, is a significant cost element (more than 5 cents/l of ethanol) if done at the pipeline inlet unless waste heat is available, for example from an electric power plant located adjacent to the pipeline inlet. Heat loss in a 1.26 m pipeline carrying 2 M drytonnes/yr is about 5 degrees C at a distance of 400 km in typical prairie clay soils, and would not likely require insulation; smaller pipelines or different soil conditions might require insulation for STS. Saccharification in the pipeline would reduce the need for investment in the fermentation plant, saving about 0.2 cents/l of ethanol. Transport of corn stover in multiple pipelines offers the opportunity to develop a large ethanol fermentation plant, avoiding some of the diseconomies of scale that arise from smaller plants whose capacities are limited by issues of truck congestion.

  8. Fully Automated Atlas-Based Hippocampus Volumetry for Clinical Routine: Validation in Subjects with Mild Cognitive Impairment from the ADNI Cohort.

    PubMed

    Suppa, Per; Hampel, Harald; Spies, Lothar; Fiebach, Jochen B; Dubois, Bruno; Buchert, Ralph

    2015-01-01

    Hippocampus volumetry based on magnetic resonance imaging (MRI) has not yet been translated into everyday clinical diagnostic patient care, at least in part due to limited availability of appropriate software tools. In the present study, we evaluate a fully-automated and computationally efficient processing pipeline for atlas based hippocampal volumetry using freely available Statistical Parametric Mapping (SPM) software in 198 amnestic mild cognitive impairment (MCI) subjects from the Alzheimer's Disease Neuroimaging Initiative (ADNI1). Subjects were grouped into MCI stable and MCI to probable Alzheimer's disease (AD) converters according to follow-up diagnoses at 12, 24, and 36 months. Hippocampal grey matter volume (HGMV) was obtained from baseline T1-weighted MRI and then corrected for total intracranial volume and age. Average processing time per subject was less than 4 minutes on a standard PC. The area under the receiver operator characteristic curve of the corrected HGMV for identification of MCI to probable AD converters within 12, 24, and 36 months was 0.78, 0.72, and 0.71, respectively. Thus, hippocampal volume computed with the fully-automated processing pipeline provides similar power for prediction of MCI to probable AD conversion as computationally more expensive methods. The whole processing pipeline has been made freely available as an SPM8 toolbox. It is easily set up and integrated into everyday clinical patient care.

  9. Automated generation of image products for Mars Exploration Rover Mission tactical operations

    NASA Technical Reports Server (NTRS)

    Alexander, Doug; Zamani, Payam; Deen, Robert; Andres, Paul; Mortensen, Helen

    2005-01-01

    This paper will discuss, from design to implementation, the methodologies applied to MIPL's automated pipeline processing as a 'system of systems' integrated with the MER GDS. Overviews of the interconnected product generating systems will also be provided with emphasis on interdependencies, including those for a) geometric rectificationn of camera lens distortions, b) generation of stereo disparity, c) derivation of 3-dimensional coordinates in XYZ space, d) generation of unified terrain meshes, e) camera-to-target ranging (distance) and f) multi-image mosaicking.

  10. Bayesian automated cortical segmentation for neonatal MRI

    NASA Astrophysics Data System (ADS)

    Chou, Zane; Paquette, Natacha; Ganesh, Bhavana; Wang, Yalin; Ceschin, Rafael; Nelson, Marvin D.; Macyszyn, Luke; Gaonkar, Bilwaj; Panigrahy, Ashok; Lepore, Natasha

    2017-11-01

    Several attempts have been made in the past few years to develop and implement an automated segmentation of neonatal brain structural MRI. However, accurate automated MRI segmentation remains challenging in this population because of the low signal-to-noise ratio, large partial volume effects and inter-individual anatomical variability of the neonatal brain. In this paper, we propose a learning method for segmenting the whole brain cortical grey matter on neonatal T2-weighted images. We trained our algorithm using a neonatal dataset composed of 3 fullterm and 4 preterm infants scanned at term equivalent age. Our segmentation pipeline combines the FAST algorithm from the FSL library software and a Bayesian segmentation approach to create a threshold matrix that minimizes the error of mislabeling brain tissue types. Our method shows promising results with our pilot training set. In both preterm and full-term neonates, automated Bayesian segmentation generates a smoother and more consistent parcellation compared to FAST, while successfully removing the subcortical structure and cleaning the edges of the cortical grey matter. This method show promising refinement of the FAST segmentation by considerably reducing manual input and editing required from the user, and further improving reliability and processing time of neonatal MR images. Further improvement will include a larger dataset of training images acquired from different manufacturers.

  11. Multi-atlas segmentation of subcortical brain structures via the AutoSeg software pipeline

    PubMed Central

    Wang, Jiahui; Vachet, Clement; Rumple, Ashley; Gouttard, Sylvain; Ouziel, Clémentine; Perrot, Emilie; Du, Guangwei; Huang, Xuemei; Gerig, Guido; Styner, Martin

    2014-01-01

    Automated segmenting and labeling of individual brain anatomical regions, in MRI are challenging, due to the issue of individual structural variability. Although atlas-based segmentation has shown its potential for both tissue and structure segmentation, due to the inherent natural variability as well as disease-related changes in MR appearance, a single atlas image is often inappropriate to represent the full population of datasets processed in a given neuroimaging study. As an alternative for the case of single atlas segmentation, the use of multiple atlases alongside label fusion techniques has been introduced using a set of individual “atlases” that encompasses the expected variability in the studied population. In our study, we proposed a multi-atlas segmentation scheme with a novel graph-based atlas selection technique. We first paired and co-registered all atlases and the subject MR scans. A directed graph with edge weights based on intensity and shape similarity between all MR scans is then computed. The set of neighboring templates is selected via clustering of the graph. Finally, weighted majority voting is employed to create the final segmentation over the selected atlases. This multi-atlas segmentation scheme is used to extend a single-atlas-based segmentation toolkit entitled AutoSeg, which is an open-source, extensible C++ based software pipeline employing BatchMake for its pipeline scripting, developed at the Neuro Image Research and Analysis Laboratories of the University of North Carolina at Chapel Hill. AutoSeg performs N4 intensity inhomogeneity correction, rigid registration to a common template space, automated brain tissue classification based skull-stripping, and the multi-atlas segmentation. The multi-atlas-based AutoSeg has been evaluated on subcortical structure segmentation with a testing dataset of 20 adult brain MRI scans and 15 atlas MRI scans. The AutoSeg achieved mean Dice coefficients of 81.73% for the subcortical structures

  12. California crude-pipeline plans detailed

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ronco, M.J.

    1986-06-09

    California and the U.S. West have recently become a center for crude-oil pipeline activity. That activity includes existing and proposed lines, offshore and onshore terminals, and some unusual permitting and construction requirements. Operation of existing pipelines is influenced by the varying gravities of crudes in the area. California has three distinct producing areas from which pipelines deliver crude to refineries or marines terminals: 1. The inland Los Angeles basin and coast from Orange County to Ventura County. 2. The San Joaquin Valley in central California which is between the coastal mountains and the Sierras. 3. That portion of the Outermore » Continental Shelf (OCS) located primarily in federal waters off Santa Barbara and San Luis Obispo counties on the central coast. The Los Angeles coastal and inland basin crude-oil pipeline system consists of gathering lines to move crude from the many wells throughout Ventura, Orange, and Los Angeles counties to operating refineries in the greater Los Angeles area. Major refineries include ARCO at Carson, Chevron at El Segundo, Mobil at Torrance, and Shell, Texaco, and Unical at Wilmington. The many different crude-oil pipelines serving these refineries from Ventura County and Orange County and from the many sites around Los Angeles County are too numerous to list.« less

  13. Redefining the Data Pipeline Using GPUs

    NASA Astrophysics Data System (ADS)

    Warner, C.; Eikenberry, S. S.; Gonzalez, A. H.; Packham, C.

    2013-10-01

    There are two major challenges facing the next generation of data processing pipelines: 1) handling an ever increasing volume of data as array sizes continue to increase and 2) the desire to process data in near real-time to maximize observing efficiency by providing rapid feedback on data quality. Combining the power of modern graphics processing units (GPUs), relational database management systems (RDBMSs), and extensible markup language (XML) to re-imagine traditional data pipelines will allow us to meet these challenges. Modern GPUs contain hundreds of processing cores, each of which can process hundreds of threads concurrently. Technologies such as Nvidia's Compute Unified Device Architecture (CUDA) platform and the PyCUDA (http://mathema.tician.de/software/pycuda) module for Python allow us to write parallel algorithms and easily link GPU-optimized code into existing data pipeline frameworks. This approach has produced speed gains of over a factor of 100 compared to CPU implementations for individual algorithms and overall pipeline speed gains of a factor of 10-25 compared to traditionally built data pipelines for both imaging and spectroscopy (Warner et al., 2011). However, there are still many bottlenecks inherent in the design of traditional data pipelines. For instance, file input/output of intermediate steps is now a significant portion of the overall processing time. In addition, most traditional pipelines are not designed to be able to process data on-the-fly in real time. We present a model for a next-generation data pipeline that has the flexibility to process data in near real-time at the observatory as well as to automatically process huge archives of past data by using a simple XML configuration file. XML is ideal for describing both the dataset and the processes that will be applied to the data. Meta-data for the datasets would be stored using an RDBMS (such as mysql or PostgreSQL) which

  14. Cellular and Biophysical Pipeline for the Screening of Peroxisome Proliferator-Activated Receptor Beta/Delta Agonists: Avoiding False Positives

    PubMed Central

    Batista, Fernanda Aparecida Heleno

    2018-01-01

    Peroxisome proliferator-activated receptor beta/delta (PPARß/δ) is considered a therapeutic target for metabolic disorders, cancer, and cardiovascular diseases. Here, we developed one pipeline for the screening of PPARß/δ agonists, which reduces the cost, time, and false-positive hits. The first step is an optimized 3-day long cellular transactivation assay based on reporter-gene technology, which is supported by automated liquid-handlers. This primary screening is followed by a confirmatory transactivation assay and by two biophysical validation methods (thermal shift assay (TSA) and (ANS) fluorescence quenching), which allow the calculation of the affinity constant, giving more information about the selected hits. All of the assays were validated using well-known commercial agonists providing trustworthy data. Furthermore, to validate and test this pipeline, we screened a natural extract library (560 extracts), and we found one plant extract that might be interesting for PPARß/δ modulation. In conclusion, our results suggested that we developed a cheaper and more robust pipeline that goes beyond the single activation screening, as it also evaluates PPARß/δ tertiary structure stabilization and the ligand affinity constant, selecting only molecules that directly bind to the receptor. Moreover, this approach might improve the effectiveness of the screening for agonists that target PPARß/δ for drug development.

  15. The Oman-India gas pipeline

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roberts, P.M.

    1995-12-31

    In March 1993, the Governments of the Sultanate of Oman and India executed a Memorandum of Understanding for a long term Gas Supply Contract to transport natural gas from Oman to India by pipeline. A feasibility study was undertaken to determine if such a pipeline was technically achievable and economically attractive. Work was initiated with a consortium of internationally recognized major design and construction firms, as well as with consultants knowledgeable in gas supply and demand in the region. Alternative gas supply volumes as well as two distinct pipeline routes were analyzed in significant detail. As a result of thismore » work, it was concluded that a pipeline crossing, taking a direct route from Oman to India, is economically and technically feasible. In September, 1994, the Agreement on Principal Terms for supply of gas to India from Oman was agreed by the respective governmental authorities. The project and its status are described.« less

  16. Nearshore Pipeline Installation Methods.

    DTIC Science & Technology

    1981-08-01

    inches b) Pipe, materials of construction: fully rigid, semi-rigid, flexible c) Pipeline length, maximum 2 miles d) Pipeline design life , minimum 15...common to their operations. Permanent facilities are specified in the Statement of Work. There- fore, a minimum design life of 15 years is chosen, which...makes the pipe leakproof and resists corrosion and abrasion. 5) Interlocked Z-shaped steel or stainless steel carcass - resists internal and external

  17. SACCHARIS: an automated pipeline to streamline discovery of carbohydrate active enzyme activities within polyspecific families and de novo sequence datasets.

    PubMed

    Jones, Darryl R; Thomas, Dallas; Alger, Nicholas; Ghavidel, Ata; Inglis, G Douglas; Abbott, D Wade

    2018-01-01

    Deposition of new genetic sequences in online databases is expanding at an unprecedented rate. As a result, sequence identification continues to outpace functional characterization of carbohydrate active enzymes (CAZymes). In this paradigm, the discovery of enzymes with novel functions is often hindered by high volumes of uncharacterized sequences particularly when the enzyme sequence belongs to a family that exhibits diverse functional specificities (i.e., polyspecificity). Therefore, to direct sequence-based discovery and characterization of new enzyme activities we have developed an automated in silico pipeline entitled: Sequence Analysis and Clustering of CarboHydrate Active enzymes for Rapid Informed prediction of Specificity (SACCHARIS). This pipeline streamlines the selection of uncharacterized sequences for discovery of new CAZyme or CBM specificity from families currently maintained on the CAZy website or within user-defined datasets. SACCHARIS was used to generate a phylogenetic tree of a GH43, a CAZyme family with defined subfamily designations. This analysis confirmed that large datasets can be organized into sequence clusters of manageable sizes that possess related functions. Seeding this tree with a GH43 sequence from Bacteroides dorei DSM 17855 (BdGH43b, revealed it partitioned as a single sequence within the tree. This pattern was consistent with it possessing a unique enzyme activity for GH43 as BdGH43b is the first described α-glucanase described for this family. The capacity of SACCHARIS to extract and cluster characterized carbohydrate binding module sequences was demonstrated using family 6 CBMs (i.e., CBM6s). This CBM family displays a polyspecific ligand binding profile and contains many structurally determined members. Using SACCHARIS to identify a cluster of divergent sequences, a CBM6 sequence from a unique clade was demonstrated to bind yeast mannan, which represents the first description of an α-mannan binding CBM. Additionally, we

  18. Ub-ISAP: a streamlined UNIX pipeline for mining unique viral vector integration sites from next generation sequencing data.

    PubMed

    Kamboj, Atul; Hallwirth, Claus V; Alexander, Ian E; McCowage, Geoffrey B; Kramer, Belinda

    2017-06-17

    The analysis of viral vector genomic integration sites is an important component in assessing the safety and efficiency of patient treatment using gene therapy. Alongside this clinical application, integration site identification is a key step in the genetic mapping of viral elements in mutagenesis screens that aim to elucidate gene function. We have developed a UNIX-based vector integration site analysis pipeline (Ub-ISAP) that utilises a UNIX-based workflow for automated integration site identification and annotation of both single and paired-end sequencing reads. Reads that contain viral sequences of interest are selected and aligned to the host genome, and unique integration sites are then classified as transcription start site-proximal, intragenic or intergenic. Ub-ISAP provides a reliable and efficient pipeline to generate large datasets for assessing the safety and efficiency of integrating vectors in clinical settings, with broader applications in cancer research. Ub-ISAP is available as an open source software package at https://sourceforge.net/projects/ub-isap/ .

  19. ViennaNGS: A toolbox for building efficient next- generation sequencing analysis pipelines

    PubMed Central

    Wolfinger, Michael T.; Fallmann, Jörg; Eggenhofer, Florian; Amman, Fabian

    2015-01-01

    Recent achievements in next-generation sequencing (NGS) technologies lead to a high demand for reuseable software components to easily compile customized analysis workflows for big genomics data. We present ViennaNGS, an integrated collection of Perl modules focused on building efficient pipelines for NGS data processing. It comes with functionality for extracting and converting features from common NGS file formats, computation and evaluation of read mapping statistics, as well as normalization of RNA abundance. Moreover, ViennaNGS provides software components for identification and characterization of splice junctions from RNA-seq data, parsing and condensing sequence motif data, automated construction of Assembly and Track Hubs for the UCSC genome browser, as well as wrapper routines for a set of commonly used NGS command line tools. PMID:26236465

  20. 78 FR 52820 - Pipeline Safety: Request for Special Permit

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-26

    ... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket No. PHMSA-2013-0181] Pipeline Safety: Request for Special Permit AGENCY: Pipeline and Hazardous Materials Safety Administration (PHMSA); DOT. ACTION: Notice. SUMMARY: Pursuant to the Federal pipeline safety laws...

  1. 78 FR 52821 - Pipeline Safety: Request for Special Permit

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-26

    ... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket No. PHMSA-2013-0146] Pipeline Safety: Request for Special Permit AGENCY: Pipeline and Hazardous Materials Safety Administration (PHMSA); DOT. ACTION: Notice. SUMMARY: Pursuant to the Federal pipeline safety laws...

  2. 78 FR 5866 - Pipeline Safety: Annual Reports and Validation

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-28

    ... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket ID PHMSA-2012-0319] Pipeline Safety: Annual Reports and Validation AGENCY: Pipeline and Hazardous Materials... 2012 gas transmission and gathering annual reports, remind pipeline owners and operators to validate...

  3. 75 FR 43612 - Pipeline Safety: Request for Special Permit

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-26

    ... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket No. PHMSA-2010-0042] Pipeline Safety: Request for Special Permit AGENCY: Pipeline and Hazardous Materials..., Inc., a natural gas pipeline operator, seeking relief from compliance with certain requirements in the...

  4. 77 FR 34458 - Pipeline Safety: Requests for Special Permit

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-11

    ... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket No. PHMSA-2012-0112] Pipeline Safety: Requests for Special Permit AGENCY: Pipeline and Hazardous Materials... BreitBurn Energy Company LP, two natural gas pipeline operators, seeking relief from compliance with...

  5. 75 FR 66425 - Pipeline Safety: Request for Special Permit

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-28

    ... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket No. PHMSA-2010-0124] Pipeline Safety: Request for Special Permit AGENCY: Pipeline and Hazardous Materials... Company, LP, a natural gas pipeline operator, seeking relief from compliance with certain requirements in...

  6. 78 FR 14877 - Pipeline Safety: Incident and Accident Reports

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-07

    ... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket ID PHMSA-2013-0028] Pipeline Safety: Incident and Accident Reports AGENCY: Pipeline and Hazardous Materials... PHMSA F 7100.2--Incident Report--Natural and Other Gas Transmission and Gathering Pipeline Systems and...

  7. 75 FR 9018 - Pipeline Safety: Random Drug Testing Rate

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-26

    ... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket ID PHMSA-2010-0034] Pipeline Safety: Random Drug Testing Rate AGENCY: Pipeline and Hazardous Materials... pipelines and operators of liquefied natural gas facilities must select and test a percentage of covered...

  8. 77 FR 2606 - Pipeline Safety: Random Drug Testing Rate

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-01-18

    ... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket ID PHMSA-2012-0004] Pipeline Safety: Random Drug Testing Rate AGENCY: Pipeline and Hazardous Materials... pipelines and operators of liquefied natural gas facilities must select and test a percentage of covered...

  9. Regulatory reform for natural gas pipelines: The effect on pipeline and distribution company share prices

    NASA Astrophysics Data System (ADS)

    Jurman, Elisabeth Antonie

    1997-08-01

    The natural gas shortages in the 1970s focused considerable attention on the federal government's role in altering energy consumption. For the natural gas industry these shortages eventually led to the passage of the Natural Gas Policy Act (NGPA) in 1978 as part of the National Energy Plan. A series of events in the decade of the 1980s has brought about the restructuring of interstate natural gas pipelines which have been transformed by regulators and the courts from monopolies into competitive entities. This transformation also changed their relationship with their downstream customers, the LDCs, who no longer had to deal with pipelines as the only merchants of gas. Regulatory reform made it possible for LDCs to buy directly from producers using the pipelines only for delivery of their purchases. This study tests for the existence of monopoly rents by analyzing the daily returns of natural gas pipeline and utility industry stock price data from 1982 to 1990, a period of regulatory reform for the natural gas industry. The study's main objective is to investigate the degree of empirical support for claims that regulatory reforms increase profits in the affected industry, as the normative theory of regulation expects, or decrease profits, as advocates of the positive theory of regulation believe. I also test Norton's theory of risk which predicts that systematic risk will increase for firms undergoing deregulation. Based on a sample of twelve natural gas pipelines, and 25 utilities an event study concept was employed to measure the impact of regulatory event announcements on daily natural gas pipeline or utility industry stock price data using a market model regression equation. The results of this study provide some evidence that regulatory reforms did not increase the profits of pipeline firms, confirming the expectations of those who claim that excess profits result from regulation and will disappear, once that protection is removed and the firms are operating in

  10. InterPred: A pipeline to identify and model protein-protein interactions.

    PubMed

    Mirabello, Claudio; Wallner, Björn

    2017-06-01

    Protein-protein interactions (PPI) are crucial for protein function. There exist many techniques to identify PPIs experimentally, but to determine the interactions in molecular detail is still difficult and very time-consuming. The fact that the number of PPIs is vastly larger than the number of individual proteins makes it practically impossible to characterize all interactions experimentally. Computational approaches that can bridge this gap and predict PPIs and model the interactions in molecular detail are greatly needed. Here we present InterPred, a fully automated pipeline that predicts and model PPIs from sequence using structural modeling combined with massive structural comparisons and molecular docking. A key component of the method is the use of a novel random forest classifier that integrate several structural features to distinguish correct from incorrect protein-protein interaction models. We show that InterPred represents a major improvement in protein-protein interaction detection with a performance comparable or better than experimental high-throughput techniques. We also show that our full-atom protein-protein complex modeling pipeline performs better than state of the art protein docking methods on a standard benchmark set. In addition, InterPred was also one of the top predictors in the latest CAPRI37 experiment. InterPred source code can be downloaded from http://wallnerlab.org/InterPred Proteins 2017; 85:1159-1170. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  11. A graph-based approach for designing extensible pipelines

    PubMed Central

    2012-01-01

    Background In bioinformatics, it is important to build extensible and low-maintenance systems that are able to deal with the new tools and data formats that are constantly being developed. The traditional and simplest implementation of pipelines involves hardcoding the execution steps into programs or scripts. This approach can lead to problems when a pipeline is expanding because the incorporation of new tools is often error prone and time consuming. Current approaches to pipeline development such as workflow management systems focus on analysis tasks that are systematically repeated without significant changes in their course of execution, such as genome annotation. However, more dynamism on the pipeline composition is necessary when each execution requires a different combination of steps. Results We propose a graph-based approach to implement extensible and low-maintenance pipelines that is suitable for pipeline applications with multiple functionalities that require different combinations of steps in each execution. Here pipelines are composed automatically by compiling a specialised set of tools on demand, depending on the functionality required, instead of specifying every sequence of tools in advance. We represent the connectivity of pipeline components with a directed graph in which components are the graph edges, their inputs and outputs are the graph nodes, and the paths through the graph are pipelines. To that end, we developed special data structures and a pipeline system algorithm. We demonstrate the applicability of our approach by implementing a format conversion pipeline for the fields of population genetics and genetic epidemiology, but our approach is also helpful in other fields where the use of multiple software is necessary to perform comprehensive analyses, such as gene expression and proteomics analyses. The project code, documentation and the Java executables are available under an open source license at http://code.google.com/p/dynamic-pipeline

  12. Towards an Automated Pipeline for the Translation and Optimization of Geospatial Data for Virtual Environments

    DTIC Science & Technology

    2008-12-01

    clearly observed in the game industry ( Introversion , 2008). Currently there are many tools available to assist in automating the production of large...Graphics and Interactive Techniques, Melbourne, Australia, February 11 – 14. Introversion Software, 2008: Procedural Content Generation. http

  13. 78 FR 65429 - Pipeline Safety: Request for Special Permit

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-31

    ... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket No. PHMSA-2010-0041] Pipeline Safety: Request for Special Permit AGENCY: Pipeline and Hazardous Materials...-0041 Williams Gas Pipeline 49 CFR 192.150........ To authorize the extension Company, LLC (WGP). of a...

  14. 76 FR 11853 - Pipeline Safety: Request for Special Permit

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-03

    ... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket No. PHMSA-2011-0027] Pipeline Safety: Request for Special Permit AGENCY: Pipeline and Hazardous Materials... a 24-inch mainline natural gas pipeline, 595 feet in length. The first segment of the special permit...

  15. Lessons Learned from Developing and Operating the Kepler Science Pipeline and Building the TESS Science Pipeline

    NASA Technical Reports Server (NTRS)

    Jenkins, Jon M.

    2017-01-01

    The experience acquired through development, implementation and operation of the KeplerK2 science pipelines can provide lessons learned for the development of science pipelines for other missions such as NASA's Transiting Exoplanet Survey Satellite, and ESA's PLATO mission.

  16. An automated, open-source (NASA Ames Stereo Pipeline) workflow for mass production of high-resolution DEMs from commercial stereo satellite imagery: Application to mountain glacies in the contiguous US

    NASA Astrophysics Data System (ADS)

    Shean, D. E.; Arendt, A. A.; Whorton, E.; Riedel, J. L.; O'Neel, S.; Fountain, A. G.; Joughin, I. R.

    2016-12-01

    We adapted the open source NASA Ames Stereo Pipeline (ASP) to generate digital elevation models (DEMs) and orthoimages from very-high-resolution (VHR) commercial imagery of the Earth. These modifications include support for rigorous and rational polynomial coefficient (RPC) sensor models, sensor geometry correction, bundle adjustment, point cloud co-registration, and significant improvements to the ASP code base. We outline an automated processing workflow for 0.5 m GSD DigitalGlobe WorldView-1/2/3 and GeoEye-1 along-track and cross-track stereo image data. Output DEM products are posted at 2, 8, and 32 m with direct geolocation accuracy of <5.0 m CE90/LE90. An automated iterative closest-point (ICP) co-registration tool reduces absolute vertical and horizontal error to <0.5­ m where appropriate ground-control data are available, with observed standard deviation of 0.1-0.5 m for overlapping, co-registered DEMs (n=14,17). While ASP can be used to process individual stereo pairs on a local workstation, the methods presented here were developed for large-scale batch processing in a high-performance computing environment. We have leveraged these resources to produce dense time series and regional mosaics for the Earth's ice sheets. We are now processing and analyzing all available 2008-2016 commercial stereo DEMs over glaciers and perennial snowfields in the contiguous US. We are using these records to study long-term, interannual, and seasonal volume change and glacier mass balance. This analysis will provide a new assessment of regional climate change, and will offer basin-scale analyses of snowpack evolution and snow/ice melt runoff for water resource applications.

  17. 75 FR 40863 - Pipeline Safety: Information Collection Activities

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-14

    ... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket No. PHMSA-2010-0193] Pipeline Safety: Information Collection Activities AGENCY: Pipeline and Hazardous Materials Safety Administration (PHMSA), DOT. ACTION: Notice and request for comments. SUMMARY: In...

  18. 77 FR 74276 - Pipeline Safety: Information Collection Activities

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-13

    ... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket No. PHMSA-2012-0302] Pipeline Safety: Information Collection Activities AGENCY: Pipeline and Hazardous Materials Safety Administration (PHMSA), DOT. ACTION: Notice and request for comments. SUMMARY: In...

  19. 76 FR 50539 - Pipeline Safety: Information Collection Activities

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-15

    ... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket No. PHMSA-2011-0136] Pipeline Safety: Information Collection Activities AGENCY: Pipeline and Hazardous Materials Safety Administration (PHMSA), DOT. ACTION: Notice and request for comments. SUMMARY: In...

  20. 75 FR 35516 - Pipeline Safety: Request for Special Permit

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-22

    ... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket No. PHMSA-2010-0147] Pipeline Safety: Request for Special Permit AGENCY: Pipeline and Hazardous Materials... with the Class 1 location portion of a 7.4 mile natural gas pipeline to be constructed in Alaska. This...

  1. 49 CFR 192.513 - Test requirements for plastic pipelines.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 49 Transportation 3 2011-10-01 2011-10-01 false Test requirements for plastic pipelines. 192.513 Section 192.513 Transportation Other Regulations Relating to Transportation (Continued) PIPELINE AND... Test requirements for plastic pipelines. (a) Each segment of a plastic pipeline must be tested in...

  2. 49 CFR 192.513 - Test requirements for plastic pipelines.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 3 2010-10-01 2010-10-01 false Test requirements for plastic pipelines. 192.513 Section 192.513 Transportation Other Regulations Relating to Transportation (Continued) PIPELINE AND... Test requirements for plastic pipelines. (a) Each segment of a plastic pipeline must be tested in...

  3. 49 CFR 192.513 - Test requirements for plastic pipelines.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 49 Transportation 3 2013-10-01 2013-10-01 false Test requirements for plastic pipelines. 192.513 Section 192.513 Transportation Other Regulations Relating to Transportation (Continued) PIPELINE AND... Test requirements for plastic pipelines. (a) Each segment of a plastic pipeline must be tested in...

  4. 49 CFR 192.513 - Test requirements for plastic pipelines.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 49 Transportation 3 2014-10-01 2014-10-01 false Test requirements for plastic pipelines. 192.513 Section 192.513 Transportation Other Regulations Relating to Transportation (Continued) PIPELINE AND... Test requirements for plastic pipelines. (a) Each segment of a plastic pipeline must be tested in...

  5. 49 CFR 192.513 - Test requirements for plastic pipelines.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 49 Transportation 3 2012-10-01 2012-10-01 false Test requirements for plastic pipelines. 192.513 Section 192.513 Transportation Other Regulations Relating to Transportation (Continued) PIPELINE AND... Test requirements for plastic pipelines. (a) Each segment of a plastic pipeline must be tested in...

  6. An Automated, High-Throughput System for GISAXS and GIWAXS Measurements of Thin Films

    NASA Astrophysics Data System (ADS)

    Schaible, Eric; Jimenez, Jessica; Church, Matthew; Lim, Eunhee; Stewart, Polite; Hexemer, Alexander

    Grazing incidence small-angle X-ray scattering (GISAXS) and grazing incidence wide-angle X-ray scattering (GIWAXS) are important techniques for characterizing thin films. In order to meet rapidly increasing demand, the SAXSWAXS beamline at the Advanced Light Source (beamline 7.3.3) has implemented a fully automated, high-throughput system to conduct SAXS, GISAXS and GIWAXS measurements. An automated robot arm transfers samples from a holding tray to a measurement stage. Intelligent software aligns each sample in turn, and measures each according to user-defined specifications. Users mail in trays of samples on individually barcoded pucks, and can download and view their data remotely. Data will be pipelined to the NERSC supercomputing facility, and will be available to users via a web portal that facilitates highly parallelized analysis.

  7. 78 FR 16764 - Pipeline Safety: Information Collection Activities

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-18

    ... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket No. PHMSA-2012-0302] Pipeline Safety: Information Collection Activities AGENCY: Pipeline and Hazardous Materials Safety Administration (PHMSA), DOT. ACTION: Notice and Request for Comments on a Previously...

  8. 76 FR 70217 - Pipeline Safety: Information Collection Activities

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-10

    ... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Administration [Docket No. PHMSA... OMB approval of new Information Collection. AGENCY: Pipeline and Hazardous Materials Safety... Pipeline and Hazardous Materials Safety Administration (PHMSA) published a notice in the Federal Register...

  9. Pipeline Reduction of Binary Light Curves from Large-Scale Surveys

    NASA Astrophysics Data System (ADS)

    Prša, Andrej; Zwitter, Tomaž

    2007-08-01

    One of the most important changes in observational astronomy of the 21st Century is a rapid shift from classical object-by-object observations to extensive automatic surveys. As CCD detectors are getting better and their prices are getting lower, more and more small and medium-size observatories are refocusing their attention to detection of stellar variability through systematic sky-scanning missions. This trend is additionally powered by the success of pioneering surveys such as ASAS, DENIS, OGLE, TASS, their space counterpart Hipparcos and others. Such surveys produce massive amounts of data and it is not at all clear how these data are to be reduced and analysed. This is especially striking in the eclipsing binary (EB) field, where most frequently used tools are optimized for object-by-object analysis. A clear need for thorough, reliable and fully automated approaches to modeling and analysis of EB data is thus obvious. This task is very difficult because of limited data quality, non-uniform phase coverage and parameter degeneracy. The talk will review recent advancements in putting together semi-automatic and fully automatic pipelines for EB data processing. Automatic procedures have already been used to process the Hipparcos data, LMC/SMC observations, OGLE and ASAS catalogs etc. We shall discuss the advantages and shortcomings of these procedures and overview the current status of automatic EB modeling pipelines for the upcoming missions such as CoRoT, Kepler, Gaia and others.

  10. 76 FR 21423 - Pipeline Safety: Request for Special Permit

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-15

    ... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket No. PHMSA-2011-0063] Pipeline Safety: Request for Special Permit AGENCY: Pipeline and Hazardous Materials... application is for two 30-inch segments, segments 3 and 4, of the TPL 330 natural gas pipeline located in St...

  11. Pipeline safety : the office of pipeline safety is changing how it oversees the pipeline industry

    DOT National Transportation Integrated Search

    2000-05-01

    Pipelines are inherently safer to the public than other modes of freight transportation for natural gas and hazardous liquids (such as oil products) because they are, for the most part, located underground. Nevertheless, the volatile nature of these ...

  12. Pipeline Safety: The Office of Pipeline Safety Is Changing How It Oversees the Pipeline Industry

    DOT National Transportation Integrated Search

    2000-05-01

    Pipelines are inherently safer to the public than other modes of freight transportation for natural gas and hazardous liquids (such as oil products) because they are, for the most part, located underground. Nevertheless, the volatile nature of these ...

  13. 75 FR 4610 - Pipeline Safety: Information Collection Activities

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-01-28

    ... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket No. PHMSA-2009-0375] Pipeline Safety: Information Collection Activities AGENCY: Pipeline and Hazardous Materials Safety Administration. ACTION: Notice and request for comments. SUMMARY: On November 24, 2009, as...

  14. 75 FR 67807 - Pipeline Safety: Emergency Preparedness Communications

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-03

    ... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket No... is issuing an Advisory Bulletin to remind operators of gas and hazardous liquid pipeline facilities... Gas Pipeline Systems. Subject: Emergency Preparedness Communications. Advisory: To further enhance the...

  15. 76 FR 65778 - Pipeline Safety: Information Collection Activities

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-24

    ... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket No...: 12,120. Frequency of Collection: On occasion. 2. Title: Recordkeeping for Natural Gas Pipeline... investigating incidents. Affected Public: Operators of natural gas pipeline systems. Annual Reporting and...

  16. Seismic hazard exposure for the Trans-Alaska Pipeline

    USGS Publications Warehouse

    Cluff, L.S.; Page, R.A.; Slemmons, D.B.; Grouse, C.B.; ,

    2003-01-01

    The discovery of oil on Alaska's North Slope and the construction of a pipeline to transport that oil across Alaska coincided with the National Environmental Policy Act of 1969 and a destructive Southern California earthquake in 1971 to cause stringent stipulations, state-of-the-art investigations, and innovative design for the pipeline. The magnitude 7.9 earthquake on the Denali fault in November 2002 was remarkably consistent with the design earthquake and fault displacement postulated for the Denali crossing of the Trans-Alaska Pipeline route. The pipeline maintained its integrity, and disaster was averted. Recent probabilistic studies to update previous hazard exposure conclusions suggest continuing pipeline integrity.

  17. Automated segmentation of pulmonary structures in thoracic computed tomography scans: a review

    NASA Astrophysics Data System (ADS)

    van Rikxoort, Eva M.; van Ginneken, Bram

    2013-09-01

    Computed tomography (CT) is the modality of choice for imaging the lungs in vivo. Sub-millimeter isotropic images of the lungs can be obtained within seconds, allowing the detection of small lesions and detailed analysis of disease processes. The high resolution of thoracic CT and the high prevalence of lung diseases require a high degree of automation in the analysis pipeline. The automated segmentation of pulmonary structures in thoracic CT has been an important research topic for over a decade now. This systematic review provides an overview of current literature. We discuss segmentation methods for the lungs, the pulmonary vasculature, the airways, including airway tree construction and airway wall segmentation, the fissures, the lobes and the pulmonary segments. For each topic, the current state of the art is summarized, and topics for future research are identified.

  18. The School-to-Prison Pipeline

    ERIC Educational Resources Information Center

    Elias, Marilyn

    2013-01-01

    Policies that encourage police presence at schools, harsh tactics including physical restraint, and automatic punishments that result in suspensions and out-of-class time are huge contributors to the school-to-prison pipeline, but the problem is more complex than that. The school-to-prison pipeline starts (or is best avoided) in the classroom.…

  19. 75 FR 13807 - Pipeline Safety: Information Collection Activities

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-23

    ... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket No... of Transportation, Pipeline and Hazardous Materials Safety Administration, 1200 New Jersey Avenue, SE...: Updates to Pipeline and Liquefied Natural Gas Reporting Requirements (One Rule). The Notice of Proposed...

  20. 76 FR 45904 - Pipeline Safety: Information Collection Activities

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-01

    ... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket No... at U.S. Department of Transportation, Pipeline and Hazardous Materials Safety Administration, 1200...: On Occasion. Title: Record Keeping for Natural Gas Pipeline Operators. OMB Control Number: 2137-0049...

  1. Pipeline enhances Norman Wells potential

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    Approval of an oil pipeline from halfway down Canada's MacKenzie River Valley at Norman Wells to N. Alberta has raised the potential for development of large reserves along with controversy over native claims. The project involves 2 closely related proposals. One, by Esso Resources, the exploration and production unit of Imperial Oil, will increase oil production from the Norman Wells field from 3000 bpd currently to 25,000 bpd. The other proposal, by Interprovincial Pipeline (N.W) Ltd., calls for construction of an underground pipeline to transport the additional production from Norman Wells to Alberta. The 560-mile, 12-in. pipeline will extend frommore » Norman Wells, which is 90 miles south of the Arctic Circle on the north shore of the Mackenzie River, south to the end of an existing line at Zama in N. Alberta. There will be 3 pumping stations en route. This work also discusses recovery, potential, drilling limitations, the processing plant, positive impact, and further development of the Norman Wells project.« less

  2. Project Report: Active Pipeline Encroachment Detector (Phase I)

    DOT National Transportation Integrated Search

    2008-06-10

    Of the many pipeline accident causes that occur to oil and gas pipelines, approximately 40% of are caused by third-party excavating activities into the buried pipeline right of way (ROW). According to DOT statistics, excavation damage is the second l...

  3. Airborne LIDAR Pipeline Inspection System (ALPIS) Mapping Tests

    DOT National Transportation Integrated Search

    2003-06-06

    Natural gas and hazardous liquid pipeline operators have a need to identify where leaks are occurring along their pipelines in order to lower the risks the pipelines pose to people and the environment. Current methods of locating natural gas and haza...

  4. 49 CFR 195.210 - Pipeline location.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... Transportation Other Regulations Relating to Transportation (Continued) PIPELINE AND HAZARDOUS MATERIALS SAFETY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED) PIPELINE SAFETY TRANSPORTATION OF HAZARDOUS LIQUIDS BY... far as practicable, areas containing private dwellings, industrial buildings, and places of public...

  5. 49 CFR 195.210 - Pipeline location.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... Transportation Other Regulations Relating to Transportation (Continued) PIPELINE AND HAZARDOUS MATERIALS SAFETY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED) PIPELINE SAFETY TRANSPORTATION OF HAZARDOUS LIQUIDS BY... far as practicable, areas containing private dwellings, industrial buildings, and places of public...

  6. 49 CFR 195.210 - Pipeline location.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... Transportation Other Regulations Relating to Transportation (Continued) PIPELINE AND HAZARDOUS MATERIALS SAFETY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED) PIPELINE SAFETY TRANSPORTATION OF HAZARDOUS LIQUIDS BY... far as practicable, areas containing private dwellings, industrial buildings, and places of public...

  7. 49 CFR 195.210 - Pipeline location.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... Transportation Other Regulations Relating to Transportation (Continued) PIPELINE AND HAZARDOUS MATERIALS SAFETY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED) PIPELINE SAFETY TRANSPORTATION OF HAZARDOUS LIQUIDS BY... far as practicable, areas containing private dwellings, industrial buildings, and places of public...

  8. Academic Pipeline and Futures Lab

    DTIC Science & Technology

    2016-02-01

    AFRL-RY-WP-TR-2015-0186 ACADEMIC PIPELINE AND FUTURES LAB Brian D. Rigling Wright State University FEBRUARY 2016...DD-MM-YY) 2. REPORT TYPE 3. DATES COVERED (From - To) February 2016 Final 12 June 2009 – 30 September 2015 4. TITLE AND SUBTITLE ACADEMIC ...6 3 WSU ACADEMIC PIPELINE AND LAYERED SENSING FUTURES LAB (prepared by K

  9. Oman-India pipeline route survey

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mullee, J.E.

    1995-12-01

    Paper describes the geological setting in the Arabian Sea for a proposed 28-inch gas pipeline from Oman to India reaching 3,500-m water depths. Covers planning, execution, quality control and results of geophysical, geotechnical and oceanographic surveys. Outlines theory and application of pipeline stress analysis on board survey vessel for feasibility assessment, and specifies equipment used.

  10. 76 FR 68828 - Pipeline Safety: Emergency Responder Forum

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-07

    ... PHMSA-2011-0295] Pipeline Safety: Emergency Responder Forum AGENCY: Pipeline and Hazardous Materials Safety Administration (PHMSA), DOT. ACTION: Notice of Forum. SUMMARY: PHMSA is co-sponsoring a one-day Emergency Responder Forum with the National Association of Pipeline Safety Representatives and the United...

  11. JGI Plant Genomics Gene Annotation Pipeline

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shu, Shengqiang; Rokhsar, Dan; Goodstein, David

    2014-07-14

    Plant genomes vary in size and are highly complex with a high amount of repeats, genome duplication and tandem duplication. Gene encodes a wealth of information useful in studying organism and it is critical to have high quality and stable gene annotation. Thanks to advancement of sequencing technology, many plant species genomes have been sequenced and transcriptomes are also sequenced. To use these vastly large amounts of sequence data to make gene annotation or re-annotation in a timely fashion, an automatic pipeline is needed. JGI plant genomics gene annotation pipeline, called integrated gene call (IGC), is our effort toward thismore » aim with aid of a RNA-seq transcriptome assembly pipeline. It utilizes several gene predictors based on homolog peptides and transcript ORFs. See Methods for detail. Here we present genome annotation of JGI flagship green plants produced by this pipeline plus Arabidopsis and rice except for chlamy which is done by a third party. The genome annotations of these species and others are used in our gene family build pipeline and accessible via JGI Phytozome portal whose URL and front page snapshot are shown below.« less

  12. ORAC-DR: A generic data reduction pipeline infrastructure

    NASA Astrophysics Data System (ADS)

    Jenness, Tim; Economou, Frossie

    2015-03-01

    ORAC-DR is a general purpose data reduction pipeline system designed to be instrument and observatory agnostic. The pipeline works with instruments as varied as infrared integral field units, imaging arrays and spectrographs, and sub-millimeter heterodyne arrays and continuum cameras. This paper describes the architecture of the pipeline system and the implementation of the core infrastructure. We finish by discussing the lessons learned since the initial deployment of the pipeline system in the late 1990s.

  13. Rights, Bunche, Rose and the "pipeline".

    PubMed Central

    Marks, Steven R.; Wilkinson-Lee, Ada M.

    2006-01-01

    We address education "pipelines" and their social ecology, drawing on the 1930's writing of Ralph J. Bunche, a Nobel peace maker whose war against systematic second-class education for the poor, minority and nonminority alike is nearly forgotten; and of the epidemiologist Geoffrey Rose, whose 1985 paper spotlighted the difficulty of shifting health status and risks in a "sick society. From the perspective of human rights and human development, we offer suggestions toward the paired "ends" of the pipeline: equality of opportunity for individuals, and equality of health for populations. We offer a national "to do" list to improve pipeline flow and then reconsider the merits of the "pipeline" metaphor, which neither matches the reality of lived education pathways nor supports notions of human rights, freedoms and capabilities, but rather reflects a commoditizing stance to free persons. PMID:17019927

  14. Evaluation of fishing gear induced pipeline damage

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ellinas, C.P.; King, B.; Davies, R.

    1995-12-31

    Impact and damage to pipelines due to fishing activities is one of the hazards faced by North Sea pipelines during their operating lives. Available data indicate that about one in ten of reported incidents are due to fishing activities. This paper is concerned with one such occurrence, the assessment of the resulting damage, the methods used to confirm pipeline integrity and the approaches developed for its repair.

  15. Streak detection and analysis pipeline for space-debris optical images

    NASA Astrophysics Data System (ADS)

    Virtanen, Jenni; Poikonen, Jonne; Säntti, Tero; Komulainen, Tuomo; Torppa, Johanna; Granvik, Mikael; Muinonen, Karri; Pentikäinen, Hanna; Martikainen, Julia; Näränen, Jyri; Lehti, Jussi; Flohrer, Tim

    2016-04-01

    We describe a novel data-processing and analysis pipeline for optical observations of moving objects, either of natural (asteroids, meteors) or artificial origin (satellites, space debris). The monitoring of the space object populations requires reliable acquisition of observational data, to support the development and validation of population models and to build and maintain catalogues of orbital elements. The orbital catalogues are, in turn, needed for the assessment of close approaches (for asteroids, with the Earth; for satellites, with each other) and for the support of contingency situations or launches. For both types of populations, there is also increasing interest to detect fainter objects corresponding to the small end of the size distribution. The ESA-funded StreakDet (streak detection and astrometric reduction) activity has aimed at formulating and discussing suitable approaches for the detection and astrometric reduction of object trails, or streaks, in optical observations. Our two main focuses are objects in lower altitudes and space-based observations (i.e., high angular velocities), resulting in long (potentially curved) and faint streaks in the optical images. In particular, we concentrate on single-image (as compared to consecutive frames of the same field) and low-SNR detection of objects. Particular attention has been paid to the process of extraction of all necessary information from one image (segmentation), and subsequently, to efficient reduction of the extracted data (classification). We have developed an automated streak detection and processing pipeline and demonstrated its performance with an extensive database of semisynthetic images simulating streak observations both from ground-based and space-based observing platforms. The average processing time per image is about 13 s for a typical 2k-by-2k image. For long streaks (length >100 pixels), primary targets of the pipeline, the detection sensitivity (true positives) is about 90% for

  16. KAMO: towards automated data processing for microcrystals.

    PubMed

    Yamashita, Keitaro; Hirata, Kunio; Yamamoto, Masaki

    2018-05-01

    In protein microcrystallography, radiation damage often hampers complete and high-resolution data collection from a single crystal, even under cryogenic conditions. One promising solution is to collect small wedges of data (5-10°) separately from multiple crystals. The data from these crystals can then be merged into a complete reflection-intensity set. However, data processing of multiple small-wedge data sets is challenging. Here, a new open-source data-processing pipeline, KAMO, which utilizes existing programs, including the XDS and CCP4 packages, has been developed to automate whole data-processing tasks in the case of multiple small-wedge data sets. Firstly, KAMO processes individual data sets and collates those indexed with equivalent unit-cell parameters. The space group is then chosen and any indexing ambiguity is resolved. Finally, clustering is performed, followed by merging with outlier rejections, and a report is subsequently created. Using synthetic and several real-world data sets collected from hundreds of crystals, it was demonstrated that merged structure-factor amplitudes can be obtained in a largely automated manner using KAMO, which greatly facilitated the structure analyses of challenging targets that only produced microcrystals. open access.

  17. Gas pipeline leakage detection based on PZT sensors

    NASA Astrophysics Data System (ADS)

    Zhu, Junxiao; Ren, Liang; Ho, Siu-Chun; Jia, Ziguang; Song, Gangbing

    2017-02-01

    In this paper, an innovative method for rapid detection and location determination of pipeline leakage utilizing lead zirconate titanate (PZT) sensors is proposed. The negative pressure wave (NPW) is a stress wave generated by leakage in the pipeline, and propagates along the pipeline from the leakage point to both ends. Thus the NPW is associated with hoop strain variation along the pipe wall. PZT sensors mounted on the pipeline were used to measure the strain variation and allowed accurate (within 2% error) and repeatable location (within 4% variance) of five manually controlled leakage points. Experimental results have verified the effectiveness and the location accuracy for leakage in a 55 meter long model pipeline.

  18. An automated workflow for parallel processing of large multiview SPIM recordings.

    PubMed

    Schmied, Christopher; Steinbach, Peter; Pietzsch, Tobias; Preibisch, Stephan; Tomancak, Pavel

    2016-04-01

    Selective Plane Illumination Microscopy (SPIM) allows to image developing organisms in 3D at unprecedented temporal resolution over long periods of time. The resulting massive amounts of raw image data requires extensive processing interactively via dedicated graphical user interface (GUI) applications. The consecutive processing steps can be easily automated and the individual time points can be processed independently, which lends itself to trivial parallelization on a high performance computing (HPC) cluster. Here, we introduce an automated workflow for processing large multiview, multichannel, multiillumination time-lapse SPIM data on a single workstation or in parallel on a HPC cluster. The pipeline relies on snakemake to resolve dependencies among consecutive processing steps and can be easily adapted to any cluster environment for processing SPIM data in a fraction of the time required to collect it. The code is distributed free and open source under the MIT license http://opensource.org/licenses/MIT The source code can be downloaded from github: https://github.com/mpicbg-scicomp/snakemake-workflows Documentation can be found here: http://fiji.sc/Automated_workflow_for_parallel_Multiview_Reconstruction : schmied@mpi-cbg.de Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press.

  19. An automated workflow for parallel processing of large multiview SPIM recordings

    PubMed Central

    Schmied, Christopher; Steinbach, Peter; Pietzsch, Tobias; Preibisch, Stephan; Tomancak, Pavel

    2016-01-01

    Summary: Selective Plane Illumination Microscopy (SPIM) allows to image developing organisms in 3D at unprecedented temporal resolution over long periods of time. The resulting massive amounts of raw image data requires extensive processing interactively via dedicated graphical user interface (GUI) applications. The consecutive processing steps can be easily automated and the individual time points can be processed independently, which lends itself to trivial parallelization on a high performance computing (HPC) cluster. Here, we introduce an automated workflow for processing large multiview, multichannel, multiillumination time-lapse SPIM data on a single workstation or in parallel on a HPC cluster. The pipeline relies on snakemake to resolve dependencies among consecutive processing steps and can be easily adapted to any cluster environment for processing SPIM data in a fraction of the time required to collect it. Availability and implementation: The code is distributed free and open source under the MIT license http://opensource.org/licenses/MIT. The source code can be downloaded from github: https://github.com/mpicbg-scicomp/snakemake-workflows. Documentation can be found here: http://fiji.sc/Automated_workflow_for_parallel_Multiview_Reconstruction. Contact: schmied@mpi-cbg.de Supplementary information: Supplementary data are available at Bioinformatics online. PMID:26628585

  20. Urban Underground Pipelines Mapping Using Ground Penetrating Radar

    NASA Astrophysics Data System (ADS)

    Jaw, S. W.; M, Hashim

    2014-02-01

    Underground spaces are now being given attention to exploit for transportation, utilities, and public usage. The underground has become a spider's web of utility networks. Mapping of underground utility pipelines has become a challenging and difficult task. As such, mapping of underground utility pipelines is a "hit-and-miss" affair, and results in many catastrophic damages, particularly in urban areas. Therefore, this study was conducted to extract locational information of the urban underground utility pipeline using trenchless measuring tool, namely ground penetrating radar (GPR). The focus of this study was to conduct underground utility pipeline mapping for retrieval of geometry properties of the pipelines, using GPR. In doing this, a series of tests were first conducted at the preferred test site and real-life experiment, followed by modeling of field-based model using Finite-Difference Time-Domain (FDTD). Results provide the locational information of underground utility pipelines associated with its mapping accuracy. Eventually, this locational information of the underground utility pipelines is beneficial to civil infrastructure management and maintenance which in the long term is time-saving and critically important for the development of metropolitan areas.

  1. Computer models of complex multiloop branched pipeline systems

    NASA Astrophysics Data System (ADS)

    Kudinov, I. V.; Kolesnikov, S. V.; Eremin, A. V.; Branfileva, A. N.

    2013-11-01

    This paper describes the principal theoretical concepts of the method used for constructing computer models of complex multiloop branched pipeline networks, and this method is based on the theory of graphs and two Kirchhoff's laws applied to electrical circuits. The models make it possible to calculate velocities, flow rates, and pressures of a fluid medium in any section of pipeline networks, when the latter are considered as single hydraulic systems. On the basis of multivariant calculations the reasons for existing problems can be identified, the least costly methods of their elimination can be proposed, and recommendations for planning the modernization of pipeline systems and construction of their new sections can be made. The results obtained can be applied to complex pipeline systems intended for various purposes (water pipelines, petroleum pipelines, etc.). The operability of the model has been verified on an example of designing a unified computer model of the heat network for centralized heat supply of the city of Samara.

  2. Bad Actors Criticality Assessment for Pipeline system

    NASA Astrophysics Data System (ADS)

    Nasir, Meseret; Chong, Kit wee; Osman, Sabtuni; Siaw Khur, Wee

    2015-04-01

    Failure of a pipeline system could bring huge economic loss. In order to mitigate such catastrophic loss, it is required to evaluate and rank the impact of each bad actor of the pipeline system. In this study, bad actors are known as the root causes or any potential factor leading to the system downtime. Fault Tree Analysis (FTA) is used to analyze the probability of occurrence for each bad actor. Bimbaum's Importance and criticality measure (BICM) is also employed to rank the impact of each bad actor on the pipeline system failure. The results demonstrate that internal corrosion; external corrosion and construction damage are critical and highly contribute to the pipeline system failure with 48.0%, 12.4% and 6.0% respectively. Thus, a minor improvement in internal corrosion; external corrosion and construction damage would bring significant changes in the pipeline system performance and reliability. These results could also be useful to develop efficient maintenance strategy by identifying the critical bad actors.

  3. Failure modes for pipelines in landslide areas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bruschi, R.; Spinazze, M.; Tomassini, D.

    1995-12-31

    In recent years a number of incidences of pipelines affected by slow soil movements have been reported in the relevant literature. Further related issues such as soil-pipe interaction have been studied both theoretically and through experimental surveys, along with the environmental conditions which are responsible for hazard to the pipeline integrity. A suitable design criteria under these circumstances has been discussed by several authors, in particular in relation to a limit state approach and hence a strain based criteria. The scope of this paper is to describe the failure mechanisms which may affect the pipeline in the presence of slowmore » soil movements impacting on the pipeline, both in the longitudinal and transverse direction. Particular attention is paid to environmental, geometric and structural parameters which steer the process towards one or other failure mechanism. Criteria for deciding upon remedial measures required to guarantee the structural integrity of the pipeline, both in the short and in the long term, are discussed.« less

  4. Methods for protecting subsea pipelines and installations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rochelle, W.R.; Simpson, D.M.

    1981-01-01

    The hazards for subsea pipelines and installations are described. Methods currently being used to protect subsea pipelines and installations are discussed with the emphasis on various trenching methods and equipment. Technical data on progress rates for trenching and feasible depths of trench are given. Possible methods for protection against icebergs are discussed. A case for more comprehensive data on icebergs is presented. Should a pipeline become damaged, repair methods are noted.

  5. A pipeline for the de novo assembly of the Themira biloba (Sepsidae: Diptera) transcriptome using a multiple k-mer length approach.

    PubMed

    Melicher, Dacotah; Torson, Alex S; Dworkin, Ian; Bowsher, Julia H

    2014-03-12

    The Sepsidae family of flies is a model for investigating how sexual selection shapes courtship and sexual dimorphism in a comparative framework. However, like many non-model systems, there are few molecular resources available. Large-scale sequencing and assembly have not been performed in any sepsid, and the lack of a closely related genome makes investigation of gene expression challenging. Our goal was to develop an automated pipeline for de novo transcriptome assembly, and to use that pipeline to assemble and analyze the transcriptome of the sepsid Themira biloba. Our bioinformatics pipeline uses cloud computing services to assemble and analyze the transcriptome with off-site data management, processing, and backup. It uses a multiple k-mer length approach combined with a second meta-assembly to extend transcripts and recover more bases of transcript sequences than standard single k-mer assembly. We used 454 sequencing to generate 1.48 million reads from cDNA generated from embryo, larva, and pupae of T. biloba and assembled a transcriptome consisting of 24,495 contigs. Annotation identified 16,705 transcripts, including those involved in embryogenesis and limb patterning. We assembled transcriptomes from an additional three non-model organisms to demonstrate that our pipeline assembled a higher-quality transcriptome than single k-mer approaches across multiple species. The pipeline we have developed for assembly and analysis increases contig length, recovers unique transcripts, and assembles more base pairs than other methods through the use of a meta-assembly. The T. biloba transcriptome is a critical resource for performing large-scale RNA-Seq investigations of gene expression patterns, and is the first transcriptome sequenced in this Dipteran family.

  6. Commissioning of a new helium pipeline

    NASA Technical Reports Server (NTRS)

    2000-01-01

    Center Director Roy Bridges addresses the audience at the commissioning of a new high-pressure helium pipeline at Kennedy Space Center that will service launch needs at the new Delta IV Complex 37 at Cape Canaveral Air Force Station. The nine-mile- long buried pipeline will also serve as a backup helium resource for Shuttle launches. Nearly one launch's worth of helium will be available in the pipeline to support a Shuttle pad in an emergency. The line originates at the Helium Facility on KSC and terminates in a meter station at the perimeter of the Delta IV launch pad. Others at the ceremony were Jerry Jorgensen, pipeline project manager, Space Gateway Support (SGS); Col. Samuel Dick, representative of the 45th Space Wing; Ramon Lugo, acting executive director, JPMO; David Herst, director, Delta IV Launch Sites; Pierre Dufour, president and CEO, Air Liquide America Corporation; and Michael Butchko, president, SGS.

  7. Garden Banks 388 deepwater pipeline span avoidance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davis, S.W.; Sawyer, M.A.; Kenney, T.D.

    1995-12-31

    This paper will describe the span avoidance measures taken for the installation of the Garden Banks 388 deepwater oil and gas gathering pipelines. The two 12 inch pipelines connect a shallow water facility in EI-315 to a deep water subsea template in GB-388. These pipelines run across the irregular continental slope typically found in moderate to deep water in the Gulf of Mexico. To minimize pipeline spans, steps were taken during design, survey, and installation phases of the project. During each phase, as additional information became available, analyses and resulting recommended approaches were refined. This continuity, seldom easily obtained, provedmore » beneficial in translating design work into field results.« less

  8. Sensor and transmitter system for communication in pipelines

    DOEpatents

    Cooper, John F.; Burnham, Alan K.

    2013-01-29

    A system for sensing and communicating in a pipeline that contains a fluid. An acoustic signal containing information about a property of the fluid is produced in the pipeline. The signal is transmitted through the pipeline. The signal is received with the information and used by a control.

  9. ORAC-DR -- SCUBA Pipeline Data Reduction

    NASA Astrophysics Data System (ADS)

    Jenness, Tim; Economou, Frossie

    ORAC-DR is a flexible data reduction pipeline designed to reduce data from many different instruments. This document describes how to use the ORAC-DR pipeline to reduce data taken with the Submillimetre Common-User Bolometer Array (SCUBA) obtained from the James Clerk Maxwell Telescope.

  10. 77 FR 73637 - Alliance Pipeline L.P.; Notice of Application

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-11

    ... Pipeline L.P.; Notice of Application Take notice that on November 26, 2012, Alliance Pipeline L.P..., Manager, Regulatory Affairs, Alliance Pipeline Ltd. on behalf of Alliance Pipeline L.P., 800, 605-5 Ave...] BILLING CODE 6717-01-P ...

  11. 49 CFR 192.755 - Protecting cast-iron pipelines.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 3 2010-10-01 2010-10-01 false Protecting cast-iron pipelines. 192.755 Section... NATURAL AND OTHER GAS BY PIPELINE: MINIMUM FEDERAL SAFETY STANDARDS Maintenance § 192.755 Protecting cast-iron pipelines. When an operator has knowledge that the support for a segment of a buried cast-iron...

  12. genipe: an automated genome-wide imputation pipeline with automatic reporting and statistical tools.

    PubMed

    Lemieux Perreault, Louis-Philippe; Legault, Marc-André; Asselin, Géraldine; Dubé, Marie-Pierre

    2016-12-01

    Genotype imputation is now commonly performed following genome-wide genotyping experiments. Imputation increases the density of analyzed genotypes in the dataset, enabling fine-mapping across the genome. However, the process of imputation using the most recent publicly available reference datasets can require considerable computation power and the management of hundreds of large intermediate files. We have developed genipe, a complete genome-wide imputation pipeline which includes automatic reporting, imputed data indexing and management, and a suite of statistical tests for imputed data commonly used in genetic epidemiology (Sequence Kernel Association Test, Cox proportional hazards for survival analysis, and linear mixed models for repeated measurements in longitudinal studies). The genipe package is an open source Python software and is freely available for non-commercial use (CC BY-NC 4.0) at https://github.com/pgxcentre/genipe Documentation and tutorials are available at http://pgxcentre.github.io/genipe CONTACT: louis-philippe.lemieux.perreault@statgen.org or marie-pierre.dube@statgen.orgSupplementary information: Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.

  13. 75 FR 4136 - Pipeline Safety: Request To Modify Special Permit

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-01-26

    ... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket No. PHMSA-2009-0377] Pipeline Safety: Request To Modify Special Permit AGENCY: Pipeline and Hazardous... coating on its gas pipeline. DATES: Submit any comments regarding this special permit modification request...

  14. 33 CFR 88.15 - Lights on dredge pipelines.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 33 Navigation and Navigable Waters 1 2010-07-01 2010-07-01 false Lights on dredge pipelines. 88.15... NAVIGATION RULES ANNEX V: PILOT RULES § 88.15 Lights on dredge pipelines. Dredge pipelines that are floating or supported on trestles shall display the following lights at night and in periods of restricted...

  15. 33 CFR 88.15 - Lights on dredge pipelines.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 33 Navigation and Navigable Waters 1 2013-07-01 2013-07-01 false Lights on dredge pipelines. 88.15... NAVIGATION RULES ANNEX V: PILOT RULES § 88.15 Lights on dredge pipelines. Dredge pipelines that are floating or supported on trestles shall display the following lights at night and in periods of restricted...

  16. 33 CFR 88.15 - Lights on dredge pipelines.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 33 Navigation and Navigable Waters 1 2014-07-01 2014-07-01 false Lights on dredge pipelines. 88.15... NAVIGATION RULES ANNEX V: PILOT RULES § 88.15 Lights on dredge pipelines. Dredge pipelines that are floating or supported on trestles shall display the following lights at night and in periods of restricted...

  17. 33 CFR 88.15 - Lights on dredge pipelines.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 33 Navigation and Navigable Waters 1 2011-07-01 2011-07-01 false Lights on dredge pipelines. 88.15... NAVIGATION RULES ANNEX V: PILOT RULES § 88.15 Lights on dredge pipelines. Dredge pipelines that are floating or supported on trestles shall display the following lights at night and in periods of restricted...

  18. 33 CFR 88.15 - Lights on dredge pipelines.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 33 Navigation and Navigable Waters 1 2012-07-01 2012-07-01 false Lights on dredge pipelines. 88.15... NAVIGATION RULES ANNEX V: PILOT RULES § 88.15 Lights on dredge pipelines. Dredge pipelines that are floating or supported on trestles shall display the following lights at night and in periods of restricted...

  19. Computational Testing for Automated Preprocessing 2: Practical Demonstration of a System for Scientific Data-Processing Workflow Management for High-Volume EEG

    PubMed Central

    Cowley, Benjamin U.; Korpela, Jussi

    2018-01-01

    Existing tools for the preprocessing of EEG data provide a large choice of methods to suitably prepare and analyse a given dataset. Yet it remains a challenge for the average user to integrate methods for batch processing of the increasingly large datasets of modern research, and compare methods to choose an optimal approach across the many possible parameter configurations. Additionally, many tools still require a high degree of manual decision making for, e.g., the classification of artifacts in channels, epochs or segments. This introduces extra subjectivity, is slow, and is not reproducible. Batching and well-designed automation can help to regularize EEG preprocessing, and thus reduce human effort, subjectivity, and consequent error. The Computational Testing for Automated Preprocessing (CTAP) toolbox facilitates: (i) batch processing that is easy for experts and novices alike; (ii) testing and comparison of preprocessing methods. Here we demonstrate the application of CTAP to high-resolution EEG data in three modes of use. First, a linear processing pipeline with mostly default parameters illustrates ease-of-use for naive users. Second, a branching pipeline illustrates CTAP's support for comparison of competing methods. Third, a pipeline with built-in parameter-sweeping illustrates CTAP's capability to support data-driven method parameterization. CTAP extends the existing functions and data structure from the well-known EEGLAB toolbox, based on Matlab, and produces extensive quality control outputs. CTAP is available under MIT open-source licence from https://github.com/bwrc/ctap. PMID:29692705

  20. Computational Testing for Automated Preprocessing 2: Practical Demonstration of a System for Scientific Data-Processing Workflow Management for High-Volume EEG.

    PubMed

    Cowley, Benjamin U; Korpela, Jussi

    2018-01-01

    Existing tools for the preprocessing of EEG data provide a large choice of methods to suitably prepare and analyse a given dataset. Yet it remains a challenge for the average user to integrate methods for batch processing of the increasingly large datasets of modern research, and compare methods to choose an optimal approach across the many possible parameter configurations. Additionally, many tools still require a high degree of manual decision making for, e.g., the classification of artifacts in channels, epochs or segments. This introduces extra subjectivity, is slow, and is not reproducible. Batching and well-designed automation can help to regularize EEG preprocessing, and thus reduce human effort, subjectivity, and consequent error. The Computational Testing for Automated Preprocessing (CTAP) toolbox facilitates: (i) batch processing that is easy for experts and novices alike; (ii) testing and comparison of preprocessing methods. Here we demonstrate the application of CTAP to high-resolution EEG data in three modes of use. First, a linear processing pipeline with mostly default parameters illustrates ease-of-use for naive users. Second, a branching pipeline illustrates CTAP's support for comparison of competing methods. Third, a pipeline with built-in parameter-sweeping illustrates CTAP's capability to support data-driven method parameterization. CTAP extends the existing functions and data structure from the well-known EEGLAB toolbox, based on Matlab, and produces extensive quality control outputs. CTAP is available under MIT open-source licence from https://github.com/bwrc/ctap.

  1. 77 FR 7572 - Alliance Pipeline L.P.; Notice of Application

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-13

    ...] Alliance Pipeline L.P.; Notice of Application Take notice that on January 25, 2012, Alliance Pipeline L.P... Pipeline Inc., Managing General Partner of Alliance Pipeline L.P., 800, 605--5 Ave. SW., Calgary, Alberta...; 8:45 am] BILLING CODE 6717-01-P ...

  2. 76 FR 54531 - Pipeline Safety: Potential for Damage to Pipeline Facilities Caused by the Passage of Hurricanes

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-01

    ... production and processing is prone to disruption by hurricanes. In 2005, Hurricanes Katrina and Rita caused... Hurricanes AGENCY: Pipeline and Hazardous Materials Safety Administration (PHMSA), DOT. ACTION: Notice... the passage of Hurricanes. ADDRESSES: This document can be viewed on the Office of Pipeline Safety...

  3. CFD analysis of onshore oil pipelines in permafrost

    NASA Astrophysics Data System (ADS)

    Nardecchia, Fabio; Gugliermetti, Luca; Gugliermetti, Franco

    2017-07-01

    Underground pipelines are built all over the world and the knowledge of their thermal interaction with the soil is crucial for their design. This paper studies the "thermal influenced zone" produced by a buried pipeline and the parameters that can influence its extension by 2D-steady state CFD simulations with the aim to improve the design of new pipelines in permafrost. In order to represent a real case, the study is referred to the Eastern Siberia-Pacific Ocean Oil Pipeline at the three stations of Mo'he, Jiagedaqi and Qiqi'har. Different burial depth sand diameters of the pipe are analyzed; the simulation results show that the effect of the oil pipeline diameter on the thermal field increases with the increase of the distance from the starting station.

  4. Pipeline Expansions

    EIA Publications

    1999-01-01

    This appendix examines the nature and type of proposed pipeline projects announced or approved for construction during the next several years in the United States. It also includes those projects in Canada and Mexico that tie-in with the U.S. markets or projects.

  5. 78 FR 24309 - Pipeline and Hazardous Materials Safety Administration

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-24

    ... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration List of Special Permit Applications Delayed AGENCY: Pipeline and Hazardous Materials Safety Administration (PHMSA..., Pipeline and Hazardous Materials Safety Administration, U.S. Department of Transportation, East Building...

  6. Porcupine: A visual pipeline tool for neuroimaging analysis

    PubMed Central

    Snoek, Lukas; Knapen, Tomas

    2018-01-01

    The field of neuroimaging is rapidly adopting a more reproducible approach to data acquisition and analysis. Data structures and formats are being standardised and data analyses are getting more automated. However, as data analysis becomes more complicated, researchers often have to write longer analysis scripts, spanning different tools across multiple programming languages. This makes it more difficult to share or recreate code, reducing the reproducibility of the analysis. We present a tool, Porcupine, that constructs one’s analysis visually and automatically produces analysis code. The graphical representation improves understanding of the performed analysis, while retaining the flexibility of modifying the produced code manually to custom needs. Not only does Porcupine produce the analysis code, it also creates a shareable environment for running the code in the form of a Docker image. Together, this forms a reproducible way of constructing, visualising and sharing one’s analysis. Currently, Porcupine links to Nipype functionalities, which in turn accesses most standard neuroimaging analysis tools. Our goal is to release researchers from the constraints of specific implementation details, thereby freeing them to think about novel and creative ways to solve a given problem. Porcupine improves the overview researchers have of their processing pipelines, and facilitates both the development and communication of their work. This will reduce the threshold at which less expert users can generate reusable pipelines. With Porcupine, we bridge the gap between a conceptual and an implementational level of analysis and make it easier for researchers to create reproducible and shareable science. We provide a wide range of examples and documentation, as well as installer files for all platforms on our website: https://timvanmourik.github.io/Porcupine. Porcupine is free, open source, and released under the GNU General Public License v3.0. PMID:29746461

  7. Numerical Simulation of Pipeline Deformation Caused by Rockfall Impact

    PubMed Central

    Liang, Zheng; Han, Chuanjun

    2014-01-01

    Rockfall impact is one of the fatal hazards in pipeline transportation of oil and gas. The deformation of oil and gas pipeline caused by rockfall impact was investigated using the finite element method in this paper. Pipeline deformations under radial impact, longitudinal inclined impact, transverse inclined impact, and lateral eccentric impact of spherical and cube rockfalls were discussed, respectively. The effects of impact angle and eccentricity on the plastic strain of pipeline were analyzed. The results show that the crater depth on pipeline caused by spherical rockfall impact is deeper than by cube rockfall impact with the same volume. In the inclined impact condition, the maximum plastic strain of crater caused by spherical rockfall impact appears when incidence angle α is 45°. The pipeline is prone to rupture under the cube rockfall impact when α is small. The plastic strain distribution of impact crater is more uneven with the increasing of impact angle. In the eccentric impact condition, plastic strain zone of pipeline decreases with the increasing of eccentricity k. PMID:24959599

  8. StrAuto: automation and parallelization of STRUCTURE analysis.

    PubMed

    Chhatre, Vikram E; Emerson, Kevin J

    2017-03-24

    Population structure inference using the software STRUCTURE has become an integral part of population genetic studies covering a broad spectrum of taxa including humans. The ever-expanding size of genetic data sets poses computational challenges for this analysis. Although at least one tool currently implements parallel computing to reduce computational overload of this analysis, it does not fully automate the use of replicate STRUCTURE analysis runs required for downstream inference of optimal K. There is pressing need for a tool that can deploy population structure analysis on high performance computing clusters. We present an updated version of the popular Python program StrAuto, to streamline population structure analysis using parallel computing. StrAuto implements a pipeline that combines STRUCTURE analysis with the Evanno Δ K analysis and visualization of results using STRUCTURE HARVESTER. Using benchmarking tests, we demonstrate that StrAuto significantly reduces the computational time needed to perform iterative STRUCTURE analysis by distributing runs over two or more processors. StrAuto is the first tool to integrate STRUCTURE analysis with post-processing using a pipeline approach in addition to implementing parallel computation - a set up ideal for deployment on computing clusters. StrAuto is distributed under the GNU GPL (General Public License) and available to download from http://strauto.popgen.org .

  9. Changes in the Pipeline Transportation Market

    EIA Publications

    1999-01-01

    This analysis assesses the amount of capacity that may be turned back to pipeline companies, based on shippers' actions over the past several years and the profile of contracts in place as of July 1, 1998. It also examines changes in the characteristics of contracts between shippers and pipeline companies.

  10. 77 FR 61826 - Pipeline Safety: Communication During Emergency Situations

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-11

    ... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket No... liquefied natural gas pipeline facilities that operators should immediately and directly notify the Public.... Background Federal regulations for gas, liquefied natural gas (LNG), and hazardous liquid pipeline facilities...

  11. Sludge pipeline design.

    PubMed

    Slatter, P T

    2001-01-01

    The need for the design engineer to have a sound basis for designing sludge pumping and pipelining plant is becoming more critical. This paper examines both a traditional text-book approach and one of the latest approaches from the literature, and compares them with experimental data. The pipelining problem can be divided into the following main areas; rheological characterisation, laminar, transitional and turbulent flow and each is addressed in turn. Experimental data for a digested sludge tested in large pipes is analysed and compared with the two different theoretical approaches. Discussion is centred on the differences between the two methods and the degree of agreement with the data. It is concluded that the new approach has merit and can be used for practical design.

  12. Parenchymal texture analysis in digital mammography: A fully automated pipeline for breast cancer risk assessment.

    PubMed

    Zheng, Yuanjie; Keller, Brad M; Ray, Shonket; Wang, Yan; Conant, Emily F; Gee, James C; Kontos, Despina

    2015-07-01

    Mammographic percent density (PD%) is known to be a strong risk factor for breast cancer. Recent studies also suggest that parenchymal texture features, which are more granular descriptors of the parenchymal pattern, can provide additional information about breast cancer risk. To date, most studies have measured mammographic texture within selected regions of interest (ROIs) in the breast, which cannot adequately capture the complexity of the parenchymal pattern throughout the whole breast. To better characterize patterns of the parenchymal tissue, the authors have developed a fully automated software pipeline based on a novel lattice-based strategy to extract a range of parenchymal texture features from the entire breast region. Digital mammograms from 106 cases with 318 age-matched controls were retrospectively analyzed. The lattice-based approach is based on a regular grid virtually overlaid on each mammographic image. Texture features are computed from the intersection (i.e., lattice) points of the grid lines within the breast, using a local window centered at each lattice point. Using this strategy, a range of statistical (gray-level histogram, co-occurrence, and run-length) and structural (edge-enhancing, local binary pattern, and fractal dimension) features are extracted. To cover the entire breast, the size of the local window for feature extraction is set equal to the lattice grid spacing and optimized experimentally by evaluating different windows sizes. The association between their lattice-based texture features and breast cancer was evaluated using logistic regression with leave-one-out cross validation and further compared to that of breast PD% and commonly used single-ROI texture features extracted from the retroareolar or the central breast region. Classification performance was evaluated using the area under the curve (AUC) of the receiver operating characteristic (ROC). DeLong's test was used to compare the different ROCs in terms of AUC

  13. 49 CFR 191.27 - Filing offshore pipeline condition reports.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... HAZARDOUS MATERIALS SAFETY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED) PIPELINE SAFETY TRANSPORTATION OF NATURAL AND OTHER GAS BY PIPELINE; ANNUAL REPORTS, INCIDENT REPORTS, AND SAFETY-RELATED..., job title, and business telephone number of person submitting the report. (4) Total length of pipeline...

  14. 77 FR 34457 - Pipeline Safety: Mechanical Fitting Failure Reports

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-11

    ... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket No... notice provides clarification to owners and operators of gas distribution pipeline facilities when... of a gas distribution pipeline facility to file a written report for any mechanical fitting failure...

  15. Multi-scale imaging and informatics pipeline for in situ pluripotent stem cell analysis.

    PubMed

    Gorman, Bryan R; Lu, Junjie; Baccei, Anna; Lowry, Nathan C; Purvis, Jeremy E; Mangoubi, Rami S; Lerou, Paul H

    2014-01-01

    Human pluripotent stem (hPS) cells are a potential source of cells for medical therapy and an ideal system to study fate decisions in early development. However, hPS cells cultured in vitro exhibit a high degree of heterogeneity, presenting an obstacle to clinical translation. hPS cells grow in spatially patterned colony structures, necessitating quantitative single-cell image analysis. We offer a tool for analyzing the spatial population context of hPS cells that integrates automated fluorescent microscopy with an analysis pipeline. It enables high-throughput detection of colonies at low resolution, with single-cellular and sub-cellular analysis at high resolutions, generating seamless in situ maps of single-cellular data organized by colony. We demonstrate the tool's utility by analyzing inter- and intra-colony heterogeneity of hPS cell cycle regulation and pluripotency marker expression. We measured the heterogeneity within individual colonies by analyzing cell cycle as a function of distance. Cells loosely associated with the outside of the colony are more likely to be in G1, reflecting a less pluripotent state, while cells within the first pluripotent layer are more likely to be in G2, possibly reflecting a G2/M block. Our multi-scale analysis tool groups colony regions into density classes, and cells belonging to those classes have distinct distributions of pluripotency markers and respond differently to DNA damage induction. Lastly, we demonstrate that our pipeline can robustly handle high-content, high-resolution single molecular mRNA FISH data by using novel image processing techniques. Overall, the imaging informatics pipeline presented offers a novel approach to the analysis of hPS cells that includes not only single cell features but also colony wide, and more generally, multi-scale spatial configuration.

  16. 25 CFR 169.25 - Oil and gas pipelines.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 25 Indians 1 2010-04-01 2010-04-01 false Oil and gas pipelines. 169.25 Section 169.25 Indians....25 Oil and gas pipelines. (a) The Act of March 11, 1904 (33 Stat. 65), as amended by the Act of March 2, 1917 (39 Stat. 973; 25 U.S.C. 321), authorizes right-of-way grants for oil and gas pipelines...

  17. 25 CFR 169.25 - Oil and gas pipelines.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 25 Indians 1 2011-04-01 2011-04-01 false Oil and gas pipelines. 169.25 Section 169.25 Indians....25 Oil and gas pipelines. (a) The Act of March 11, 1904 (33 Stat. 65), as amended by the Act of March 2, 1917 (39 Stat. 973; 25 U.S.C. 321), authorizes right-of-way grants for oil and gas pipelines...

  18. 25 CFR 169.25 - Oil and gas pipelines.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 25 Indians 1 2014-04-01 2014-04-01 false Oil and gas pipelines. 169.25 Section 169.25 Indians....25 Oil and gas pipelines. (a) The Act of March 11, 1904 (33 Stat. 65), as amended by the Act of March 2, 1917 (39 Stat. 973; 25 U.S.C. 321), authorizes right-of-way grants for oil and gas pipelines...

  19. 25 CFR 169.25 - Oil and gas pipelines.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 25 Indians 1 2013-04-01 2013-04-01 false Oil and gas pipelines. 169.25 Section 169.25 Indians....25 Oil and gas pipelines. (a) The Act of March 11, 1904 (33 Stat. 65), as amended by the Act of March 2, 1917 (39 Stat. 973; 25 U.S.C. 321), authorizes right-of-way grants for oil and gas pipelines...

  20. 25 CFR 169.25 - Oil and gas pipelines.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 25 Indians 1 2012-04-01 2011-04-01 true Oil and gas pipelines. 169.25 Section 169.25 Indians....25 Oil and gas pipelines. (a) The Act of March 11, 1904 (33 Stat. 65), as amended by the Act of March 2, 1917 (39 Stat. 973; 25 U.S.C. 321), authorizes right-of-way grants for oil and gas pipelines...

  1. Benchmark datasets for phylogenomic pipeline validation, applications for foodborne pathogen surveillance.

    PubMed

    Timme, Ruth E; Rand, Hugh; Shumway, Martin; Trees, Eija K; Simmons, Mustafa; Agarwala, Richa; Davis, Steven; Tillman, Glenn E; Defibaugh-Chavez, Stephanie; Carleton, Heather A; Klimke, William A; Katz, Lee S

    2017-01-01

    As next generation sequence technology has advanced, there have been parallel advances in genome-scale analysis programs for determining evolutionary relationships as proxies for epidemiological relationship in public health. Most new programs skip traditional steps of ortholog determination and multi-gene alignment, instead identifying variants across a set of genomes, then summarizing results in a matrix of single-nucleotide polymorphisms or alleles for standard phylogenetic analysis. However, public health authorities need to document the performance of these methods with appropriate and comprehensive datasets so they can be validated for specific purposes, e.g., outbreak surveillance. Here we propose a set of benchmark datasets to be used for comparison and validation of phylogenomic pipelines. We identified four well-documented foodborne pathogen events in which the epidemiology was concordant with routine phylogenomic analyses (reference-based SNP and wgMLST approaches). These are ideal benchmark datasets, as the trees, WGS data, and epidemiological data for each are all in agreement. We have placed these sequence data, sample metadata, and "known" phylogenetic trees in publicly-accessible databases and developed a standard descriptive spreadsheet format describing each dataset. To facilitate easy downloading of these benchmarks, we developed an automated script that uses the standard descriptive spreadsheet format. Our "outbreak" benchmark datasets represent the four major foodborne bacterial pathogens ( Listeria monocytogenes , Salmonella enterica , Escherichia coli , and Campylobacter jejuni ) and one simulated dataset where the "known tree" can be accurately called the "true tree". The downloading script and associated table files are available on GitHub: https://github.com/WGS-standards-and-analysis/datasets. These five benchmark datasets will help standardize comparison of current and future phylogenomic pipelines, and facilitate important cross

  2. Disrupting the School-to-Prison Pipeline

    ERIC Educational Resources Information Center

    Bahena, Sofía, Ed.; Cooc, North, Ed.; Currie-Rubin, Rachel, Ed.; Kuttner, Paul, Ed.; Ng, Monica, Ed.

    2012-01-01

    A trenchant and wide-ranging look at this alarming national trend, "Disrupting the School-to-Prison Pipeline" is unsparing in its account of the problem while pointing in the direction of meaningful and much-needed reforms. The "school-to-prison pipeline" has received much attention in the education world over the past few…

  3. Using industry ROV videos to assess fish associations with subsea pipelines

    NASA Astrophysics Data System (ADS)

    McLean, D. L.; Partridge, J. C.; Bond, T.; Birt, M. J.; Bornt, K. R.; Langlois, T. J.

    2017-06-01

    Remote Operated Vehicles are routinely used to undertake inspection and maintenance activities of underwater pipelines in north-west Australia. In doing so, many terabytes of geo-referenced underwater video are collected at depths, and on a scale usually unobtainable for ecological research. We assessed fish diversity and abundance from existing ROV videos collected along 2-3 km sections of two pipelines in north-west Australia, one at 60-80 m water depth and the other at 120-130 m. A total of 5962 individual fish from 92 species and 42 families were observed. Both pipelines were characterised by a high abundance of commercially important fishes including: snappers (Lutjanidae) and groupers (Epinephelidae). The presence of thousands of unidentifiable larval fish, in addition to juveniles, sub-adults and adults suggests that the pipelines may be enhancing, rather than simply attracting, fish stocks. The prevalence and high complexity of sponges on the shallower pipeline and of deepwater corals on the deeper pipeline had a strong positive correlation with the fish abundance. These habitats likely offer a significant food source and refuge for fish, but also for invertebrates upon which fish feed. A greater diversity on the shallower pipeline, and a higher abundance of fishes on both pipelines, were associated with unsupported pipeline sections (spans) and many species appeared to be utilising pipeline spans as refuges. This study is a first look at the potential value of subsea pipelines for fishes on the north-west shelf. While the results suggest that these sections of pipeline appear to offer significant habitat that supports diverse and important commercially fished species, further work, including off-pipeline surveys on the natural seafloor, are required to determine conclusively the ecological value of pipelines and thereby inform discussions regarding the ecological implications of pipeline decommissioning.

  4. Data as a Service: A Seismic Web Service Pipeline

    NASA Astrophysics Data System (ADS)

    Martinez, E.

    2016-12-01

    Publishing data as a service pipeline provides an improved, dynamic approach over static data archives. A service pipeline is a collection of micro web services that each perform a specific task and expose the results of that task. Structured request/response formats allow micro web services to be chained together into a service pipeline to provide more complex results. The U.S. Geological Survey adopted service pipelines to publish seismic hazard and design data supporting both specific and generalized audiences. The seismic web service pipeline starts at source data and exposes probability and deterministic hazard curves, response spectra, risk-targeted ground motions, and seismic design provision metadata. This pipeline supports public/private organizations and individual engineers/researchers. Publishing data as a service pipeline provides a variety of benefits. Exposing the component services enables advanced users to inspect or use the data at each processing step. Exposing a composite service enables new users quick access to published data with a very low barrier to entry. Advanced users may re-use micro web services by chaining them in new ways or injecting new micros services into the pipeline. This allows the user to test hypothesis and compare their results to published results. Exposing data at each step in the pipeline enables users to review and validate the data and process more quickly and accurately. Making the source code open source, per USGS policy, further enables this transparency. Each micro service may be scaled independent of any other micro service. This ensures data remains available and timely in a cost-effective manner regardless of load. Additionally, if a new or more efficient approach to processing the data is discovered, this new approach may replace the old approach at any time, keeping the pipeline running while not affecting other micro services.

  5. 77 FR 19799 - Pipeline Safety: Pipeline Damage Prevention Programs

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-02

    ...,602 to $3,445,975. Evaluating just the lower range of benefits over ten years results in a total... consequences resulting from excavation damage to pipelines. A comprehensive damage prevention program requires..., including that resulting from excavation, digging, and other impacts, is also precipitated by operators...

  6. Deliverability on the interstate natural gas pipeline system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1998-05-01

    Deliverability on the Interstate Natural Gas Pipeline System examines the capability of the national pipeline grid to transport natural gas to various US markets. The report quantifies the capacity levels and utilization rates of major interstate pipeline companies in 1996 and the changes since 1990, as well as changes in markets and end-use consumption patterns. It also discusses the effects of proposed capacity expansions on capacity levels. The report consists of five chapters, several appendices, and a glossary. Chapter 1 discusses some of the operational and regulatory features of the US interstate pipeline system and how they affect overall systemmore » design, system utilization, and capacity expansions. Chapter 2 looks at how the exploration, development, and production of natural gas within North America is linked to the national pipeline grid. Chapter 3 examines the capability of the interstate natural gas pipeline network to link production areas to market areas, on the basis of capacity and usage levels along 10 corridors. The chapter also examines capacity expansions that have occurred since 1990 along each corridor and the potential impact of proposed new capacity. Chapter 4 discusses the last step in the transportation chain, that is, deliverability to the ultimate end user. Flow patterns into and out of each market region are discussed, as well as the movement of natural gas between States in each region. Chapter 5 examines how shippers reserve interstate pipeline capacity in the current transportation marketplace and how pipeline companies are handling the secondary market for short-term unused capacity. Four appendices provide supporting data and additional detail on the methodology used to estimate capacity. 32 figs., 15 tabs.« less

  7. 75 FR 72778 - Pipeline Safety: Technical Pipeline Safety Advisory Committee Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-26

    ... pipeline safety regulations to the remaining unregulated rural onshore hazardous liquid low-stress... Low-Stress Lines'' on June 22, 2010 (75 FR 35366). The meeting agenda will include the committee's...

  8. Improved third generation peristaltic crawler for removal of high-level waste plugs in United States department of energy Hanford site pipelines

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vazquez, Gabriela; Pribanic, Tomas

    2013-07-01

    There are approximately 56 million gallons (212 km{sup 3}) of high level waste (HLW) at the U.S. Department of Energy (DOE) Hanford Site. It is scheduled that by the year 2040, the HLW is to be completely transferred to secure double-shell tanks (DST) from the leaking single-tanks (SST) via transfer pipeline system. Blockages have formed inside the pipes during transport because of the variety in composition and characteristics of the waste. These full and partial plugs delay waste transfers and require manual intervention to repair, therefore are extremely expensive, consuming millions of dollars and further threatening the environment. To successfullymore » continue the transfer of waste through the pipelines, DOE site engineers are in need of a technology that can accurately locate the blockages and unplug the pipelines. In this study, the proposed solution to remediate blockages formed in pipelines is the use of a peristaltic crawler: a pneumatically/hydraulically operated device that propels itself in a worm-like motion through sequential fluctuations of pressure in its air cavities. The crawler is also equipped with a high-pressure water nozzle used to clear blockages inside the pipelines. The crawler is now in its third generation. Previous generations showed limitations in its durability, speed, and maneuverability. Latest improvements include an automation of sequence that prevents kickback, a front-mounted inspection camera for visual feedback, and a thinner wall outer bellow for improved maneuverability. Different experimental tests were conducted to evaluate the improvements of crawler relative to its predecessors using a pipeline test-bed assembly. Anchor force tests, unplugging tests, and fatigue testing for both the bellow and rubber rims have yet to be conducted and thus results are not presented in this research. Experiments tested bellow force and response, cornering maneuverability, and straight line navigational speed. The design concept and

  9. Virtual Instrumentation Corrosion Controller for Natural Gas Pipelines

    NASA Astrophysics Data System (ADS)

    Gopalakrishnan, J.; Agnihotri, G.; Deshpande, D. M.

    2012-12-01

    Corrosion is an electrochemical process. Corrosion in natural gas (methane) pipelines leads to leakages. Corrosion occurs when anode and cathode are connected through electrolyte. Rate of corrosion in metallic pipeline can be controlled by impressing current to it and thereby making it to act as cathode of corrosion cell. Technologically advanced and energy efficient corrosion controller is required to protect natural gas pipelines. Proposed virtual instrumentation (VI) based corrosion controller precisely controls the external corrosion in underground metallic pipelines, enhances its life and ensures safety. Designing and development of proportional-integral-differential (PID) corrosion controller using VI (LabVIEW) is carried out. When the designed controller is deployed at field, it maintains the pipe to soil potential (PSP) within safe operating limit and not entering into over/under protection zone. Horizontal deployment of this technique can be done to protect all metallic structure, oil pipelines, which need corrosion protection.

  10. Natural Gas Pipeline and System Expansions

    EIA Publications

    1997-01-01

    This special report examines recent expansions to the North American natural gas pipeline network and the nature and type of proposed pipeline projects announced or approved for construction during the next several years in the United States. It includes those projects in Canada and Mexico that tie in with U.S. markets or projects.

  11. Evaluation of Campus Pipeline, Spring 2002.

    ERIC Educational Resources Information Center

    Serban, Andreea M.; Fleming, Steve

    The Campus Pipeline at Santa Barbara City College (SBCC), California, is a portal whose purpose is to provide single web entry to relevant academic and institutional information for students and faculty. This fall 2002 evaluation of the Campus Pipeline aims to: (1) explore the degree of satisfaction of students and faculty; (2) determine which…

  12. Testing the School-to-Prison Pipeline

    ERIC Educational Resources Information Center

    Owens, Emily G.

    2017-01-01

    The School-to-Prison Pipeline is a social phenomenon where students become formally involved with the criminal justice system as a result of school policies that use law enforcement, rather than discipline, to address behavioral problems. A potentially important part of the School-to-Prison Pipeline is the use of sworn School Resource Officers…

  13. Problematizing the STEM Pipeline Metaphor: Is the STEM Pipeline Metaphor Serving Our Students and the STEM Workforce?

    ERIC Educational Resources Information Center

    Cannady, Matthew A.; Greenwald, Eric; Harris, Kimberly N.

    2014-01-01

    Researchers and policy makers often use the metaphor of an ever-narrowing pipeline to describe the trajectory to a science, technology, engineering or mathematics (STEM) degree or career. This study interrogates the appropriateness of the STEM pipeline as the dominant frame for understanding and making policies related to STEM career trajectories.…

  14. Commissioning of a new helium pipeline

    NASA Technical Reports Server (NTRS)

    2000-01-01

    At the commissioning of a new high-pressure helium pipeline at Kennedy Space Center, participants cut the lines to helium-filled balloons. From left, they are Center Director Roy Bridges; Michael Butchko, president, SGS; Pierre Dufour, president and CEO, Air Liquide America Corporation; David Herst, director, Delta IV Launch Sites; Pamela Gillespie, executive administrator, office of Congressman Dave Weldon; and Col. Samuel Dick, representative of the 45th Space Wing. The nine-mile-long buried pipeline will service launch needs at the new Delta IV Complex 37 at Cape Canaveral Air Force Station. It will also serve as a backup helium resource for Shuttle launches. Nearly one launch's worth of helium will be available in the pipeline to support a Shuttle pad in an emergency. The line originates at the Helium Facility on KSC and terminates in a meter station at the perimeter of the Delta IV launch pad. Others at the ceremony were Jerry Jorgensen, pipeline project manager, Space Gateway Support (SGS), and Ramon Lugo, acting executive director, JPMO.

  15. Commissioning of a new helium pipeline

    NASA Technical Reports Server (NTRS)

    2000-01-01

    At the commissioning of a new high-pressure helium pipeline at Kennedy Space Center, participants watch as helium-filled balloons take to the sky after their lines were cut. From left, they are Center Director Roy Bridges; Michael Butchko, president, SGS; Pierre Dufour, president and CEO, Air Liquide America Corporation; David Herst, director, Delta IV Launch Sites; Pamela Gillespie, executive administrator, office of Congressman Dave Weldon; and Col. Samuel Dick, representative of the 45th Space Wing. The nine-mile-long buried pipeline will service launch needs at the new Delta IV Complex 37 at Cape Canaveral Air Force Station. It will also serve as a backup helium resource for Shuttle launches. Nearly one launch's worth of helium will be available in the pipeline to support a Shuttle pad in an emergency. The line originates at the Helium Facility on KSC and terminates in a meter station at the perimeter of the Delta IV launch pad. Others at the ceremony were Jerry Jorgensen, pipeline project manager, Space Gateway Support (SGS), and Ramon Lugo, acting executive director, JPMO.

  16. Color correction pipeline optimization for digital cameras

    NASA Astrophysics Data System (ADS)

    Bianco, Simone; Bruna, Arcangelo R.; Naccari, Filippo; Schettini, Raimondo

    2013-04-01

    The processing pipeline of a digital camera converts the RAW image acquired by the sensor to a representation of the original scene that should be as faithful as possible. There are mainly two modules responsible for the color-rendering accuracy of a digital camera: the former is the illuminant estimation and correction module, and the latter is the color matrix transformation aimed to adapt the color response of the sensor to a standard color space. These two modules together form what may be called the color correction pipeline. We design and test new color correction pipelines that exploit different illuminant estimation and correction algorithms that are tuned and automatically selected on the basis of the image content. Since the illuminant estimation is an ill-posed problem, illuminant correction is not error-free. An adaptive color matrix transformation module is optimized, taking into account the behavior of the first module in order to alleviate the amplification of color errors. The proposed pipelines are tested on a publicly available dataset of RAW images. Experimental results show that exploiting the cross-talks between the modules of the pipeline can lead to a higher color-rendition accuracy.

  17. BigDataScript: a scripting language for data pipelines.

    PubMed

    Cingolani, Pablo; Sladek, Rob; Blanchette, Mathieu

    2015-01-01

    The analysis of large biological datasets often requires complex processing pipelines that run for a long time on large computational infrastructures. We designed and implemented a simple script-like programming language with a clean and minimalist syntax to develop and manage pipeline execution and provide robustness to various types of software and hardware failures as well as portability. We introduce the BigDataScript (BDS) programming language for data processing pipelines, which improves abstraction from hardware resources and assists with robustness. Hardware abstraction allows BDS pipelines to run without modification on a wide range of computer architectures, from a small laptop to multi-core servers, server farms, clusters and clouds. BDS achieves robustness by incorporating the concepts of absolute serialization and lazy processing, thus allowing pipelines to recover from errors. By abstracting pipeline concepts at programming language level, BDS simplifies implementation, execution and management of complex bioinformatics pipelines, resulting in reduced development and debugging cycles as well as cleaner code. BigDataScript is available under open-source license at http://pcingola.github.io/BigDataScript. © The Author 2014. Published by Oxford University Press.

  18. BigDataScript: a scripting language for data pipelines

    PubMed Central

    Cingolani, Pablo; Sladek, Rob; Blanchette, Mathieu

    2015-01-01

    Motivation: The analysis of large biological datasets often requires complex processing pipelines that run for a long time on large computational infrastructures. We designed and implemented a simple script-like programming language with a clean and minimalist syntax to develop and manage pipeline execution and provide robustness to various types of software and hardware failures as well as portability. Results: We introduce the BigDataScript (BDS) programming language for data processing pipelines, which improves abstraction from hardware resources and assists with robustness. Hardware abstraction allows BDS pipelines to run without modification on a wide range of computer architectures, from a small laptop to multi-core servers, server farms, clusters and clouds. BDS achieves robustness by incorporating the concepts of absolute serialization and lazy processing, thus allowing pipelines to recover from errors. By abstracting pipeline concepts at programming language level, BDS simplifies implementation, execution and management of complex bioinformatics pipelines, resulting in reduced development and debugging cycles as well as cleaner code. Availability and implementation: BigDataScript is available under open-source license at http://pcingola.github.io/BigDataScript. Contact: pablo.e.cingolani@gmail.com PMID:25189778

  19. 30 CFR 250.1005 - Inspection requirements for DOI pipelines.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 30 Mineral Resources 2 2010-07-01 2010-07-01 false Inspection requirements for DOI pipelines. 250.1005 Section 250.1005 Mineral Resources MINERALS MANAGEMENT SERVICE, DEPARTMENT OF THE INTERIOR...-Way § 250.1005 Inspection requirements for DOI pipelines. (a) Pipeline routes shall be inspected at...

  20. Prospects for coal slurry pipelines in California

    NASA Technical Reports Server (NTRS)

    Lynch, J. F.

    1978-01-01

    The coal slurry pipeline segment of the transport industry is emerging in the United States. If accepted it will play a vital role in meeting America's urgent energy requirements without public subsidy, tax relief, or federal grants. It is proven technology, ideally suited for transport of an abundant energy resource over thousands of miles to energy short industrial centers and at more than competitive costs. Briefly discussed are the following: (1) history of pipelines; (2) California market potential; (3) slurry technology; (4) environmental benefits; (5) market competition; and (6) a proposed pipeline.

  1. Influence of Anchoring on Burial Depth of Submarine Pipelines

    PubMed Central

    Zhuang, Yuan; Li, Yang; Su, Wei

    2016-01-01

    Since the beginning of the twenty-first century, there has been widespread construction of submarine oil-gas transmission pipelines due to an increase in offshore oil exploration. Vessel anchoring operations are causing more damage to submarine pipelines due to shipping transportation also increasing. Therefore, it is essential that the influence of anchoring on the required burial depth of submarine pipelines is determined. In this paper, mathematical models for ordinary anchoring and emergency anchoring have been established to derive an anchor impact energy equation for each condition. The required effective burial depth for submarine pipelines has then been calculated via an energy absorption equation for the protection layer covering the submarine pipelines. Finally, the results of the model calculation have been verified by accident case analysis, and the impact of the anchoring height, anchoring water depth and the anchor weight on the required burial depth of submarine pipelines has been further analyzed. PMID:27166952

  2. An automated, high-throughput plant phenotyping system using machine learning-based plant segmentation and image analysis.

    PubMed

    Lee, Unseok; Chang, Sungyul; Putra, Gian Anantrio; Kim, Hyoungseok; Kim, Dong Hwan

    2018-01-01

    A high-throughput plant phenotyping system automatically observes and grows many plant samples. Many plant sample images are acquired by the system to determine the characteristics of the plants (populations). Stable image acquisition and processing is very important to accurately determine the characteristics. However, hardware for acquiring plant images rapidly and stably, while minimizing plant stress, is lacking. Moreover, most software cannot adequately handle large-scale plant imaging. To address these problems, we developed a new, automated, high-throughput plant phenotyping system using simple and robust hardware, and an automated plant-imaging-analysis pipeline consisting of machine-learning-based plant segmentation. Our hardware acquires images reliably and quickly and minimizes plant stress. Furthermore, the images are processed automatically. In particular, large-scale plant-image datasets can be segmented precisely using a classifier developed using a superpixel-based machine-learning algorithm (Random Forest), and variations in plant parameters (such as area) over time can be assessed using the segmented images. We performed comparative evaluations to identify an appropriate learning algorithm for our proposed system, and tested three robust learning algorithms. We developed not only an automatic analysis pipeline but also a convenient means of plant-growth analysis that provides a learning data interface and visualization of plant growth trends. Thus, our system allows end-users such as plant biologists to analyze plant growth via large-scale plant image data easily.

  3. SAMPL4 & DOCK3.7: lessons for automated docking procedures

    NASA Astrophysics Data System (ADS)

    Coleman, Ryan G.; Sterling, Teague; Weiss, Dahlia R.

    2014-03-01

    The SAMPL4 challenges were used to test current automated methods for solvation energy, virtual screening, pose and affinity prediction of the molecular docking pipeline DOCK 3.7. Additionally, first-order models of binding affinity were proposed as milestones for any method predicting binding affinity. Several important discoveries about the molecular docking software were made during the challenge: (1) Solvation energies of ligands were five-fold worse than any other method used in SAMPL4, including methods that were similarly fast, (2) HIV Integrase is a challenging target, but automated docking on the correct allosteric site performed well in terms of virtual screening and pose prediction (compared to other methods) but affinity prediction, as expected, was very poor, (3) Molecular docking grid sizes can be very important, serious errors were discovered with default settings that have been adjusted for all future work. Overall, lessons from SAMPL4 suggest many changes to molecular docking tools, not just DOCK 3.7, that could improve the state of the art. Future difficulties and projects will be discussed.

  4. Natural disasters and the gas pipeline system.

    DOT National Transportation Integrated Search

    1996-11-01

    Episodic descriptions are provided of the effects of the Loma Prieta earthquake (1989) on the gas pipeline systems of Pacific Gas & Electric Company and the Cit of Palo Alto and of the Northridge earthquake (1994) on Southern California Gas' pipeline...

  5. Development of a Free-Swimming Acoustic Tool for Liquid Pipeline Leak Detection Including Evaluation for Natural Gas Pipeline Applications

    DOT National Transportation Integrated Search

    2010-08-01

    Significant financial and environmental consequences often result from line leakage of oil product pipelines. Product can escape into the surrounding soil as even the smallest leak can lead to rupture of the pipeline. From a health perspective, water...

  6. Stress and Strain State Analysis of Defective Pipeline Portion

    NASA Astrophysics Data System (ADS)

    Burkov, P. V.; Burkova, S. P.; Knaub, S. A.

    2015-09-01

    The paper presents computer simulation results of the pipeline having defects in a welded joint. Autodesk Inventor software is used for simulation of the stress and strain state of the pipeline. Places of the possible failure and stress concentrators are predicted on the defective portion of the pipeline.

  7. An automated assay for the assessment of cardiac arrest in fish embryo.

    PubMed

    Puybareau, Elodie; Genest, Diane; Barbeau, Emilie; Léonard, Marc; Talbot, Hugues

    2017-02-01

    Studies on fish embryo models are widely developed in research. They are used in several research fields including drug discovery or environmental toxicology. In this article, we propose an entirely automated assay to detect cardiac arrest in Medaka (Oryzias latipes) based on image analysis. We propose a multi-scale pipeline based on mathematical morphology. Starting from video sequences of entire wells in 24-well plates, we focus on the embryo, detect its heart, and ascertain whether or not the heart is beating based on intensity variation analysis. Our image analysis pipeline only uses commonly available operators. It has a low computational cost, allowing analysis at the same rate as acquisition. From an initial dataset of 3192 videos, 660 were discarded as unusable (20.7%), 655 of them correctly so (99.25%) and only 5 incorrectly so (0.75%). The 2532 remaining videos were used for our test. On these, 45 errors were made, leading to a success rate of 98.23%. Copyright © 2016 Elsevier Ltd. All rights reserved.

  8. Sub-soil contamination due to oil spills in zones surrounding oil pipeline-pump stations and oil pipeline right-of-ways in Southwest-Mexico.

    PubMed

    Iturbe, Rosario; Flores, Carlos; Castro, Alejandrina; Torres, Luis G

    2007-10-01

    Oil spills due to oil pipelines is a very frequent problem in Mexico. Petroleos Mexicanos (PEMEX), very concerned with the environmental agenda, has been developing inspection and correction plans for zones around oil pipelines pumping stations and pipeline right-of-way. These stations are located at regular intervals of kilometres along the pipelines. In this study, two sections of an oil pipeline and two pipeline pumping stations zones are characterized in terms of the presence of Total Petroleum Hydrocarbons (TPHs) and Polycyclic Aromatic Hydrocarbons (PAHs). The study comprehends sampling of the areas, delimitation of contamination in the vertical and horizontal extension, analysis of the sampled soils regarding TPHs content and, in some cases, the 16 PAHs considered as priority by USEPA, calculation of areas and volumes contaminated (according to Mexican legislation, specifically NOM-EM-138-ECOL-2002) and, finally, a proposal for the best remediation techniques suitable for the contamination levels and the localization of contaminants.

  9. A Java-based fMRI processing pipeline evaluation system for assessment of univariate general linear model and multivariate canonical variate analysis-based pipelines.

    PubMed

    Zhang, Jing; Liang, Lichen; Anderson, Jon R; Gatewood, Lael; Rottenberg, David A; Strother, Stephen C

    2008-01-01

    As functional magnetic resonance imaging (fMRI) becomes widely used, the demands for evaluation of fMRI processing pipelines and validation of fMRI analysis results is increasing rapidly. The current NPAIRS package, an IDL-based fMRI processing pipeline evaluation framework, lacks system interoperability and the ability to evaluate general linear model (GLM)-based pipelines using prediction metrics. Thus, it can not fully evaluate fMRI analytical software modules such as FSL.FEAT and NPAIRS.GLM. In order to overcome these limitations, a Java-based fMRI processing pipeline evaluation system was developed. It integrated YALE (a machine learning environment) into Fiswidgets (a fMRI software environment) to obtain system interoperability and applied an algorithm to measure GLM prediction accuracy. The results demonstrated that the system can evaluate fMRI processing pipelines with univariate GLM and multivariate canonical variates analysis (CVA)-based models on real fMRI data based on prediction accuracy (classification accuracy) and statistical parametric image (SPI) reproducibility. In addition, a preliminary study was performed where four fMRI processing pipelines with GLM and CVA modules such as FSL.FEAT and NPAIRS.CVA were evaluated with the system. The results indicated that (1) the system can compare different fMRI processing pipelines with heterogeneous models (NPAIRS.GLM, NPAIRS.CVA and FSL.FEAT) and rank their performance by automatic performance scoring, and (2) the rank of pipeline performance is highly dependent on the preprocessing operations. These results suggest that the system will be of value for the comparison, validation, standardization and optimization of functional neuroimaging software packages and fMRI processing pipelines.

  10. Risk Analysis using Corrosion Rate Parameter on Gas Transmission Pipeline

    NASA Astrophysics Data System (ADS)

    Sasikirono, B.; Kim, S. J.; Haryadi, G. D.; Huda, A.

    2017-05-01

    In the oil and gas industry, the pipeline is a major component in the transmission and distribution process of oil and gas. Oil and gas distribution process sometimes performed past the pipeline across the various types of environmental conditions. Therefore, in the transmission and distribution process of oil and gas, a pipeline should operate safely so that it does not harm the surrounding environment. Corrosion is still a major cause of failure in some components of the equipment in a production facility. In pipeline systems, corrosion can cause failures in the wall and damage to the pipeline. Therefore it takes care and periodic inspections or checks on the pipeline system. Every production facility in an industry has a level of risk for damage which is a result of the opportunities and consequences of damage caused. The purpose of this research is to analyze the level of risk of 20-inch Natural Gas Transmission Pipeline using Risk-based inspection semi-quantitative based on API 581 associated with the likelihood of failure and the consequences of the failure of a component of the equipment. Then the result is used to determine the next inspection plans. Nine pipeline components were observed, such as a straight pipes inlet, connection tee, and straight pipes outlet. The risk assessment level of the nine pipeline’s components is presented in a risk matrix. The risk level of components is examined at medium risk levels. The failure mechanism that is used in this research is the mechanism of thinning. Based on the results of corrosion rate calculation, remaining pipeline components age can be obtained, so the remaining lifetime of pipeline components are known. The calculation of remaining lifetime obtained and the results vary for each component. Next step is planning the inspection of pipeline components by NDT external methods.

  11. Multi-Mission Automated Task Invocation Subsystem

    NASA Technical Reports Server (NTRS)

    Cheng, Cecilia S.; Patel, Rajesh R.; Sayfi, Elias M.; Lee, Hyun H.

    2009-01-01

    Multi-Mission Automated Task Invocation Subsystem (MATIS) is software that establishes a distributed data-processing framework for automated generation of instrument data products from a spacecraft mission. Each mission may set up a set of MATIS servers for processing its data products. MATIS embodies lessons learned in experience with prior instrument- data-product-generation software. MATIS is an event-driven workflow manager that interprets project-specific, user-defined rules for managing processes. It executes programs in response to specific events under specific conditions according to the rules. Because requirements of different missions are too diverse to be satisfied by one program, MATIS accommodates plug-in programs. MATIS is flexible in that users can control such processing parameters as how many pipelines to run and on which computing machines to run them. MATIS has a fail-safe capability. At each step, MATIS captures and retains pertinent information needed to complete the step and start the next step. In the event of a restart, this information is retrieved so that processing can be resumed appropriately. At this writing, it is planned to develop a graphical user interface (GUI) for monitoring and controlling a product generation engine in MATIS. The GUI would enable users to schedule multiple processes and manage the data products produced in the processes. Although MATIS was initially designed for instrument data product generation,

  12. ORAC-DR: Astronomy data reduction pipeline

    NASA Astrophysics Data System (ADS)

    Jenness, Tim; Economou, Frossie; Cavanagh, Brad; Currie, Malcolm J.; Gibb, Andy

    2013-10-01

    ORAC-DR is a generic data reduction pipeline infrastructure; it includes specific data processing recipes for a number of instruments. It is used at the James Clerk Maxwell Telescope, United Kingdom Infrared Telescope, AAT, and LCOGT. This pipeline runs at the JCMT Science Archive hosted by CADC to generate near-publication quality data products; the code has been in use since 1998.

  13. Simplified Technique for Predicting Offshore Pipeline Expansion

    NASA Astrophysics Data System (ADS)

    Seo, J. H.; Kim, D. K.; Choi, H. S.; Yu, S. Y.; Park, K. S.

    2018-06-01

    In this study, we propose a method for estimating the amount of expansion that occurs in subsea pipelines, which could be applied in the design of robust structures that transport oil and gas from offshore wells. We begin with a literature review and general discussion of existing estimation methods and terminologies with respect to subsea pipelines. Due to the effects of high pressure and high temperature, the production of fluid from offshore wells is typically caused by physical deformation of subsea structures, e.g., expansion and contraction during the transportation process. In severe cases, vertical and lateral buckling occurs, which causes a significant negative impact on structural safety, and which is related to on-bottom stability, free-span, structural collapse, and many other factors. In addition, these factors may affect the production rate with respect to flow assurance, wax, and hydration, to name a few. In this study, we developed a simple and efficient method for generating a reliable pipe expansion design in the early stage, which can lead to savings in both cost and computation time. As such, in this paper, we propose an applicable diagram, which we call the standard dimensionless ratio (SDR) versus virtual anchor length (L A ) diagram, that utilizes an efficient procedure for estimating subsea pipeline expansion based on applied reliable scenarios. With this user guideline, offshore pipeline structural designers can reliably determine the amount of subsea pipeline expansion and the obtained results will also be useful for the installation, design, and maintenance of the subsea pipeline.

  14. Consensus between Pipelines in Structural Brain Networks

    PubMed Central

    Parker, Christopher S.; Deligianni, Fani; Cardoso, M. Jorge; Daga, Pankaj; Modat, Marc; Dayan, Michael; Clark, Chris A.

    2014-01-01

    Structural brain networks may be reconstructed from diffusion MRI tractography data and have great potential to further our understanding of the topological organisation of brain structure in health and disease. Network reconstruction is complex and involves a series of processesing methods including anatomical parcellation, registration, fiber orientation estimation and whole-brain fiber tractography. Methodological choices at each stage can affect the anatomical accuracy and graph theoretical properties of the reconstructed networks, meaning applying different combinations in a network reconstruction pipeline may produce substantially different networks. Furthermore, the choice of which connections are considered important is unclear. In this study, we assessed the similarity between structural networks obtained using two independent state-of-the-art reconstruction pipelines. We aimed to quantify network similarity and identify the core connections emerging most robustly in both pipelines. Similarity of network connections was compared between pipelines employing different atlases by merging parcels to a common and equivalent node scale. We found a high agreement between the networks across a range of fiber density thresholds. In addition, we identified a robust core of highly connected regions coinciding with a peak in similarity across network density thresholds, and replicated these results with atlases at different node scales. The binary network properties of these core connections were similar between pipelines but showed some differences in atlases across node scales. This study demonstrates the utility of applying multiple structural network reconstrution pipelines to diffusion data in order to identify the most important connections for further study. PMID:25356977

  15. Expansion of the U.S. Natural Gas Pipeline Network

    EIA Publications

    2009-01-01

    Additions in 2008 and Projects through 2011. This report examines new natural gas pipeline capacity added to the U.S. natural gas pipeline system during 2008. In addition, it discusses and analyzes proposed natural gas pipeline projects that may be developed between 2009 and 2011, and the market factors supporting these initiatives.

  16. 75 FR 32836 - Pipeline Safety: Workshop on Public Awareness Programs

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-09

    ... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket ID... American Public Gas Association Association of Oil Pipelines American Petroleum Institute Interstate... the pipeline industry). Hazardous Liquid Gas Transmission/Gathering Natural Gas Distribution (10...

  17. Automation of large scale transient protein expression in mammalian cells

    PubMed Central

    Zhao, Yuguang; Bishop, Benjamin; Clay, Jordan E.; Lu, Weixian; Jones, Margaret; Daenke, Susan; Siebold, Christian; Stuart, David I.; Yvonne Jones, E.; Radu Aricescu, A.

    2011-01-01

    Traditional mammalian expression systems rely on the time-consuming generation of stable cell lines; this is difficult to accommodate within a modern structural biology pipeline. Transient transfections are a fast, cost-effective solution, but require skilled cell culture scientists, making man-power a limiting factor in a setting where numerous samples are processed in parallel. Here we report a strategy employing a customised CompacT SelecT cell culture robot allowing the large-scale expression of multiple protein constructs in a transient format. Successful protocols have been designed for automated transient transfection of human embryonic kidney (HEK) 293T and 293S GnTI− cells in various flask formats. Protein yields obtained by this method were similar to those produced manually, with the added benefit of reproducibility, regardless of user. Automation of cell maintenance and transient transfection allows the expression of high quality recombinant protein in a completely sterile environment with limited support from a cell culture scientist. The reduction in human input has the added benefit of enabling continuous cell maintenance and protein production, features of particular importance to structural biology laboratories, which typically use large quantities of pure recombinant proteins, and often require rapid characterisation of a series of modified constructs. This automated method for large scale transient transfection is now offered as a Europe-wide service via the P-cube initiative. PMID:21571074

  18. Bio-oil transport by pipeline: a techno-economic assessment.

    PubMed

    Pootakham, Thanyakarn; Kumar, Amit

    2010-09-01

    Bio-oil, produced by fast pyrolysis of biomass, has high energy density compared to 'as received' biomass. The study assesses and compares the cost of transportation ($/liter of bio-oil) of bio-oil by pipeline and truck. The fixed and variable cost components of transportation of bio-oil at a pipeline capacity of 560 m(3)/day and to a distance of 100 km are 0.0423$/m(3) and 0.1201$/m(3)/km, respectively. Pipeline transportation of bio-oil costs less than transportation by liquid tank truck (load capacity 30 m(3)) and super B-train trailer (load capacity 60 m(3)) above pipeline capacities of 1000 and 1700 m(3)/day, respectively. When transportation distance is greater than 100 km, bio-oil must be heated at booster stations. When transporting bio-oil by pipeline to a distance of 400 km, minimum pipeline capacities of 1150 and 2000 m(3)/day are required to compete economically with liquid tank trucks and super B-train tank trailers, respectively. Copyright 2010 Elsevier Ltd. All rights reserved.

  19. Trans-Proteomic Pipeline, a standardized data processing pipeline for large-scale reproducible proteomics informatics

    PubMed Central

    Deutsch, Eric W.; Mendoza, Luis; Shteynberg, David; Slagel, Joseph; Sun, Zhi; Moritz, Robert L.

    2015-01-01

    Democratization of genomics technologies has enabled the rapid determination of genotypes. More recently the democratization of comprehensive proteomics technologies is enabling the determination of the cellular phenotype and the molecular events that define its dynamic state. Core proteomic technologies include mass spectrometry to define protein sequence, protein:protein interactions, and protein post-translational modifications. Key enabling technologies for proteomics are bioinformatic pipelines to identify, quantitate, and summarize these events. The Trans-Proteomics Pipeline (TPP) is a robust open-source standardized data processing pipeline for large-scale reproducible quantitative mass spectrometry proteomics. It supports all major operating systems and instrument vendors via open data formats. Here we provide a review of the overall proteomics workflow supported by the TPP, its major tools, and how it can be used in its various modes from desktop to cloud computing. We describe new features for the TPP, including data visualization functionality. We conclude by describing some common perils that affect the analysis of tandem mass spectrometry datasets, as well as some major upcoming features. PMID:25631240

  20. Trans-Proteomic Pipeline, a standardized data processing pipeline for large-scale reproducible proteomics informatics.

    PubMed

    Deutsch, Eric W; Mendoza, Luis; Shteynberg, David; Slagel, Joseph; Sun, Zhi; Moritz, Robert L

    2015-08-01

    Democratization of genomics technologies has enabled the rapid determination of genotypes. More recently the democratization of comprehensive proteomics technologies is enabling the determination of the cellular phenotype and the molecular events that define its dynamic state. Core proteomic technologies include MS to define protein sequence, protein:protein interactions, and protein PTMs. Key enabling technologies for proteomics are bioinformatic pipelines to identify, quantitate, and summarize these events. The Trans-Proteomics Pipeline (TPP) is a robust open-source standardized data processing pipeline for large-scale reproducible quantitative MS proteomics. It supports all major operating systems and instrument vendors via open data formats. Here, we provide a review of the overall proteomics workflow supported by the TPP, its major tools, and how it can be used in its various modes from desktop to cloud computing. We describe new features for the TPP, including data visualization functionality. We conclude by describing some common perils that affect the analysis of MS/MS datasets, as well as some major upcoming features. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. Kepler Science Operations Center Pipeline Framework

    NASA Technical Reports Server (NTRS)

    Klaus, Todd C.; McCauliff, Sean; Cote, Miles T.; Girouard, Forrest R.; Wohler, Bill; Allen, Christopher; Middour, Christopher; Caldwell, Douglas A.; Jenkins, Jon M.

    2010-01-01

    The Kepler mission is designed to continuously monitor up to 170,000 stars at a 30 minute cadence for 3.5 years searching for Earth-size planets. The data are processed at the Science Operations Center (SOC) at NASA Ames Research Center. Because of the large volume of data and the memory and CPU-intensive nature of the analysis, significant computing hardware is required. We have developed generic pipeline framework software that is used to distribute and synchronize the processing across a cluster of CPUs and to manage the resulting products. The framework is written in Java and is therefore platform-independent, and scales from a single, standalone workstation (for development and research on small data sets) to a full cluster of homogeneous or heterogeneous hardware with minimal configuration changes. A plug-in architecture provides customized control of the unit of work without the need to modify the framework itself. Distributed transaction services provide for atomic storage of pipeline products for a unit of work across a relational database and the custom Kepler DB. Generic parameter management and data accountability services are provided to record the parameter values, software versions, and other meta-data used for each pipeline execution. A graphical console allows for the configuration, execution, and monitoring of pipelines. An alert and metrics subsystem is used to monitor the health and performance of the pipeline. The framework was developed for the Kepler project based on Kepler requirements, but the framework itself is generic and could be used for a variety of applications where these features are needed.

  2. 76 FR 35130 - Pipeline Safety: Control Room Management/Human Factors

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-16

    ... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration 49 CFR Parts...: Control Room Management/Human Factors AGENCY: Pipeline and Hazardous Materials Safety Administration... safety standards, risk assessments, and safety policies for natural gas pipelines and for hazardous...

  3. 76 FR 5494 - Pipeline Safety: Mechanical Fitting Failure Reporting Requirements

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-01

    ... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration 49 CFR Part... Safety: Mechanical Fitting Failure Reporting Requirements AGENCY: Pipeline and Hazardous Materials Safety... tightening. A widely accepted industry guidance document, Gas Pipeline Technical Committee (GPTC) Guide, does...

  4. A Conceptual Model of the Air Force Logistics Pipeline

    DTIC Science & Technology

    1989-09-01

    Contracting Process . ....... 138 Industrial Capacity .. ......... 140 The Disposal Pipeline Subsystem ....... 142 Collective Pipeline Models...Explosion of " Industry ," Acquisition and Production Process .... ............ 202 60. First Level Explosion of "Attrition," the Disposal Process...Terminology and Phrases, a publication of The American Production and Inventory Control Society ( APICS ). This dictionary defines 5 "pipeline stock" as the

  5. A distributed pipeline for DIDSON data processing

    USGS Publications Warehouse

    Li, Liling; Danner, Tyler; Eickholt, Jesse; McCann, Erin L.; Pangle, Kevin; Johnson, Nicholas

    2018-01-01

    Technological advances in the field of ecology allow data on ecological systems to be collected at high resolution, both temporally and spatially. Devices such as Dual-frequency Identification Sonar (DIDSON) can be deployed in aquatic environments for extended periods and easily generate several terabytes of underwater surveillance data which may need to be processed multiple times. Due to the large amount of data generated and need for flexibility in processing, a distributed pipeline was constructed for DIDSON data making use of the Hadoop ecosystem. The pipeline is capable of ingesting raw DIDSON data, transforming the acoustic data to images, filtering the images, detecting and extracting motion, and generating feature data for machine learning and classification. All of the tasks in the pipeline can be run in parallel and the framework allows for custom processing. Applications of the pipeline include monitoring migration times, determining the presence of a particular species, estimating population size and other fishery management tasks.

  6. Optimal Energy Consumption Analysis of Natural Gas Pipeline

    PubMed Central

    Liu, Enbin; Li, Changjun; Yang, Yi

    2014-01-01

    There are many compressor stations along long-distance natural gas pipelines. Natural gas can be transported using different boot programs and import pressures, combined with temperature control parameters. Moreover, different transport methods have correspondingly different energy consumptions. At present, the operating parameters of many pipelines are determined empirically by dispatchers, resulting in high energy consumption. This practice does not abide by energy reduction policies. Therefore, based on a full understanding of the actual needs of pipeline companies, we introduce production unit consumption indicators to establish an objective function for achieving the goal of lowering energy consumption. By using a dynamic programming method for solving the model and preparing calculation software, we can ensure that the solution process is quick and efficient. Using established optimization methods, we analyzed the energy savings for the XQ gas pipeline. By optimizing the boot program, the import station pressure, and the temperature parameters, we achieved the optimal energy consumption. By comparison with the measured energy consumption, the pipeline now has the potential to reduce energy consumption by 11 to 16 percent. PMID:24955410

  7. Amateur Image Pipeline Processing using Python plus PyRAF

    NASA Astrophysics Data System (ADS)

    Green, Wayne

    2012-05-01

    A template pipeline spanning observing planning to publishing is offered as a basis for establishing a long term observing program. The data reduction pipeline encapsulates all policy and procedures, providing an accountable framework for data analysis and a teaching framework for IRAF. This paper introduces the technical details of a complete pipeline processing environment using Python, PyRAF and a few other languages. The pipeline encapsulates all processing decisions within an auditable framework. The framework quickly handles the heavy lifting of image processing. It also serves as an excellent teaching environment for astronomical data management and IRAF reduction decisions.

  8. 76 FR 53086 - Pipeline Safety: Safety of Gas Transmission Pipelines

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-25

    ... and external corrosion (subpart I of 49 CFR part 192). Pressure tests of new pipelines (subpart J of..., integrate and validate data (e.g., review of mill inspection reports, hydrostatic tests reports, pipe leaks... chemical properties, mill inspection reports, hydrostatic tests reports, coating type and condition, pipe...

  9. Automated reduction of sub-millimetre single-dish heterodyne data from the James Clerk Maxwell Telescope using ORAC-DR

    NASA Astrophysics Data System (ADS)

    Jenness, Tim; Currie, Malcolm J.; Tilanus, Remo P. J.; Cavanagh, Brad; Berry, David S.; Leech, Jamie; Rizzi, Luca

    2015-10-01

    With the advent of modern multidetector heterodyne instruments that can result in observations generating thousands of spectra per minute it is no longer feasible to reduce these data as individual spectra. We describe the automated data reduction procedure used to generate baselined data cubes from heterodyne data obtained at the James Clerk Maxwell Telescope (JCMT). The system can automatically detect baseline regions in spectra and automatically determine regridding parameters, all without input from a user. Additionally, it can detect and remove spectra suffering from transient interference effects or anomalous baselines. The pipeline is written as a set of recipes using the ORAC-DR pipeline environment with the algorithmic code using Starlink software packages and infrastructure. The algorithms presented here can be applied to other heterodyne array instruments and have been applied to data from historical JCMT heterodyne instrumentation.

  10. FESetup: Automating Setup for Alchemical Free Energy Simulations.

    PubMed

    Loeffler, Hannes H; Michel, Julien; Woods, Christopher

    2015-12-28

    FESetup is a new pipeline tool which can be used flexibly within larger workflows. The tool aims to support fast and easy setup of alchemical free energy simulations for molecular simulation packages such as AMBER, GROMACS, Sire, or NAMD. Post-processing methods like MM-PBSA and LIE can be set up as well. Ligands are automatically parametrized with AM1-BCC, and atom mappings for a single topology description are computed with a maximum common substructure search (MCSS) algorithm. An abstract molecular dynamics (MD) engine can be used for equilibration prior to free energy setup or standalone. Currently, all modern AMBER force fields are supported. Ease of use, robustness of the code, and automation where it is feasible are the main development goals. The project follows an open development model, and we welcome contributions.

  11. Statistical method to compare massive parallel sequencing pipelines.

    PubMed

    Elsensohn, M H; Leblay, N; Dimassi, S; Campan-Fournier, A; Labalme, A; Roucher-Boulez, F; Sanlaville, D; Lesca, G; Bardel, C; Roy, P

    2017-03-01

    Today, sequencing is frequently carried out by Massive Parallel Sequencing (MPS) that cuts drastically sequencing time and expenses. Nevertheless, Sanger sequencing remains the main validation method to confirm the presence of variants. The analysis of MPS data involves the development of several bioinformatic tools, academic or commercial. We present here a statistical method to compare MPS pipelines and test it in a comparison between an academic (BWA-GATK) and a commercial pipeline (TMAP-NextGENe®), with and without reference to a gold standard (here, Sanger sequencing), on a panel of 41 genes in 43 epileptic patients. This method used the number of variants to fit log-linear models for pairwise agreements between pipelines. To assess the heterogeneity of the margins and the odds ratios of agreement, four log-linear models were used: a full model, a homogeneous-margin model, a model with single odds ratio for all patients, and a model with single intercept. Then a log-linear mixed model was fitted considering the biological variability as a random effect. Among the 390,339 base-pairs sequenced, TMAP-NextGENe® and BWA-GATK found, on average, 2253.49 and 1857.14 variants (single nucleotide variants and indels), respectively. Against the gold standard, the pipelines had similar sensitivities (63.47% vs. 63.42%) and close but significantly different specificities (99.57% vs. 99.65%; p < 0.001). Same-trend results were obtained when only single nucleotide variants were considered (99.98% specificity and 76.81% sensitivity for both pipelines). The method allows thus pipeline comparison and selection. It is generalizable to all types of MPS data and all pipelines.

  12. SeqTrim: a high-throughput pipeline for pre-processing any type of sequence read

    PubMed Central

    2010-01-01

    Background High-throughput automated sequencing has enabled an exponential growth rate of sequencing data. This requires increasing sequence quality and reliability in order to avoid database contamination with artefactual sequences. The arrival of pyrosequencing enhances this problem and necessitates customisable pre-processing algorithms. Results SeqTrim has been implemented both as a Web and as a standalone command line application. Already-published and newly-designed algorithms have been included to identify sequence inserts, to remove low quality, vector, adaptor, low complexity and contaminant sequences, and to detect chimeric reads. The availability of several input and output formats allows its inclusion in sequence processing workflows. Due to its specific algorithms, SeqTrim outperforms other pre-processors implemented as Web services or standalone applications. It performs equally well with sequences from EST libraries, SSH libraries, genomic DNA libraries and pyrosequencing reads and does not lead to over-trimming. Conclusions SeqTrim is an efficient pipeline designed for pre-processing of any type of sequence read, including next-generation sequencing. It is easily configurable and provides a friendly interface that allows users to know what happened with sequences at every pre-processing stage, and to verify pre-processing of an individual sequence if desired. The recommended pipeline reveals more information about each sequence than previously described pre-processors and can discard more sequencing or experimental artefacts. PMID:20089148

  13. STEREO TRansiting Exoplanet and Stellar Survey (STRESS) - I. Introduction and data pipeline

    NASA Astrophysics Data System (ADS)

    Sangaralingam, Vinothini; Stevens, Ian R.

    2011-12-01

    The Solar TErrestrial RElations Observatory (STEREO) is a system of two identical spacecraft in heliocentric Earth orbit. We use the two heliospheric imagers (HI), which are wide-angle imagers with multibaffle systems, to perform high-precision stellar photometry in order to search for exoplanetary transits and understand stellar variables. The large cadence (40 min for HI-1 and 2 h for HI-2), high precision, wide magnitude range (R mag: 4-12) and broad sky coverage (nearly 20 per cent for HI-1A alone and 60 per cent of the sky in the zodiacal region for all instruments combined) of this instrument place it in a region left largely devoid by other current projects. In this paper, we describe the semi-automated pipeline devised for reduction of the data, some of the interesting characteristics of the data obtained and data-analysis methods used, along with some early results.

  14. 75 FR 56972 - Pipeline Safety: Control Room Management/Human Factors

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-17

    ... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration 49 CFR Parts... Factors AGENCY: Pipeline and Hazardous Materials Safety Administration (PHMSA); DOT. ACTION: Notice of...: Background There are roughly 170,000 miles of hazardous liquid pipelines, 295,000 miles of gas transmission...

  15. TESS Data Processing and Quick-look Pipeline

    NASA Astrophysics Data System (ADS)

    Fausnaugh, Michael; Huang, Xu; Glidden, Ana; Guerrero, Natalia; TESS Science Office

    2018-01-01

    We describe the data analysis procedures and pipelines for the Transiting Exoplanet Survey Satellite (TESS). We briefly review the processing pipeline developed and implemented by the Science Processing Operations Center (SPOC) at NASA Ames, including pixel/full-frame image calibration, photometric analysis, pre-search data conditioning, transiting planet search, and data validation. We also describe data-quality diagnostic analyses and photometric performance assessment tests. Finally, we detail a "quick-look pipeline" (QLP) that has been developed by the MIT branch of the TESS Science Office (TSO) to provide a fast and adaptable routine to search for planet candidates in the 30 minute full-frame images.

  16. Providing Situational Awareness for Pipeline Control Operations

    NASA Astrophysics Data System (ADS)

    Butts, Jonathan; Kleinhans, Hugo; Chandia, Rodrigo; Papa, Mauricio; Shenoi, Sujeet

    A SCADA system for a single 3,000-mile-long strand of oil or gas pipeline may employ several thousand field devices to measure process parameters and operate equipment. Because of the vital tasks performed by these sensors and actuators, pipeline operators need accurate and timely information about their status and integrity. This paper describes a realtime scanner that provides situational awareness about SCADA devices and control operations. The scanner, with the assistance of lightweight, distributed sensors, analyzes SCADA network traffic, verifies the operational status and integrity of field devices, and identifies anomalous activity. Experimental results obtained using real pipeline control traffic demonstrate the utility of the scanner in industrial settings.

  17. Automation or De-automation

    NASA Astrophysics Data System (ADS)

    Gorlach, Igor; Wessel, Oliver

    2008-09-01

    In the global automotive industry, for decades, vehicle manufacturers have continually increased the level of automation of production systems in order to be competitive. However, there is a new trend to decrease the level of automation, especially in final car assembly, for reasons of economy and flexibility. In this research, the final car assembly lines at three production sites of Volkswagen are analysed in order to determine the best level of automation for each, in terms of manufacturing costs, productivity, quality and flexibility. The case study is based on the methodology proposed by the Fraunhofer Institute. The results of the analysis indicate that fully automated assembly systems are not necessarily the best option in terms of cost, productivity and quality combined, which is attributed to high complexity of final car assembly systems; some de-automation is therefore recommended. On the other hand, the analysis shows that low automation can result in poor product quality due to reasons related to plant location, such as inadequate workers' skills, motivation, etc. Hence, the automation strategy should be formulated on the basis of analysis of all relevant aspects of the manufacturing process, such as costs, quality, productivity and flexibility in relation to the local context. A more balanced combination of automated and manual assembly operations provides better utilisation of equipment, reduces production costs and improves throughput.

  18. Pipeline welding goes mechanized

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beeson, R.

    1999-11-01

    Spread four has bugs in the cornfield--but not to worry. The bug referred to here is a mechanized welding bug, specifically a single welding head, computer-aided gas metal arc (GMAW) system from CRC-Evans Automatic Welding powered by a Miller Electric XMT{reg{underscore}sign} 304 inverter-based welding machine. The bug operator and owner of 32 inverters is Welded Construction, L.P., of Perrysburgh, Ohio. Spread four is a 147-mile stretch of the Alliance Pipeline system (Alliance) cutting through the cornfields of northeast Iowa. While used successfully in Canada and Europe for onshore and offshore pipeline construction for 30 years, this is the first large-scalemore » use of mechanized welding in the US on a cross-country pipeline. On longer, larger-diameter and thicker-wall pipe projects--the Alliance mainline has 1,844 miles of pipe, most of it 36-in. diameter with a 0.622-in. wall thickness--mechanized GMAW offers better productivity than manual shielded metal arc welding (SMAW). In addition, high-strength steels, such as the API 5L Grade X70 pipe used on the Alliance, benefit from the low-hydrogen content of certain solid and tubular wire electrodes.« less

  19. The inverse electroencephalography pipeline

    NASA Astrophysics Data System (ADS)

    Weinstein, David Michael

    The inverse electroencephalography (EEG) problem is defined as determining which regions of the brain are active based on remote measurements recorded with scalp EEG electrodes. An accurate solution to this problem would benefit both fundamental neuroscience research and clinical neuroscience applications. However, constructing accurate patient-specific inverse EEG solutions requires complex modeling, simulation, and visualization algorithms, and to date only a few systems have been developed that provide such capabilities. In this dissertation, a computational system for generating and investigating patient-specific inverse EEG solutions is introduced, and the requirements for each stage of this Inverse EEG Pipeline are defined and discussed. While the requirements of many of the stages are satisfied with existing algorithms, others have motivated research into novel modeling and simulation methods. The principal technical results of this work include novel surface-based volume modeling techniques, an efficient construction for the EEG lead field, and the Open Source release of the Inverse EEG Pipeline software for use by the bioelectric field research community. In this work, the Inverse EEG Pipeline is applied to three research problems in neurology: comparing focal and distributed source imaging algorithms; separating measurements into independent activation components for multifocal epilepsy; and localizing the cortical activity that produces the P300 effect in schizophrenia.

  20. Identification of missing variants by combining multiple analytic pipelines.

    PubMed

    Ren, Yingxue; Reddy, Joseph S; Pottier, Cyril; Sarangi, Vivekananda; Tian, Shulan; Sinnwell, Jason P; McDonnell, Shannon K; Biernacka, Joanna M; Carrasquillo, Minerva M; Ross, Owen A; Ertekin-Taner, Nilüfer; Rademakers, Rosa; Hudson, Matthew; Mainzer, Liudmila Sergeevna; Asmann, Yan W

    2018-04-16

    After decades of identifying risk factors using array-based genome-wide association studies (GWAS), genetic research of complex diseases has shifted to sequencing-based rare variants discovery. This requires large sample sizes for statistical power and has brought up questions about whether the current variant calling practices are adequate for large cohorts. It is well-known that there are discrepancies between variants called by different pipelines, and that using a single pipeline always misses true variants exclusively identifiable by other pipelines. Nonetheless, it is common practice today to call variants by one pipeline due to computational cost and assume that false negative calls are a small percent of total. We analyzed 10,000 exomes from the Alzheimer's Disease Sequencing Project (ADSP) using multiple analytic pipelines consisting of different read aligners and variant calling strategies. We compared variants identified by using two aligners in 50,100, 200, 500, 1000, and 1952 samples; and compared variants identified by adding single-sample genotyping to the default multi-sample joint genotyping in 50,100, 500, 2000, 5000 and 10,000 samples. We found that using a single pipeline missed increasing numbers of high-quality variants correlated with sample sizes. By combining two read aligners and two variant calling strategies, we rescued 30% of pass-QC variants at sample size of 2000, and 56% at 10,000 samples. The rescued variants had higher proportions of low frequency (minor allele frequency [MAF] 1-5%) and rare (MAF < 1%) variants, which are the very type of variants of interest. In 660 Alzheimer's disease cases with earlier onset ages of ≤65, 4 out of 13 (31%) previously-published rare pathogenic and protective mutations in APP, PSEN1, and PSEN2 genes were undetected by the default one-pipeline approach but recovered by the multi-pipeline approach. Identification of the complete variant set from sequencing data is the prerequisite of genetic

  1. Adaptability in Crisis Management: The Role of Organizational Structure

    DTIC Science & Technology

    2013-06-01

    individuals to make optimal decisions under constraints of high risk, uncertainty, high workload, and time pressure (see, e.g., Brehmer, 2007 ). There is...et al., 2003; Hallam & Stammers , 1981). For instance, Hallam and Stammers (1981) showed that the impact of variations in task complexity on team...each team member. FIRESCOPE, a commonly used crisis intervention plan developed in California (Office of Emergency Services, 2007 ) is a good

  2. QuickRNASeq lifts large-scale RNA-seq data analyses to the next level of automation and interactive visualization.

    PubMed

    Zhao, Shanrong; Xi, Li; Quan, Jie; Xi, Hualin; Zhang, Ying; von Schack, David; Vincent, Michael; Zhang, Baohong

    2016-01-08

    RNA sequencing (RNA-seq), a next-generation sequencing technique for transcriptome profiling, is being increasingly used, in part driven by the decreasing cost of sequencing. Nevertheless, the analysis of the massive amounts of data generated by large-scale RNA-seq remains a challenge. Multiple algorithms pertinent to basic analyses have been developed, and there is an increasing need to automate the use of these tools so as to obtain results in an efficient and user friendly manner. Increased automation and improved visualization of the results will help make the results and findings of the analyses readily available to experimental scientists. By combing the best open source tools developed for RNA-seq data analyses and the most advanced web 2.0 technologies, we have implemented QuickRNASeq, a pipeline for large-scale RNA-seq data analyses and visualization. The QuickRNASeq workflow consists of three main steps. In Step #1, each individual sample is processed, including mapping RNA-seq reads to a reference genome, counting the numbers of mapped reads, quality control of the aligned reads, and SNP (single nucleotide polymorphism) calling. Step #1 is computationally intensive, and can be processed in parallel. In Step #2, the results from individual samples are merged, and an integrated and interactive project report is generated. All analyses results in the report are accessible via a single HTML entry webpage. Step #3 is the data interpretation and presentation step. The rich visualization features implemented here allow end users to interactively explore the results of RNA-seq data analyses, and to gain more insights into RNA-seq datasets. In addition, we used a real world dataset to demonstrate the simplicity and efficiency of QuickRNASeq in RNA-seq data analyses and interactive visualizations. The seamless integration of automated capabilites with interactive visualizations in QuickRNASeq is not available in other published RNA-seq pipelines. The high degree

  3. Research on prognostics and health management of underground pipeline

    NASA Astrophysics Data System (ADS)

    Zhang, Guangdi; Yang, Meng; Yang, Fan; Ni, Na

    2018-04-01

    With the development of the city, the construction of the underground pipeline is more and more complex, which has relation to the safety and normal operation of the city, known as "the lifeline of the city". First of all, this paper introduces the principle of PHM (Prognostics and Health Management) technology, then proposed for fault diagnosis, prognostics and health management in view of underground pipeline, make a diagnosis and prognostics for the faults appearing in the operation of the underground pipeline, and then make a health assessment of the whole underground pipe network in order to ensure the operation of the pipeline safely. Finally, summarize and prospect the future research direction.

  4. Study of stress-strain state of pipeline under permafrost conditions

    NASA Astrophysics Data System (ADS)

    Tarasenko, A. A.; Redutinskiy, M. N.; Chepur, P. V.; Gruchenkova, A. A.

    2018-05-01

    In this paper, the dependences of the stress-strain state and subsidence of pipelines on the dimensions of the subsidence zone are obtained for the sizes of pipes that have become most widespread during the construction of main oil pipelines (530x10, 820x12, 1020x12, 1020x14, 1020x16, 1220x14, 1220x16, 1220x18 mm). True values of stresses in the pipeline wall, as well as the exact location of maximum stresses for the interval of subsidence zones from 5 to 60 meters, are determined. For this purpose, the authors developed a finite element model of the pipeline that takes into account the actual interaction of the pipeline with the subgrade and allows calculating the SSS of the structure for a variable subsidence zone. Based on the obtained dependences for the underground laying of oil pipelines in permafrost areas, it is proposed to artificially limit the zone of possible subsidence by separation supports from the soil with higher building properties and physical-mechanical parameters. This technical solution would significantly reduce costs when constructing new oil pipelines in permafrost areas.

  5. Accident Prevention and Diagnostics of Underground Pipeline Systems

    NASA Astrophysics Data System (ADS)

    Trokhimchuk, M.; Bakhracheva, Y.

    2017-11-01

    Up to forty thousand accidents occur annually with underground pipelines due to corrosion. The comparison of the methods for assessing the quality of anti-corrosion coating is provided. It is proposed to use the device to be tied-in to existing pipeline which has a higher functionality in comparison with other types of the devices due to the possibility of tie-in to the pipelines with different diameters. The existing technologies and applied materials allow us to organize industrial production of the proposed device.

  6. 78 FR 57455 - Pipeline Safety: Information Collection Activities

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-18

    ... ``. . . system-specific information, including pipe diameter, operating pressure, product transported, and...) must provide contact information and geospatial data on their pipeline system. This information should... Mapping System (NPMS) to support various regulatory programs, pipeline inspections, and authorized...

  7. Pipeline oil fire detection with MODIS active fire products

    NASA Astrophysics Data System (ADS)

    Ogungbuyi, M. G.; Martinez, P.; Eckardt, F. D.

    2017-12-01

    We investigate 85 129 MODIS satellite active fire events from 2007 to 2015 in the Niger Delta of Nigeria. The region is the oil base for Nigerian economy and the hub of oil exploration where oil facilities (i.e. flowlines, flow stations, trunklines, oil wells and oil fields) are domiciled, and from where crude oil and refined products are transported to different Nigerian locations through a network of pipeline systems. Pipeline and other oil facilities are consistently susceptible to oil leaks due to operational or maintenance error, and by acts of deliberate sabotage of the pipeline equipment which often result in explosions and fire outbreaks. We used ground oil spill reports obtained from the National Oil Spill Detection and Response Agency (NOSDRA) database (see www.oilspillmonitor.ng) to validate MODIS satellite data. NOSDRA database shows an estimate of 10 000 spill events from 2007 - 2015. The spill events were filtered to include largest spills by volume and events occurring only in the Niger Delta (i.e. 386 spills). By projecting both MODIS fire and spill as `input vector' layers with `Points' geometry, and the Nigerian pipeline networks as `from vector' layers with `LineString' geometry in a geographical information system, we extracted the nearest MODIS events (i.e. 2192) closed to the pipelines by 1000m distance in spatial vector analysis. The extraction process that defined the nearest distance to the pipelines is based on the global practices of the Right of Way (ROW) in pipeline management that earmarked 30m strip of land to the pipeline. The KML files of the extracted fires in a Google map validated their source origin to be from oil facilities. Land cover mapping confirmed fire anomalies. The aim of the study is to propose a near-real-time monitoring of spill events along pipeline routes using 250 m spatial resolution of MODIS active fire detection sensor when such spills are accompanied by fire events in the study location.

  8. Generic Data Pipelining Using ORAC-DR

    NASA Astrophysics Data System (ADS)

    Allan, Alasdair; Jenness, Tim; Economou, Frossie; Currie, Malcolm J.; Bly, Martin J.

    A generic data reduction pipeline is, perhaps, the holy grail for data reduction software. We present work which sets us firmly on the path towards this goal. ORAC-DR is an online data reduction pipeline written by the Joint Astronomy Center (JAC) and the UK Astronomy Technology Center (ATC) and distributed as part of the Starlink Software collection (SSC). It is intended to run with a minimum of observer interaction, and is able to handle data from many different instruments, including SCUBA, CGS4, UFTI, IRCAM and Michelle, with support for IRIS2 and UIST under development. Recent work by Starlink in collaboration with the JAC has resulted in an increase in the pipeline's flexibility, opening up the possibility that it could be used for truly generic data reduction for data from any imaging, and eventually spectroscopic, detector.

  9. Leak detectability of the Norman Wells pipeline by mass balance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liou, J.C.P.

    Pipeline leak detection using software-based systems is becoming common practice. The detectability of such systems is measured by how small and how quickly a leak can be detected. Algorithms used and measurement uncertainties determine leak detectability. This paper addresses leak detectability using mass balance, establishes leak detectability for Norman Wells pipelines, and compares it with field leak test results. The pipeline is operated by the Interprovincial Pipe Line (IPL) Inc., of Edmonton, Canada. It is a 12.75-inch outside diameter steel pipe with variable wall thickness. The length of the pipe is approximately 550 miles (868.9 km). The pipeline transports lightmore » crude oil at a constant flow rate of about 250 m{sup 3}/hr. The crude oil can enter the pipeline at two locations. Besides the Norman Wells inlet, there is a side line near Zama terminal that can inject crude oil into the pipeline.« less

  10. National Pipeline Mapping System (NPMS) : standards for creating pipeline location data : standards for electronic data submissions, including metadata standards and examples

    DOT National Transportation Integrated Search

    1997-07-14

    These standards represent a guideline for preparing digital data for inclusion in the National Pipeline Mapping System Repository. The standards were created with input from the pipeline industry and government agencies. They address the submission o...

  11. Algeria LPG pipeline is build by Bechtel

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Horner, C.

    1984-08-01

    The construction of the 313 mile long, 24 in. LPG pipeline from Hassi R'Mel to Arzew, Algeria is described. The pipeline was designed to deliver 6 million tons of LPG annually using one pumping station. Eventually an additional pumping station will be added to raise the system capacity to 9 million tons annually.

  12. 30 CFR 250.1004 - Safety equipment requirements for DOI pipelines.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... OFFSHORE OIL AND GAS AND SULPHUR OPERATIONS IN THE OUTER CONTINENTAL SHELF Pipelines and Pipeline Rights-of... a flow safety valve (FSV). (ii) For sulphur operations, incoming pipelines delivering gas to the power plant platform may be equipped with high- and low-pressure sensors (PSHL), which activate audible...

  13. 75 FR 67450 - Pipeline Safety: Control Room Management Implementation Workshop

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-02

    ... PHMSA-2010-0294] Pipeline Safety: Control Room Management Implementation Workshop AGENCY: Pipeline and...) on the implementation of pipeline control room management. The workshop is intended to foster an understanding of the Control Room Management Rule issued by PHMSA on December 3, 2009, and is open to the public...

  14. Development of Protective Coatings for Co-Sequestration Processes and Pipelines

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bierwagen, Gordon; Huang, Yaping

    2011-11-30

    The program, entitled Development of Protective Coatings for Co-Sequestration Processes and Pipelines, examined the sensitivity of existing coating systems to supercritical carbon dioxide (SCCO2) exposure and developed new coating system to protect pipelines from their corrosion under SCCO2 exposure. A literature review was also conducted regarding pipeline corrosion sensors to monitor pipes used in handling co-sequestration fluids. Research was to ensure safety and reliability for a pipeline involving transport of SCCO2 from the power plant to the sequestration site to mitigate the greenhouse gas effect. Results showed that one commercial coating and one designed formulation can both be supplied asmore » potential candidates for internal pipeline coating to transport SCCO2.« less

  15. Automated data processing architecture for the Gemini Planet Imager Exoplanet Survey

    NASA Astrophysics Data System (ADS)

    Wang, Jason J.; Perrin, Marshall D.; Savransky, Dmitry; Arriaga, Pauline; Chilcote, Jeffrey K.; De Rosa, Robert J.; Millar-Blanchaer, Maxwell A.; Marois, Christian; Rameau, Julien; Wolff, Schuyler G.; Shapiro, Jacob; Ruffio, Jean-Baptiste; Maire, Jérôme; Marchis, Franck; Graham, James R.; Macintosh, Bruce; Ammons, S. Mark; Bailey, Vanessa P.; Barman, Travis S.; Bruzzone, Sebastian; Bulger, Joanna; Cotten, Tara; Doyon, René; Duchêne, Gaspard; Fitzgerald, Michael P.; Follette, Katherine B.; Goodsell, Stephen; Greenbaum, Alexandra Z.; Hibon, Pascale; Hung, Li-Wei; Ingraham, Patrick; Kalas, Paul; Konopacky, Quinn M.; Larkin, James E.; Marley, Mark S.; Metchev, Stanimir; Nielsen, Eric L.; Oppenheimer, Rebecca; Palmer, David W.; Patience, Jennifer; Poyneer, Lisa A.; Pueyo, Laurent; Rajan, Abhijith; Rantakyrö, Fredrik T.; Schneider, Adam C.; Sivaramakrishnan, Anand; Song, Inseok; Soummer, Remi; Thomas, Sandrine; Wallace, J. Kent; Ward-Duong, Kimberly; Wiktorowicz, Sloane J.

    2018-01-01

    The Gemini Planet Imager Exoplanet Survey (GPIES) is a multiyear direct imaging survey of 600 stars to discover and characterize young Jovian exoplanets and their environments. We have developed an automated data architecture to process and index all data related to the survey uniformly. An automated and flexible data processing framework, which we term the Data Cruncher, combines multiple data reduction pipelines (DRPs) together to process all spectroscopic, polarimetric, and calibration data taken with GPIES. With no human intervention, fully reduced and calibrated data products are available less than an hour after the data are taken to expedite follow up on potential objects of interest. The Data Cruncher can run on a supercomputer to reprocess all GPIES data in a single day as improvements are made to our DRPs. A backend MySQL database indexes all files, which are synced to the cloud, and a front-end web server allows for easy browsing of all files associated with GPIES. To help observers, quicklook displays show reduced data as they are processed in real time, and chatbots on Slack post observing information as well as reduced data products. Together, the GPIES automated data processing architecture reduces our workload, provides real-time data reduction, optimizes our observing strategy, and maintains a homogeneously reduced dataset to study planet occurrence and instrument performance.

  16. Design Optimization of Innovative High-Level Waste Pipeline Unplugging Technologies - 13341

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pribanic, T.; Awwad, A.; Varona, J.

    2013-07-01

    Florida International University (FIU) is currently working on the development and optimization of two innovative pipeline unplugging methods: the asynchronous pulsing system (APS) and the peristaltic crawler system (PCS). Experiments were conducted on the APS to determine how air in the pipeline influences the system's performance as well as determine the effectiveness of air mitigation techniques in a pipeline. The results obtained during the experimental phase of the project, including data from pipeline pressure pulse tests along with air bubble compression tests are presented. Single-cycle pulse amplification caused by a fast-acting cylinder piston pump in 21.8, 30.5, and 43.6 mmore » pipelines were evaluated. Experiments were conducted on fully flooded pipelines as well as pipelines that contained various amounts of air to evaluate the system's performance when air is present in the pipeline. Also presented are details of the improvements implemented to the third generation crawler system (PCS). The improvements include the redesign of the rims of the unit to accommodate a camera system that provides visual feedback of the conditions inside the pipeline. Visual feedback allows the crawler to be used as a pipeline unplugging and inspection tool. Tests conducted previously demonstrated a significant reduction of the crawler speed with increasing length of tether. Current improvements include the positioning of a pneumatic valve manifold system that is located in close proximity to the crawler, rendering tether length independent of crawler speed. Additional improvements to increase the crawler's speed were also investigated and presented. Descriptions of the test beds, which were designed to emulate possible scenarios present on the Department of Energy (DOE) pipelines, are presented. Finally, conclusions and recommendations for the systems are provided. (authors)« less

  17. Quantitative Risk Mapping of Urban Gas Pipeline Networks Using GIS

    NASA Astrophysics Data System (ADS)

    Azari, P.; Karimi, M.

    2017-09-01

    Natural gas is considered an important source of energy in the world. By increasing growth of urbanization, urban gas pipelines which transmit natural gas from transmission pipelines to consumers, will become a dense network. The increase in the density of urban pipelines will influence probability of occurring bad accidents in urban areas. These accidents have a catastrophic effect on people and their property. Within the next few years, risk mapping will become an important component in urban planning and management of large cities in order to decrease the probability of accident and to control them. Therefore, it is important to assess risk values and determine their location on urban map using an appropriate method. In the history of risk analysis of urban natural gas pipeline networks, the pipelines has always been considered one by one and their density in urban area has not been considered. The aim of this study is to determine the effect of several pipelines on the risk value of a specific grid point. This paper outlines a quantitative risk assessment method for analysing the risk of urban natural gas pipeline networks. It consists of two main parts: failure rate calculation where the EGIG historical data are used and fatal length calculation that involves calculation of gas release and fatality rate of consequences. We consider jet fire, fireball and explosion for investigating the consequences of gas pipeline failure. The outcome of this method is an individual risk and is shown as a risk map.

  18. Pipeline systems - safety for assets and transport regularity

    DOT National Transportation Integrated Search

    1997-01-01

    This review regarding safety for assets and financial interests for pipeline systems has showed how this aspect has been taken care of in the existing petroleum legislation. It has been demonstrated that the integrity of pipeline systems with the res...

  19. FLIM FRET Technology for Drug Discovery: Automated Multiwell-Plate High-Content Analysis, Multiplexed Readouts and Application in Situ**

    PubMed Central

    Kumar, Sunil; Alibhai, Dominic; Margineanu, Anca; Laine, Romain; Kennedy, Gordon; McGinty, James; Warren, Sean; Kelly, Douglas; Alexandrov, Yuriy; Munro, Ian; Talbot, Clifford; Stuckey, Daniel W; Kimberly, Christopher; Viellerobe, Bertrand; Lacombe, Francois; Lam, Eric W-F; Taylor, Harriet; Dallman, Margaret J; Stamp, Gordon; Murray, Edward J; Stuhmeier, Frank; Sardini, Alessandro; Katan, Matilda; Elson, Daniel S; Neil, Mark A A; Dunsby, Chris; French, Paul M W

    2011-01-01

    A fluorescence lifetime imaging (FLIM) technology platform intended to read out changes in Förster resonance energy transfer (FRET) efficiency is presented for the study of protein interactions across the drug-discovery pipeline. FLIM provides a robust, inherently ratiometric imaging modality for drug discovery that could allow the same sensor constructs to be translated from automated cell-based assays through small transparent organisms such as zebrafish to mammals. To this end, an automated FLIM multiwell-plate reader is described for high content analysis of fixed and live cells, tomographic FLIM in zebrafish and FLIM FRET of live cells via confocal endomicroscopy. For cell-based assays, an exemplar application reading out protein aggregation using FLIM FRET is presented, and the potential for multiple simultaneous FLIM (FRET) readouts in microscopy is illustrated. PMID:21337485

  20. Creation and Implementation of a Workforce Development Pipeline Program at MSFC

    NASA Technical Reports Server (NTRS)

    Hix, Billy

    2003-01-01

    Within the context of NASA's Education Programs, this Workforce Development Pipeline guide describes the goals and objectives of MSFC's Workforce Development Pipeline Program as well as the principles and strategies for guiding implementation. It is designed to support the initiatives described in the NASA Implementation Plan for Education, 1999-2003 (EP-1998-12-383-HQ) and represents the vision of the members of the Education Programs office at MSFC. This document: 1) Outlines NASA s Contribution to National Priorities; 2) Sets the context for the Workforce Development Pipeline Program; 3) Describes Workforce Development Pipeline Program Strategies; 4) Articulates the Workforce Development Pipeline Program Goals and Aims; 5) List the actions to build a unified approach; 6) Outlines the Workforce Development Pipeline Programs guiding Principles; and 7) The results of implementation.

  1. Benchmark datasets for phylogenomic pipeline validation, applications for foodborne pathogen surveillance

    PubMed Central

    Rand, Hugh; Shumway, Martin; Trees, Eija K.; Simmons, Mustafa; Agarwala, Richa; Davis, Steven; Tillman, Glenn E.; Defibaugh-Chavez, Stephanie; Carleton, Heather A.; Klimke, William A.; Katz, Lee S.

    2017-01-01

    Background As next generation sequence technology has advanced, there have been parallel advances in genome-scale analysis programs for determining evolutionary relationships as proxies for epidemiological relationship in public health. Most new programs skip traditional steps of ortholog determination and multi-gene alignment, instead identifying variants across a set of genomes, then summarizing results in a matrix of single-nucleotide polymorphisms or alleles for standard phylogenetic analysis. However, public health authorities need to document the performance of these methods with appropriate and comprehensive datasets so they can be validated for specific purposes, e.g., outbreak surveillance. Here we propose a set of benchmark datasets to be used for comparison and validation of phylogenomic pipelines. Methods We identified four well-documented foodborne pathogen events in which the epidemiology was concordant with routine phylogenomic analyses (reference-based SNP and wgMLST approaches). These are ideal benchmark datasets, as the trees, WGS data, and epidemiological data for each are all in agreement. We have placed these sequence data, sample metadata, and “known” phylogenetic trees in publicly-accessible databases and developed a standard descriptive spreadsheet format describing each dataset. To facilitate easy downloading of these benchmarks, we developed an automated script that uses the standard descriptive spreadsheet format. Results Our “outbreak” benchmark datasets represent the four major foodborne bacterial pathogens (Listeria monocytogenes, Salmonella enterica, Escherichia coli, and Campylobacter jejuni) and one simulated dataset where the “known tree” can be accurately called the “true tree”. The downloading script and associated table files are available on GitHub: https://github.com/WGS-standards-and-analysis/datasets. Discussion These five benchmark datasets will help standardize comparison of current and future phylogenomic

  2. 75 FR 19957 - B-R Pipeline Company; Notice of Application

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-16

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. CP10-129-000] B-R Pipeline Company; Notice of Application April 9, 2010. Take notice that on April 8, 2010, B-R Pipeline Company (B-R... should be directed to Counsel for B-R Pipeline Company, William H. Penniman or Michael Brooks, Sutherland...

  3. A Review of Fatigue Crack Growth for Pipeline Steels Exposed to Hydrogen

    PubMed Central

    Nanninga, N.; Slifka, A.; Levy, Y.; White, C.

    2010-01-01

    Hydrogen pipeline systems offer an economical means of storing and transporting energy in the form of hydrogen gas. Pipelines can be used to transport hydrogen that has been generated at solar and wind farms to and from salt cavern storage locations. In addition, pipeline transportation systems will be essential before widespread hydrogen fuel cell vehicle technology becomes a reality. Since hydrogen pipeline use is expected to grow, the mechanical integrity of these pipelines will need to be validated under the presence of pressurized hydrogen. This paper focuses on a review of the fatigue crack growth response of pipeline steels when exposed to gaseous hydrogen environments. Because of defect-tolerant design principles in pipeline structures, it is essential that designers consider hydrogen-assisted fatigue crack growth behavior in these applications. PMID:27134796

  4. Long-Term Monitoring of Cased Pipelines Using Longrange Guided-Wave Technique

    DOT National Transportation Integrated Search

    2009-05-19

    Integrity management programs for gas transmission pipelines are required by The Office of Pipeline Safety (OPS)/DOT. Direct Assessment (DA) and 'Other Technologies' have become the focus of assessment options for pipeline integrity on cased crossing...

  5. An in silico pipeline to filter the Toxoplasma gondii proteome for proteins that could traffic to the host cell nucleus and influence host cell epigenetic regulation.

    PubMed

    Syn, Genevieve; Blackwell, Jenefer M; Jamieson, Sarra E; Francis, Richard W

    2018-01-01

    Toxoplasma gondii uses epigenetic mechanisms to regulate both endogenous and host cell gene expression. To identify genes with putative epigenetic functions, we developed an in silico pipeline to interrogate the T. gondii proteome of 8313 proteins. Step 1 employs PredictNLS and NucPred to identify genes predicted to target eukaryotic nuclei. Step 2 uses GOLink to identify proteins of epigenetic function based on Gene Ontology terms. This resulted in 611 putative nuclear localised proteins with predicted epigenetic functions. Step 3 filtered for secretory proteins using SignalP, SecretomeP, and experimental data. This identified 57 of the 611 putative epigenetic proteins as likely to be secreted. The pipeline is freely available online, uses open access tools and software with user-friendly Perl scripts to automate and manage the results, and is readily adaptable to undertake any such in silico search for genes contributing to particular functions.

  6. Investigation of CO2 release pressures in pipeline cracks

    NASA Astrophysics Data System (ADS)

    Gorenz, Paul; Herzog, Nicoleta; Egbers, Christoph

    2013-04-01

    The CCS (Carbon Capture and Storage) technology can prevent or reduce the emissions of carbon dioxide. The main idea of this technology is the segregation and collection of CO2 from facilities with a high emission of that greenhouse gas, i.e. power plants which burn fossil fuels. To segregate CO2 from the exhaust gas the power plant must be upgraded. Up to now there are three possible procedures to segregate the carbon dioxide with different advantages and disadvantages. After segregation the carbon dioxide will be transported by pipeline to a subsurface storage location. As CO2 is at normal conditions (1013,25 Pa; 20 °C) in a gaseous phase state it must be set under high pressure to enter denser phase states to make a more efficient pipeline transport possible. Normally the carbon dioxide is set into the liquid or supercritical phase state by compressor stations which compress the gas up to 15 MPa. The pressure drop makes booster stations along the pipeline necessary which keep the CO2 in a dens phase state. Depending on the compression pressure CO2 can be transported over 300km without any booster station. The goal of this work is the investigation of release pressures in pipeline cracks. The high pressurised pipeline system consists of different parts with different failure probabilities. In most cases corrosion or obsolescence is the reason for pipeline damages. In case of a crack CO2 will escape from the pipeline and disperse into the atmosphere. Due to its nature CO2 can remain unattended for a long time. There are some studies of the CO2 dispersion process, e.g. Mazzoldi et al. (2007, 2008 and 2011) and Wang et al. (2008), but with different assumptions concerning the pipeline release pressures. To give an idea of realistic release pressures investigations with the CFD tool OpenFOAM were carried out and are presented within this work. To cover such a scenario with an accidental release of carbon dioxide a pipeline section with different diameters and

  7. 78 FR 59906 - Pipeline Safety: Class Location Requirements

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-30

    ... 192 [Docket No. PHMSA-2013-0161] Pipeline Safety: Class Location Requirements AGENCY: Pipeline and... Location Requirements,'' seeking comments on whether integrity management program (IMP) requirements, or... for class location requirements. PHMSA has received two requests to extend the comment period to allow...

  8. Peace in the pipeline

    NASA Astrophysics Data System (ADS)

    Luhar, Mitul

    2018-04-01

    Turbulence in pipe flows causes substantial friction and economic losses. The solution to appease the flow through pipelines might be, counterintuitively, to initially enhance turbulent mixing and get laminar flow in return.

  9. Bp'S Baku-Tbilisi-Ceyhan pipeline: the new corporate colonialism.

    PubMed

    Marriott, James; Muttitt, Greg

    2006-01-01

    An international campaign was waged questioning the benefits of BP's Baku-Tbilisi-Ceyhan pipeline in an effort to avoid a "zone of sacrifice" there. This article is an offshoot of that effort and explains the contemporary struggle over the pipeline project. The authors describe the project's background and evaluate the actual and potential impacts of the project in which they consider eight areas. They also assess BP's capacity to confront resistance to the pipeline.

  10. Fully automated, deep learning segmentation of oxygen-induced retinopathy images

    PubMed Central

    Xiao, Sa; Bucher, Felicitas; Wu, Yue; Rokem, Ariel; Lee, Cecilia S.; Marra, Kyle V.; Fallon, Regis; Diaz-Aguilar, Sophia; Aguilar, Edith; Friedlander, Martin; Lee, Aaron Y.

    2017-01-01

    Oxygen-induced retinopathy (OIR) is a widely used model to study ischemia-driven neovascularization (NV) in the retina and to serve in proof-of-concept studies in evaluating antiangiogenic drugs for ocular, as well as nonocular, diseases. The primary parameters that are analyzed in this mouse model include the percentage of retina with vaso-obliteration (VO) and NV areas. However, quantification of these two key variables comes with a great challenge due to the requirement of human experts to read the images. Human readers are costly, time-consuming, and subject to bias. Using recent advances in machine learning and computer vision, we trained deep learning neural networks using over a thousand segmentations to fully automate segmentation in OIR images. While determining the percentage area of VO, our algorithm achieved a similar range of correlation coefficients to that of expert inter-human correlation coefficients. In addition, our algorithm achieved a higher range of correlation coefficients compared with inter-expert correlation coefficients for quantification of the percentage area of neovascular tufts. In summary, we have created an open-source, fully automated pipeline for the quantification of key values of OIR images using deep learning neural networks. PMID:29263301

  11. Study on the Control of Cleanliness for X90 Pipeline in the Secondary Refining Process

    NASA Astrophysics Data System (ADS)

    Chu, Ren Sheng; Liu, Jin Gang; Li, Zhan Jun

    X90 pipeline steel requires ultra low for sulfur content and gas content in the smelting process. The secondary refining process is very important for X90 pipeline in smelting process and the control of cleanliness is the key for the secondary refining process in the steelmaking process for Pretreatment of hot metal → LD → LF refining → RH refining → Calcium treatment → CC. In the current paper, the cleanliness control method of secondary refining was analyzed for the evolution of non-metallic inclusions in the secondary refining prcess and related changes for composition in steel. The size, composition and the type of the non-metallic inclusions were analyzed by aspex explorer automated scanning electron microscope in X90 pipeline samples for 20mm * 25mm * 25mm by the line cutting. The results show that the number of non-metallic inclusions in steel decrease from the beginning of the LF refining to the RH refining. In the composition of the Non-metallic inclusions, the initial non-metallic inclusions of alumina is converted to two comple-type non-metallic inclusions. Most of them, the non-metallic inclusions were composed by the calcium aluminate and CaS. The others are that the spinel is the core, peripheral parcels calcium aluminate nonmetallic inclusions for complex-type non-metallic inclusions. For the size of the non-metallic inclusions, the non-metallic inclusions for size larger than 100µm is converted to 5 20µm based small size non-metallic inclusions. While the S content of the steel decreased from 0.012% to 0.0012% or less, Al content is kept at between 0.025% to 0.035% and the quality for the casting slab satisfies the requirement of the steel. The ratings for various types of the non-metallic inclusions are 1.5 or less. The control strategy for the inclusions in 90 pipeline is small size, diffuse distribution and little amount of the deformation after rolling. On the contrary, the specific chemical composition of the inclusions is not

  12. Bio-Docklets: virtualization containers for single-step execution of NGS pipelines.

    PubMed

    Kim, Baekdoo; Ali, Thahmina; Lijeron, Carlos; Afgan, Enis; Krampis, Konstantinos

    2017-08-01

    Processing of next-generation sequencing (NGS) data requires significant technical skills, involving installation, configuration, and execution of bioinformatics data pipelines, in addition to specialized postanalysis visualization and data mining software. In order to address some of these challenges, developers have leveraged virtualization containers toward seamless deployment of preconfigured bioinformatics software and pipelines on any computational platform. We present an approach for abstracting the complex data operations of multistep, bioinformatics pipelines for NGS data analysis. As examples, we have deployed 2 pipelines for RNA sequencing and chromatin immunoprecipitation sequencing, preconfigured within Docker virtualization containers we call Bio-Docklets. Each Bio-Docklet exposes a single data input and output endpoint and from a user perspective, running the pipelines as simply as running a single bioinformatics tool. This is achieved using a "meta-script" that automatically starts the Bio-Docklets and controls the pipeline execution through the BioBlend software library and the Galaxy Application Programming Interface. The pipeline output is postprocessed by integration with the Visual Omics Explorer framework, providing interactive data visualizations that users can access through a web browser. Our goal is to enable easy access to NGS data analysis pipelines for nonbioinformatics experts on any computing environment, whether a laboratory workstation, university computer cluster, or a cloud service provider. Beyond end users, the Bio-Docklets also enables developers to programmatically deploy and run a large number of pipeline instances for concurrent analysis of multiple datasets. © The Authors 2017. Published by Oxford University Press.

  13. Bio-Docklets: virtualization containers for single-step execution of NGS pipelines

    PubMed Central

    Kim, Baekdoo; Ali, Thahmina; Lijeron, Carlos; Afgan, Enis

    2017-01-01

    Abstract Processing of next-generation sequencing (NGS) data requires significant technical skills, involving installation, configuration, and execution of bioinformatics data pipelines, in addition to specialized postanalysis visualization and data mining software. In order to address some of these challenges, developers have leveraged virtualization containers toward seamless deployment of preconfigured bioinformatics software and pipelines on any computational platform. We present an approach for abstracting the complex data operations of multistep, bioinformatics pipelines for NGS data analysis. As examples, we have deployed 2 pipelines for RNA sequencing and chromatin immunoprecipitation sequencing, preconfigured within Docker virtualization containers we call Bio-Docklets. Each Bio-Docklet exposes a single data input and output endpoint and from a user perspective, running the pipelines as simply as running a single bioinformatics tool. This is achieved using a “meta-script” that automatically starts the Bio-Docklets and controls the pipeline execution through the BioBlend software library and the Galaxy Application Programming Interface. The pipeline output is postprocessed by integration with the Visual Omics Explorer framework, providing interactive data visualizations that users can access through a web browser. Our goal is to enable easy access to NGS data analysis pipelines for nonbioinformatics experts on any computing environment, whether a laboratory workstation, university computer cluster, or a cloud service provider. Beyond end users, the Bio-Docklets also enables developers to programmatically deploy and run a large number of pipeline instances for concurrent analysis of multiple datasets. PMID:28854616

  14. 78 FR 10689 - Pipeline Safety: Public Forum State One-Call Exemptions

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-14

    ... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket No... Safety, Pipeline and Hazardous Materials Safety Administration, DOT. ACTION: Notice; public forum. SUMMARY: The Pipeline and Hazardous Materials Safety Administration will sponsor a public forum on state...

  15. 75 FR 5536 - Pipeline Safety: Control Room Management/Human Factors, Correction

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-03

    ... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration 49 CFR Parts...: Control Room Management/Human Factors, Correction AGENCY: Pipeline and Hazardous Materials Safety... following correcting amendments: PART 192--TRANSPORTATION OF NATURAL AND OTHER GAS BY PIPELINE: MINIMUM...

  16. CATS, continuous automated testing of seismological, hydroacoustic, and infrasound (SHI) processing software.

    NASA Astrophysics Data System (ADS)

    Brouwer, Albert; Brown, David; Tomuta, Elena

    2017-04-01

    To detect nuclear explosions, waveform data from over 240 SHI stations world-wide flows into the International Data Centre (IDC) of the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO), located in Vienna, Austria. A complex pipeline of software applications processes this data in numerous ways to form event hypotheses. The software codebase comprises over 2 million lines of code, reflects decades of development, and is subject to frequent enhancement and revision. Since processing must run continuously and reliably, software changes are subjected to thorough testing before being put into production. To overcome the limitations and cost of manual testing, the Continuous Automated Testing System (CATS) has been created. CATS provides an isolated replica of the IDC processing environment, and is able to build and test different versions of the pipeline software directly from code repositories that are placed under strict configuration control. Test jobs are scheduled automatically when code repository commits are made. Regressions are reported. We present the CATS design choices and test methods. Particular attention is paid to how the system accommodates the individual testing of strongly interacting software components that lack test instrumentation.

  17. Numerical Modeling of Mechanical Behavior for Buried Steel Pipelines Crossing Subsidence Strata

    PubMed Central

    Han, C. J.

    2015-01-01

    This paper addresses the mechanical behavior of buried steel pipeline crossing subsidence strata. The investigation is based on numerical simulation of the nonlinear response of the pipeline-soil system through finite element method, considering large strain and displacement, inelastic material behavior of buried pipeline and the surrounding soil, as well as contact and friction on the pipeline-soil interface. Effects of key parameters on the mechanical behavior of buried pipeline were investigated, such as strata subsidence, diameter-thickness ratio, buried depth, internal pressure, friction coefficient and soil properties. The results show that the maximum strain appears on the outer transition subsidence section of the pipeline, and its cross section is concave shaped. With the increasing of strata subsidence and diameter-thickness ratio, the out of roundness, longitudinal strain and equivalent plastic strain increase gradually. With the buried depth increasing, the deflection, out of roundness and strain of the pipeline decrease. Internal pressure and friction coefficient have little effect on the deflection of buried pipeline. Out of roundness is reduced and the strain is increased gradually with the increasing of internal pressure. The physical properties of soil have a great influence on the mechanical properties of buried pipeline. The results from the present study can be used for the development of optimization design and preventive maintenance for buried steel pipelines. PMID:26103460

  18. MitoFish and MiFish Pipeline: A Mitochondrial Genome Database of Fish with an Analysis Pipeline for Environmental DNA Metabarcoding.

    PubMed

    Sato, Yukuto; Miya, Masaki; Fukunaga, Tsukasa; Sado, Tetsuya; Iwasaki, Wataru

    2018-06-01

    Fish mitochondrial genome (mitogenome) data form a fundamental basis for revealing vertebrate evolution and hydrosphere ecology. Here, we report recent functional updates of MitoFish, which is a database of fish mitogenomes with a precise annotation pipeline MitoAnnotator. Most importantly, we describe implementation of MiFish pipeline for metabarcoding analysis of fish mitochondrial environmental DNA, which is a fast-emerging and powerful technology in fish studies. MitoFish, MitoAnnotator, and MiFish pipeline constitute a key platform for studies of fish evolution, ecology, and conservation, and are freely available at http://mitofish.aori.u-tokyo.ac.jp/ (last accessed April 7th, 2018).

  19. Intermediate Palomar Transient Factory: Realtime Image Subtraction Pipeline

    DOE PAGES

    Cao, Yi; Nugent, Peter E.; Kasliwal, Mansi M.

    2016-09-28

    A fast-turnaround pipeline for realtime data reduction plays an essential role in discovering and permitting followup observations to young supernovae and fast-evolving transients in modern time-domain surveys. In this paper, we present the realtime image subtraction pipeline in the intermediate Palomar Transient Factory. By using highperformance computing, efficient databases, and machine-learning algorithms, this pipeline manages to reliably deliver transient candidates within 10 minutes of images being taken. Our experience in using high-performance computing resources to process big data in astronomy serves as a trailblazer to dealing with data from large-scale time-domain facilities in the near future.

  20. Intermediate Palomar Transient Factory: Realtime Image Subtraction Pipeline

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cao, Yi; Nugent, Peter E.; Kasliwal, Mansi M.

    A fast-turnaround pipeline for realtime data reduction plays an essential role in discovering and permitting followup observations to young supernovae and fast-evolving transients in modern time-domain surveys. In this paper, we present the realtime image subtraction pipeline in the intermediate Palomar Transient Factory. By using highperformance computing, efficient databases, and machine-learning algorithms, this pipeline manages to reliably deliver transient candidates within 10 minutes of images being taken. Our experience in using high-performance computing resources to process big data in astronomy serves as a trailblazer to dealing with data from large-scale time-domain facilities in the near future.

  1. Automated flow cytometric analysis across large numbers of samples and cell types.

    PubMed

    Chen, Xiaoyi; Hasan, Milena; Libri, Valentina; Urrutia, Alejandra; Beitz, Benoît; Rouilly, Vincent; Duffy, Darragh; Patin, Étienne; Chalmond, Bernard; Rogge, Lars; Quintana-Murci, Lluis; Albert, Matthew L; Schwikowski, Benno

    2015-04-01

    Multi-parametric flow cytometry is a key technology for characterization of immune cell phenotypes. However, robust high-dimensional post-analytic strategies for automated data analysis in large numbers of donors are still lacking. Here, we report a computational pipeline, called FlowGM, which minimizes operator input, is insensitive to compensation settings, and can be adapted to different analytic panels. A Gaussian Mixture Model (GMM)-based approach was utilized for initial clustering, with the number of clusters determined using Bayesian Information Criterion. Meta-clustering in a reference donor permitted automated identification of 24 cell types across four panels. Cluster labels were integrated into FCS files, thus permitting comparisons to manual gating. Cell numbers and coefficient of variation (CV) were similar between FlowGM and conventional gating for lymphocyte populations, but notably FlowGM provided improved discrimination of "hard-to-gate" monocyte and dendritic cell (DC) subsets. FlowGM thus provides rapid high-dimensional analysis of cell phenotypes and is amenable to cohort studies. Copyright © 2015. Published by Elsevier Inc.

  2. TRIC: an automated alignment strategy for reproducible protein quantification in targeted proteomics.

    PubMed

    Röst, Hannes L; Liu, Yansheng; D'Agostino, Giuseppe; Zanella, Matteo; Navarro, Pedro; Rosenberger, George; Collins, Ben C; Gillet, Ludovic; Testa, Giuseppe; Malmström, Lars; Aebersold, Ruedi

    2016-09-01

    Next-generation mass spectrometric (MS) techniques such as SWATH-MS have substantially increased the throughput and reproducibility of proteomic analysis, but ensuring consistent quantification of thousands of peptide analytes across multiple liquid chromatography-tandem MS (LC-MS/MS) runs remains a challenging and laborious manual process. To produce highly consistent and quantitatively accurate proteomics data matrices in an automated fashion, we developed TRIC (http://proteomics.ethz.ch/tric/), a software tool that utilizes fragment-ion data to perform cross-run alignment, consistent peak-picking and quantification for high-throughput targeted proteomics. TRIC reduced the identification error compared to a state-of-the-art SWATH-MS analysis without alignment by more than threefold at constant recall while correcting for highly nonlinear chromatographic effects. On a pulsed-SILAC experiment performed on human induced pluripotent stem cells, TRIC was able to automatically align and quantify thousands of light and heavy isotopic peak groups. Thus, TRIC fills a gap in the pipeline for automated analysis of massively parallel targeted proteomics data sets.

  3. Building the Pipeline for Hubble Legacy Archive Grism data

    NASA Astrophysics Data System (ADS)

    Kümmel, M.; Albrecht, R.; Fosbury, R.; Freudling, W.; Haase, J.; Hook, R. N.; Kuntschner, H.; Lombardi, M.; Micol, A.; Rosa, M.; Stoehr, F.; Walsh, J. R.

    2008-10-01

    The Pipeline for Hubble Legacy Archive Grism data (PHLAG) is currently being developed as an end-to-end pipeline for the Hubble Legacy Archive (HLA). The inputs to PHLAG are slitless spectroscopic HST data with only the basic calibrations from standard HST pipelines applied; the outputs are fully calibrated, Virtuall Observatory-compatible spectra, which will be made available through a static HLA-archive. We give an overview of the various aspects of PHLAG. The pipeline consists of several subcomponents -- data preparation, data retrieval, image combination, object detection, spectral extraction using the aXe software, quality control -- which is discussed in detail. As a pilot project, PHLAG is currently being applied to NICMOS G141 grism data. Examples of G141 spectra reduced with PHLAG are shown.

  4. Thinking on Sichuan-Chongqing gas pipeline transportation system reform under market-oriented conditions

    NASA Astrophysics Data System (ADS)

    Duan, Yanzhi

    2017-01-01

    The gas pipeline networks in Sichuan and Chongqing (Sichuan-Chongqing) region have formed a fully-fledged gas pipeline transportation system in China, which supports and promotes the rapid development of gas market in Sichuan-Chongqing region. In the circumstances of further developed market-oriented economy, it is necessary to carry out further the pipeline system reform in the areas of investment/financing system, operation system and pricing system to lay a solid foundation for improving future gas production and marketing capability and adapting itself to the national gas system reform, and to achieve the objectives of multiparty participated pipeline construction, improved pipeline transportation efficiency and fair and rational pipeline transportation prices. In this article, main thinking on reform in the three areas and major deployment are addressed, and corresponding measures on developing shared pipeline economy, providing financial support to pipeline construction, setting up independent regulatory agency to enhance the industrial supervision for gas pipeline transportation, and promoting the construction of regional gas trade market are recommended.

  5. Forecasting and Evaluation of Gas Pipelines Geometric Forms Breach Hazard

    NASA Astrophysics Data System (ADS)

    Voronin, K. S.

    2016-10-01

    Main gas pipelines during operation are under the influence of the permanent pressure drops which leads to their lengthening and as a result, to instability of their position in space. In dynamic systems that have feedback, phenomena, preceding emergencies, should be observed. The article discusses the forced vibrations of the gas pipeline cylindrical surface under the influence of dynamic loads caused by pressure surges, and the process of its geometric shape deformation. Frequency of vibrations, arising in the pipeline at the stage preceding its bending, is being determined. Identification of this frequency can be the basis for the development of a method of monitoring the technical condition of the gas pipeline, and forecasting possible emergency situations allows planning and carrying out in due time reconstruction works on sections of gas pipeline with a possible deviation from the design position.

  6. INTERNAL REPAIR OF PIPELINES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bill Bruce; Nancy Porter; George Ritter

    2005-07-20

    The two broad categories of fiber-reinforced composite liner repair and deposited weld metal repair technologies were reviewed and evaluated for potential application for internal repair of gas transmission pipelines. Both are used to some extent for other applications and could be further developed for internal, local, structural repair of gas transmission pipelines. Principal conclusions from a survey of natural gas transmission industry pipeline operators can be summarized in terms of the following performance requirements for internal repair: (1) Use of internal repair is most attractive for river crossings, under other bodies of water, in difficult soil conditions, under highways, undermore » congested intersections, and under railway crossings. (2) Internal pipe repair offers a strong potential advantage to the high cost of horizontal direct drilling when a new bore must be created to solve a leak or other problem. (3) Typical travel distances can be divided into three distinct groups: up to 305 m (1,000 ft.); between 305 m and 610 m (1,000 ft. and 2,000 ft.); and beyond 914 m (3,000 ft.). All three groups require pig-based systems. A despooled umbilical system would suffice for the first two groups which represents 81% of survey respondents. The third group would require an onboard self-contained power unit for propulsion and welding/liner repair energy needs. (4) The most common size range for 80% to 90% of operators surveyed is 508 mm (20 in.) to 762 mm (30 in.), with 95% using 558.8 mm (22 in.) pipe. Evaluation trials were conducted on pipe sections with simulated corrosion damage repaired with glass fiber-reinforced composite liners, carbon fiber-reinforced composite liners, and weld deposition. Additional un-repaired pipe sections were evaluated in the virgin condition and with simulated damage. Hydrostatic failure pressures for pipe sections repaired with glass fiber-reinforced composite liner were only marginally greater than that of pipe sections

  7. The JCSG high-throughput structural biology pipeline.

    PubMed

    Elsliger, Marc André; Deacon, Ashley M; Godzik, Adam; Lesley, Scott A; Wooley, John; Wüthrich, Kurt; Wilson, Ian A

    2010-10-01

    The Joint Center for Structural Genomics high-throughput structural biology pipeline has delivered more than 1000 structures to the community over the past ten years. The JCSG has made a significant contribution to the overall goal of the NIH Protein Structure Initiative (PSI) of expanding structural coverage of the protein universe, as well as making substantial inroads into structural coverage of an entire organism. Targets are processed through an extensive combination of bioinformatics and biophysical analyses to efficiently characterize and optimize each target prior to selection for structure determination. The pipeline uses parallel processing methods at almost every step in the process and can adapt to a wide range of protein targets from bacterial to human. The construction, expansion and optimization of the JCSG gene-to-structure pipeline over the years have resulted in many technological and methodological advances and developments. The vast number of targets and the enormous amounts of associated data processed through the multiple stages of the experimental pipeline required the development of variety of valuable resources that, wherever feasible, have been converted to free-access web-based tools and applications.

  8. 77 FR 17119 - Pipeline Safety: Cast Iron Pipe (Supplementary Advisory Bulletin)

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-23

    ... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket No... national attention and highlight the need for continued safety improvements to aging gas pipeline systems... 26, 1992) covering the continued use of cast iron pipe in natural gas distribution pipeline systems...

  9. 78 FR 6402 - Pipeline Safety: Accident and Incident Notification Time Limit

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-30

    ... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket No.... SUMMARY: Owners and operators of gas and hazardous liquid pipeline systems and liquefied natural gas (LNG... operators of gas and hazardous liquids pipeline systems and LNG facilities that, ``at the earliest...

  10. Ultrasonic wave based pressure measurement in small diameter pipeline.

    PubMed

    Wang, Dan; Song, Zhengxiang; Wu, Yuan; Jiang, Yuan

    2015-12-01

    An effective non-intrusive method of ultrasound-based technique that allows monitoring liquid pressure in small diameter pipeline (less than 10mm) is presented in this paper. Ultrasonic wave could penetrate medium, through the acquisition of representative information from the echoes, properties of medium can be reflected. This pressure measurement is difficult due to that echoes' information is not easy to obtain in small diameter pipeline. The proposed method is a study on pipeline with Kneser liquid and is based on the principle that the transmission speed of ultrasonic wave in pipeline liquid correlates with liquid pressure and transmission speed of ultrasonic wave in pipeline liquid is reflected through ultrasonic propagation time providing that acoustic distance is fixed. Therefore, variation of ultrasonic propagation time can reflect variation of pressure in pipeline. Ultrasonic propagation time is obtained by electric processing approach and is accurately measured to nanosecond through high resolution time measurement module. We used ultrasonic propagation time difference to reflect actual pressure in this paper to reduce the environmental influences. The corresponding pressure values are finally obtained by acquiring the relationship between variation of ultrasonic propagation time difference and pressure with the use of neural network analysis method, the results show that this method is accurate and can be used in practice. Copyright © 2015 Elsevier B.V. All rights reserved.

  11. 49 CFR 192.620 - Alternative maximum allowable operating pressure for certain steel pipelines.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Transportation (Continued) PIPELINE AND HAZARDOUS MATERIALS SAFETY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED) PIPELINE SAFETY TRANSPORTATION OF NATURAL AND OTHER GAS BY PIPELINE: MINIMUM FEDERAL SAFETY...) At points where gas with potentially deleterious contaminants enters the pipeline, use filter...

  12. Decision Through Optimism: The North Peruvian Pipeline.

    DTIC Science & Technology

    1987-05-01

    corporations. Another factor, optimism, is more intangible, but influenced the decision strongly. This paper discusses the need for, construction of...decision, the construction effort, and financing to accomplish this endeavor. Finally, it notes Peru’s oil situation after completion of the pipeline and...decision strongly. This paper discusses the need for, construction of, and results of building the Northern Peru Oil Pipeline. The paper reviews the

  13. Superpixel-based and boundary-sensitive convolutional neural network for automated liver segmentation

    NASA Astrophysics Data System (ADS)

    Qin, Wenjian; Wu, Jia; Han, Fei; Yuan, Yixuan; Zhao, Wei; Ibragimov, Bulat; Gu, Jia; Xing, Lei

    2018-05-01

    Segmentation of liver in abdominal computed tomography (CT) is an important step for radiation therapy planning of hepatocellular carcinoma. Practically, a fully automatic segmentation of liver remains challenging because of low soft tissue contrast between liver and its surrounding organs, and its highly deformable shape. The purpose of this work is to develop a novel superpixel-based and boundary sensitive convolutional neural network (SBBS-CNN) pipeline for automated liver segmentation. The entire CT images were first partitioned into superpixel regions, where nearby pixels with similar CT number were aggregated. Secondly, we converted the conventional binary segmentation into a multinomial classification by labeling the superpixels into three classes: interior liver, liver boundary, and non-liver background. By doing this, the boundary region of the liver was explicitly identified and highlighted for the subsequent classification. Thirdly, we computed an entropy-based saliency map for each CT volume, and leveraged this map to guide the sampling of image patches over the superpixels. In this way, more patches were extracted from informative regions (e.g. the liver boundary with irregular changes) and fewer patches were extracted from homogeneous regions. Finally, deep CNN pipeline was built and trained to predict the probability map of the liver boundary. We tested the proposed algorithm in a cohort of 100 patients. With 10-fold cross validation, the SBBS-CNN achieved mean Dice similarity coefficients of 97.31  ±  0.36% and average symmetric surface distance of 1.77  ±  0.49 mm. Moreover, it showed superior performance in comparison with state-of-art methods, including U-Net, pixel-based CNN, active contour, level-sets and graph-cut algorithms. SBBS-CNN provides an accurate and effective tool for automated liver segmentation. It is also envisioned that the proposed framework is directly applicable in other medical image segmentation scenarios.

  14. Diagnostic Inspection of Pipelines for Estimating the State of Stress in Them

    NASA Astrophysics Data System (ADS)

    Subbotin, V. A.; Kolotilov, Yu. V.; Smirnova, V. Yu.; Ivashko, S. K.

    2017-12-01

    The diagnostic inspection used to estimate the technical state of a pipeline is described. The problems of inspection works are listed, and a functional-structural scheme is developed to estimate the state of stress in a pipeline. Final conclusions regarding the actual loading of a pipeline section are drawn upon a cross analysis of the entire information obtained during pipeline inspection.

  15. Mathematical modeling of non-stationary gas flow in gas pipeline

    NASA Astrophysics Data System (ADS)

    Fetisov, V. G.; Nikolaev, A. K.; Lykov, Y. V.; Duchnevich, L. N.

    2018-03-01

    An analysis of the operation of the gas transportation system shows that for a considerable part of time pipelines operate in an unsettled regime of gas movement. Its pressure and flow rate vary along the length of pipeline and over time as a result of uneven consumption and selection, switching on and off compressor units, shutting off stop valves, emergence of emergency leaks. The operational management of such regimes is associated with difficulty of reconciling the operating modes of individual sections of gas pipeline with each other, as well as with compressor stations. Determining the grounds that cause change in the operating mode of the pipeline system and revealing patterns of these changes determine the choice of its parameters. Therefore, knowledge of the laws of changing the main technological parameters of gas pumping through pipelines in conditions of non-stationary motion is of great importance for practice.

  16. Natural Gas Pipeline Statistics

    DOT National Transportation Integrated Search

    1980-01-01

    Federal regulation CFR 49, part 191 requires that all gas pipeline operators file annual reports with the U.S. Department of Transportation's Materials Transportation Bureau. These reports contain a wide range of safety and operational data involving...

  17. NPDES Permit for Northern Border Pipeline Company in Montana

    EPA Pesticide Factsheets

    Under NPDES permit MT-0030791, the Northern Border Pipeline Company is authorized to discharge from locations along the Northern Border Gas Transmission Pipeline located within the exterior boundaries of the Fort Peck Indian Reservation, Montana.

  18. Structural reliability assessment of the Oman India Pipeline

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Al-Sharif, A.M.; Preston, R.

    1996-12-31

    Reliability techniques are increasingly finding application in design. The special design conditions for the deep water sections of the Oman India Pipeline dictate their use since the experience basis for application of standard deterministic techniques is inadequate. The paper discusses the reliability analysis as applied to the Oman India Pipeline, including selection of a collapse model, characterization of the variability in the parameters that affect pipe resistance to collapse, and implementation of first and second order reliability analyses to assess the probability of pipe failure. The reliability analysis results are used as the basis for establishing the pipe wall thicknessmore » requirements for the pipeline.« less

  19. 77 FR 65508 - Annual Charge Filing Procedures for Natural Gas Pipelines

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-29

    ...] Annual Charge Filing Procedures for Natural Gas Pipelines AGENCY: Federal Energy Regulatory Commission... FERC) is proposing to amend its regulations to revise the filing requirements for natural gas pipelines...) clause. Currently, natural gas pipelines utilizing an ACA clause must make a tariff filing to reflect a...

  20. 40 CFR 761.250 - Sample site selection for pipeline section abandonment.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 30 2010-07-01 2010-07-01 false Sample site selection for pipeline... Disposal of Natural Gas Pipeline: Selecting Sample Sites, Collecting Surface Samples, and Analyzing Standard PCB Wipe Samples § 761.250 Sample site selection for pipeline section abandonment. This procedure...

  1. 40 CFR 761.250 - Sample site selection for pipeline section abandonment.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 31 2011-07-01 2011-07-01 false Sample site selection for pipeline... Disposal of Natural Gas Pipeline: Selecting Sample Sites, Collecting Surface Samples, and Analyzing Standard PCB Wipe Samples § 761.250 Sample site selection for pipeline section abandonment. This procedure...

  2. 18 CFR 284.142 - Sales by intrastate pipelines.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Sales by intrastate... AUTHORITIES CERTAIN SALES AND TRANSPORTATION OF NATURAL GAS UNDER THE NATURAL GAS POLICY ACT OF 1978 AND RELATED AUTHORITIES Certain Sales by Intrastate Pipelines § 284.142 Sales by intrastate pipelines. Any...

  3. 18 CFR 284.142 - Sales by intrastate pipelines.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 18 Conservation of Power and Water Resources 1 2012-04-01 2012-04-01 false Sales by intrastate... AUTHORITIES CERTAIN SALES AND TRANSPORTATION OF NATURAL GAS UNDER THE NATURAL GAS POLICY ACT OF 1978 AND RELATED AUTHORITIES Certain Sales by Intrastate Pipelines § 284.142 Sales by intrastate pipelines. Any...

  4. Improved, Low-Stress Economical Submerged Pipeline

    NASA Technical Reports Server (NTRS)

    Jones, Jack A.; Chao, Yi

    2011-01-01

    A preliminary study has shown that the use of a high-strength composite fiber cloth material may greatly reduce fabrication and deployment costs of a subsea offshore pipeline. The problem is to develop an inexpensive submerged pipeline that can safely and economically transport large quantities of fresh water, oil, and natural gas underwater for long distances. Above-water pipelines are often not feasible due to safety, cost, and environmental problems, and present, fixed-wall, submerged pipelines are often very expensive. The solution is to have a submerged, compliant-walled tube that when filled, is lighter than the surrounding medium. Some examples include compliant tubes for transporting fresh water under the ocean, for transporting crude oil underneath salt or fresh water, and for transporting high-pressure natural gas from offshore to onshore. In each case, the fluid transported is lighter than its surrounding fluid, and thus the flexible tube will tend to float. The tube should be ballasted to the ocean floor so as to limit the motion of the tube in the horizontal and vertical directions. The tube should be placed below 100-m depth to minimize biofouling and turbulence from surface storms. The tube may also have periodic pumps to maintain flow without over-pressurizing, or it can have a single pump at the beginning. The tube may have periodic valves that allow sections of the tube to be repaired or maintained. Some examples of tube materials that may be particularly suited for these applications are non-porous composite tubes made of high-performance fibers such as Kevlar, Spectra, PBO, Aramid, carbon fibers, or high-strength glass. Above-ground pipes for transporting water, oil, and natural gas have typically been fabricated from fiber-reinforced plastic or from more costly high-strength steel. Also, previous suggested subsea pipeline designs have only included heavy fixed-wall pipes that can be very expensive initially, and can be difficult and expensive

  5. An Autonomous Data Reduction Pipeline for Wide Angle EO Systems

    NASA Astrophysics Data System (ADS)

    Privett, G.; George, S.; Feline, W.; Ash, A.; Routledge, G.

    The UK’s National Space and Security Policy states that the identification of potential on-orbit collisions and re-entry warning over the UK is of high importance, and is driving requirements for indigenous Space Situational Awareness (SSA) systems. To meet these requirements options are being examined, including the creation of a distributed network of simple, low cost commercial–off-the-shelf electro-optical sensors to support survey work and catalogue maintenance. This paper outlines work at Dstl examining whether data obtained using readily-deployable equipment could significantly enhance UK SSA capability and support cross-cueing between multiple deployed systems. To effectively exploit data from this distributed sensor architecture, a data handling system is required to autonomously detect satellite trails in a manner that pragmatically handles highly variable target intensities, periodicity and rates of apparent motion. The processing and collection strategies must be tailored to specific mission sets to ensure effective detections of platforms as diverse as stable geostationary satellites and low altitude CubeSats. Data captured during the Automated Transfer Vehicle-5 (ATV-5) de-orbit trial and images captured of a rocket body break up and a deployed deorbit sail have been employed to inform the development of a prototype processing pipeline for autonomous on-site processing. The approach taken employs tools such as Astrometry.Net and DAOPHOT from the astronomical community, together with image processing and orbit determination software developed inhouse by Dstl. Interim results from the automated analysis of data collected from wide angle sensors are described, together with the current perceived limitations of the proposed system and our plans for future development.

  6. The X-shooter pipeline

    NASA Astrophysics Data System (ADS)

    Modigliani, Andrea; Goldoni, Paolo; Royer, Frédéric; Haigron, Regis; Guglielmi, Laurent; François, Patrick; Horrobin, Matthew; Bristow, Paul; Vernet, Joel; Moehler, Sabine; Kerber, Florian; Ballester, Pascal; Mason, Elena; Christensen, Lise

    2010-07-01

    The X-shooter data reduction pipeline, as part of the ESO-VLT Data Flow System, provides recipes for Paranal Science Operations, and for Data Product and Quality Control Operations at Garching headquarters. At Paranal, it is used for the quick-look data evaluation. The pipeline recipes can be executed either with EsoRex at the command line level or through the Gasgano graphical user interface. The recipes are implemented with the ESO Common Pipeline Library (CPL). X-shooter is the first of the second generation of VLT instruments. It makes possible to collect in one shot the full spectrum of the target from 300 to 2500 nm, subdivided in three arms optimised for UVB, VIS and NIR ranges, with an efficiency between 15% and 35% including the telescope and the atmosphere, and a spectral resolution varying between 3000 and 17,000. It allows observations in stare, offset modes, using the slit or an IFU, and observing sequences nodding the target along the slit. Data reduction can be performed either with a classical approach, by determining the spectral format via 2D-polynomial transformations, or with the help of a dedicated instrument physical model to gain insight on the instrument and allowing a constrained solution that depends on a few parameters with a physical meaning. In the present paper we describe the steps of data reduction necessary to fully reduce science observations in the different modes with examples on typical data calibrations and observations sequences.

  7. 30 CFR 250.1751 - How do I decommission a pipeline in place?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... to be decommissioned; and (4) Length (feet) of segment remaining. (b) Pig the pipeline, unless the Regional Supervisor determines that pigging is not practical; (c) Flush the pipeline; (d) Fill the pipeline...

  8. 30 CFR 250.1751 - How do I decommission a pipeline in place?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... to be decommissioned; and (4) Length (feet) of segment remaining. (b) Pig the pipeline, unless the Regional Supervisor determines that pigging is not practical; (c) Flush the pipeline; (d) Fill the pipeline...

  9. 30 CFR 250.1751 - How do I decommission a pipeline in place?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... to be decommissioned; and (4) Length (feet) of segment remaining. (b) Pig the pipeline, unless the Regional Supervisor determines that pigging is not practical; (c) Flush the pipeline; (d) Fill the pipeline...

  10. 30 CFR 250.1751 - How do I decommission a pipeline in place?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... to be decommissioned; and (4) Length (feet) of segment remaining. (b) Pig the pipeline, unless the Regional Supervisor determines that pigging is not practical; (c) Flush the pipeline; (d) Fill the pipeline...

  11. 49 CFR 195.452 - Pipeline integrity management in high consequence areas.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 49 Transportation 3 2013-10-01 2013-10-01 false Pipeline integrity management in high consequence... Management § 195.452 Pipeline integrity management in high consequence areas. (a) Which pipelines are covered... by this section must: (1) Develop a written integrity management program that addresses the risks on...

  12. 49 CFR 195.452 - Pipeline integrity management in high consequence areas.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 49 Transportation 3 2012-10-01 2012-10-01 false Pipeline integrity management in high consequence... Management § 195.452 Pipeline integrity management in high consequence areas. (a) Which pipelines are covered... by this section must: (1) Develop a written integrity management program that addresses the risks on...

  13. 49 CFR 195.452 - Pipeline integrity management in high consequence areas.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 49 Transportation 3 2014-10-01 2014-10-01 false Pipeline integrity management in high consequence... Management § 195.452 Pipeline integrity management in high consequence areas. (a) Which pipelines are covered... by this section must: (1) Develop a written integrity management program that addresses the risks on...

  14. 75 FR 80300 - Five-Year Review of Oil Pipeline Pricing Index

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-22

    .... On September 24, 2010, the U.S. Department of Transportation, Pipeline and Hazardous Materials Safety... pipeline cost changes for the 2004-2009 period: \\12\\ AOPL states that Dr. Shehadeh began his analysis using... typical pipeline operator. Valero states that Mr. O'Loughlin's analysis applied an objective filter which...

  15. 78 FR 39720 - Atmos Pipeline and Storage, LLC; Notice of Application

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-02

    ... and Storage, LLC; Notice of Application Take notice that on June 14, 2013, Atmos Pipeline and Storage... authorizing the construction and operation of the Fort Necessity Gas Storage Project (Project) and associated...) \\2\\. \\1\\ Atmos Pipeline and Storage, LLC, 127 FERC ] 61,260 (2009). \\2\\ Atmos Pipeline and Storage...

  16. Prokaryotic Contig Annotation Pipeline Server: Web Application for a Prokaryotic Genome Annotation Pipeline Based on the Shiny App Package.

    PubMed

    Park, Byeonghyeok; Baek, Min-Jeong; Min, Byoungnam; Choi, In-Geol

    2017-09-01

    Genome annotation is a primary step in genomic research. To establish a light and portable prokaryotic genome annotation pipeline for use in individual laboratories, we developed a Shiny app package designated as "P-CAPS" (Prokaryotic Contig Annotation Pipeline Server). The package is composed of R and Python scripts that integrate publicly available annotation programs into a server application. P-CAPS is not only a browser-based interactive application but also a distributable Shiny app package that can be installed on any personal computer. The final annotation is provided in various standard formats and is summarized in an R markdown document. Annotation can be visualized and examined with a public genome browser. A benchmark test showed that the annotation quality and completeness of P-CAPS were reliable and compatible with those of currently available public pipelines.

  17. New gas-pipeline system planned for Argentina

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andrich, V.

    1979-03-01

    A new gas-pipeline system planned for Argentina by Gas del Estado will carry up to 10 million cu m/day to Mendoza, San Juan, and San Luis from the Neuquen basin. Operating on a toll system of transport payment, the Centro-Oeste pipeline system will consist of 1100 km of over 30 in. dia trunk line and 600 km of 8, 12, and 18 in. dia lateral lines.

  18. 30 CFR 250.1751 - How do I decommission a pipeline in place?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ...) Length (feet) of segment remaining. (b) Pig the pipeline, unless the Regional Supervisor determines that pigging is not practical; (c) Flush the pipeline; (d) Fill the pipeline with seawater; (e) Cut and plug...

  19. 76 FR 45332 - Pipeline and Hazardous Materials Safety Administration

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-28

    ... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration Office of... Hazardous Materials Safety Administration (PHMSA), DOT. ACTION: List of Applications for Modification of..., 2011. ADDRESSES: Record Center, Pipeline and Hazardous Materials Safety Administration, U.S. Department...

  20. Development of a design methodology for pipelines in ice scoured seabeds

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clark, J.I.; Paulin, M.J.; Lach, P.R.

    1994-12-31

    Large areas of the continental shelf of northern oceans are frequently scoured or gouged by moving bodies of ice such as icebergs and sea ice keels associated with pressure ridges. This phenomenon presents a formidable challenge when the route of a submarine pipeline is intersected by the scouring ice. It is generally acknowledged that if a pipeline, laid on the seabed, were hit by an iceberg or a pressure ridge keel, the forces imposed on the pipeline would be much greater than it could practically withstand. The pipeline must therefore be buried to avoid direct contact with ice, but itmore » is very important to determine with some assurance the minimum depth required for safety for both economical and environmental reasons. The safe burial depth of a pipeline, however, cannot be determined directly from the relatively straight forward measurement of maximum scour depth. The major design consideration is the determination of the potential sub-scour deformation of the ice scoured soil. Forces transmitted through the soil and soil displacement around the pipeline could load the pipeline to failure if not taken into account in the design. If the designer can predict the forces transmitted through the soil, the pipeline can be designed to withstand these external forces using conventional design practice. In this paper, the authors outline a design methodology that is based on phenomenological studies of ice scoured terrain, both modern and relict, laboratory tests, centrifuge modeling, and numerical analysis. The implications of these studies, which could assist in the safe and economical design of pipelines in ice scoured terrain, will also be discussed.« less