NASA Astrophysics Data System (ADS)
Duan, Yanzhi
2017-01-01
The gas pipeline networks in Sichuan and Chongqing (Sichuan-Chongqing) region have formed a fully-fledged gas pipeline transportation system in China, which supports and promotes the rapid development of gas market in Sichuan-Chongqing region. In the circumstances of further developed market-oriented economy, it is necessary to carry out further the pipeline system reform in the areas of investment/financing system, operation system and pricing system to lay a solid foundation for improving future gas production and marketing capability and adapting itself to the national gas system reform, and to achieve the objectives of multiparty participated pipeline construction, improved pipeline transportation efficiency and fair and rational pipeline transportation prices. In this article, main thinking on reform in the three areas and major deployment are addressed, and corresponding measures on developing shared pipeline economy, providing financial support to pipeline construction, setting up independent regulatory agency to enhance the industrial supervision for gas pipeline transportation, and promoting the construction of regional gas trade market are recommended.
About U.S. Natural Gas Pipelines
2007-01-01
This information product provides the interested reader with a broad and non-technical overview of how the U.S. natural gas pipeline network operates, along with some insights into the many individual pipeline systems that make up the network. While the focus of the presentation is the transportation of natural gas over the interstate and intrastate pipeline systems, information on subjects related to pipeline development, such as system design and pipeline expansion, are also included.
Pipeline repair development in support of the Oman to India gas pipeline
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abadie, W.; Carlson, W.
1995-12-01
This paper provides a summary of development which has been conducted to date for the ultra deep, diverless pipeline repair system for the proposed Oman to India Gas Pipeline. The work has addressed critical development areas involving testing and/or prototype development of tools and procedures required to perform a diverless pipeline repair in water depths of up to 3,525 m.
Development of Protective Coatings for Co-Sequestration Processes and Pipelines
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bierwagen, Gordon; Huang, Yaping
2011-11-30
The program, entitled Development of Protective Coatings for Co-Sequestration Processes and Pipelines, examined the sensitivity of existing coating systems to supercritical carbon dioxide (SCCO2) exposure and developed new coating system to protect pipelines from their corrosion under SCCO2 exposure. A literature review was also conducted regarding pipeline corrosion sensors to monitor pipes used in handling co-sequestration fluids. Research was to ensure safety and reliability for a pipeline involving transport of SCCO2 from the power plant to the sequestration site to mitigate the greenhouse gas effect. Results showed that one commercial coating and one designed formulation can both be supplied asmore » potential candidates for internal pipeline coating to transport SCCO2.« less
77 FR 74275 - Pipeline Safety: Information Collection Activities
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-13
... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket No.... These regulations require operators of hazardous liquid pipelines and gas pipelines to develop and... control room. Affected Public: Operators of both natural gas and hazardous liquid pipeline systems. Annual...
Drive Control System for Pipeline Crawl Robot Based on CAN Bus
NASA Astrophysics Data System (ADS)
Chen, H. J.; Gao, B. T.; Zhang, X. H.; Deng2, Z. Q.
2006-10-01
Drive control system plays important roles in pipeline robot. In order to inspect the flaw and corrosion of seabed crude oil pipeline, an original mobile pipeline robot with crawler drive unit, power and monitor unit, central control unit, and ultrasonic wave inspection device is developed. The CAN bus connects these different function units and presents a reliable information channel. Considering the limited space, a compact hardware system is designed based on an ARM processor with two CAN controllers. With made-to-order CAN protocol for the crawl robot, an intelligent drive control system is developed. The implementation of the crawl robot demonstrates that the presented drive control scheme can meet the motion control requirements of the underwater pipeline crawl robot.
Development of the updated system of city underground pipelines based on Visual Studio
NASA Astrophysics Data System (ADS)
Zhang, Jianxiong; Zhu, Yun; Li, Xiangdong
2009-10-01
Our city has owned the integrated pipeline network management system with ArcGIS Engine 9.1 as the bottom development platform and with Oracle9i as basic database for storaging data. In this system, ArcGIS SDE9.1 is applied as the spatial data engine, and the system was a synthetic management software developed with Visual Studio visualization procedures development tools. As the pipeline update function of the system has the phenomenon of slower update and even sometimes the data lost, to ensure the underground pipeline data can real-time be updated conveniently and frequently, and the actuality and integrity of the underground pipeline data, we have increased a new update module in the system developed and researched by ourselves. The module has the powerful data update function, and can realize the function of inputting and outputting and rapid update volume of data. The new developed module adopts Visual Studio visualization procedures development tools, and uses access as the basic database to storage data. We can edit the graphics in AutoCAD software, and realize the database update using link between the graphics and the system. Practice shows that the update module has good compatibility with the original system, reliable and high update efficient of the database.
Development of a robotic system of nonstripping pipeline repair by reinforced polymeric compositions
NASA Astrophysics Data System (ADS)
Rybalkin, LA
2018-03-01
The article considers the possibility of creating a robotic system for pipeline repair. The pipeline repair is performed due to inner layer formation by special polyurethane compositions reinforced by short glass fiber strands. This approach provides the opportunity to repair pipelines without excavation works and pipe replacement.
Guidelines for riser splash zone design and repair
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1995-02-01
The many years of offshore oil and gas development has established the subsea pipeline as a reliable and cost effective means of transportation for produced hydrocarbons. The requirement for subsea pipeline systems will continue to move into deeper water and more remote locations with the future development of oil and gas exploration. The integrity of subsea pipeline and riser systems, throughout their operating lifetime, is an important area for operators to consider in maximizing reliability and serviceability for economic, contractual and environmental reasons. Adequate design and installation are the basis for ensuring the integrity of any subsea pipeline and risermore » systems. In the event of system damage, from any source, quick and accurate repair and reinstatement of the pipeline system is essential. This report has been developed to provide guidelines for riser and splash zone design, to perform a detailed overview of existing riser repair techniques and products, and to prepare comprehensive guidelines identifying the capabilities and limits of riser reinstatement systems.« less
A graph-based approach for designing extensible pipelines
2012-01-01
Background In bioinformatics, it is important to build extensible and low-maintenance systems that are able to deal with the new tools and data formats that are constantly being developed. The traditional and simplest implementation of pipelines involves hardcoding the execution steps into programs or scripts. This approach can lead to problems when a pipeline is expanding because the incorporation of new tools is often error prone and time consuming. Current approaches to pipeline development such as workflow management systems focus on analysis tasks that are systematically repeated without significant changes in their course of execution, such as genome annotation. However, more dynamism on the pipeline composition is necessary when each execution requires a different combination of steps. Results We propose a graph-based approach to implement extensible and low-maintenance pipelines that is suitable for pipeline applications with multiple functionalities that require different combinations of steps in each execution. Here pipelines are composed automatically by compiling a specialised set of tools on demand, depending on the functionality required, instead of specifying every sequence of tools in advance. We represent the connectivity of pipeline components with a directed graph in which components are the graph edges, their inputs and outputs are the graph nodes, and the paths through the graph are pipelines. To that end, we developed special data structures and a pipeline system algorithm. We demonstrate the applicability of our approach by implementing a format conversion pipeline for the fields of population genetics and genetic epidemiology, but our approach is also helpful in other fields where the use of multiple software is necessary to perform comprehensive analyses, such as gene expression and proteomics analyses. The project code, documentation and the Java executables are available under an open source license at http://code.google.com/p/dynamic-pipeline. The system has been tested on Linux and Windows platforms. Conclusions Our graph-based approach enables the automatic creation of pipelines by compiling a specialised set of tools on demand, depending on the functionality required. It also allows the implementation of extensible and low-maintenance pipelines and contributes towards consolidating openness and collaboration in bioinformatics systems. It is targeted at pipeline developers and is suited for implementing applications with sequential execution steps and combined functionalities. In the format conversion application, the automatic combination of conversion tools increased both the number of possible conversions available to the user and the extensibility of the system to allow for future updates with new file formats. PMID:22788675
PipelineDog: a simple and flexible graphic pipeline construction and maintenance tool.
Zhou, Anbo; Zhang, Yeting; Sun, Yazhou; Xing, Jinchuan
2018-05-01
Analysis pipelines are an essential part of bioinformatics research, and ad hoc pipelines are frequently created by researchers for prototyping and proof-of-concept purposes. However, most existing pipeline management system or workflow engines are too complex for rapid prototyping or learning the pipeline concept. A lightweight, user-friendly and flexible solution is thus desirable. In this study, we developed a new pipeline construction and maintenance tool, PipelineDog. This is a web-based integrated development environment with a modern web graphical user interface. It offers cross-platform compatibility, project management capabilities, code formatting and error checking functions and an online repository. It uses an easy-to-read/write script system that encourages code reuse. With the online repository, it also encourages sharing of pipelines, which enhances analysis reproducibility and accountability. For most users, PipelineDog requires no software installation. Overall, this web application provides a way to rapidly create and easily manage pipelines. PipelineDog web app is freely available at http://web.pipeline.dog. The command line version is available at http://www.npmjs.com/package/pipelinedog and online repository at http://repo.pipeline.dog. ysun@kean.edu or xing@biology.rutgers.edu or ysun@diagnoa.com. Supplementary data are available at Bioinformatics online.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-25
... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket ID... Management Programs AGENCY: Pipeline and Hazardous Materials Safety Administration (PHMSA), DOT. ACTION... Nation's gas distribution pipeline systems through development of inspection methods and guidance for the...
Pipelining in a changing competitive environment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jones, E.G.; Wishart, D.M.
1996-12-31
The changing competitive environment for the pipeline industry presents a broad spectrum of new challenges and opportunities: international cooperation; globalization of opportunities, organizations and competition; and integrated systems approach to system configuration, financing, contracting strategy, materials sourcing, and operations; cutting edge and emerging technologies; adherence to high standards of environmental protection; an emphasis on safety; innovative approaches to project financing; and advances in technology and programs to maintain the long term, cost effective integrity of operating pipeline systems. These challenges and opportunities are partially a result of the increasingly competitive nature of pipeline development and the public`s intolerance to incidentsmore » of pipeline failure. A creative systems approach to these challenges is often the key to the project moving ahead. This usually encompasses collaboration among users of the pipeline, pipeline owners and operators, international engineering and construction companies, equipment and materials suppliers, in-country engineers and constructors, international lending agencies and financial institutions.« less
The SCUBA Data Reduction Pipeline: ORAC-DR at the JCMT
NASA Astrophysics Data System (ADS)
Jenness, Tim; Economou, Frossie
The ORAC data reduction pipeline, developed for UKIRT, has been designed to be a completely general approach to writing data reduction pipelines. This generality has enabled the JCMT to adapt the system for use with SCUBA with minimal development time using the existing SCUBA data reduction algorithms (Surf).
NASA Astrophysics Data System (ADS)
Wen, Shipeng; Xu, Jishang; Hu, Guanghai; Dong, Ping; Shen, Hong
2015-08-01
The safety of submarine pipelines is largely influenced by free spans and corrosions. Previous studies on free spans caused by seabed scours are mainly based on the stable environment, where the background seabed scour is in equilibrium and the soil is homogeneous. To study the effects of background erosion on the free span development of subsea pipelines, a submarine pipeline located at the abandoned Yellow River subaqueous delta lobe was investigated with an integrated surveying system which included a Multibeam bathymetric system, a dual-frequency side-scan sonar, a high resolution sub-bottom profiler, and a Magnetic Flux Leakage (MFL) sensor. We found that seabed homogeneity has a great influence on the free span development of the pipeline. More specifically, for homogeneous background scours, the morphology of scour hole below the pipeline is quite similar to that without the background scour, whereas for inhomogeneous background scour, the nature of spanning is mainly dependent on the evolution of seabed morphology near the pipeline. Magnetic Flux Leakage (MFL) detection results also reveal the possible connection between long free spans and accelerated corrosion of the pipeline.
NASA Astrophysics Data System (ADS)
Pérez-López, F.; Vallejo, J. C.; Martínez, S.; Ortiz, I.; Macfarlane, A.; Osuna, P.; Gill, R.; Casale, M.
2015-09-01
BepiColombo is an interdisciplinary ESA mission to explore the planet Mercury in cooperation with JAXA. The mission consists of two separate orbiters: ESA's Mercury Planetary Orbiter (MPO) and JAXA's Mercury Magnetospheric Orbiter (MMO), which are dedicated to the detailed study of the planet and its magnetosphere. The MPO scientific payload comprises eleven instruments packages covering different disciplines developed by several European teams. This paper describes the design and development approach of the framework required to support the operation of the distributed BepiColombo MPO instruments pipelines, developed and operated from different locations, but designed as a single entity. An architecture based on primary-redundant configuration, fully integrated into the BepiColombo Science Operations Control System (BSCS), has been selected, where some instrument pipelines will be operated from the instrument team's data processing centres, having a pipeline replica that can be run from the Science Ground Segment (SGS), while others will be executed as primary pipelines from the SGS, adopting the SGS the pipeline orchestration role.
Freight pipelines: Current status and anticipated future use
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1998-07-01
This report is issued by the Task Committee on Freight Pipelines, Pipeline Division, ASCE. Freight pipelines of various types (including slurry pipeline, pneumatic pipeline, and capsule pipeline) have been used throughout the world for over a century for transporting solid and sometimes even package products. Recent advancements in pipeline technology, aided by advanced computer control systems and trenchless technologies, have greatly facilitated the transportation of solids by pipelines. Today, in many situations, freight pipelines are not only the most economical and practical means for transporting solids, they are also the most reliable, safest and most environmentally friendly transportation mode. Increasedmore » use of underground pipelines to transport freight is anticipated in the future, especially as the technology continues to improve and surface transportation modes such as highways become more congested. This paper describes the state of the art and expected future uses of various types of freight pipelines. Obstacles hindering the development and use of the most advanced freight pipeline systems, such as the pneumatic capsule pipeline for interstate transport of freight, are discussed.« less
The Hyper Suprime-Cam software pipeline
NASA Astrophysics Data System (ADS)
Bosch, James; Armstrong, Robert; Bickerton, Steven; Furusawa, Hisanori; Ikeda, Hiroyuki; Koike, Michitaro; Lupton, Robert; Mineo, Sogo; Price, Paul; Takata, Tadafumi; Tanaka, Masayuki; Yasuda, Naoki; AlSayyad, Yusra; Becker, Andrew C.; Coulton, William; Coupon, Jean; Garmilla, Jose; Huang, Song; Krughoff, K. Simon; Lang, Dustin; Leauthaud, Alexie; Lim, Kian-Tat; Lust, Nate B.; MacArthur, Lauren A.; Mandelbaum, Rachel; Miyatake, Hironao; Miyazaki, Satoshi; Murata, Ryoma; More, Surhud; Okura, Yuki; Owen, Russell; Swinbank, John D.; Strauss, Michael A.; Yamada, Yoshihiko; Yamanoi, Hitomi
2018-01-01
In this paper, we describe the optical imaging data processing pipeline developed for the Subaru Telescope's Hyper Suprime-Cam (HSC) instrument. The HSC Pipeline builds on the prototype pipeline being developed by the Large Synoptic Survey Telescope's Data Management system, adding customizations for HSC, large-scale processing capabilities, and novel algorithms that have since been reincorporated into the LSST codebase. While designed primarily to reduce HSC Subaru Strategic Program (SSP) data, it is also the recommended pipeline for reducing general-observer HSC data. The HSC pipeline includes high-level processing steps that generate coadded images and science-ready catalogs as well as low-level detrending and image characterizations.
Bad Actors Criticality Assessment for Pipeline system
NASA Astrophysics Data System (ADS)
Nasir, Meseret; Chong, Kit wee; Osman, Sabtuni; Siaw Khur, Wee
2015-04-01
Failure of a pipeline system could bring huge economic loss. In order to mitigate such catastrophic loss, it is required to evaluate and rank the impact of each bad actor of the pipeline system. In this study, bad actors are known as the root causes or any potential factor leading to the system downtime. Fault Tree Analysis (FTA) is used to analyze the probability of occurrence for each bad actor. Bimbaum's Importance and criticality measure (BICM) is also employed to rank the impact of each bad actor on the pipeline system failure. The results demonstrate that internal corrosion; external corrosion and construction damage are critical and highly contribute to the pipeline system failure with 48.0%, 12.4% and 6.0% respectively. Thus, a minor improvement in internal corrosion; external corrosion and construction damage would bring significant changes in the pipeline system performance and reliability. These results could also be useful to develop efficient maintenance strategy by identifying the critical bad actors.
Expansion of the U.S. Natural Gas Pipeline Network
2009-01-01
Additions in 2008 and Projects through 2011. This report examines new natural gas pipeline capacity added to the U.S. natural gas pipeline system during 2008. In addition, it discusses and analyzes proposed natural gas pipeline projects that may be developed between 2009 and 2011, and the market factors supporting these initiatives.
Design Optimization of Innovative High-Level Waste Pipeline Unplugging Technologies - 13341
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pribanic, T.; Awwad, A.; Varona, J.
2013-07-01
Florida International University (FIU) is currently working on the development and optimization of two innovative pipeline unplugging methods: the asynchronous pulsing system (APS) and the peristaltic crawler system (PCS). Experiments were conducted on the APS to determine how air in the pipeline influences the system's performance as well as determine the effectiveness of air mitigation techniques in a pipeline. The results obtained during the experimental phase of the project, including data from pipeline pressure pulse tests along with air bubble compression tests are presented. Single-cycle pulse amplification caused by a fast-acting cylinder piston pump in 21.8, 30.5, and 43.6 mmore » pipelines were evaluated. Experiments were conducted on fully flooded pipelines as well as pipelines that contained various amounts of air to evaluate the system's performance when air is present in the pipeline. Also presented are details of the improvements implemented to the third generation crawler system (PCS). The improvements include the redesign of the rims of the unit to accommodate a camera system that provides visual feedback of the conditions inside the pipeline. Visual feedback allows the crawler to be used as a pipeline unplugging and inspection tool. Tests conducted previously demonstrated a significant reduction of the crawler speed with increasing length of tether. Current improvements include the positioning of a pneumatic valve manifold system that is located in close proximity to the crawler, rendering tether length independent of crawler speed. Additional improvements to increase the crawler's speed were also investigated and presented. Descriptions of the test beds, which were designed to emulate possible scenarios present on the Department of Energy (DOE) pipelines, are presented. Finally, conclusions and recommendations for the systems are provided. (authors)« less
NASA Astrophysics Data System (ADS)
Artana, K. B.; Pitana, T.; Dinariyana, D. P.; Ariana, M.; Kristianto, D.; Pratiwi, E.
2018-06-01
The aim of this research is to develop an algorithm and application that can perform real-time monitoring of the safety operation of offshore platforms and subsea gas pipelines as well as determine the need for ship inspection using data obtained from automatic identification system (AIS). The research also focuses on the integration of shipping database, AIS data, and others to develop a prototype for designing a real-time monitoring system of offshore platforms and pipelines. A simple concept is used in the development of this prototype, which is achieved by using an overlaying map that outlines the coordinates of the offshore platform and subsea gas pipeline with the ship's coordinates (longitude/latitude) as detected by AIS. Using such information, we can then build an early warning system (EWS) relayed through short message service (SMS), email, or other means when the ship enters the restricted and exclusion zone of platforms and pipelines. The ship inspection system is developed by combining several attributes. Then, decision analysis software is employed to prioritize the vessel's four attributes, including ship age, ship type, classification, and flag state. Results show that the EWS can increase the safety level of offshore platforms and pipelines, as well as the efficient use of patrol boats in monitoring the safety of the facilities. Meanwhile, ship inspection enables the port to prioritize the ship to be inspected in accordance with the priority ranking inspection score.
The Hyper Suprime-Cam software pipeline
Bosch, James; Armstrong, Robert; Bickerton, Steven; ...
2017-10-12
Here in this article, we describe the optical imaging data processing pipeline developed for the Subaru Telescope’s Hyper Suprime-Cam (HSC) instrument. The HSC Pipeline builds on the prototype pipeline being developed by the Large Synoptic Survey Telescope’s Data Management system, adding customizations for HSC, large-scale processing capabilities, and novel algorithms that have since been reincorporated into the LSST codebase. While designed primarily to reduce HSC Subaru Strategic Program (SSP) data, it is also the recommended pipeline for reducing general-observer HSC data. The HSC pipeline includes high-level processing steps that generate coadded images and science-ready catalogs as well as low-level detrendingmore » and image characterizations.« less
The Hyper Suprime-Cam software pipeline
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bosch, James; Armstrong, Robert; Bickerton, Steven
Here in this article, we describe the optical imaging data processing pipeline developed for the Subaru Telescope’s Hyper Suprime-Cam (HSC) instrument. The HSC Pipeline builds on the prototype pipeline being developed by the Large Synoptic Survey Telescope’s Data Management system, adding customizations for HSC, large-scale processing capabilities, and novel algorithms that have since been reincorporated into the LSST codebase. While designed primarily to reduce HSC Subaru Strategic Program (SSP) data, it is also the recommended pipeline for reducing general-observer HSC data. The HSC pipeline includes high-level processing steps that generate coadded images and science-ready catalogs as well as low-level detrendingmore » and image characterizations.« less
Zhang, Jing; Liang, Lichen; Anderson, Jon R; Gatewood, Lael; Rottenberg, David A; Strother, Stephen C
2008-01-01
As functional magnetic resonance imaging (fMRI) becomes widely used, the demands for evaluation of fMRI processing pipelines and validation of fMRI analysis results is increasing rapidly. The current NPAIRS package, an IDL-based fMRI processing pipeline evaluation framework, lacks system interoperability and the ability to evaluate general linear model (GLM)-based pipelines using prediction metrics. Thus, it can not fully evaluate fMRI analytical software modules such as FSL.FEAT and NPAIRS.GLM. In order to overcome these limitations, a Java-based fMRI processing pipeline evaluation system was developed. It integrated YALE (a machine learning environment) into Fiswidgets (a fMRI software environment) to obtain system interoperability and applied an algorithm to measure GLM prediction accuracy. The results demonstrated that the system can evaluate fMRI processing pipelines with univariate GLM and multivariate canonical variates analysis (CVA)-based models on real fMRI data based on prediction accuracy (classification accuracy) and statistical parametric image (SPI) reproducibility. In addition, a preliminary study was performed where four fMRI processing pipelines with GLM and CVA modules such as FSL.FEAT and NPAIRS.CVA were evaluated with the system. The results indicated that (1) the system can compare different fMRI processing pipelines with heterogeneous models (NPAIRS.GLM, NPAIRS.CVA and FSL.FEAT) and rank their performance by automatic performance scoring, and (2) the rank of pipeline performance is highly dependent on the preprocessing operations. These results suggest that the system will be of value for the comparison, validation, standardization and optimization of functional neuroimaging software packages and fMRI processing pipelines.
NASA Astrophysics Data System (ADS)
Tomarov, G. V.; Povarov, V. P.; Shipkov, A. A.; Gromov, A. F.; Kiselev, A. N.; Shepelev, S. V.; Galanin, A. V.
2015-02-01
Specific features relating to development of the information-analytical system on the problem of flow-accelerated corrosion of pipeline elements in the secondary coolant circuit of the VVER-440-based power units at the Novovoronezh nuclear power plant are considered. The results from a statistical analysis of data on the quantity, location, and operating conditions of the elements and preinserted segments of pipelines used in the condensate-feedwater and wet steam paths are presented. The principles of preparing and using the information-analytical system for determining the lifetime to reaching inadmissible wall thinning in elements of pipelines used in the secondary coolant circuit of the VVER-440-based power units at the Novovoronezh NPP are considered.
Digital Mapping of Buried Pipelines with a Dual Array System
DOT National Transportation Integrated Search
2003-06-06
The objective of this research is to develop a non-invasive system for detecting, mapping, and inspecting ferrous and plastic pipelines in place using technology that combines and interprets measurements from ground penetrating radar and electromagne...
A Concept for the One Degree Imager (ODI) Data Reduction Pipeline and Archiving System
NASA Astrophysics Data System (ADS)
Knezek, Patricia; Stobie, B.; Michael, S.; Valdes, F.; Marru, S.; Henschel, R.; Pierce, M.
2010-05-01
The One Degree Imager (ODI), currently being built by the WIYN Observatory, will provide tremendous possibilities for conducting diverse scientific programs. ODI will be a complex instrument, using non-conventional Orthogonal Transfer Array (OTA) detectors. Due to its large field of view, small pixel size, use of OTA technology, and expected frequent use, ODI will produce vast amounts of astronomical data. If ODI is to achieve its full potential, a data reduction pipeline must be developed. Long-term archiving must also be incorporated into the pipeline system to ensure the continued value of ODI data. This paper presents a concept for an ODI data reduction pipeline and archiving system. To limit costs and development time, our plan leverages existing software and hardware, including existing pipeline software, Science Gateways, Computational Grid & Cloud Technology, Indiana University's Data Capacitor and Massive Data Storage System, and TeraGrid compute resources. Existing pipeline software will be augmented to add functionality required to meet challenges specific to ODI, enhance end-user control, and enable the execution of the pipeline on grid resources including national grid resources such as the TeraGrid and Open Science Grid. The planned system offers consistent standard reductions and end-user flexibility when working with images beyond the initial instrument signature removal. It also gives end-users access to computational and storage resources far beyond what are typically available at most institutions. Overall, the proposed system provides a wide array of software tools and the necessary hardware resources to use them effectively.
High-throughput bioinformatics with the Cyrille2 pipeline system
Fiers, Mark WEJ; van der Burgt, Ate; Datema, Erwin; de Groot, Joost CW; van Ham, Roeland CHJ
2008-01-01
Background Modern omics research involves the application of high-throughput technologies that generate vast volumes of data. These data need to be pre-processed, analyzed and integrated with existing knowledge through the use of diverse sets of software tools, models and databases. The analyses are often interdependent and chained together to form complex workflows or pipelines. Given the volume of the data used and the multitude of computational resources available, specialized pipeline software is required to make high-throughput analysis of large-scale omics datasets feasible. Results We have developed a generic pipeline system called Cyrille2. The system is modular in design and consists of three functionally distinct parts: 1) a web based, graphical user interface (GUI) that enables a pipeline operator to manage the system; 2) the Scheduler, which forms the functional core of the system and which tracks what data enters the system and determines what jobs must be scheduled for execution, and; 3) the Executor, which searches for scheduled jobs and executes these on a compute cluster. Conclusion The Cyrille2 system is an extensible, modular system, implementing the stated requirements. Cyrille2 enables easy creation and execution of high throughput, flexible bioinformatics pipelines. PMID:18269742
Cormier, Nathan; Kolisnik, Tyler; Bieda, Mark
2016-07-05
There has been an enormous expansion of use of chromatin immunoprecipitation followed by sequencing (ChIP-seq) technologies. Analysis of large-scale ChIP-seq datasets involves a complex series of steps and production of several specialized graphical outputs. A number of systems have emphasized custom development of ChIP-seq pipelines. These systems are primarily based on custom programming of a single, complex pipeline or supply libraries of modules and do not produce the full range of outputs commonly produced for ChIP-seq datasets. It is desirable to have more comprehensive pipelines, in particular ones addressing common metadata tasks, such as pathway analysis, and pipelines producing standard complex graphical outputs. It is advantageous if these are highly modular systems, available as both turnkey pipelines and individual modules, that are easily comprehensible, modifiable and extensible to allow rapid alteration in response to new analysis developments in this growing area. Furthermore, it is advantageous if these pipelines allow data provenance tracking. We present a set of 20 ChIP-seq analysis software modules implemented in the Kepler workflow system; most (18/20) were also implemented as standalone, fully functional R scripts. The set consists of four full turnkey pipelines and 16 component modules. The turnkey pipelines in Kepler allow data provenance tracking. Implementation emphasized use of common R packages and widely-used external tools (e.g., MACS for peak finding), along with custom programming. This software presents comprehensive solutions and easily repurposed code blocks for ChIP-seq analysis and pipeline creation. Tasks include mapping raw reads, peakfinding via MACS, summary statistics, peak location statistics, summary plots centered on the transcription start site (TSS), gene ontology, pathway analysis, and de novo motif finding, among others. These pipelines range from those performing a single task to those performing full analyses of ChIP-seq data. The pipelines are supplied as both Kepler workflows, which allow data provenance tracking, and, in the majority of cases, as standalone R scripts. These pipelines are designed for ease of modification and repurposing.
NASA Astrophysics Data System (ADS)
Lan, G.; Jiang, J.; Li, D. D.; Yi, W. S.; Zhao, Z.; Nie, L. N.
2013-12-01
The calculation of water-hammer pressure phenomenon of single-phase liquid is already more mature for a pipeline of uniform characteristics, but less research has addressed the calculation of slurry water hammer pressure in complex pipelines with slurry flows carrying solid particles. In this paper, based on the developments of slurry pipelines at home and abroad, the fundamental principle and method of numerical simulation of transient processes are presented, and several boundary conditions are given. Through the numerical simulation and analysis of transient processes of a practical engineering of long-distance slurry transportation pipeline system, effective protection measures and operating suggestions are presented. A model for calculating the water impact of solid and fluid phases is established based on a practical engineering of long-distance slurry pipeline transportation system. After performing a numerical simulation of the transient process, analyzing and comparing the results, effective protection measures and operating advice are recommended, which has guiding significance to the design and operating management of practical engineering of longdistance slurry pipeline transportation system.
Nayor, Jennifer; Borges, Lawrence F; Goryachev, Sergey; Gainer, Vivian S; Saltzman, John R
2018-07-01
ADR is a widely used colonoscopy quality indicator. Calculation of ADR is labor-intensive and cumbersome using current electronic medical databases. Natural language processing (NLP) is a method used to extract meaning from unstructured or free text data. (1) To develop and validate an accurate automated process for calculation of adenoma detection rate (ADR) and serrated polyp detection rate (SDR) on data stored in widely used electronic health record systems, specifically Epic electronic health record system, Provation ® endoscopy reporting system, and Sunquest PowerPath pathology reporting system. Screening colonoscopies performed between June 2010 and August 2015 were identified using the Provation ® reporting tool. An NLP pipeline was developed to identify adenomas and sessile serrated polyps (SSPs) on pathology reports corresponding to these colonoscopy reports. The pipeline was validated using a manual search. Precision, recall, and effectiveness of the natural language processing pipeline were calculated. ADR and SDR were then calculated. We identified 8032 screening colonoscopies that were linked to 3821 pathology reports (47.6%). The NLP pipeline had an accuracy of 100% for adenomas and 100% for SSPs. Mean total ADR was 29.3% (range 14.7-53.3%); mean male ADR was 35.7% (range 19.7-62.9%); and mean female ADR was 24.9% (range 9.1-51.0%). Mean total SDR was 4.0% (0-9.6%). We developed and validated an NLP pipeline that accurately and automatically calculates ADRs and SDRs using data stored in Epic, Provation ® and Sunquest PowerPath. This NLP pipeline can be used to evaluate colonoscopy quality parameters at both individual and practice levels.
NASA Astrophysics Data System (ADS)
Dudin, S. M.; Novitskiy, D. V.
2018-05-01
The works of researchers at VNIIgaz, Giprovostokneft, Kuibyshev NIINP, Grozny Petroleum Institute, etc., are devoted to modeling heterogeneous medium flows in pipelines under laboratory conditions. In objective consideration, the empirical relationships obtained and the calculation procedures for pipelines transporting multiphase products are a bank of experimental data on the problem of pipeline transportation of multiphase systems. Based on the analysis of the published works, the main design requirements for experimental installations designed to study the flow regimes of gas-liquid flows in pipelines were formulated, which were taken into account by the authors when creating the experimental stand. The article describes the results of experimental studies of the flow regimes of a gas-liquid mixture in a pipeline, and also gives a methodological description of the experimental installation. Also the article describes the software of the experimental scientific and educational stand developed with the participation of the authors.
Use of FBG sensors for health monitoring of pipelines
NASA Astrophysics Data System (ADS)
Felli, Ferdinando; Paolozzi, Antonio; Vendittozzi, Cristian; Paris, Claudio; Asanuma, Hiroshi
2016-04-01
The infrastructures for oil and gas production and distribution need reliable monitoring systems. The risks for pipelines, in particular, are not only limited to natural disasters (landslides, earthquakes, extreme environmental conditions) and accidents, but involve also the damages related to criminal activities, such as oil theft. The existing monitoring systems are not adequate for detecting damages from oil theft, and in several occasion the illegal activities resulted in leakage of oil and catastrophic environmental pollution. Systems based on fiber optic FBG (Fiber Bragg Grating) sensors present a number of advantages for pipeline monitoring. FBG sensors can withstand harsh environment, are immune to interferences, and can be used to develop a smart system for monitoring at the same time several physical characteristics, such as strain, temperature, acceleration, pressure, and vibrations. The monitoring station can be positioned tens of kilometers away from the measuring points, lowering the costs and the complexity of the system. This paper describes tests on a sensor, based on FBG technology, developed specifically for detecting damages of pipeline due to illegal activities (drilling of the pipes), that can be integrated into a smart monitoring chain.
Viability of using different types of main oil pipelines pump drives
NASA Astrophysics Data System (ADS)
Zakirzakov, A. G.; Zemenkov, Yu D.; Akulov, K. A.
2018-05-01
The choice of the pumping units' drive of main oil pipelines is of great importance both for design of pipelines and for modernization of existing ones. At the beginning of oil pipeline transport development, due to the limited number and types of energy sources, the choice was not difficult. The combustion energy of the pumped product was often the only available energy resource for its transportation. In this regard, the pipelines that had autonomous energy sources favorably differed from other energy consumers in the sector. With the passage of time, with the development of the country's electricity supply system, the electric drive for power-line equipment of oil pipelines becomes the dominant type of a pumping station drive. Nowadays, the traditional component is an essential factor when choosing some type of the drive. For many years, oil companies have been using electric drives for pumps, while gas transport enterprises prefer self-contained gas turbines.
Rasmussen, Luke V; Peissig, Peggy L; McCarty, Catherine A; Starren, Justin
2012-06-01
Although the penetration of electronic health records is increasing rapidly, much of the historical medical record is only available in handwritten notes and forms, which require labor-intensive, human chart abstraction for some clinical research. The few previous studies on automated extraction of data from these handwritten notes have focused on monolithic, custom-developed recognition systems or third-party systems that require proprietary forms. We present an optical character recognition processing pipeline, which leverages the capabilities of existing third-party optical character recognition engines, and provides the flexibility offered by a modular custom-developed system. The system was configured and run on a selected set of form fields extracted from a corpus of handwritten ophthalmology forms. The processing pipeline allowed multiple configurations to be run, with the optimal configuration consisting of the Nuance and LEADTOOLS engines running in parallel with a positive predictive value of 94.6% and a sensitivity of 13.5%. While limitations exist, preliminary experience from this project yielded insights on the generalizability and applicability of integrating multiple, inexpensive general-purpose third-party optical character recognition engines in a modular pipeline.
Peissig, Peggy L; McCarty, Catherine A; Starren, Justin
2011-01-01
Background Although the penetration of electronic health records is increasing rapidly, much of the historical medical record is only available in handwritten notes and forms, which require labor-intensive, human chart abstraction for some clinical research. The few previous studies on automated extraction of data from these handwritten notes have focused on monolithic, custom-developed recognition systems or third-party systems that require proprietary forms. Methods We present an optical character recognition processing pipeline, which leverages the capabilities of existing third-party optical character recognition engines, and provides the flexibility offered by a modular custom-developed system. The system was configured and run on a selected set of form fields extracted from a corpus of handwritten ophthalmology forms. Observations The processing pipeline allowed multiple configurations to be run, with the optimal configuration consisting of the Nuance and LEADTOOLS engines running in parallel with a positive predictive value of 94.6% and a sensitivity of 13.5%. Discussion While limitations exist, preliminary experience from this project yielded insights on the generalizability and applicability of integrating multiple, inexpensive general-purpose third-party optical character recognition engines in a modular pipeline. PMID:21890871
Capsule injection system for a hydraulic capsule pipelining system
Liu, Henry
1982-01-01
An injection system for injecting capsules into a hydraulic capsule pipelining system, the pipelining system comprising a pipeline adapted for flow of a carrier liquid therethrough, and capsules adapted to be transported through the pipeline by the carrier liquid flowing through the pipeline. The injection system comprises a reservoir of carrier liquid, the pipeline extending within the reservoir and extending downstream out of the reservoir, and a magazine in the reservoir for holding capsules in a series, one above another, for injection into the pipeline in the reservoir. The magazine has a lower end in communication with the pipeline in the reservoir for delivery of capsules from the magazine into the pipeline.
Using steady-state equations for transient flow calculation in natural gas pipelines
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maddox, R.N.; Zhou, P.
1984-04-02
Maddox and Zhou have extended their technique for calculating the unsteady-state behavior of straight gas pipelines to complex pipeline systems and networks. After developing the steady-state flow rate and pressure profile for each pipe in the network, analysts can perform the transient-state analysis in the real-time step-wise manner described for this technique.
Ho, Cheng-I; Lin, Min-Der; Lo, Shang-Lien
2010-07-01
A methodology based on the integration of a seismic-based artificial neural network (ANN) model and a geographic information system (GIS) to assess water leakage and to prioritize pipeline replacement is developed in this work. Qualified pipeline break-event data derived from the Taiwan Water Corporation Pipeline Leakage Repair Management System were analyzed. "Pipe diameter," "pipe material," and "the number of magnitude-3( + ) earthquakes" were employed as the input factors of ANN, while "the number of monthly breaks" was used for the prediction output. This study is the first attempt to manipulate earthquake data in the break-event ANN prediction model. Spatial distribution of the pipeline break-event data was analyzed and visualized by GIS. Through this, the users can swiftly figure out the hotspots of the leakage areas. A northeastern township in Taiwan, frequently affected by earthquakes, is chosen as the case study. Compared to the traditional processes for determining the priorities of pipeline replacement, the methodology developed is more effective and efficient. Likewise, the methodology can overcome the difficulty of prioritizing pipeline replacement even in situations where the break-event records are unavailable.
Inverse Transient Analysis for Classification of Wall Thickness Variations in Pipelines
Tuck, Jeffrey; Lee, Pedro
2013-01-01
Analysis of transient fluid pressure signals has been investigated as an alternative method of fault detection in pipeline systems and has shown promise in both laboratory and field trials. The advantage of the method is that it can potentially provide a fast and cost effective means of locating faults such as leaks, blockages and pipeline wall degradation within a pipeline while the system remains fully operational. The only requirement is that high speed pressure sensors are placed in contact with the fluid. Further development of the method requires detailed numerical models and enhanced understanding of transient flow within a pipeline where variations in pipeline condition and geometry occur. One such variation commonly encountered is the degradation or thinning of pipe walls, which can increase the susceptible of a pipeline to leak development. This paper aims to improve transient-based fault detection methods by investigating how changes in pipe wall thickness will affect the transient behaviour of a system; this is done through the analysis of laboratory experiments. The laboratory experiments are carried out on a stainless steel pipeline of constant outside diameter, into which a pipe section of variable wall thickness is inserted. In order to detect the location and severity of these changes in wall conditions within the laboratory system an inverse transient analysis procedure is employed which considers independent variations in wavespeed and diameter. Inverse transient analyses are carried out using a genetic algorithm optimisation routine to match the response from a one-dimensional method of characteristics transient model to the experimental time domain pressure responses. The accuracy of the detection technique is evaluated and benefits associated with various simplifying assumptions and simulation run times are investigated. It is found that for the case investigated, changes in the wavespeed and nominal diameter of the pipeline are both important to the accuracy of the inverse analysis procedure and can be used to differentiate the observed transient behaviour caused by changes in wall thickness from that caused by other known faults such as leaks. Further application of the method to real pipelines is discussed.
Transforming microbial genotyping: a robotic pipeline for genotyping bacterial strains.
O'Farrell, Brian; Haase, Jana K; Velayudhan, Vimalkumar; Murphy, Ronan A; Achtman, Mark
2012-01-01
Microbial genotyping increasingly deals with large numbers of samples, and data are commonly evaluated by unstructured approaches, such as spread-sheets. The efficiency, reliability and throughput of genotyping would benefit from the automation of manual manipulations within the context of sophisticated data storage. We developed a medium- throughput genotyping pipeline for MultiLocus Sequence Typing (MLST) of bacterial pathogens. This pipeline was implemented through a combination of four automated liquid handling systems, a Laboratory Information Management System (LIMS) consisting of a variety of dedicated commercial operating systems and programs, including a Sample Management System, plus numerous Python scripts. All tubes and microwell racks were bar-coded and their locations and status were recorded in the LIMS. We also created a hierarchical set of items that could be used to represent bacterial species, their products and experiments. The LIMS allowed reliable, semi-automated, traceable bacterial genotyping from initial single colony isolation and sub-cultivation through DNA extraction and normalization to PCRs, sequencing and MLST sequence trace evaluation. We also describe robotic sequencing to facilitate cherrypicking of sequence dropouts. This pipeline is user-friendly, with a throughput of 96 strains within 10 working days at a total cost of < €25 per strain. Since developing this pipeline, >200,000 items were processed by two to three people. Our sophisticated automated pipeline can be implemented by a small microbiology group without extensive external support, and provides a general framework for semi-automated bacterial genotyping of large numbers of samples at low cost.
Transforming Microbial Genotyping: A Robotic Pipeline for Genotyping Bacterial Strains
Velayudhan, Vimalkumar; Murphy, Ronan A.; Achtman, Mark
2012-01-01
Microbial genotyping increasingly deals with large numbers of samples, and data are commonly evaluated by unstructured approaches, such as spread-sheets. The efficiency, reliability and throughput of genotyping would benefit from the automation of manual manipulations within the context of sophisticated data storage. We developed a medium- throughput genotyping pipeline for MultiLocus Sequence Typing (MLST) of bacterial pathogens. This pipeline was implemented through a combination of four automated liquid handling systems, a Laboratory Information Management System (LIMS) consisting of a variety of dedicated commercial operating systems and programs, including a Sample Management System, plus numerous Python scripts. All tubes and microwell racks were bar-coded and their locations and status were recorded in the LIMS. We also created a hierarchical set of items that could be used to represent bacterial species, their products and experiments. The LIMS allowed reliable, semi-automated, traceable bacterial genotyping from initial single colony isolation and sub-cultivation through DNA extraction and normalization to PCRs, sequencing and MLST sequence trace evaluation. We also describe robotic sequencing to facilitate cherrypicking of sequence dropouts. This pipeline is user-friendly, with a throughput of 96 strains within 10 working days at a total cost of < €25 per strain. Since developing this pipeline, >200,000 items were processed by two to three people. Our sophisticated automated pipeline can be implemented by a small microbiology group without extensive external support, and provides a general framework for semi-automated bacterial genotyping of large numbers of samples at low cost. PMID:23144721
Modelling of non-equilibrium flow in the branched pipeline systems
NASA Astrophysics Data System (ADS)
Sumskoi, S. I.; Sverchkov, A. M.; Lisanov, M. V.; Egorov, A. F.
2016-09-01
This article presents a mathematical model and a numerical method for solving the task of water hammer in the branched pipeline system. The task is considered in the onedimensional non-stationary formulation taking into account the realities such as the change in the diameter of the pipeline and its branches. By comparison with the existing analytic solution it has been shown that the proposed method possesses good accuracy. With the help of the developed model and numerical method the task has been solved concerning the transmission of the compression waves complex in the branching pipeline system when several shut down valves operate. It should be noted that the offered model and method may be easily introduced to a number of other tasks, for example, to describe the flow of blood in the vessels.
A real-time coherent dedispersion pipeline for the giant metrewave radio telescope
NASA Astrophysics Data System (ADS)
De, Kishalay; Gupta, Yashwant
2016-02-01
A fully real-time coherent dedispersion system has been developed for the pulsar back-end at the Giant Metrewave Radio Telescope (GMRT). The dedispersion pipeline uses the single phased array voltage beam produced by the existing GMRT software back-end (GSB) to produce coherently dedispersed intensity output in real time, for the currently operational bandwidths of 16 MHz and 32 MHz. Provision has also been made to coherently dedisperse voltage beam data from observations recorded on disk. We discuss the design and implementation of the real-time coherent dedispersion system, describing the steps carried out to optimise the performance of the pipeline. Presently functioning on an Intel Xeon X5550 CPU equipped with a NVIDIA Tesla C2075 GPU, the pipeline allows dispersion free, high time resolution data to be obtained in real-time. We illustrate the significant improvements over the existing incoherent dedispersion system at the GMRT, and present some preliminary results obtained from studies of pulsars using this system, demonstrating its potential as a useful tool for low frequency pulsar observations. We describe the salient features of our implementation, comparing it with other recently developed real-time coherent dedispersion systems. This implementation of a real-time coherent dedispersion pipeline for a large, low frequency array instrument like the GMRT, will enable long-term observing programs using coherent dedispersion to be carried out routinely at the observatory. We also outline the possible improvements for such a pipeline, including prospects for the upgraded GMRT which will have bandwidths about ten times larger than at present.
NASA Technical Reports Server (NTRS)
Chaudhary, Aashish; Votava, Petr; Nemani, Ramakrishna R.; Michaelis, Andrew; Kotfila, Chris
2016-01-01
We are developing capabilities for an integrated petabyte-scale Earth science collaborative analysis and visualization environment. The ultimate goal is to deploy this environment within the NASA Earth Exchange (NEX) and OpenNEX in order to enhance existing science data production pipelines in both high-performance computing (HPC) and cloud environments. Bridging of HPC and cloud is a fairly new concept under active research and this system significantly enhances the ability of the scientific community to accelerate analysis and visualization of Earth science data from NASA missions, model outputs and other sources. We have developed a web-based system that seamlessly interfaces with both high-performance computing (HPC) and cloud environments, providing tools that enable science teams to develop and deploy large-scale analysis, visualization and QA pipelines of both the production process and the data products, and enable sharing results with the community. Our project is developed in several stages each addressing separate challenge - workflow integration, parallel execution in either cloud or HPC environments and big-data analytics or visualization. This work benefits a number of existing and upcoming projects supported by NEX, such as the Web Enabled Landsat Data (WELD), where we are developing a new QA pipeline for the 25PB system.
Analytics and Visualization Pipelines for Big Data on the NASA Earth Exchange (NEX) and OpenNEX
NASA Astrophysics Data System (ADS)
Chaudhary, A.; Votava, P.; Nemani, R. R.; Michaelis, A.; Kotfila, C.
2016-12-01
We are developing capabilities for an integrated petabyte-scale Earth science collaborative analysis and visualization environment. The ultimate goal is to deploy this environment within the NASA Earth Exchange (NEX) and OpenNEX in order to enhance existing science data production pipelines in both high-performance computing (HPC) and cloud environments. Bridging of HPC and cloud is a fairly new concept under active research and this system significantly enhances the ability of the scientific community to accelerate analysis and visualization of Earth science data from NASA missions, model outputs and other sources. We have developed a web-based system that seamlessly interfaces with both high-performance computing (HPC) and cloud environments, providing tools that enable science teams to develop and deploy large-scale analysis, visualization and QA pipelines of both the production process and the data products, and enable sharing results with the community. Our project is developed in several stages each addressing separate challenge - workflow integration, parallel execution in either cloud or HPC environments and big-data analytics or visualization. This work benefits a number of existing and upcoming projects supported by NEX, such as the Web Enabled Landsat Data (WELD), where we are developing a new QA pipeline for the 25PB system.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
This is a short paper on the history and development of the Platte Pipe Line which stretches 1156 miles from Byron, Wyoming, to Wood River, Illinois. It discusses the development and significance of one of the most used crude oil pipelines in the United States. It also discusses its role in advanced pipeline control technology and the future of the system.
Gaps of Decision Support Models for Pipeline Renewal and Recommendations for Improvement
In terms of the development of software for decision support for pipeline renewal, more attention to date has been paid to the development of asset management models that help an owner decide on which portions of a system to prioritize needed actions. There has been much less w...
GAPS OF DECISION SUPPORT MODELS FOR PIPELINE RENEWAL AND RECOMMENDATIONS FOR IMPROVEMENT (SLIDE)
In terms of the development of software for decision support for pipeline renewal, more attention to date has been paid to the development of asset management models that help an owner decide on which portions of a system to prioritize needed actions. There has been much less wor...
CCDLAB: A Graphical User Interface FITS Image Data Reducer, Viewer, and Canadian UVIT Data Pipeline
NASA Astrophysics Data System (ADS)
Postma, Joseph E.; Leahy, Denis
2017-11-01
CCDLAB was originally developed as a FITS image data reducer and viewer, and development was then continued to provide ground support for the development of the UVIT detector system provided by the Canadian Space Agency to the Indian Space Research Organization’s ASTROSAT satellite and UVIT telescopes. After the launch of ASTROSAT and during UVIT’s first-light and PV phase starting in 2015 December, necessity required the development of a data pipeline to produce scientific images out of the Level 1 format data produced for UVIT by ISRO. Given the previous development of CCDLAB for UVIT ground support, the author provided a pipeline for the new Level 1 format data to be run through CCDLAB with the additional satellite-dependent reduction operations required to produce scientific data. Features of the pipeline are discussed with focus on the relevant data-reduction challenges intrinsic to UVIT data.
Bioinformatic pipelines in Python with Leaf
2013-01-01
Background An incremental, loosely planned development approach is often used in bioinformatic studies when dealing with custom data analysis in a rapidly changing environment. Unfortunately, the lack of a rigorous software structuring can undermine the maintainability, communicability and replicability of the process. To ameliorate this problem we propose the Leaf system, the aim of which is to seamlessly introduce the pipeline formality on top of a dynamical development process with minimum overhead for the programmer, thus providing a simple layer of software structuring. Results Leaf includes a formal language for the definition of pipelines with code that can be transparently inserted into the user’s Python code. Its syntax is designed to visually highlight dependencies in the pipeline structure it defines. While encouraging the developer to think in terms of bioinformatic pipelines, Leaf supports a number of automated features including data and session persistence, consistency checks between steps of the analysis, processing optimization and publication of the analytic protocol in the form of a hypertext. Conclusions Leaf offers a powerful balance between plan-driven and change-driven development environments in the design, management and communication of bioinformatic pipelines. Its unique features make it a valuable alternative to other related tools. PMID:23786315
Improved satellite and geospatial tools for pipeline operator decision support systems.
DOT National Transportation Integrated Search
2017-01-06
Under Cooperative Agreement No. OASRTRS-14-H-CAL, California Polytechnic State University San Luis Obispo (Cal Poly), partnered with C-CORE, MDA, PRCI, and Electricore to design and develop improved satellite and geospatial tools for pipeline operato...
Deliverability on the interstate natural gas pipeline system
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1998-05-01
Deliverability on the Interstate Natural Gas Pipeline System examines the capability of the national pipeline grid to transport natural gas to various US markets. The report quantifies the capacity levels and utilization rates of major interstate pipeline companies in 1996 and the changes since 1990, as well as changes in markets and end-use consumption patterns. It also discusses the effects of proposed capacity expansions on capacity levels. The report consists of five chapters, several appendices, and a glossary. Chapter 1 discusses some of the operational and regulatory features of the US interstate pipeline system and how they affect overall systemmore » design, system utilization, and capacity expansions. Chapter 2 looks at how the exploration, development, and production of natural gas within North America is linked to the national pipeline grid. Chapter 3 examines the capability of the interstate natural gas pipeline network to link production areas to market areas, on the basis of capacity and usage levels along 10 corridors. The chapter also examines capacity expansions that have occurred since 1990 along each corridor and the potential impact of proposed new capacity. Chapter 4 discusses the last step in the transportation chain, that is, deliverability to the ultimate end user. Flow patterns into and out of each market region are discussed, as well as the movement of natural gas between States in each region. Chapter 5 examines how shippers reserve interstate pipeline capacity in the current transportation marketplace and how pipeline companies are handling the secondary market for short-term unused capacity. Four appendices provide supporting data and additional detail on the methodology used to estimate capacity. 32 figs., 15 tabs.« less
Fuchs, Helmut; Aguilar-Pimentel, Juan Antonio; Amarie, Oana V; Becker, Lore; Calzada-Wack, Julia; Cho, Yi-Li; Garrett, Lillian; Hölter, Sabine M; Irmler, Martin; Kistler, Martin; Kraiger, Markus; Mayer-Kuckuk, Philipp; Moreth, Kristin; Rathkolb, Birgit; Rozman, Jan; da Silva Buttkus, Patricia; Treise, Irina; Zimprich, Annemarie; Gampe, Kristine; Hutterer, Christine; Stöger, Claudia; Leuchtenberger, Stefanie; Maier, Holger; Miller, Manuel; Scheideler, Angelika; Wu, Moya; Beckers, Johannes; Bekeredjian, Raffi; Brielmeier, Markus; Busch, Dirk H; Klingenspor, Martin; Klopstock, Thomas; Ollert, Markus; Schmidt-Weber, Carsten; Stöger, Tobias; Wolf, Eckhard; Wurst, Wolfgang; Yildirim, Ali Önder; Zimmer, Andreas; Gailus-Durner, Valérie; Hrabě de Angelis, Martin
2017-09-29
Since decades, model organisms have provided an important approach for understanding the mechanistic basis of human diseases. The German Mouse Clinic (GMC) was the first phenotyping facility that established a collaboration-based platform for phenotype characterization of mouse lines. In order to address individual projects by a tailor-made phenotyping strategy, the GMC advanced in developing a series of pipelines with tests for the analysis of specific disease areas. For a general broad analysis, there is a screening pipeline that covers the key parameters for the most relevant disease areas. For hypothesis-driven phenotypic analyses, there are thirteen additional pipelines with focus on neurological and behavioral disorders, metabolic dysfunction, respiratory system malfunctions, immune-system disorders and imaging techniques. In this article, we give an overview of the pipelines and describe the scientific rationale behind the different test combinations. Copyright © 2017 Elsevier B.V. All rights reserved.
Simulation and Experiment Research on Fatigue Life of High Pressure Air Pipeline Joint
NASA Astrophysics Data System (ADS)
Shang, Jin; Xie, Jianghui; Yu, Jian; Zhang, Deman
2017-12-01
High pressure air pipeline joint is important part of high pressure air system, whose reliability is related to the safety and stability of the system. This thesis developed a new type-high pressure air pipeline joint, carried out dynamics research on CB316-1995 and new type-high pressure air pipeline joint with finite element method, deeply analysed the join forms of different design schemes and effect of materials on stress, tightening torque and fatigue life of joint. Research team set up vibration/pulse test bench, carried out joint fatigue life contrast test. The result shows: the maximum stress of the joint is inverted in the inner side of the outer sleeve nut, which is consistent with the failure mode of the crack on the outer sleeve nut in practice. Simulation and experiment of fatigue life and tightening torque of new type-high pressure air pipeline joint are better than CB316-1995 joint.
Aerodynamics of electrically driven freight pipeline system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lundgren, T.S.; Zhao, Y.
2000-06-01
This paper examines the aerodynamic characteristics of a freight pipeline system in which freight capsules are individually propelled by electrical motors. The fundamental difference between this system and the more extensively studied pneumatic capsule pipeline is the different role played by aerodynamic forces. In a driven system the propelled capsules are resisted by aerodynamic forces and, in reaction, pump air through the tube. In contrast, in a pneumatically propelled system external blowers pump air through the tubes, and this provides the thrust for the capsules. An incompressible transient analysis is developed to study the aerodynamics of multiple capsules in amore » cross-linked two-bore pipeline. An aerodynamic friction coefficient is used as a cost parameter to compare the effects of capsule blockage and headway and to assess the merits of adits and vents. The authors conclude that optimum efficiency for off-design operation is obtained with long platoons of capsules in vented or adit connected tubes.« less
The Kepler Science Data Processing Pipeline Source Code Road Map
NASA Technical Reports Server (NTRS)
Wohler, Bill; Jenkins, Jon M.; Twicken, Joseph D.; Bryson, Stephen T.; Clarke, Bruce Donald; Middour, Christopher K.; Quintana, Elisa Victoria; Sanderfer, Jesse Thomas; Uddin, Akm Kamal; Sabale, Anima;
2016-01-01
We give an overview of the operational concepts and architecture of the Kepler Science Processing Pipeline. Designed, developed, operated, and maintained by the Kepler Science Operations Center (SOC) at NASA Ames Research Center, the Science Processing Pipeline is a central element of the Kepler Ground Data System. The SOC consists of an office at Ames Research Center, software development and operations departments, and a data center which hosts the computers required to perform data analysis. The SOC's charter is to analyze stellar photometric data from the Kepler spacecraft and report results to the Kepler Science Office for further analysis. We describe how this is accomplished via the Kepler Science Processing Pipeline, including, the software algorithms. We present the high-performance, parallel computing software modules of the pipeline that perform transit photometry, pixel-level calibration, systematic error correction, attitude determination, stellar target management, and instrument characterization.
Aerial image databases for pipeline rights-of-way management
NASA Astrophysics Data System (ADS)
Jadkowski, Mark A.
1996-03-01
Pipeline companies that own and manage extensive rights-of-way corridors are faced with ever-increasing regulatory pressures, operating issues, and the need to remain competitive in today's marketplace. Automation has long been an answer to the problem of having to do more work with less people, and Automated Mapping/Facilities Management/Geographic Information Systems (AM/FM/GIS) solutions have been implemented at several pipeline companies. Until recently, the ability to cost-effectively acquire and incorporate up-to-date aerial imagery into these computerized systems has been out of the reach of most users. NASA's Earth Observations Commercial Applications Program (EOCAP) is providing a means by which pipeline companies can bridge this gap. The EOCAP project described in this paper includes a unique partnership with NASA and James W. Sewall Company to develop an aircraft-mounted digital camera system and a ground-based computer system to geometrically correct and efficiently store and handle the digital aerial images in an AM/FM/GIS environment. This paper provides a synopsis of the project, including details on (1) the need for aerial imagery, (2) NASA's interest and role in the project, (3) the design of a Digital Aerial Rights-of-Way Monitoring System, (4) image georeferencing strategies for pipeline applications, and (5) commercialization of the EOCAP technology through a prototype project at Algonquin Gas Transmission Company which operates major gas pipelines in New England, New York, and New Jersey.
ERIC Educational Resources Information Center
Rodriguez, Louie F.
2016-01-01
The educational system continues to inadequately serve Latina/o students across the educational pipeline. A key shortcoming is the system's inability to develop, support, and grow educational leaders that can respond. In this article, the author poses a series of pedagogical approaches using a Community Cultural Wealth (Yosso, 2005) lens. In the…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-22
... interconnect pipelines to four existing offshore pipelines (Dauphin Natural Gas Pipeline, Williams Natural Gas Pipeline, Destin Natural Gas Pipeline, and Viosca Knoll Gathering System [VKGS] Gas Pipeline) that connect to the onshore natural gas transmission pipeline system. Natural gas would be delivered to customers...
Pipeline inspection using an autonomous underwater vehicle
DOE Office of Scientific and Technical Information (OSTI.GOV)
Egeskov, P.; Bech, M.; Bowley, R.
1995-12-31
Pipeline inspection can be carried out by means of small Autonomous Underwater Vehicles (AUVs), operating either with a control link to a surface vessel, or totally independently. The AUV offers an attractive alternative to conventional inspection methods where Remotely Operated Vehicles (ROVs) or paravanes are used. A flatfish type AUV ``MARTIN`` (Marine Tool for Inspection) has been developed for this purpose. The paper describes the proposed types of inspection jobs to be carried out by ``MARTIN``. The design and construction of the vessel, its hydrodynamic properties, its propulsion and control systems are discussed. The pipeline tracking and survey systems, asmore » well as the launch and recovery systems are described.« less
Generating disease-pertinent treatment vocabularies from MEDLINE citations.
Wang, Liqin; Del Fiol, Guilherme; Bray, Bruce E; Haug, Peter J
2017-01-01
Healthcare communities have identified a significant need for disease-specific information. Disease-specific ontologies are useful in assisting the retrieval of disease-relevant information from various sources. However, building these ontologies is labor intensive. Our goal is to develop a system for an automated generation of disease-pertinent concepts from a popular knowledge resource for the building of disease-specific ontologies. A pipeline system was developed with an initial focus of generating disease-specific treatment vocabularies. It was comprised of the components of disease-specific citation retrieval, predication extraction, treatment predication extraction, treatment concept extraction, and relevance ranking. A semantic schema was developed to support the extraction of treatment predications and concepts. Four ranking approaches (i.e., occurrence, interest, degree centrality, and weighted degree centrality) were proposed to measure the relevance of treatment concepts to the disease of interest. We measured the performance of four ranks in terms of the mean precision at the top 100 concepts with five diseases, as well as the precision-recall curves against two reference vocabularies. The performance of the system was also compared to two baseline approaches. The pipeline system achieved a mean precision of 0.80 for the top 100 concepts with the ranking by interest. There were no significant different among the four ranks (p=0.53). However, the pipeline-based system had significantly better performance than the two baselines. The pipeline system can be useful for an automated generation of disease-relevant treatment concepts from the biomedical literature. Copyright © 2016 Elsevier Inc. All rights reserved.
GIS characterization of spatially distributed lifeline damage
Toprak, Selcuk; O'Rourke, Thomas; Tutuncu, Ilker
1999-01-01
This paper describes the visualization of spatially distributed water pipeline damage following an earthquake using geographical information systems (GIS). Pipeline damage is expressed as a repair rate (RR). Repair rate contours are developed with GIS by dividing the study area into grid cells (n ?? n), determining the number of particular pipeline repairs in each grid cell, and dividing the number of repairs by the length of that pipeline in each cell area. The resulting contour plot is a two-dimensional visualization of point source damage. High damage zones are defined herein as areas with an RR value greater than the mean RR for the entire study area of interest. A hyperbolic relationship between visual display of high pipeline damage zones and grid size, n, was developed. The relationship is expressed in terms of two dimensionless parameters, threshold area coverage (TAC) and dimensionless grid size (DGS). The relationship is valid over a wide range of different map scales spanning approximately 1,200 km2 for the largest portion of the Los Angeles water distribution system to 1 km2 for the Marina in San Francisco. This relationship can aid GIS users to get sufficiently refined, but easily visualized, maps of damage patterns.
Workflows for microarray data processing in the Kepler environment.
Stropp, Thomas; McPhillips, Timothy; Ludäscher, Bertram; Bieda, Mark
2012-05-17
Microarray data analysis has been the subject of extensive and ongoing pipeline development due to its complexity, the availability of several options at each analysis step, and the development of new analysis demands, including integration with new data sources. Bioinformatics pipelines are usually custom built for different applications, making them typically difficult to modify, extend and repurpose. Scientific workflow systems are intended to address these issues by providing general-purpose frameworks in which to develop and execute such pipelines. The Kepler workflow environment is a well-established system under continual development that is employed in several areas of scientific research. Kepler provides a flexible graphical interface, featuring clear display of parameter values, for design and modification of workflows. It has capabilities for developing novel computational components in the R, Python, and Java programming languages, all of which are widely used for bioinformatics algorithm development, along with capabilities for invoking external applications and using web services. We developed a series of fully functional bioinformatics pipelines addressing common tasks in microarray processing in the Kepler workflow environment. These pipelines consist of a set of tools for GFF file processing of NimbleGen chromatin immunoprecipitation on microarray (ChIP-chip) datasets and more comprehensive workflows for Affymetrix gene expression microarray bioinformatics and basic primer design for PCR experiments, which are often used to validate microarray results. Although functional in themselves, these workflows can be easily customized, extended, or repurposed to match the needs of specific projects and are designed to be a toolkit and starting point for specific applications. These workflows illustrate a workflow programming paradigm focusing on local resources (programs and data) and therefore are close to traditional shell scripting or R/BioConductor scripting approaches to pipeline design. Finally, we suggest that microarray data processing task workflows may provide a basis for future example-based comparison of different workflow systems. We provide a set of tools and complete workflows for microarray data analysis in the Kepler environment, which has the advantages of offering graphical, clear display of conceptual steps and parameters and the ability to easily integrate other resources such as remote data and web services.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-02-03
... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket No... Distribution Systems, Gas Transmission and Gathering Systems, and Hazardous Liquid Systems AGENCY: Pipeline and.... SUMMARY: This notice advises owners and operators of gas pipeline facilities and hazardous liquid pipeline...
DOT National Transportation Integrated Search
2010-06-18
The potential exists for stress corrosion cracking (SCC) of carbon steel pipelines transporting fuel grade ethanol (FGE) and FGE- gasoline blends. The objectives of SCC 4-4 were to: 1. Develop data necessary to make engineering assessments of the fea...
Forecasting and Evaluation of Gas Pipelines Geometric Forms Breach Hazard
NASA Astrophysics Data System (ADS)
Voronin, K. S.
2016-10-01
Main gas pipelines during operation are under the influence of the permanent pressure drops which leads to their lengthening and as a result, to instability of their position in space. In dynamic systems that have feedback, phenomena, preceding emergencies, should be observed. The article discusses the forced vibrations of the gas pipeline cylindrical surface under the influence of dynamic loads caused by pressure surges, and the process of its geometric shape deformation. Frequency of vibrations, arising in the pipeline at the stage preceding its bending, is being determined. Identification of this frequency can be the basis for the development of a method of monitoring the technical condition of the gas pipeline, and forecasting possible emergency situations allows planning and carrying out in due time reconstruction works on sections of gas pipeline with a possible deviation from the design position.
Magnetic pipeline for coal and oil
DOE Office of Scientific and Technical Information (OSTI.GOV)
Knolle, E.
1998-07-01
A 1994 analysis of the recorded costs of the Alaska oil pipeline, in a paper entitled Maglev Crude Oil Pipeline, (NASA CP-3247 pp. 671--684) concluded that, had the Knolle Magnetrans pipeline technology been available and used, some $10 million per day in transportation costs could have been saved over the 20 years of the Alaska oil pipeline's existence. This over 800 mile long pipeline requires about 500 horsepower per mile in pumping power, which together with the cost of the pipeline's capital investment consumes about one-third of the energy value of the pumped oil. This does not include the costmore » of getting the oil out of the ground. The reason maglev technology performs superior to conventional pipelines is because by magnetically levitating the oil into contact-free suspense, there is no drag-causing adhesion. In addition, by using permanent magnets in repulsion, suspension is achieved without using energy. Also, the pumped oil's adhesion to the inside of pipes limits its speed. In the case of the Alaska pipeline the speed is limited to about 7 miles per hour, which, with its 48-inch pipe diameter and 1200 psi pressure, pumps about 2 million barrels per day. The maglev system, as developed by Knolle Magnetrans, would transport oil in magnetically suspended sealed containers and, thus free of adhesion, at speeds 10 to 20 times faster. Furthermore, the diameter of the levitated containers can be made smaller with the same capacity, which makes the construction of the maglev system light and inexpensive. There are similar advantages when using maglev technology to transport coal. Also, a maglev system has advantages over railroads in mountainous regions where coal is primarily mined. A maglev pipeline can travel, all-year and all weather, in a straight line to the end-user, whereas railroads have difficult circuitous routes. In contrast, a maglev pipeline can climb over steep hills without much difficulty.« less
Design of cylindrical pipe automatic welding control system based on STM32
NASA Astrophysics Data System (ADS)
Chen, Shuaishuai; Shen, Weicong
2018-04-01
The development of modern economy makes the demand for pipeline construction and construction rapidly increasing, and the pipeline welding has become an important link in pipeline construction. At present, there are still a large number of using of manual welding methods at home and abroad, and field pipe welding especially lacks miniature and portable automatic welding equipment. An automated welding system consists of a control system, which consisting of a lower computer control panel and a host computer operating interface, as well as automatic welding machine mechanisms and welding power systems in coordination with the control system. In this paper, a new control system of automatic pipe welding based on the control panel of the lower computer and the interface of the host computer is proposed, which has many advantages over the traditional automatic welding machine.
78 FR 42889 - Pipeline Safety: Reminder of Requirements for Utility LP-Gas and LPG Pipeline Systems
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-18
... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration 49 CFR Part 192 [Docket No. PHMSA-2013-0097] Pipeline Safety: Reminder of Requirements for Utility LP-Gas and LPG Pipeline Systems AGENCY: Pipeline and Hazardous Materials Safety Administration (PHMSA), DOT. ACTION...
Design of oil pipeline leak detection and communication system based on optical fiber technology
NASA Astrophysics Data System (ADS)
Tu, Yaqing; Chen, Huabo
1999-08-01
The integrity of oil pipeline is always a major concern of operators. Pipeline leak not only leads to loss of oil, but pollutes environment. A new pipeline leak detection and communication system based on optical fiber technology to ensure the pipeline reliability is presented. Combined direct leak detection method with an indirect one, the system will greatly reduce the rate of false alarm. According, to the practical features of oil pipeline,the pipeline communication system is designed employing the state-of-the-art optic fiber communication technology. The system has such feature as high location accuracy of leak detection, good real-time characteristic, etc. which overcomes the disadvantages of traditional leak detection methods and communication system effectively.
Development of an Automated Imaging Pipeline for the Analysis of the Zebrafish Larval Kidney
Westhoff, Jens H.; Giselbrecht, Stefan; Schmidts, Miriam; Schindler, Sebastian; Beales, Philip L.; Tönshoff, Burkhard; Liebel, Urban; Gehrig, Jochen
2013-01-01
The analysis of kidney malformation caused by environmental influences during nephrogenesis or by hereditary nephropathies requires animal models allowing the in vivo observation of developmental processes. The zebrafish has emerged as a useful model system for the analysis of vertebrate organ development and function, and it is suitable for the identification of organotoxic or disease-modulating compounds on a larger scale. However, to fully exploit its potential in high content screening applications, dedicated protocols are required allowing the consistent visualization of inner organs such as the embryonic kidney. To this end, we developed a high content screening compatible pipeline for the automated imaging of standardized views of the developing pronephros in zebrafish larvae. Using a custom designed tool, cavities were generated in agarose coated microtiter plates allowing for accurate positioning and orientation of zebrafish larvae. This enabled the subsequent automated acquisition of stable and consistent dorsal views of pronephric kidneys. The established pipeline was applied in a pilot screen for the analysis of the impact of potentially nephrotoxic drugs on zebrafish pronephros development in the Tg(wt1b:EGFP) transgenic line in which the developing pronephros is highlighted by GFP expression. The consistent image data that was acquired allowed for quantification of gross morphological pronephric phenotypes, revealing concentration dependent effects of several compounds on nephrogenesis. In addition, applicability of the imaging pipeline was further confirmed in a morpholino based model for cilia-associated human genetic disorders associated with different intraflagellar transport genes. The developed tools and pipeline can be used to study various aspects in zebrafish kidney research, and can be readily adapted for the analysis of other organ systems. PMID:24324758
Development of an automated imaging pipeline for the analysis of the zebrafish larval kidney.
Westhoff, Jens H; Giselbrecht, Stefan; Schmidts, Miriam; Schindler, Sebastian; Beales, Philip L; Tönshoff, Burkhard; Liebel, Urban; Gehrig, Jochen
2013-01-01
The analysis of kidney malformation caused by environmental influences during nephrogenesis or by hereditary nephropathies requires animal models allowing the in vivo observation of developmental processes. The zebrafish has emerged as a useful model system for the analysis of vertebrate organ development and function, and it is suitable for the identification of organotoxic or disease-modulating compounds on a larger scale. However, to fully exploit its potential in high content screening applications, dedicated protocols are required allowing the consistent visualization of inner organs such as the embryonic kidney. To this end, we developed a high content screening compatible pipeline for the automated imaging of standardized views of the developing pronephros in zebrafish larvae. Using a custom designed tool, cavities were generated in agarose coated microtiter plates allowing for accurate positioning and orientation of zebrafish larvae. This enabled the subsequent automated acquisition of stable and consistent dorsal views of pronephric kidneys. The established pipeline was applied in a pilot screen for the analysis of the impact of potentially nephrotoxic drugs on zebrafish pronephros development in the Tg(wt1b:EGFP) transgenic line in which the developing pronephros is highlighted by GFP expression. The consistent image data that was acquired allowed for quantification of gross morphological pronephric phenotypes, revealing concentration dependent effects of several compounds on nephrogenesis. In addition, applicability of the imaging pipeline was further confirmed in a morpholino based model for cilia-associated human genetic disorders associated with different intraflagellar transport genes. The developed tools and pipeline can be used to study various aspects in zebrafish kidney research, and can be readily adapted for the analysis of other organ systems.
Development and Applications of Pipeline Steel in Long-Distance Gas Pipeline of China
NASA Astrophysics Data System (ADS)
Chunyong, Huo; Yang, Li; Lingkang, Ji
In past decades, with widely utilizing of Microalloying and Thermal Mechanical Control Processing (TMCP) technology, the good matching of strength, toughness, plasticity and weldability on pipeline steel has been reached so that oil and gas pipeline has been greatly developed in China to meet the demand of strong domestic consumption of energy. In this paper, development history of pipeline steel and gas pipeline in china is briefly reviewed. The microstructure characteristic and mechanical performance of pipeline steel used in some representative gas pipelines of china built in different stage are summarized. Through the analysis on the evolution of pipeline service environment, some prospective development trend of application of pipeline steel in China is also presented.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 49 Transportation 3 2010-10-01 2010-10-01 false Distribution systems reporting transmission pipelines; transmission or gathering systems reporting distribution pipelines. 191.13 Section 191.13 Transportation Other Regulations Relating to Transportation (Continued) PIPELINE AND HAZARDOUS MATERIALS SAFETY ADMINISTRATION, DEPARTMENT OF...
NASA Technical Reports Server (NTRS)
Tang, Henry H.; Le, Suy Q.; Orndoff, Evelyne S.; Smith, Frederick D.; Tapia, Alma S.; Brower, David V.
2012-01-01
Integrity and performance monitoring of subsea pipelines and structures provides critical information for managing offshore oil and gas production operation and preventing environmentally damaging and costly catastrophic failure. Currently pipeline monitoring devices require ground assembly and installation prior to the underwater deployment of the pipeline. A monitoring device that could be installed in situ on the operating underwater structures could enhance the productivity and improve the safety of current offshore operation. Through a Space Act Agreement (SAA) between the National Aeronautics and Space Administration (NASA) Johnson Space Center (JSC) and Astro Technology, Inc. (ATI), JSC provides technical expertise and testing facilities to support the development of fiber optic sensor technologies by ATI. This paper details the first collaboration effort between NASA JSC and ATI in evaluating underwater applicable adhesives and friction coatings for attaching fiber optic sensor system to subsea pipeline. A market survey was conducted to examine different commercial ]off ]the ]shelf (COTS) underwater adhesive systems and to select adhesive candidates for testing and evaluation. Four COTS epoxy based underwater adhesives were selected and evaluated. The adhesives were applied and cured in simulated seawater conditions and then evaluated for application characteristics and adhesive strength. The adhesive that demonstrated the best underwater application characteristics and highest adhesive strength were identified for further evaluation in developing an attachment system that could be deployed in the harsh subsea environment. Various friction coatings were also tested in this study to measure their shear strengths for a mechanical clamping design concept for attaching fiber optic sensor system. A COTS carbide alloy coating was found to increase the shear strength of metal to metal clamping interface by up to 46 percent. This study provides valuable data for assessing the feasibility of developing the next generation fiber optic senor system that could be retrofitted onto existing subsea pipeline structures.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-09-21
... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket No. PHMSA-2012-0024] Pipeline Safety: Information Collection Activities, Revision to Gas Transmission and Gathering Pipeline Systems Annual Report, Gas Transmission and Gathering Pipeline Systems Incident Report...
Development of a Carbon Management Geographic Information System (GIS) for the United States
DOE Office of Scientific and Technical Information (OSTI.GOV)
Howard Herzog; Holly Javedan
In this project a Carbon Management Geographical Information System (GIS) for the US was developed. The GIS stored, integrated, and manipulated information relating to the components of carbon management systems. Additionally, the GIS was used to interpret and analyze the effect of developing these systems. This report documents the key deliverables from the project: (1) Carbon Management Geographical Information System (GIS) Documentation; (2) Stationary CO{sub 2} Source Database; (3) Regulatory Data for CCS in United States; (4) CO{sub 2} Capture Cost Estimation; (5) CO{sub 2} Storage Capacity Tools; (6) CO{sub 2} Injection Cost Modeling; (7) CO{sub 2} Pipeline Transport Costmore » Estimation; (8) CO{sub 2} Source-Sink Matching Algorithm; and (9) CO{sub 2} Pipeline Transport and Cost Model.« less
Mittra, James; Tait, Joyce; Wield, David
2011-03-01
The pharmaceutical and agro-biotechnology industries have been confronted by dwindling product pipelines and rapid developments in life sciences, thus demanding a strategic rethink of conventional research and development. Despite offering both industries a solution to the pipeline problem, the life sciences have also brought complex regulatory challenges for firms. In this paper, we comment on the response of these industries to the life science trajectory, in the context of maturing conventional small-molecule product pipelines and routes to market. The challenges of managing transition from maturity to new high-value-added innovation models are addressed. Furthermore, we argue that regulation plays a crucial role in shaping the innovation systems of both industries, and as such, we suggest potentially useful changes to the current regulatory system. Copyright © 2010 Elsevier Ltd. All rights reserved.
Aerial surveillance for gas and liquid hydrocarbon pipelines using a flame ionization detector (FID)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Riquetti, P.V.; Fletcher, J.I.; Minty, C.D.
1996-12-31
A novel application for the detection of airborne hydrocarbons has been successfully developed by means of a highly sensitive, fast responding Flame Ionization Detector (FID). The traditional way to monitor pipeline leaks has been by ground crews using specific sensors or by airborne crews highly trained to observe anomalies associated with leaks during periodic surveys of the pipeline right-of-way. The goal has been to detect leaks in a fast and cost effective way before the associated spill becomes a costly and hazardous problem. This paper describes a leak detection system combined with a global positioning system (GPS) and a computerizedmore » data output designed to pinpoint the presence of hydrocarbons in the air space of the pipeline`s right of way. Fixed wing aircraft as well as helicopters have been successfully used as airborne platforms. Natural gas, crude oil and finished products pipelines in Canada and the US have been surveyed using this technology with excellent correlation between the aircraft detection and in situ ground detection. The information obtained is processed with a proprietary software and reduced to simple coordinates. Results are transferred to ground crews to effect the necessary repairs.« less
Designing a reliable leak bio-detection system for natural gas pipelines.
Batzias, F A; Siontorou, C G; Spanidis, P-M P
2011-02-15
Monitoring of natural gas (NG) pipelines is an important task for economical/safety operation, loss prevention and environmental protection. Timely and reliable leak detection of gas pipeline, therefore, plays a key role in the overall integrity management for the pipeline system. Owing to the various limitations of the currently available techniques and the surveillance area that needs to be covered, the research on new detector systems is still thriving. Biosensors are worldwide considered as a niche technology in the environmental market, since they afford the desired detector capabilities at low cost, provided they have been properly designed/developed and rationally placed/networked/maintained by the aid of operational research techniques. This paper addresses NG leakage surveillance through a robust cooperative/synergistic scheme between biosensors and conventional detector systems; the network is validated in situ and optimized in order to provide reliable information at the required granularity level. The proposed scheme is substantiated through a knowledge based approach and relies on Fuzzy Multicriteria Analysis (FMCA), for selecting the best biosensor design that suits both, the target analyte and the operational micro-environment. This approach is illustrated in the design of leak surveying over a pipeline network in Greece. Copyright © 2010 Elsevier B.V. All rights reserved.
An integrated GPS-FID system for airborne gas detection of pipeline right-of-ways
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gehue, H.L.; Sommer, P.
1996-12-31
Pipeline integrity, safety and environmental concerns are of prime importance in the Canadian natural gas industry. Terramatic Technology Inc. (TTI) has developed an integrated GPS/FID gas detection system known as TTI-AirTrac{trademark} for use in airborne gas detection (AGD) along pipeline right-of-ways. The Flame Ionization Detector (FID), which has traditionally been used to monitor air quality for gas plants and refineries, has been integrated with the Global Positioning System (GPS) via a 486 DX2-50 computer and specialized open architecture data acquisition software. The purpose of this technology marriage is to be able to continuously monitor air quality during airborne pipeline inspection.more » Event tagging from visual surveillance is used to determine an explanation of any delta line deviations (DLD). These deviations are an indication of hydrocarbon gases present in the plume that the aircraft has passed through. The role of the GPS system is to provide mapping information and coordinate data for ground inspections. The ground based inspection using a handheld multi gas detector will confirm whether or not a leak exists.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dawn Lenz; Raymond T. Lines; Darryl Murdock
ITT Industries Space Systems Division (Space Systems) has developed an airborne natural gas leak detection system designed to detect, image, quantify, and precisely locate leaks from natural gas transmission pipelines. This system is called the Airborne Natural Gas Emission Lidar (ANGEL) system. The ANGEL system uses a highly sensitive differential absorption Lidar technology to remotely detect pipeline leaks. The ANGEL System is operated from a fixed wing aircraft and includes automatic scanning, pointing system, and pilot guidance systems. During a pipeline inspection, the ANGEL system aircraft flies at an elevation of 1000 feet above the ground at speeds of betweenmore » 100 and 150 mph. Under this contract with DOE/NETL, Space Systems was funded to integrate the ANGEL sensor into a test aircraft and conduct a series of flight tests over a variety of test targets including simulated natural gas pipeline leaks. Following early tests in upstate New York in the summer of 2004, the ANGEL system was deployed to Casper, Wyoming to participate in a set of DOE-sponsored field tests at the Rocky Mountain Oilfield Testing Center (RMOTC). At RMOTC the Space Systems team completed integration of the system and flew an operational system for the first time. The ANGEL system flew 2 missions/day for the duration for the 5-day test. Over the course of the week the ANGEL System detected leaks ranging from 100 to 5,000 scfh.« less
The Minimal Preprocessing Pipelines for the Human Connectome Project
Glasser, Matthew F.; Sotiropoulos, Stamatios N; Wilson, J Anthony; Coalson, Timothy S; Fischl, Bruce; Andersson, Jesper L; Xu, Junqian; Jbabdi, Saad; Webster, Matthew; Polimeni, Jonathan R; Van Essen, David C; Jenkinson, Mark
2013-01-01
The Human Connectome Project (HCP) faces the challenging task of bringing multiple magnetic resonance imaging (MRI) modalities together in a common automated preprocessing framework across a large cohort of subjects. The MRI data acquired by the HCP differ in many ways from data acquired on conventional 3 Tesla scanners and often require newly developed preprocessing methods. We describe the minimal preprocessing pipelines for structural, functional, and diffusion MRI that were developed by the HCP to accomplish many low level tasks, including spatial artifact/distortion removal, surface generation, cross-modal registration, and alignment to standard space. These pipelines are specially designed to capitalize on the high quality data offered by the HCP. The final standard space makes use of a recently introduced CIFTI file format and the associated grayordinates spatial coordinate system. This allows for combined cortical surface and subcortical volume analyses while reducing the storage and processing requirements for high spatial and temporal resolution data. Here, we provide the minimum image acquisition requirements for the HCP minimal preprocessing pipelines and additional advice for investigators interested in replicating the HCP’s acquisition protocols or using these pipelines. Finally, we discuss some potential future improvements for the pipelines. PMID:23668970
Acoustic system for communication in pipelines
Martin, II, Louis Peter; Cooper, John F [Oakland, CA
2008-09-09
A system for communication in a pipe, or pipeline, or network of pipes containing a fluid. The system includes an encoding and transmitting sub-system connected to the pipe, or pipeline, or network of pipes that transmits a signal in the frequency range of 3-100 kHz into the pipe, or pipeline, or network of pipes containing a fluid, and a receiver and processor sub-system connected to the pipe, or pipeline, or network of pipes containing a fluid that receives said signal and uses said signal for a desired application.
Theory and Application of Magnetic Flux Leakage Pipeline Detection.
Shi, Yan; Zhang, Chao; Li, Rui; Cai, Maolin; Jia, Guanwei
2015-12-10
Magnetic flux leakage (MFL) detection is one of the most popular methods of pipeline inspection. It is a nondestructive testing technique which uses magnetic sensitive sensors to detect the magnetic leakage field of defects on both the internal and external surfaces of pipelines. This paper introduces the main principles, measurement and processing of MFL data. As the key point of a quantitative analysis of MFL detection, the identification of the leakage magnetic signal is also discussed. In addition, the advantages and disadvantages of different identification methods are analyzed. Then the paper briefly introduces the expert systems used. At the end of this paper, future developments in pipeline MFL detection are predicted.
Theory and Application of Magnetic Flux Leakage Pipeline Detection
Shi, Yan; Zhang, Chao; Li, Rui; Cai, Maolin; Jia, Guanwei
2015-01-01
Magnetic flux leakage (MFL) detection is one of the most popular methods of pipeline inspection. It is a nondestructive testing technique which uses magnetic sensitive sensors to detect the magnetic leakage field of defects on both the internal and external surfaces of pipelines. This paper introduces the main principles, measurement and processing of MFL data. As the key point of a quantitative analysis of MFL detection, the identification of the leakage magnetic signal is also discussed. In addition, the advantages and disadvantages of different identification methods are analyzed. Then the paper briefly introduces the expert systems used. At the end of this paper, future developments in pipeline MFL detection are predicted. PMID:26690435
Simulation of pipeline in the area of the underwater crossing
NASA Astrophysics Data System (ADS)
Burkov, P.; Chernyavskiy, D.; Burkova, S.; Konan, E. C.
2014-08-01
The article studies stress-strain behavior of the main oil-pipeline section Alexandrovskoye-Anzhero-Sudzhensk using software system Ansys. This method of examination and assessment of technical conditions of objects of pipeline transport studies the objects and the processes that affect the technical condition of these facilities, including the research on the basis of computer simulation. Such approach allows to develop the theory, methods of calculations and designing of objects of pipeline transport, units and parts of machines, regardless of their industry and destination with a view to improve the existing constructions and create new structures, machines of high performance, durability and reliability, maintainability, low material capacity and cost, which have competitiveness on the world market.
Numerical Modeling of Mechanical Behavior for Buried Steel Pipelines Crossing Subsidence Strata
Han, C. J.
2015-01-01
This paper addresses the mechanical behavior of buried steel pipeline crossing subsidence strata. The investigation is based on numerical simulation of the nonlinear response of the pipeline-soil system through finite element method, considering large strain and displacement, inelastic material behavior of buried pipeline and the surrounding soil, as well as contact and friction on the pipeline-soil interface. Effects of key parameters on the mechanical behavior of buried pipeline were investigated, such as strata subsidence, diameter-thickness ratio, buried depth, internal pressure, friction coefficient and soil properties. The results show that the maximum strain appears on the outer transition subsidence section of the pipeline, and its cross section is concave shaped. With the increasing of strata subsidence and diameter-thickness ratio, the out of roundness, longitudinal strain and equivalent plastic strain increase gradually. With the buried depth increasing, the deflection, out of roundness and strain of the pipeline decrease. Internal pressure and friction coefficient have little effect on the deflection of buried pipeline. Out of roundness is reduced and the strain is increased gradually with the increasing of internal pressure. The physical properties of soil have a great influence on the mechanical properties of buried pipeline. The results from the present study can be used for the development of optimization design and preventive maintenance for buried steel pipelines. PMID:26103460
Federal Register 2010, 2011, 2012, 2013, 2014
2012-02-09
... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket ID... installed to lessen the volume of natural gas and hazardous liquid released during catastrophic pipeline... p.m. Panel 3: Considerations for Natural Gas Pipeline Leak Detection Systems 3:30 p.m. Break 3:45 p...
Detection of underground pipeline based on Golay waveform design
NASA Astrophysics Data System (ADS)
Dai, Jingjing; Xu, Dazhuan
2017-08-01
The detection of underground pipeline is an important problem in the development of the city, but the research about it is not mature at present. In this paper, based on the principle of waveform design in wireless communication, we design an acoustic signal detection system to detect the location of underground pipelines. According to the principle of acoustic localization, we chose DSP-F28335 as the development board, and use DA and AD module as the master control chip. The DA module uses complementary Golay sequence as emission signal. The AD module acquisiting data synchronously, so that the echo signals which containing position information of the target is recovered through the signal processing. The test result shows that the method in this paper can not only calculate the sound velocity of the soil, but also can locate the location of underground pipelines accurately.
Budin, Francois; Hoogstoel, Marion; Reynolds, Patrick; Grauer, Michael; O'Leary-Moore, Shonagh K; Oguz, Ipek
2013-01-01
Magnetic resonance imaging (MRI) of rodent brains enables study of the development and the integrity of the brain under certain conditions (alcohol, drugs etc.). However, these images are difficult to analyze for biomedical researchers with limited image processing experience. In this paper we present an image processing pipeline running on a Midas server, a web-based data storage system. It is composed of the following steps: rigid registration, skull-stripping, average computation, average parcellation, parcellation propagation to individual subjects, and computation of region-based statistics on each image. The pipeline is easy to configure and requires very little image processing knowledge. We present results obtained by processing a data set using this pipeline and demonstrate how this pipeline can be used to find differences between populations.
Use of geographic information systems for applications on gas pipeline rights-of-way
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sydelko, P.J.; Wilkey, P.L.
1992-12-01
Geographic information system (GIS) applications for the siting and monitoring of gas pipeline rights-of-way (ROWS) were developed for areas near Rio Vista, California. The data layers developed for this project represent geographic features, such as landcover, elevation, aspect, slope, soils, hydrography, transportation, endangered species, wetlands, and public line surveys. A GIS was used to develop and store spatial data from several sources; to manipulate spatial data to evaluate environmental and engineering issues associated with the siting, permitting, construction, maintenance, and monitoring of gas pipeline ROWS; and to graphically display analysis results. Examples of these applications include (1) determination of environmentallymore » sensitive areas, such as endangered species habitat, wetlands, and areas of highly erosive soils; (2) evaluation of engineering constraints, including shallow depth to bedrock, major hydrographic features, and shallow water table; (3) classification of satellite imagery for landuse/landcover that will affect ROWS; and (4) identification of alternative ROW corridors that avoid environmentally sensitive areas or areas with severe engineering constraints.« less
GIS least-cost analysis approach for siting gas pipeline ROWs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sydelko, P.J.; Wilkey, P.L.
1994-09-01
Geographic-information-system applications for the siting and monitoring of gas pipeline rights-of-way (ROWS) were developed for areas near Rio Vista, California. The data layers developed for this project represent geographic features, such as landcover, elevation, aspect, slope, soils, hydrography, transportation corridors, endangered species habitats, wetlands, and public line surveys. A geographic information system was used to develop and store spatial data from several sources; to manipulate spatial data to evaluate environmental and engineering issues associated with the siting, permitting, construction, maintenance, and monitoring of gas-pipeline ROWS; and to graphically display analysis results. Examples of these applications include (1) determination of environmentallymore » sensitive areas, such as endangered species habitat, wetlands, and areas of highly erosive soils; (2) evaluation of engineering constraints, including shallow depth to bedrock, major hydrographic features, and shallow water table; (3) classification of satellite imagery for landuse/landcover that will affect ROWS; and (4) identification of alternative ROW corridors that avoid environmentally sensitive areas or areas with severe engineering constraints.« less
Use of geographic information systems for applications on gas pipeline rights-of-way
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sydelko, P.J.
1993-10-01
Geographic information system (GIS) applications for the siting and monitoring of gas pipeline rights-of-way (ROWS) were developed for areas near Rio Vista, California. The data layers developed for this project represent geographic features, such as landcover, elevation, aspect, slope, soils, hydrography, transportation, endangered species, wetlands, and public line surveys. A GIS was used to develop and store spatial data from several sources; to manipulate spatial data to evaluate environmental and engineering issues associated with the siting, permitting, construction, maintenance, and monitoring of gas pipeline ROWS; and to graphically display analysis results. Examples of these applications include (1) determination of environmentallymore » sensitive areas, such as endangered species habitat, wetlands, and areas of highly erosive soils; (2) evaluation of engineering constraints, including shallow depth to bedrock, major hydrographic features, and shallow water table; (3) classification of satellite imagery for land use/landcover that will affect ROWS; and (4) identification of alternative ROW corridors that avoid environmentally sensitive areas or areas with severe engineering constraints.« less
Use of geographic information systems for applications on gas pipeline rights-of-way
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sydelko, P.J.; Wilkey, P.L.
1992-01-01
Geographic information system (GIS) applications for the siting and monitoring of gas pipeline rights-of-way (ROWS) were developed for areas near Rio Vista, California. The data layers developed for this project represent geographic features, such as landcover, elevation, aspect, slope, soils, hydrography, transportation, endangered species, wetlands, and public line surveys. A GIS was used to develop and store spatial data from several sources; to manipulate spatial data to evaluate environmental and engineering issues associated with the siting, permitting, construction, maintenance, and monitoring of gas pipeline ROWS; and to graphically display analysis results. Examples of these applications include (1) determination of environmentallymore » sensitive areas, such as endangered species habitat, wetlands, and areas of highly erosive soils; (2) evaluation of engineering constraints, including shallow depth to bedrock, major hydrographic features, and shallow water table; (3) classification of satellite imagery for landuse/landcover that will affect ROWS; and (4) identification of alternative ROW corridors that avoid environmentally sensitive areas or areas with severe engineering constraints.« less
An evolutionary approach to the architecture of effective healthcare delivery systems.
Towill, D R; Christopher, M
2005-01-01
Aims to show that material flow concepts developed and successfully applied to commercial products and services can form equally well the architectural infrastructure of effective healthcare delivery systems. The methodology is based on the "power of analogy" which demonstrates that healthcare pipelines may be classified via the Time-Space Matrix. A small number (circa 4) of substantially different healthcare delivery pipelines will cover the vast majority of patient needs and simultaneously create adequate added value from their perspective. The emphasis is firmly placed on total process mapping and analysis via established identification techniques. Healthcare delivery pipelines must be properly engineered and matched to life cycle phase if the service is to be effective. This small family of healthcare delivery pipelines needs to be designed via adherence to very specific-to-purpose principles. These vary from "lean production" through to "agile delivery". The proposition for a strategic approach to healthcare delivery pipeline design is novel and positions much currently isolated research into a comprehensive organisational framework. It therefore provides a synthesis of the needs of global healthcare.
A visual programming environment for the Navier-Stokes computer
NASA Technical Reports Server (NTRS)
Tomboulian, Sherryl; Crockett, Thomas W.; Middleton, David
1988-01-01
The Navier-Stokes computer is a high-performance, reconfigurable, pipelined machine designed to solve large computational fluid dynamics problems. Due to the complexity of the architecture, development of effective, high-level language compilers for the system appears to be a very difficult task. Consequently, a visual programming methodology has been developed which allows users to program the system at an architectural level by constructing diagrams of the pipeline configuration. These schematic program representations can then be checked for validity and automatically translated into machine code. The visual environment is illustrated by using a prototype graphical editor to program an example problem.
Building a virtual ligand screening pipeline using free software: a survey.
Glaab, Enrico
2016-03-01
Virtual screening, the search for bioactive compounds via computational methods, provides a wide range of opportunities to speed up drug development and reduce the associated risks and costs. While virtual screening is already a standard practice in pharmaceutical companies, its applications in preclinical academic research still remain under-exploited, in spite of an increasing availability of dedicated free databases and software tools. In this survey, an overview of recent developments in this field is presented, focusing on free software and data repositories for screening as alternatives to their commercial counterparts, and outlining how available resources can be interlinked into a comprehensive virtual screening pipeline using typical academic computing facilities. Finally, to facilitate the set-up of corresponding pipelines, a downloadable software system is provided, using platform virtualization to integrate pre-installed screening tools and scripts for reproducible application across different operating systems. © The Author 2015. Published by Oxford University Press.
Building a virtual ligand screening pipeline using free software: a survey
2016-01-01
Virtual screening, the search for bioactive compounds via computational methods, provides a wide range of opportunities to speed up drug development and reduce the associated risks and costs. While virtual screening is already a standard practice in pharmaceutical companies, its applications in preclinical academic research still remain under-exploited, in spite of an increasing availability of dedicated free databases and software tools. In this survey, an overview of recent developments in this field is presented, focusing on free software and data repositories for screening as alternatives to their commercial counterparts, and outlining how available resources can be interlinked into a comprehensive virtual screening pipeline using typical academic computing facilities. Finally, to facilitate the set-up of corresponding pipelines, a downloadable software system is provided, using platform virtualization to integrate pre-installed screening tools and scripts for reproducible application across different operating systems. PMID:26094053
Continuous Turbidity Monitoring in the Indian Creek Watershed, Tazewell County, Virginia, 2006-08
Moyer, Douglas; Hyer, Kenneth
2009-01-01
Thousands of miles of natural gas pipelines are installed annually in the United States. These pipelines commonly cross streams, rivers, and other water bodies during pipeline construction. A major concern associated with pipelines crossing water bodies is increased sediment loading and the subsequent impact to the ecology of the aquatic system. Several studies have investigated the techniques used to install pipelines across surface-water bodies and their effect on downstream suspended-sediment concentrations. These studies frequently employ the evaluation of suspended-sediment or turbidity data that were collected using discrete sample-collection methods. No studies, however, have evaluated the utility of continuous turbidity monitoring for identifying real-time sediment input and providing a robust dataset for the evaluation of long-term changes in suspended-sediment concentration as it relates to a pipeline crossing. In 2006, the U.S. Geological Survey, in cooperation with East Tennessee Natural Gas and the U.S. Fish and Wildlife Service, began a study to monitor the effects of construction of the Jewell Ridge Lateral natural gas pipeline on turbidity conditions below pipeline crossings of Indian Creek and an unnamed tributary to Indian Creek, in Tazewell County, Virginia. The potential for increased sediment loading to Indian Creek is of major concern for watershed managers because Indian Creek is listed as one of Virginia's Threatened and Endangered Species Waters and contains critical habitat for two freshwater mussel species, purple bean (Villosa perpurpurea) and rough rabbitsfoot (Quadrula cylindrical strigillata). Additionally, Indian Creek contains the last known reproducing population of the tan riffleshell (Epioblasma florentina walkeri). Therefore, the objectives of the U.S. Geological Survey monitoring effort were to (1) develop a continuous turbidity monitoring network that attempted to measure real-time changes in suspended sediment (using turbidity as a surrogate) downstream from the pipeline crossings, and (2) provide continuous turbidity data that enable the development of a real-time turbidity-input warning system and assessment of long-term changes in turbidity conditions. Water-quality conditions were assessed using continuous water-quality monitors deployed upstream and downstream from the pipeline crossings in Indian Creek and the unnamed tributary. These paired upstream and downstream monitors were outfitted with turbidity, pH (for Indian Creek only), specific-conductance, and water-temperature sensors. Water-quality data were collected continuously (every 15 minutes) during three phases of the pipeline construction: pre-construction, during construction, and post-construction. Continuous turbidity data were evaluated at various time steps to determine whether the construction of the pipeline crossings had an effect on downstream suspended-sediment conditions in Indian Creek and the unnamed tributary. These continuous turbidity data were analyzed in real time with the aid of a turbidity-input warning system. A warning occurred when turbidity values downstream from the pipeline were 6 Formazin Nephelometric Units or 15 percent (depending on the observed range) greater than turbidity upstream from the pipeline crossing. Statistical analyses also were performed on monthly and phase-of-construction turbidity data to determine if the pipeline crossing served as a long-term source of sediment. Results of this intensive water-quality monitoring effort indicate that values of turbidity in Indian Creek increased significantly between the upstream and downstream water-quality monitors during the construction of the Jewell Ridge pipeline. The magnitude of the significant turbidity increase, however, was small (less than 2 Formazin Nephelometric Units). Patterns in the continuous turbidity data indicate that the actual pipeline crossing of Indian Creek had little influence of downstream water quality; co
Analysis of oil-pipeline distribution of multiple products subject to delivery time-windows
NASA Astrophysics Data System (ADS)
Jittamai, Phongchai
This dissertation defines the operational problems of, and develops solution methodologies for, a distribution of multiple products into oil pipeline subject to delivery time-windows constraints. A multiple-product oil pipeline is a pipeline system composing of pipes, pumps, valves and storage facilities used to transport different types of liquids. Typically, products delivered by pipelines are petroleum of different grades moving either from production facilities to refineries or from refineries to distributors. Time-windows, which are generally used in logistics and scheduling areas, are incorporated in this study. The distribution of multiple products into oil pipeline subject to delivery time-windows is modeled as multicommodity network flow structure and mathematically formulated. The main focus of this dissertation is the investigation of operating issues and problem complexity of single-source pipeline problems and also providing solution methodology to compute input schedule that yields minimum total time violation from due delivery time-windows. The problem is proved to be NP-complete. The heuristic approach, a reversed-flow algorithm, is developed based on pipeline flow reversibility to compute input schedule for the pipeline problem. This algorithm is implemented in no longer than O(T·E) time. This dissertation also extends the study to examine some operating attributes and problem complexity of multiple-source pipelines. The multiple-source pipeline problem is also NP-complete. A heuristic algorithm modified from the one used in single-source pipeline problems is introduced. This algorithm can also be implemented in no longer than O(T·E) time. Computational results are presented for both methodologies on randomly generated problem sets. The computational experience indicates that reversed-flow algorithms provide good solutions in comparison with the optimal solutions. Only 25% of the problems tested were more than 30% greater than optimal values and approximately 40% of the tested problems were solved optimally by the algorithms.
ML-o-Scope: A Diagnostic Visualization System for Deep Machine Learning Pipelines
2014-05-16
ML-o-scope: a diagnostic visualization system for deep machine learning pipelines Daniel Bruckner Electrical Engineering and Computer Sciences... machine learning pipelines 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK NUMBER 5f...the system as a support for tuning large scale object-classification pipelines. 1 Introduction A new generation of pipelined machine learning models
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-12
... natural gas from existing pipeline systems to the LNG terminal facilities. The Project would be... room, warehouse, and shop. Pipeline Header System: A 29-mile-long, 42-inch-diameter natural gas pipeline extending northward from the shoreside facilities to nine natural gas interconnects southwest of...
Creation and Implementation of a Workforce Development Pipeline Program at MSFC
NASA Technical Reports Server (NTRS)
Hix, Billy
2003-01-01
Within the context of NASA's Education Programs, this Workforce Development Pipeline guide describes the goals and objectives of MSFC's Workforce Development Pipeline Program as well as the principles and strategies for guiding implementation. It is designed to support the initiatives described in the NASA Implementation Plan for Education, 1999-2003 (EP-1998-12-383-HQ) and represents the vision of the members of the Education Programs office at MSFC. This document: 1) Outlines NASA s Contribution to National Priorities; 2) Sets the context for the Workforce Development Pipeline Program; 3) Describes Workforce Development Pipeline Program Strategies; 4) Articulates the Workforce Development Pipeline Program Goals and Aims; 5) List the actions to build a unified approach; 6) Outlines the Workforce Development Pipeline Programs guiding Principles; and 7) The results of implementation.
Method and system for pipeline communication
Richardson,; John, G [Idaho Falls, ID
2008-01-29
A pipeline communication system and method includes a pipeline having a surface extending along at least a portion of the length of the pipeline. A conductive bus is formed to and extends along a portion of the surface of the pipeline. The conductive bus includes a first conductive trace and a second conductive trace with the first and second conductive traces being adapted to conformally couple with a pipeline at the surface extending along at least a portion of the length of the pipeline. A transmitter for sending information along the conductive bus on the pipeline is coupled thereto and a receiver for receiving the information from the conductive bus on the pipeline is also couple to the conductive bus.
Kepler Science Operations Center Architecture
NASA Technical Reports Server (NTRS)
Middour, Christopher; Klaus, Todd; Jenkins, Jon; Pletcher, David; Cote, Miles; Chandrasekaran, Hema; Wohler, Bill; Girouard, Forrest; Gunter, Jay P.; Uddin, Kamal;
2010-01-01
We give an overview of the operational concepts and architecture of the Kepler Science Data Pipeline. Designed, developed, operated, and maintained by the Science Operations Center (SOC) at NASA Ames Research Center, the Kepler Science Data Pipeline is central element of the Kepler Ground Data System. The SOC charter is to analyze stellar photometric data from the Kepler spacecraft and report results to the Kepler Science Office for further analysis. We describe how this is accomplished via the Kepler Science Data Pipeline, including the hardware infrastructure, scientific algorithms, and operational procedures. The SOC consists of an office at Ames Research Center, software development and operations departments, and a data center that hosts the computers required to perform data analysis. We discuss the high-performance, parallel computing software modules of the Kepler Science Data Pipeline that perform transit photometry, pixel-level calibration, systematic error-correction, attitude determination, stellar target management, and instrument characterization. We explain how data processing environments are divided to support operational processing and test needs. We explain the operational timelines for data processing and the data constructs that flow into the Kepler Science Data Pipeline.
A hydrogen energy carrier. Volume 2: Systems analysis
NASA Technical Reports Server (NTRS)
Savage, R. L. (Editor); Blank, L. (Editor); Cady, T. (Editor); Cox, K. (Editor); Murray, R. (Editor); Williams, R. D. (Editor)
1973-01-01
A systems analysis of hydrogen as an energy carrier in the United States indicated that it is feasible to use hydrogen in all energy use areas, except some types of transportation. These use areas are industrial, residential and commercial, and electric power generation. Saturation concept and conservation concept forecasts of future total energy demands were made. Projected costs of producing hydrogen from coal or from nuclear heat combined with thermochemical decomposition of water are in the range $1.00 to $1.50 per million Btu of hydrogen produced. Other methods are estimated to be more costly. The use of hydrogen as a fuel will require the development of large-scale transmission and storage systems. A pipeline system similar to the existing natural gas pipeline system appears practical, if design factors are included to avoid hydrogen environment embrittlement of pipeline metals. Conclusions from the examination of the safety, legal, environmental, economic, political and societal aspects of hydrogen fuel are that a hydrogen energy carrier system would be compatible with American values and the existing energy system.
ORAC-DR: A generic data reduction pipeline infrastructure
NASA Astrophysics Data System (ADS)
Jenness, Tim; Economou, Frossie
2015-03-01
ORAC-DR is a general purpose data reduction pipeline system designed to be instrument and observatory agnostic. The pipeline works with instruments as varied as infrared integral field units, imaging arrays and spectrographs, and sub-millimeter heterodyne arrays and continuum cameras. This paper describes the architecture of the pipeline system and the implementation of the core infrastructure. We finish by discussing the lessons learned since the initial deployment of the pipeline system in the late 1990s.
U.S. pipeline industry enters new era
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnsen, M.R.
1999-11-01
The largest construction project in North America this year and next--the Alliance Pipeline--marks some advances for the US pipeline industry. With the Alliance Pipeline system (Alliance), mechanized welding and ultrasonic testing are making their debuts in the US as primary mainline construction techniques. Particularly in Canada and Europe, mechanized welding technology has been used for both onshore and offshore pipeline construction for at least 15 years. However, it has never before been used to build a cross-country pipeline in the US, although it has been tested on short segments. This time, however, an accelerated construction schedule, among other reasons, necessitatedmore » the use of mechanized gas metal arc welding (GMAW). The $3-billion pipeline will delivery natural gas from northwestern British Columbia and northeastern Alberta in Canada to a hub near Chicago, Ill., where it will connect to the North American pipeline grid. Once the pipeline is completed and buried, crews will return the topsoil. Corn and other crops will reclaim the land. While the casual passerby probably won't know the Alliance pipeline is there, it may have a far-reaching effect on the way mainline pipelines are built in the US. For even though mechanized welding and ultrasonic testing are being used for the first time in the United States on this project, some US workers had already gained experience with the technology on projects elsewhere. And work on this pipeline has certainly developed a much larger pool of experienced workers for industry to draw from. The Alliance project could well signal the start of a new era in US pipeline construction.« less
State of art of seismic design and seismic hazard analysis for oil and gas pipeline system
NASA Astrophysics Data System (ADS)
Liu, Aiwen; Chen, Kun; Wu, Jian
2010-06-01
The purpose of this paper is to adopt the uniform confidence method in both water pipeline design and oil-gas pipeline design. Based on the importance of pipeline and consequence of its failure, oil and gas pipeline can be classified into three pipe classes, with exceeding probabilities over 50 years of 2%, 5% and 10%, respectively. Performance-based design requires more information about ground motion, which should be obtained by evaluating seismic safety for pipeline engineering site. Different from a city’s water pipeline network, the long-distance oil and gas pipeline system is a spatially linearly distributed system. For the uniform confidence of seismic safety, a long-distance oil and pipeline formed with pump stations and different-class pipe segments should be considered as a whole system when analyzing seismic risk. Considering the uncertainty of earthquake magnitude, the design-basis fault displacements corresponding to the different pipeline classes are proposed to improve deterministic seismic hazard analysis (DSHA). A new empirical relationship between the maximum fault displacement and the surface-wave magnitude is obtained with the supplemented earthquake data in East Asia. The estimation of fault displacement for a refined oil pipeline in Wenchuan M S8.0 earthquake is introduced as an example in this paper.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pribanic, Tomas; Awwad, Amer; Crespo, Jairo
2012-07-01
Transferring high-level waste (HLW) between storage tanks or to treatment facilities is a common practice performed at the Department of Energy (DoE) sites. Changes in the chemical and/or physical properties of the HLW slurry during the transfer process may lead to the formation of blockages inside the pipelines resulting in schedule delays and increased costs. To improve DoE's capabilities in the event of a pipeline plugging incident, FIU has continued to develop two novel unplugging technologies: an asynchronous pulsing system and a peristaltic crawler. The asynchronous pulsing system uses a hydraulic pulse generator to create pressure disturbances at two oppositemore » inlet locations of the pipeline to dislodge blockages by attacking the plug from both sides remotely. The peristaltic crawler is a pneumatic/hydraulic operated crawler that propels itself by a sequence of pressurization/depressurization of cavities (inner tubes). The crawler includes a frontal attachment that has a hydraulically powered unplugging tool. In this paper, details of the asynchronous pulsing system's ability to unplug a pipeline on a small-scale test-bed and results from the experimental testing of the second generation peristaltic crawler are provided. The paper concludes with future improvements for the third generation crawler and a recommended path forward for the asynchronous pulsing testing. (authors)« less
Completion of development of robotics systems for inspecting unpiggable transmission pipelines.
DOT National Transportation Integrated Search
2013-02-01
This document presents the final report for a program focusing on the completion of the : research, development and demonstration effort, which was initiated in 2001, for the : development of two robotic systems for the in-line, live inspection of un...
77 FR 17119 - Pipeline Safety: Cast Iron Pipe (Supplementary Advisory Bulletin)
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-23
... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket No... national attention and highlight the need for continued safety improvements to aging gas pipeline systems... 26, 1992) covering the continued use of cast iron pipe in natural gas distribution pipeline systems...
78 FR 6402 - Pipeline Safety: Accident and Incident Notification Time Limit
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-30
... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket No.... SUMMARY: Owners and operators of gas and hazardous liquid pipeline systems and liquefied natural gas (LNG... operators of gas and hazardous liquids pipeline systems and LNG facilities that, ``at the earliest...
Merged GIS, GPS data assist siting for gulf gas line
DOE Office of Scientific and Technical Information (OSTI.GOV)
Scott, D.R.; Schmidt, J.A.
1998-06-29
A GIS-based decision-support system was developed for a US Gulf of Mexico onshore and offshore pipeline that has assisted in locating a cost-effective pipeline route based on landcover type, wetland distribution, and proximity to other environmentally sensitive resources. Described here are the methods used to integrate various sources of available GIS data with satellite imagery and surveyed information. Costs of collecting and processing these data are compared with benefits of the system over use of manual methods.
Brown, David K; Penkler, David L; Musyoka, Thommas M; Bishop, Özlem Tastan
2015-01-01
Complex computational pipelines are becoming a staple of modern scientific research. Often these pipelines are resource intensive and require days of computing time. In such cases, it makes sense to run them over high performance computing (HPC) clusters where they can take advantage of the aggregated resources of many powerful computers. In addition to this, researchers often want to integrate their workflows into their own web servers. In these cases, software is needed to manage the submission of jobs from the web interface to the cluster and then return the results once the job has finished executing. We have developed the Job Management System (JMS), a workflow management system and web interface for high performance computing (HPC). JMS provides users with a user-friendly web interface for creating complex workflows with multiple stages. It integrates this workflow functionality with the resource manager, a tool that is used to control and manage batch jobs on HPC clusters. As such, JMS combines workflow management functionality with cluster administration functionality. In addition, JMS provides developer tools including a code editor and the ability to version tools and scripts. JMS can be used by researchers from any field to build and run complex computational pipelines and provides functionality to include these pipelines in external interfaces. JMS is currently being used to house a number of bioinformatics pipelines at the Research Unit in Bioinformatics (RUBi) at Rhodes University. JMS is an open-source project and is freely available at https://github.com/RUBi-ZA/JMS.
Brown, David K.; Penkler, David L.; Musyoka, Thommas M.; Bishop, Özlem Tastan
2015-01-01
Complex computational pipelines are becoming a staple of modern scientific research. Often these pipelines are resource intensive and require days of computing time. In such cases, it makes sense to run them over high performance computing (HPC) clusters where they can take advantage of the aggregated resources of many powerful computers. In addition to this, researchers often want to integrate their workflows into their own web servers. In these cases, software is needed to manage the submission of jobs from the web interface to the cluster and then return the results once the job has finished executing. We have developed the Job Management System (JMS), a workflow management system and web interface for high performance computing (HPC). JMS provides users with a user-friendly web interface for creating complex workflows with multiple stages. It integrates this workflow functionality with the resource manager, a tool that is used to control and manage batch jobs on HPC clusters. As such, JMS combines workflow management functionality with cluster administration functionality. In addition, JMS provides developer tools including a code editor and the ability to version tools and scripts. JMS can be used by researchers from any field to build and run complex computational pipelines and provides functionality to include these pipelines in external interfaces. JMS is currently being used to house a number of bioinformatics pipelines at the Research Unit in Bioinformatics (RUBi) at Rhodes University. JMS is an open-source project and is freely available at https://github.com/RUBi-ZA/JMS. PMID:26280450
Li, Weifeng; Ling, Wencui; Liu, Suoxiang; Zhao, Jing; Liu, Ruiping; Chen, Qiuwen; Qiang, Zhimin; Qu, Jiuhui
2011-01-01
Water leakage in drinking water distribution systems is a serious problem for many cities and a huge challenge for water utilities. An integrated system for the detection, early warning, and control of pipeline leakage has been developed and successfully used to manage the pipeline networks in selected areas of Beijing. A method based on the geographic information system has been proposed to quickly and automatically optimize the layout of the instruments which detect leaks. Methods are also proposed to estimate the probability of each pipe segment leaking (on the basis of historic leakage data), and to assist in locating the leakage points (based on leakage signals). The district metering area (DMA) strategy is used. Guidelines and a flowchart for establishing a DMA to manage the large-scale looped networks in Beijing are proposed. These different functions have been implemented into a central software system to simplify the day-to-day use of the system. In 2007 the system detected 102 non-obvious leakages (i.e., 14.2% of the total detected in Beijing) in the selected areas, which was estimated to save a total volume of 2,385,000 m3 of water. These results indicate the feasibility, efficiency and wider applicability of this system.
Historical analysis of US pipeline accidents triggered by natural hazards
NASA Astrophysics Data System (ADS)
Girgin, Serkan; Krausmann, Elisabeth
2015-04-01
Natural hazards, such as earthquakes, floods, landslides, or lightning, can initiate accidents in oil and gas pipelines with potentially major consequences on the population or the environment due to toxic releases, fires and explosions. Accidents of this type are also referred to as Natech events. Many major accidents highlight the risk associated with natural-hazard impact on pipelines transporting dangerous substances. For instance, in the USA in 1994, flooding of the San Jacinto River caused the rupture of 8 and the undermining of 29 pipelines by the floodwaters. About 5.5 million litres of petroleum and related products were spilled into the river and ignited. As a results, 547 people were injured and significant environmental damage occurred. Post-incident analysis is a valuable tool for better understanding the causes, dynamics and impacts of pipeline Natech accidents in support of future accident prevention and mitigation. Therefore, data on onshore hazardous-liquid pipeline accidents collected by the US Pipeline and Hazardous Materials Safety Administration (PHMSA) was analysed. For this purpose, a database-driven incident data analysis system was developed to aid the rapid review and categorization of PHMSA incident reports. Using an automated data-mining process followed by a peer review of the incident records and supported by natural hazard databases and external information sources, the pipeline Natechs were identified. As a by-product of the data-collection process, the database now includes over 800,000 incidents from all causes in industrial and transportation activities, which are automatically classified in the same way as the PHMSA record. This presentation describes the data collection and reviewing steps conducted during the study, provides information on the developed database and data analysis tools, and reports the findings of a statistical analysis of the identified hazardous liquid pipeline incidents in terms of accident dynamics and consequences.
The value of pipelines to the transportation system of Texas : year one report
DOT National Transportation Integrated Search
2000-10-01
Pipelines represent a major transporter of petrochemical commodities in Texas. The Texas pipeline system represents as much as 17% of the total pipeline mileage in the U.S. and links many segments of the country with energy sources located on the Gul...
78 FR 41991 - Pipeline Safety: Potential for Damage to Pipeline Facilities Caused by Flooding
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-12
... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket No...: Pipeline and Hazardous Materials Safety Administration (PHMSA); DOT. ACTION: Notice; Issuance of Advisory... Gas and Hazardous Liquid Pipeline Systems. Subject: Potential for Damage to Pipeline Facilities Caused...
78 FR 57455 - Pipeline Safety: Information Collection Activities
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-18
... ``. . . system-specific information, including pipe diameter, operating pressure, product transported, and...) must provide contact information and geospatial data on their pipeline system. This information should... Mapping System (NPMS) to support various regulatory programs, pipeline inspections, and authorized...
Reliability-based management of buried pipelines considering external corrosion defects
NASA Astrophysics Data System (ADS)
Miran, Seyedeh Azadeh
Corrosion is one of the main deteriorating mechanisms that degrade the energy pipeline integrity, due to transferring corrosive fluid or gas and interacting with corrosive environment. Corrosion defects are usually detected by periodical inspections using in-line inspection (ILI) methods. In order to ensure pipeline safety, this study develops a cost-effective maintenance strategy that consists of three aspects: corrosion growth model development using ILI data, time-dependent performance evaluation, and optimal inspection interval determination. In particular, the proposed study is applied to a cathodic protected buried steel pipeline located in Mexico. First, time-dependent power-law formulation is adopted to probabilistically characterize growth of the maximum depth and length of the external corrosion defects. Dependency between defect depth and length are considered in the model development and generation of the corrosion defects over time is characterized by the homogenous Poisson process. The growth models unknown parameters are evaluated based on the ILI data through the Bayesian updating method with Markov Chain Monte Carlo (MCMC) simulation technique. The proposed corrosion growth models can be used when either matched or non-matched defects are available, and have ability to consider newly generated defects since last inspection. Results of this part of study show that both depth and length growth models can predict damage quantities reasonably well and a strong correlation between defect depth and length is found. Next, time-dependent system failure probabilities are evaluated using developed corrosion growth models considering prevailing uncertainties where three failure modes, namely small leak, large leak and rupture are considered. Performance of the pipeline is evaluated through failure probability per km (or called a sub-system) where each subsystem is considered as a series system of detected and newly generated defects within that sub-system. Sensitivity analysis is also performed to determine to which incorporated parameter(s) in the growth models reliability of the studied pipeline is most sensitive. The reliability analysis results suggest that newly generated defects should be considered in calculating failure probability, especially for prediction of long-term performance of the pipeline and also, impact of the statistical uncertainty in the model parameters is significant that should be considered in the reliability analysis. Finally, with the evaluated time-dependent failure probabilities, a life cycle-cost analysis is conducted to determine optimal inspection interval of studied pipeline. The expected total life-cycle costs consists construction cost and expected costs of inspections, repair, and failure. The repair is conducted when failure probability from any described failure mode exceeds pre-defined probability threshold after each inspection. Moreover, this study also investigates impact of repair threshold values and unit costs of inspection and failure on the expected total life-cycle cost and optimal inspection interval through a parametric study. The analysis suggests that a smaller inspection interval leads to higher inspection costs, but can lower failure cost and also repair cost is less significant compared to inspection and failure costs.
Automatic Generalizability Method of Urban Drainage Pipe Network Considering Multi-Features
NASA Astrophysics Data System (ADS)
Zhu, S.; Yang, Q.; Shao, J.
2018-05-01
Urban drainage systems are indispensable dataset for storm-flooding simulation. Given data availability and current computing power, the structure and complexity of urban drainage systems require to be simplify. However, till data, the simplify procedure mainly depend on manual operation that always leads to mistakes and lower work efficiency. This work referenced the classification methodology of road system, and proposed a conception of pipeline stroke. Further, length of pipeline, angle between two pipelines, the pipeline belonged road level and diameter of pipeline were chosen as the similarity criterion to generate the pipeline stroke. Finally, designed the automatic method to generalize drainage systems with the concern of multi-features. This technique can improve the efficiency and accuracy of the generalization of drainage systems. In addition, it is beneficial to the study of urban storm-floods.
Sensor and transmitter system for communication in pipelines
Cooper, John F.; Burnham, Alan K.
2013-01-29
A system for sensing and communicating in a pipeline that contains a fluid. An acoustic signal containing information about a property of the fluid is produced in the pipeline. The signal is transmitted through the pipeline. The signal is received with the information and used by a control.
DOT National Transportation Integrated Search
1997-07-14
These standards represent a guideline for preparing digital data for inclusion in the National Pipeline Mapping System Repository. The standards were created with input from the pipeline industry and government agencies. They address the submission o...
NASA Technical Reports Server (NTRS)
Jenkins, Jon M.
2017-01-01
The experience acquired through development, implementation and operation of the KeplerK2 science pipelines can provide lessons learned for the development of science pipelines for other missions such as NASA's Transiting Exoplanet Survey Satellite, and ESA's PLATO mission.
Assessing fugitive emissions of CH4 from high-pressure gas pipelines in the UK
NASA Astrophysics Data System (ADS)
Clancy, S.; Worrall, F.; Davies, R. J.; Almond, S.; Boothroyd, I.
2016-12-01
Concern over the greenhouse gas impact of the exploitation of unconventional natural gas from shale deposits has caused a spotlight to be shone on to the entire hydrocarbon industry. Numerous studies have developed life-cycle emissions inventories to assess the impact that hydraulic fracturing has upon greenhouse gas emissions. Incorporated within life-cycle assessments are transmission and distribution losses, including infrastructure such as pipelines and compressor stations that pressurise natural gas for transport along pipelines. Estimates of fugitive emissions from transmission, storage and distribution have been criticized for reliance on old data from inappropriate sources (1970s Russian gas pipelines). In this study, we investigate fugitive emissions of CH4 from the UK high pressure national transmission system. The study took two approaches. Firstly, CH4 concentration is detected by driving along roads bisecting high pressure gas pipelines and also along an equivalent distance along a route where no high pressure gas pipeline was nearby. Five pipelines and five equivalent control routes were driven and the test was that CH4 measurements, when adjusted for distance and wind speed, should be greater on any route with a pipe than any route without a pipe. Secondly, 5 km of a high pressure gas pipeline and 5 km of equivalent farmland, were walked and soil gas (above the pipeline where present) was analysed every 7 m using a tunable diode laser. When wind adjusted 92 km of high pressure pipeline and 72 km of control route were drive over a 10 day period. When wind and distance adjusted CH4 fluxes were significantly greater on routes with a pipeline than those without. The smallest leak detectable was 3% above ambient (1.03 relative concentration) with any leaks below 3% above ambient assumed ambient. The number of leaks detected along the pipelines correlate to the estimated length of pipe joints, inferring that there are constant fugitive CH4 emissions from these joints. When scaled up to the UK's National Transmission System pipeline length of 7600 km gives a fugitive CH4 flux of 62.6 kt CH4/yr with a CO2 equivalent of 1570 kt CO2eq/yr - this fugitive emission from high pressure pipelines is 0.14% of the annual gas supply.
Oil and gas pipeline construction cost analysis and developing regression models for cost estimation
NASA Astrophysics Data System (ADS)
Thaduri, Ravi Kiran
In this study, cost data for 180 pipelines and 136 compressor stations have been analyzed. On the basis of the distribution analysis, regression models have been developed. Material, Labor, ROW and miscellaneous costs make up the total cost of a pipeline construction. The pipelines are analyzed based on different pipeline lengths, diameter, location, pipeline volume and year of completion. In a pipeline construction, labor costs dominate the total costs with a share of about 40%. Multiple non-linear regression models are developed to estimate the component costs of pipelines for various cross-sectional areas, lengths and locations. The Compressor stations are analyzed based on the capacity, year of completion and location. Unlike the pipeline costs, material costs dominate the total costs in the construction of compressor station, with an average share of about 50.6%. Land costs have very little influence on the total costs. Similar regression models are developed to estimate the component costs of compressor station for various capacities and locations.
Improving the result of forcasting using reservoir and surface network simulation
NASA Astrophysics Data System (ADS)
Hendri, R. S.; Winarta, J.
2018-01-01
This study was aimed to get more representative results in production forcasting using integrated simulation in pipeline gathering system of X field. There are 5 main scenarios which consist of the production forecast of the existing condition, work over, and infill drilling. Then, it’s determined the best development scenario. The methods of this study is Integrated Reservoir Simulator and Pipeline Simulator so-calle as Integrated Reservoir and Surface Network Simulation. After well data result from reservoir simulator was then integrated with pipeline networking simulator’s to construct a new schedule, which was input for all simulation procedure. The well design result was done by well modeling simulator then exported into pipeline simulator. Reservoir prediction depends on the minimum value of Tubing Head Pressure (THP) for each well, where the pressure drop on the Gathering Network is not necessary calculated. The same scenario was done also for the single-reservoir simulation. Integration Simulation produces results approaching the actual condition of the reservoir and was confirmed by the THP profile, which difference between those two methods. The difference between integrated simulation compared to single-modeling simulation is 6-9%. The aimed of solving back-pressure problem in pipeline gathering system of X field is achieved.
The visual and radiological inspection of a pipeline using a teleoperated pipe crawler
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fogle, R.F.; Kuelske, K.; Kellner, R.
1995-01-01
In the 1950s, the Savannah River Site built an open, unlined retention basin to temporarily store potentially radionuclide contaminated cooling water from a chemical separations process and storm water drainage from a nearby waste management facility that stored large quantities of nuclear fission byproducts in carbon steel tanks. The retention basin was retired from service in 1972 when a new, lined basin was completed. In 1978, the old retention basin was excavated, backfilled with uncontaminated dirt, and covered with grass. At the same time, much of the underground process pipeline leading to the basin was abandoned. Since the closure ofmore » the retention basin, new environmental regulations require that the basin undergo further assessment to determine whether additional remediation is required. A visual and radiological inspection of the pipeline was necessary to aid in the remediation decision making process for the retention basin system. A teleoperated pipe crawler inspection system was developed to survey the abandoned sections of underground pipelines leading to the retired retention basin. This paper will describe the background to this project, the scope of the investigation, the equipment requirements, and the results of the pipeline inspection.« less
The inspection of a radiologically contaminated pipeline using a teleoperated pipe crawler
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fogle, R.F.; Kuelske, K.; Kellner, R.A.
1995-08-01
In the 1950s, the Savannah River Site built an open, unlined retention basin to temporarily store potentially radionuclide contaminated cooling water from a chemical separations process and storm water drainage from a nearby waste management facility that stored large quantities of nuclear fission byproducts in carbon steel tanks. The retention basin was retired from service in 1972 when a new, lined basin was completed. In 1978, the old retention basin was excavated, backfilled with uncontaminated dirt, and covered with grass. At the same time, much of the underground process pipeline leading to the basin was abandoned. Since the closure ofmore » the retention basin, new environmental regulations require that the basin undergo further assessment to determine whether additional remediation is required. A visual and radiological inspection of the pipeline was necessary to aid in the remediation decision making process for the retention basin system. A teleoperated pipe crawler inspection system was developed to survey the abandoned sections of underground pipelines leading to the retired retention basin. This paper will describe the background to this project, the scope of the investigation, the equipment requirements, and the results of the pipeline inspection.« less
Pipeline systems - safety for assets and transport regularity
DOT National Transportation Integrated Search
1997-01-01
This review regarding safety for assets and financial interests for pipeline systems has showed how this aspect has been taken care of in the existing petroleum legislation. It has been demonstrated that the integrity of pipeline systems with the res...
New Research on MEMS Acoustic Vector Sensors Used in Pipeline Ground Markers
Song, Xiaopeng; Jian, Zeming; Zhang, Guojun; Liu, Mengran; Guo, Nan; Zhang, Wendong
2015-01-01
According to the demands of current pipeline detection systems, the above-ground marker (AGM) system based on sound detection principle has been a major development trend in pipeline technology. A novel MEMS acoustic vector sensor for AGM systems which has advantages of high sensitivity, high signal-to-noise ratio (SNR), and good low frequency performance has been put forward. Firstly, it is presented that the frequency of the detected sound signal is concentrated in a lower frequency range, and the sound attenuation is relatively low in soil. Secondly, the MEMS acoustic vector sensor structure and basic principles are introduced. Finally, experimental tests are conducted and the results show that in the range of 0°∼90°, when r = 5 m, the proposed MEMS acoustic vector sensor can effectively detect sound signals in soil. The measurement errors of all angles are less than 5°. PMID:25609046
NASA Astrophysics Data System (ADS)
Kyrychok, Vladyslav; Torop, Vasyl
2018-03-01
The present paper is devoted to the problem of the assessment of probable crack growth at pressure vessel nozzles zone under the cyclic seismic loads. The approaches to creating distributed pipeline systems, connected to equipment are being proposed. The possibility of using in common different finite element program packages for accurate estimation of the strength of bonded pipelines and pressure vessels systems is shown and justified. The authors propose checking the danger of defects in nozzle domain, evaluate the residual life of the system, basing on the developed approach.
East Spar: Alliance approach for offshore gasfield development
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1998-04-01
East spar is a gas/condensate field 25 miles west of Barrow Island, offshore Western Australia. Proved plus probable reserves at the time of development were estimated at 430 Bcf gas and 28 million bbl of condensate. The field was discovered in early 1993 when the Western Australia gas market was deregulated and the concept of a gas pipeline to the gold fields was proposed. This created a window of opportunity for East Spar, but only if plans could be established quickly. A base-case development plan was established to support gas marketing while alternative plans could be developed in parallel. Themore » completed East Spar facilities comprise two subsea wells, a subsea gathering system, and a multiphase (gas/condensate/water) pipeline to new gas-processing facilities. The subsea facilities are controlled through a navigation, communication, and control (NCC) buoy. The control room and gas-processing plant are 39 miles east of the field on Varanus Island. Sales gas is exported through a pre-existing gas-sales pipeline to the Dampier-Bunbury and Goldfields Gas Transmission pipelines. Condensate is stored in and exported by use of pre-existing facilities on Varanus Island. Field development from approval to first production took 22 months. The paper describes its field development.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
SADE is a software package for rapidly assembling analytic pipelines to manipulate data. The packages consists of the engine that manages the data and coordinates the movement of data between the tasks performing a function? a set of core libraries consisting of plugins that perform common tasks? and a framework to extend the system supporting the development of new plugins. Currently through configuration files, a pipeline can be defined that maps the routing of data through a series of plugins. Pipelines can be run in a batch mode or can process streaming data? they can be executed from the commandmore » line or run through a Windows background service. There currently exists over a hundred plugins, over fifty pipeline configurations? and the software is now being used by about a half-dozen projects.« less
75 FR 53733 - Pipeline Safety: Information Collection Activities
Federal Register 2010, 2011, 2012, 2013, 2014
2010-09-01
... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket No. PHMSA-2010-0246] Pipeline Safety: Information Collection Activities AGENCY: Pipeline and Hazardous... liquefied natural gas, hazardous liquid, and gas transmission pipeline systems operated by a company. The...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thompson, P.J.
1991-12-01
Geographic information system (GIS) applications for the siting and monitoring of gas pipeline rights-of-way for this project (ROWs) were developed for areas near Rio Vista, California. The data layers developed for this project represent geographic features, such as landcover, elevation, aspect, slope, soils, hydrography, transportation, endangered species, wetlands, and public line surveys. A GIS was used to develop and store spatial data from several sources; to manipulate spatial data to evaluate environmental and engineering issues associated with the siting, permitting, construction, maintenance, and monitoring of gas pipeline ROWS; and to graphically display analysis results. Examples of these applications include (1)more » determination of environmentally sensitive areas, such as endangered species habitat, wetlands, and areas of highly erosive soils; (2) evaluation of engineering constraints, including shallow depth to bedrock, major hydrographic features, and shallow water table; (3) classification of satellite imagery for landuse/landcover that will affect ROWs; and (4) identification of alternative ROW corridors that avoid environmentally sensitive areas or areas with severe engineering constraints.« less
Use of geographic information systems for applications on gas pipeline rights-of-way
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thompson, P.J.
1991-12-01
Geographic information system (GIS) applications for the siting and monitoring of gas pipeline rights-of-way for this project (ROWs) were developed for areas near Rio Vista, California. The data layers developed for this project represent geographic features, such as landcover, elevation, aspect, slope, soils, hydrography, transportation, endangered species, wetlands, and public line surveys. A GIS was used to develop and store spatial data from several sources; to manipulate spatial data to evaluate environmental and engineering issues associated with the siting, permitting, construction, maintenance, and monitoring of gas pipeline ROWS; and to graphically display analysis results. Examples of these applications include (1)more » determination of environmentally sensitive areas, such as endangered species habitat, wetlands, and areas of highly erosive soils; (2) evaluation of engineering constraints, including shallow depth to bedrock, major hydrographic features, and shallow water table; (3) classification of satellite imagery for landuse/landcover that will affect ROWs; and (4) identification of alternative ROW corridors that avoid environmentally sensitive areas or areas with severe engineering constraints.« less
Thakur, Shalabh; Guttman, David S
2016-06-30
Comparative analysis of whole genome sequence data from closely related prokaryotic species or strains is becoming an increasingly important and accessible approach for addressing both fundamental and applied biological questions. While there are number of excellent tools developed for performing this task, most scale poorly when faced with hundreds of genome sequences, and many require extensive manual curation. We have developed a de-novo genome analysis pipeline (DeNoGAP) for the automated, iterative and high-throughput analysis of data from comparative genomics projects involving hundreds of whole genome sequences. The pipeline is designed to perform reference-assisted and de novo gene prediction, homolog protein family assignment, ortholog prediction, functional annotation, and pan-genome analysis using a range of proven tools and databases. While most existing methods scale quadratically with the number of genomes since they rely on pairwise comparisons among predicted protein sequences, DeNoGAP scales linearly since the homology assignment is based on iteratively refined hidden Markov models. This iterative clustering strategy enables DeNoGAP to handle a very large number of genomes using minimal computational resources. Moreover, the modular structure of the pipeline permits easy updates as new analysis programs become available. DeNoGAP integrates bioinformatics tools and databases for comparative analysis of a large number of genomes. The pipeline offers tools and algorithms for annotation and analysis of completed and draft genome sequences. The pipeline is developed using Perl, BioPerl and SQLite on Ubuntu Linux version 12.04 LTS. Currently, the software package accompanies script for automated installation of necessary external programs on Ubuntu Linux; however, the pipeline should be also compatible with other Linux and Unix systems after necessary external programs are installed. DeNoGAP is freely available at https://sourceforge.net/projects/denogap/ .
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-27
... PHMSA-2013-0248] Pipeline Safety: Random Drug Testing Rate; Contractor Management Information System Reporting; and Obtaining Drug and Alcohol Management Information System Sign-In Information AGENCY: Pipeline... Management Information System (MIS) Data; and New Method for Operators to Obtain User Name and Password for...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Childers, M.; Barnes, J.
The phased field development of the Lion and Panthere fields, offshore the Ivory Coast, includes a small floating production, storage, and offloading (FPSO) tanker with minimal processing capability as an early oil production system (EPS). For the long-term production scheme, the FPSO will be replaced by a converted jack up mobile offshore production system (MOPS) with full process equipment. The development also includes guyed-caisson well platforms, pipeline export for natural gas to fuel an onshore power plant, and a floating storage and offloading (FSO) tanker for oil export. Pipeline export for oil is a future possibility. This array of innovativemore » strategies and techniques seldom has been brought together in a single project. The paper describes the development plan, early oil, jack up MOPS, and transport and installation.« less
Pipelines subject to slow landslide movements: Structural modeling vs field measurement
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bruschi, R.; Glavina, S.; Spinazze, M.
1996-12-01
In recent years finite element techniques have been increasingly used to investigate the behavior of buried pipelines subject to soil movements. The use of these tools provides a rational basis for the definition of minimum wall thickness requirements in landslide crossings. Furthermore the design of mitigation measures or monitoring systems which control the development of undesirable strains in the pipe wall over time, requires a detailed structural modeling. The scope of this paper is to discuss the use of dedicated structural modeling with relevant calibration to field measurements. The strain measurements used were regularly gathered from pipe sections, in twomore » different sites over a period of time long enough to record changes of axial strain due to soil movement. Detailed structural modeling of pipeline layout in both sites and for operating conditions, is applied. Numerical simulations show the influence of the distribution of soil movement acting on the pipeline with regards to the state of strain which can be developed in certain locations. The role of soil nature and direction of relative movements in the definition of loads transferred to the pipeline, is also discussed.« less
Experimental and analytical study of water pipe's rupture for damage identification purposes
NASA Astrophysics Data System (ADS)
Papakonstantinou, Konstantinos G.; Shinozuka, Masanobu; Beikae, Mohsen
2011-04-01
A malfunction, local damage or sudden pipe break of a pipeline system can trigger significant flow variations. As shown in the paper, pressure variations and pipe vibrations are two strongly correlated parameters. A sudden change in the flow velocity and pressure of a pipeline system can induce pipe vibrations. Thus, based on acceleration data, a rapid detection and localization of a possible damage may be carried out by inexpensive, nonintrusive monitoring techniques. To illustrate this approach, an experiment on a single pipe was conducted in the laboratory. Pressure gauges and accelerometers were installed and their correlation was checked during an artificially created transient flow. The experimental findings validated the correlation between the parameters. The interaction between pressure variations and pipe vibrations was also theoretically justified. The developed analytical model explains the connection among flow pressure, velocity, pressure wave propagation and pipe vibration. The proposed method provides a rapid, efficient and practical way to identify and locate sudden failures of a pipeline system and sets firm foundations for the development and implementation of an advanced, new generation Supervisory Control and Data Acquisition (SCADA) system for continuous health monitoring of pipe networks.
Computer models of complex multiloop branched pipeline systems
NASA Astrophysics Data System (ADS)
Kudinov, I. V.; Kolesnikov, S. V.; Eremin, A. V.; Branfileva, A. N.
2013-11-01
This paper describes the principal theoretical concepts of the method used for constructing computer models of complex multiloop branched pipeline networks, and this method is based on the theory of graphs and two Kirchhoff's laws applied to electrical circuits. The models make it possible to calculate velocities, flow rates, and pressures of a fluid medium in any section of pipeline networks, when the latter are considered as single hydraulic systems. On the basis of multivariant calculations the reasons for existing problems can be identified, the least costly methods of their elimination can be proposed, and recommendations for planning the modernization of pipeline systems and construction of their new sections can be made. The results obtained can be applied to complex pipeline systems intended for various purposes (water pipelines, petroleum pipelines, etc.). The operability of the model has been verified on an example of designing a unified computer model of the heat network for centralized heat supply of the city of Samara.
Integrating the ODI-PPA scientific gateway with the QuickReduce pipeline for on-demand processing
NASA Astrophysics Data System (ADS)
Young, Michael D.; Kotulla, Ralf; Gopu, Arvind; Liu, Wilson
2014-07-01
As imaging systems improve, the size of astronomical data has continued to grow, making the transfer and processing of data a significant burden. To solve this problem for the WIYN Observatory One Degree Imager (ODI), we developed the ODI-Portal, Pipeline, and Archive (ODI-PPA) science gateway, integrating the data archive, data reduction pipelines, and a user portal. In this paper, we discuss the integration of the QuickReduce (QR) pipeline into PPA's Tier 2 processing framework. QR is a set of parallelized, stand-alone Python routines accessible to all users, and operators who can create master calibration products and produce standardized calibrated data, with a short turn-around time. Upon completion, the data are ingested into the archive and portal, and made available to authorized users. Quality metrics and diagnostic plots are generated and presented via the portal for operator approval and user perusal. Additionally, users can tailor the calibration process to their specific science objective(s) by selecting custom datasets, applying preferred master calibrations or generating their own, and selecting pipeline options. Submission of a QuickReduce job initiates data staging, pipeline execution, and ingestion of output data products all while allowing the user to monitor the process status, and to download or further process/analyze the output within the portal. User-generated data products are placed into a private user-space within the portal. ODI-PPA leverages cyberinfrastructure at Indiana University including the Big Red II supercomputer, the Scholarly Data Archive tape system and the Data Capacitor shared file system.
NASA Astrophysics Data System (ADS)
Vetrov, A.
2009-05-01
The condition of underground constructions, communication and supply systems in the cities has to be periodically monitored and controlled in order to prevent their breakage, which can result in serious accident, especially in urban area. The most risk of damage have the underground construction made of steal such as pipelines widely used for water, gas and heat supply. To ensure the pipeline survivability it is necessary to carry out the operative and inexpensive control of pipelines condition. Induced electromagnetic methods of geophysics can be applied to provide such diagnostics. The highly developed surface in urbane area is one of cause hampering the realization of electromagnetic methods of diagnostics. The main problem is in finding of an appropriate place for the source line and electrodes on a limited surface area and their optimal position relative to the observation path to minimize their influence on observed data. Author made a number of experiments of an underground heating system pipeline diagnostics using different position of the source line and electrodes. The experiments were made on a 200 meters section over 2 meters deep pipeline. The admissible length of the source line and angle between the source line and the observation path were determined. The minimal length of the source line for the experiment conditions and accuracy made 30 meters, the maximum admissible angle departure from the perpendicular position made 30 degrees. The work was undertaken in cooperation with diagnostics company DIsSO, Saint-Petersburg, Russia.
Ye, Weixing; Zhu, Lei; Liu, Yingying; Crickmore, Neil; Peng, Donghai; Ruan, Lifang; Sun, Ming
2012-07-01
We have designed a high-throughput system for the identification of novel crystal protein genes (cry) from Bacillus thuringiensis strains. The system was developed with two goals: (i) to acquire the mixed plasmid-enriched genomic sequence of B. thuringiensis using next-generation sequencing biotechnology, and (ii) to identify cry genes with a computational pipeline (using BtToxin_scanner). In our pipeline method, we employed three different kinds of well-developed prediction methods, BLAST, hidden Markov model (HMM), and support vector machine (SVM), to predict the presence of Cry toxin genes. The pipeline proved to be fast (average speed, 1.02 Mb/min for proteins and open reading frames [ORFs] and 1.80 Mb/min for nucleotide sequences), sensitive (it detected 40% more protein toxin genes than a keyword extraction method using genomic sequences downloaded from GenBank), and highly specific. Twenty-one strains from our laboratory's collection were selected based on their plasmid pattern and/or crystal morphology. The plasmid-enriched genomic DNA was extracted from these strains and mixed for Illumina sequencing. The sequencing data were de novo assembled, and a total of 113 candidate cry sequences were identified using the computational pipeline. Twenty-seven candidate sequences were selected on the basis of their low level of sequence identity to known cry genes, and eight full-length genes were obtained with PCR. Finally, three new cry-type genes (primary ranks) and five cry holotypes, which were designated cry8Ac1, cry7Ha1, cry21Ca1, cry32Fa1, and cry21Da1 by the B. thuringiensis Toxin Nomenclature Committee, were identified. The system described here is both efficient and cost-effective and can greatly accelerate the discovery of novel cry genes.
HESP: Instrument control, calibration and pipeline development
NASA Astrophysics Data System (ADS)
Anantha, Ch.; Roy, Jayashree; Mahesh, P. K.; Parihar, P. S.; Sangal, A. K.; Sriram, S.; Anand, M. N.; Anupama, G. C.; Giridhar, S.; Prabhu, T. P.; Sivarani, T.; Sundararajan, M. S.
Hanle Echelle SPectrograph (HESP) is a fibre-fed, high resolution (R = 30,000 and 60,000) spectrograph being developed for the 2m HCT telescope at IAO, Hanle. The major components of the instrument are a) Cassegrain unit b) Spectrometer instrument. An instrument control system interacting with a guiding unit at Cassegrain interface as well as handling spectrograph functions is being developed. An on-axis auto-guiding using the spill-over angular ring around the input pinhole is also being developed. The stellar light from the Cassegrain unit is taken to the spectrograph using an optical fiber which is being characterized for spectral transmission, focal ratio degradation and scrambling properties. The design of the thermal enclosure and thermal control for the spectrograph housing is presented. A data pipeline for the entire Echelle spectral reduction is being developed. We also plan to implement an instrument physical model based calibration into the main data pipeline and in the maintenance and quality control operations.
A Review of Fatigue Crack Growth for Pipeline Steels Exposed to Hydrogen
Nanninga, N.; Slifka, A.; Levy, Y.; White, C.
2010-01-01
Hydrogen pipeline systems offer an economical means of storing and transporting energy in the form of hydrogen gas. Pipelines can be used to transport hydrogen that has been generated at solar and wind farms to and from salt cavern storage locations. In addition, pipeline transportation systems will be essential before widespread hydrogen fuel cell vehicle technology becomes a reality. Since hydrogen pipeline use is expected to grow, the mechanical integrity of these pipelines will need to be validated under the presence of pressurized hydrogen. This paper focuses on a review of the fatigue crack growth response of pipeline steels when exposed to gaseous hydrogen environments. Because of defect-tolerant design principles in pipeline structures, it is essential that designers consider hydrogen-assisted fatigue crack growth behavior in these applications. PMID:27134796
78 FR 14877 - Pipeline Safety: Incident and Accident Reports
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-07
... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket ID PHMSA-2013-0028] Pipeline Safety: Incident and Accident Reports AGENCY: Pipeline and Hazardous Materials... PHMSA F 7100.2--Incident Report--Natural and Other Gas Transmission and Gathering Pipeline Systems and...
Method for Stereo Mapping Based on Objectarx and Pipeline Technology
NASA Astrophysics Data System (ADS)
Liu, F.; Chen, T.; Lin, Z.; Yang, Y.
2012-07-01
Stereo mapping is an important way to acquire 4D production. Based on the development of the stereo mapping and the characteristics of ObjectARX and pipeline technology, a new stereo mapping scheme which can realize the interaction between the AutoCAD and digital photogrammetry system is offered by ObjectARX and pipeline technology. An experiment is made in order to make sure the feasibility with the example of the software MAP-AT (Modern Aerial Photogrammetry Automatic Triangulation), the experimental results show that this scheme is feasible and it has very important meaning for the realization of the acquisition and edit integration.
Machine-learning-based Brokers for Real-time Classification of the LSST Alert Stream
NASA Astrophysics Data System (ADS)
Narayan, Gautham; Zaidi, Tayeb; Soraisam, Monika D.; Wang, Zhe; Lochner, Michelle; Matheson, Thomas; Saha, Abhijit; Yang, Shuo; Zhao, Zhenge; Kececioglu, John; Scheidegger, Carlos; Snodgrass, Richard T.; Axelrod, Tim; Jenness, Tim; Maier, Robert S.; Ridgway, Stephen T.; Seaman, Robert L.; Evans, Eric Michael; Singh, Navdeep; Taylor, Clark; Toeniskoetter, Jackson; Welch, Eric; Zhu, Songzhe; The ANTARES Collaboration
2018-05-01
The unprecedented volume and rate of transient events that will be discovered by the Large Synoptic Survey Telescope (LSST) demand that the astronomical community update its follow-up paradigm. Alert-brokers—automated software system to sift through, characterize, annotate, and prioritize events for follow-up—will be critical tools for managing alert streams in the LSST era. The Arizona-NOAO Temporal Analysis and Response to Events System (ANTARES) is one such broker. In this work, we develop a machine learning pipeline to characterize and classify variable and transient sources only using the available multiband optical photometry. We describe three illustrative stages of the pipeline, serving the three goals of early, intermediate, and retrospective classification of alerts. The first takes the form of variable versus transient categorization, the second a multiclass typing of the combined variable and transient data set, and the third a purity-driven subtyping of a transient class. Although several similar algorithms have proven themselves in simulations, we validate their performance on real observations for the first time. We quantitatively evaluate our pipeline on sparse, unevenly sampled, heteroskedastic data from various existing observational campaigns, and demonstrate very competitive classification performance. We describe our progress toward adapting the pipeline developed in this work into a real-time broker working on live alert streams from time-domain surveys.
Manifold Coal-Slurry Transport System
NASA Technical Reports Server (NTRS)
Liddle, S. G.; Estus, J. M.; Lavin, M. L.
1986-01-01
Feeding several slurry pipes into main pipeline reduces congestion in coal mines. System based on manifold concept: feeder pipelines from each working entry joined to main pipeline that carries coal slurry out of panel and onto surface. Manifold concept makes coal-slurry haulage much simpler than existing slurry systems.
43 CFR 2801.9 - When do I need a grant?
Code of Federal Regulations, 2014 CFR
2014-10-01
..., pipelines, tunnels, and other systems which impound, store, transport, or distribute water; (2) Pipelines and other systems for transporting or distributing liquids and gases, other than water and other than... and terminal facilities used in connection with them; (3) Pipelines, slurry and emulsion systems, and...
43 CFR 2801.9 - When do I need a grant?
Code of Federal Regulations, 2012 CFR
2012-10-01
..., pipelines, tunnels, and other systems which impound, store, transport, or distribute water; (2) Pipelines and other systems for transporting or distributing liquids and gases, other than water and other than... and terminal facilities used in connection with them; (3) Pipelines, slurry and emulsion systems, and...
43 CFR 2801.9 - When do I need a grant?
Code of Federal Regulations, 2011 CFR
2011-10-01
..., pipelines, tunnels, and other systems which impound, store, transport, or distribute water; (2) Pipelines and other systems for transporting or distributing liquids and gases, other than water and other than... and terminal facilities used in connection with them; (3) Pipelines, slurry and emulsion systems, and...
43 CFR 2801.9 - When do I need a grant?
Code of Federal Regulations, 2013 CFR
2013-10-01
..., pipelines, tunnels, and other systems which impound, store, transport, or distribute water; (2) Pipelines and other systems for transporting or distributing liquids and gases, other than water and other than... and terminal facilities used in connection with them; (3) Pipelines, slurry and emulsion systems, and...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-10-31
... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket No. PHMSA-2013-0097] Pipeline Safety: Reminder of Requirements for Liquefied Petroleum Gas and Utility Liquefied Petroleum Gas Pipeline Systems AGENCY: Pipeline and Hazardous Materials Safety Administration...
Methods of increasing efficiency and maintainability of pipeline systems
NASA Astrophysics Data System (ADS)
Ivanov, V. A.; Sokolov, S. M.; Ogudova, E. V.
2018-05-01
This study is dedicated to the issue of pipeline transportation system maintenance. The article identifies two classes of technical-and-economic indices, which are used to select an optimal pipeline transportation system structure. Further, the article determines various system maintenance strategies and strategy selection criteria. Meanwhile, the maintenance strategies turn out to be not sufficiently effective due to non-optimal values of maintenance intervals. This problem could be solved by running the adaptive maintenance system, which includes a pipeline transportation system reliability improvement algorithm, especially an equipment degradation computer model. In conclusion, three model building approaches for determining optimal technical systems verification inspections duration were considered.
Soysal, Ergin; Wang, Jingqi; Jiang, Min; Wu, Yonghui; Pakhomov, Serguei; Liu, Hongfang; Xu, Hua
2017-11-24
Existing general clinical natural language processing (NLP) systems such as MetaMap and Clinical Text Analysis and Knowledge Extraction System have been successfully applied to information extraction from clinical text. However, end users often have to customize existing systems for their individual tasks, which can require substantial NLP skills. Here we present CLAMP (Clinical Language Annotation, Modeling, and Processing), a newly developed clinical NLP toolkit that provides not only state-of-the-art NLP components, but also a user-friendly graphic user interface that can help users quickly build customized NLP pipelines for their individual applications. Our evaluation shows that the CLAMP default pipeline achieved good performance on named entity recognition and concept encoding. We also demonstrate the efficiency of the CLAMP graphic user interface in building customized, high-performance NLP pipelines with 2 use cases, extracting smoking status and lab test values. CLAMP is publicly available for research use, and we believe it is a unique asset for the clinical NLP community. © The Author 2017. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
MetaStorm: A Public Resource for Customizable Metagenomics Annotation
Arango-Argoty, Gustavo; Singh, Gargi; Heath, Lenwood S.; Pruden, Amy; Xiao, Weidong; Zhang, Liqing
2016-01-01
Metagenomics is a trending research area, calling for the need to analyze large quantities of data generated from next generation DNA sequencing technologies. The need to store, retrieve, analyze, share, and visualize such data challenges current online computational systems. Interpretation and annotation of specific information is especially a challenge for metagenomic data sets derived from environmental samples, because current annotation systems only offer broad classification of microbial diversity and function. Moreover, existing resources are not configured to readily address common questions relevant to environmental systems. Here we developed a new online user-friendly metagenomic analysis server called MetaStorm (http://bench.cs.vt.edu/MetaStorm/), which facilitates customization of computational analysis for metagenomic data sets. Users can upload their own reference databases to tailor the metagenomics annotation to focus on various taxonomic and functional gene markers of interest. MetaStorm offers two major analysis pipelines: an assembly-based annotation pipeline and the standard read annotation pipeline used by existing web servers. These pipelines can be selected individually or together. Overall, MetaStorm provides enhanced interactive visualization to allow researchers to explore and manipulate taxonomy and functional annotation at various levels of resolution. PMID:27632579
MetaStorm: A Public Resource for Customizable Metagenomics Annotation.
Arango-Argoty, Gustavo; Singh, Gargi; Heath, Lenwood S; Pruden, Amy; Xiao, Weidong; Zhang, Liqing
2016-01-01
Metagenomics is a trending research area, calling for the need to analyze large quantities of data generated from next generation DNA sequencing technologies. The need to store, retrieve, analyze, share, and visualize such data challenges current online computational systems. Interpretation and annotation of specific information is especially a challenge for metagenomic data sets derived from environmental samples, because current annotation systems only offer broad classification of microbial diversity and function. Moreover, existing resources are not configured to readily address common questions relevant to environmental systems. Here we developed a new online user-friendly metagenomic analysis server called MetaStorm (http://bench.cs.vt.edu/MetaStorm/), which facilitates customization of computational analysis for metagenomic data sets. Users can upload their own reference databases to tailor the metagenomics annotation to focus on various taxonomic and functional gene markers of interest. MetaStorm offers two major analysis pipelines: an assembly-based annotation pipeline and the standard read annotation pipeline used by existing web servers. These pipelines can be selected individually or together. Overall, MetaStorm provides enhanced interactive visualization to allow researchers to explore and manipulate taxonomy and functional annotation at various levels of resolution.
The optimization of design parameters for surge relief valve for pipeline systems
NASA Astrophysics Data System (ADS)
Kim, Hyunjun; Hur, Jisung; Kim, Sanghyun
2017-06-01
Surge is an abnormal pressure which induced by rapid changes of flow rate in pipeline systems. In order to protect pipeline system from the surge pressure, various hydraulic devices have been developed. Surge-relief valve(SRV) is one of the widely applied devices to control surge due to its feasibility in application, efficiency and cost-effectiveness. SRV is designed to automatically open under abnormal pressure and discharge the flow and makes pressure of the system drop to the allowable level. The performance of the SRV is influenced by hydraulics. According to previous studies, there are several affecting factors which determine performance of the PRV such as design parameters (e.g. size of the valve), system parameters (e.g. number of the valves and location of the valve), and operation parameters (e.g. set point and operation time). Therefore, the systematic consideration for factors affecting performance of SRV is required for the proper installation of SRV in the system. In this study, methodology for finding optimum parameters of the SRV is explored through the integration of Genetic Algorithm(GA) into surge analysis.
75 FR 63774 - Pipeline Safety: Safety of On-Shore Hazardous Liquid Pipelines
Federal Register 2010, 2011, 2012, 2013, 2014
2010-10-18
... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration 49 CFR Part... Pipelines AGENCY: Pipeline and Hazardous Materials Safety Administration (PHMSA), Department of... Gas Pipeline Safety Act of 1968, Public Law 90-481, delegated to DOT the authority to develop...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-02-02
... Management Program for Gas Distribution Pipelines; Correction AGENCY: Pipeline and Hazardous Materials Safety... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration 49 CFR Part... Regulations to require operators of gas distribution pipelines to develop and implement integrity management...
INTERNAL REPAIR OF GAS PIPLINES SURVEY OF OPERATOR EXPERIENCE AND INDUSTRY NEEDS REPORT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ian D. Harris
2003-09-01
A repair method that can be applied from the inside of a gas transmission pipeline (i.e., a trenchless repair) is an attractive alternative to conventional repair methods since the need to excavate the pipeline is precluded. This is particularly true for pipelines in environmentally sensitive and highly populated areas. The objectives of the project are to evaluate, develop, demonstrate, and validate internal repair methods for pipelines; develop a functional specification for an internal pipeline repair system; and prepare a recommended practice for internal repair of pipelines. The purpose of this survey is to better understand the needs and performance requirementsmore » of the natural gas transmission industry regarding internal repair. A total of fifty-six surveys were sent to pipeline operators. A total of twenty completed surveys were returned, representing a 36% response rate, which is considered very good given the fact that tailored surveys are known in the marketing industry to seldom attract more than a 10% response rate. The twenty survey responses produced the following principal conclusions: (1) Use of internal weld repair is most attractive for river crossings, under other bodies of water (e.g., lakes and swamps) in difficult soil conditions, under highways, under congested intersections, and under railway crossings. All these areas tend to be very difficult and very costly if, and where, conventional excavated repairs may be currently used. (2) Internal pipe repair offers a strong potential advantage to the high cost of horizontal direct drilling (HDD) when a new bore must be created to solve a leak or other problem in a water/river crossing. (3) The typical travel distances required can be divided into three distinct groups: up to 305 m (1,000 ft.); between 305 m and 610 m (1,000 ft. and 2,000 ft.); and beyond 914 m (3,000 ft.). In concept, these groups require pig-based systems; despooled umbilical systems could be considered for the first two groups. For the last group a self-propelled system with an onboard self-contained power and welding system is required. (4) Pipe size range requirements range from 50.8 mm (2 in.) through 1,219.2 mm (48 in.) in diameter. The most common size range for 80% to 90% of operators surveyed is 508 mm to 762 mm (20 in. to 30 in.) diameter, with 95% using 558.8 mm (22 in.) diameter pipe.« less
Developing eThread pipeline using SAGA-pilot abstraction for large-scale structural bioinformatics.
Ragothaman, Anjani; Boddu, Sairam Chowdary; Kim, Nayong; Feinstein, Wei; Brylinski, Michal; Jha, Shantenu; Kim, Joohyun
2014-01-01
While most of computational annotation approaches are sequence-based, threading methods are becoming increasingly attractive because of predicted structural information that could uncover the underlying function. However, threading tools are generally compute-intensive and the number of protein sequences from even small genomes such as prokaryotes is large typically containing many thousands, prohibiting their application as a genome-wide structural systems biology tool. To leverage its utility, we have developed a pipeline for eThread--a meta-threading protein structure modeling tool, that can use computational resources efficiently and effectively. We employ a pilot-based approach that supports seamless data and task-level parallelism and manages large variation in workload and computational requirements. Our scalable pipeline is deployed on Amazon EC2 and can efficiently select resources based upon task requirements. We present runtime analysis to characterize computational complexity of eThread and EC2 infrastructure. Based on results, we suggest a pathway to an optimized solution with respect to metrics such as time-to-solution or cost-to-solution. Our eThread pipeline can scale to support a large number of sequences and is expected to be a viable solution for genome-scale structural bioinformatics and structure-based annotation, particularly, amenable for small genomes such as prokaryotes. The developed pipeline is easily extensible to other types of distributed cyberinfrastructure.
Developing eThread Pipeline Using SAGA-Pilot Abstraction for Large-Scale Structural Bioinformatics
Ragothaman, Anjani; Feinstein, Wei; Jha, Shantenu; Kim, Joohyun
2014-01-01
While most of computational annotation approaches are sequence-based, threading methods are becoming increasingly attractive because of predicted structural information that could uncover the underlying function. However, threading tools are generally compute-intensive and the number of protein sequences from even small genomes such as prokaryotes is large typically containing many thousands, prohibiting their application as a genome-wide structural systems biology tool. To leverage its utility, we have developed a pipeline for eThread—a meta-threading protein structure modeling tool, that can use computational resources efficiently and effectively. We employ a pilot-based approach that supports seamless data and task-level parallelism and manages large variation in workload and computational requirements. Our scalable pipeline is deployed on Amazon EC2 and can efficiently select resources based upon task requirements. We present runtime analysis to characterize computational complexity of eThread and EC2 infrastructure. Based on results, we suggest a pathway to an optimized solution with respect to metrics such as time-to-solution or cost-to-solution. Our eThread pipeline can scale to support a large number of sequences and is expected to be a viable solution for genome-scale structural bioinformatics and structure-based annotation, particularly, amenable for small genomes such as prokaryotes. The developed pipeline is easily extensible to other types of distributed cyberinfrastructure. PMID:24995285
78 FR 28837 - Acadian Gas Pipeline System; Notice of Petition
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-16
... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. PR11-129-001] Acadian Gas Pipeline System; Notice of Petition Take notice that on May 6, 2013, Acadian Gas Pipeline System (Acadian... concerns filed in the September 26, 2011 filing, as more fully detailed in the petition. Any person...
40 CFR 761.60 - Disposal requirements.
Code of Federal Regulations, 2010 CFR
2010-07-01
... a disposal facility approved under this part. (5) Natural gas pipeline systems containing PCBs. The owner or operator of natural gas pipeline systems containing ≥50 ppm PCBs, when no longer in use, shall... the PCB concentrations in natural gas pipeline systems shall do so in accordance with paragraph (b)(5...
Göbl, Rüdiger; Navab, Nassir; Hennersperger, Christoph
2018-06-01
Research in ultrasound imaging is limited in reproducibility by two factors: First, many existing ultrasound pipelines are protected by intellectual property, rendering exchange of code difficult. Second, most pipelines are implemented in special hardware, resulting in limited flexibility of implemented processing steps on such platforms. With SUPRA, we propose an open-source pipeline for fully software-defined ultrasound processing for real-time applications to alleviate these problems. Covering all steps from beamforming to output of B-mode images, SUPRA can help improve the reproducibility of results and make modifications to the image acquisition mode accessible to the research community. We evaluate the pipeline qualitatively, quantitatively, and regarding its run time. The pipeline shows image quality comparable to a clinical system and backed by point spread function measurements a comparable resolution. Including all processing stages of a usual ultrasound pipeline, the run-time analysis shows that it can be executed in 2D and 3D on consumer GPUs in real time. Our software ultrasound pipeline opens up the research in image acquisition. Given access to ultrasound data from early stages (raw channel data, radiofrequency data), it simplifies the development in imaging. Furthermore, it tackles the reproducibility of research results, as code can be shared easily and even be executed without dedicated ultrasound hardware.
Natural disasters and the gas pipeline system.
DOT National Transportation Integrated Search
1996-11-01
Episodic descriptions are provided of the effects of the Loma Prieta earthquake (1989) on the gas pipeline systems of Pacific Gas & Electric Company and the Cit of Palo Alto and of the Northridge earthquake (1994) on Southern California Gas' pipeline...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-27
... Reports for Gas Pipeline Operators AGENCY: Pipeline and Hazardous Materials Safety Administration, DOT... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket No... Pipeline Systems; PHMSA F 7100.2-1 Annual Report for Calendar Year 20xx Natural and Other Gas Transmission...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-17
... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration 49 CFR 191... Reports AGENCY: Pipeline and Hazardous Materials Safety Administration (PHMSA), DOT. ACTION: Issuance of... Pipeline and Hazardous Materials Safety Administration (PHMSA) published a final rule on November 26, 2010...
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1995-02-17
The Natural Gas Transmission and Distribution Model (NGTDM) is the component of the National Energy Modeling System (NEMS) that is used to represent the domestic natural gas transmission and distribution system. NEMS was developed in the Office of integrated Analysis and Forecasting of the Energy information Administration (EIA). NEMS is the third in a series of computer-based, midterm energy modeling systems used since 1974 by the EIA and its predecessor, the Federal Energy Administration, to analyze domestic energy-economy markets and develop projections. The NGTDM is the model within the NEMS that represents the transmission, distribution, and pricing of natural gas.more » The model also includes representations of the end-use demand for natural gas, the production of domestic natural gas, and the availability of natural gas traded on the international market based on information received from other NEMS models. The NGTDM determines the flow of natural gas in an aggregate, domestic pipeline network, connecting domestic and foreign supply regions with 12 demand regions. The methodology employed allows the analysis of impacts of regional capacity constraints in the interstate natural gas pipeline network and the identification of pipeline capacity expansion requirements. There is an explicit representation of core and noncore markets for natural gas transmission and distribution services, and the key components of pipeline tariffs are represented in a pricing algorithm. Natural gas pricing and flow patterns are derived by obtaining a market equilibrium across the three main elements of the natural gas market: the supply element, the demand element, and the transmission and distribution network that links them. The NGTDM consists of four modules: the Annual Flow Module, the Capacity F-expansion Module, the Pipeline Tariff Module, and the Distributor Tariff Module. A model abstract is provided in Appendix A.« less
Air-cushion tankers for Alaskan North Slope oil
NASA Technical Reports Server (NTRS)
Anderson, J. L.
1973-01-01
A concept is described for transporting oil from the Arctic to southern markets in 10,000-ton, chemically fueled air-cushion vehicles (ACV's) configured as tankers. Based on preliminary cost estimates the conceptual ACV tanker system as tailored to the transportation of Alaskan North Slope oil could deliver the oil for about the same price per barrel as the proposed trans-Alaska pipeline with only one-third of the capital investment. The report includes the description of the conceptual system and its operation; preliminary cost estimates; an appraisal of ACV tanker development; and a comparison of system costs, versatility, vulnerability, and ecological effect with those of the trans-Alaska pipeline.
Low-cost failure sensor design and development for water pipeline distribution systems.
Khan, K; Widdop, P D; Day, A J; Wood, A S; Mounce, S R; Machell, J
2002-01-01
This paper describes the design and development of a new sensor which is low cost to manufacture and install and is reliable in operation with sufficient accuracy, resolution and repeatability for use in newly developed systems for pipeline monitoring and leakage detection. To provide an appropriate signal, the concept of a "failure" sensor is introduced, in which the output is not necessarily proportional to the input, but is unmistakably affected when an unusual event occurs. The design of this failure sensor is based on the water opacity which can be indicative of an unusual event in a water distribution network. The laboratory work and field trials necessary to design and prove out this type of failure sensor are described here. It is concluded that a low-cost failure sensor of this type has good potential for use in a comprehensive water monitoring and management system based on Artificial Neural Networks (ANN).
First Retrieval of Surface Lambert Albedos From Mars Reconnaissance Orbiter CRISM Data
NASA Astrophysics Data System (ADS)
McGuire, P. C.; Arvidson, R. E.; Murchie, S. L.; Wolff, M. J.; Smith, M. D.; Martin, T. Z.; Milliken, R. E.; Mustard, J. F.; Pelkey, S. M.; Lichtenberg, K. A.; Cavender, P. J.; Humm, D. C.; Titus, T. N.; Malaret, E. R.
2006-12-01
We have developed a pipeline-processing software system to convert radiance-on-sensor for each of 72 out of 544 CRISM spectral bands used in global mapping to the corresponding surface Lambert albedo, accounting for atmospheric, thermal, and photoclinometric effects. We will present and interpret first results from this software system for the retrieval of Lambert albedos from CRISM data. For the multispectral mapping modes, these pipeline-processed 72 spectral bands constitute all of the available bands, for wavelengths from 0.362-3.920 μm, at 100-200 m/pixel spatial resolution, and ~ 0.006\\spaceμm spectral resolution. For the hyperspectral targeted modes, these pipeline-processed 72 spectral bands are only a selection of all of the 544 spectral bands, but at a resolution of 15-38 m/pixel. The pipeline processing for both types of observing modes (multispectral and hyperspectral) will use climatology, based on data from MGS/TES, in order to estimate ice- and dust-aerosol optical depths, prior to the atmospheric correction with lookup tables based upon radiative-transport calculations via DISORT. There is one DISORT atmospheric-correction lookup table for converting radiance-on-sensor to Lambert albedo for each of the 72 spectral bands. The measurements of the Emission Phase Function (EPF) during targeting will not be employed in this pipeline processing system. We are developing a separate system for extracting more accurate aerosol optical depths and surface scattering properties. This separate system will use direct calls (instead of lookup tables) to the DISORT code for all 544 bands, and it will use the EPF data directly, bootstrapping from the climatology data for the aerosol optical depths. The pipeline processing will thermally correct the albedos for the spectral bands above ~ 2.6 μm, by a choice between 4 different techniques for determining surface temperature: 1) climatology, 2) empirical estimation of the albedo at 3.9 μm from the measured albedo at 2.5 μm, 3) a physical thermal model (PTM) based upon maps of thermal inertia from TES and coarse-resolution surface slopes (SS) from MOLA, and 4) a photoclinometric extension to the PTM that uses CRISM albedos at 0.41 μm to compute the SS at CRISM spatial resolution. For the thermal correction, we expect that each of these 4 different techniques will be valuable for some fraction of the observations.
Gazda, Nicholas P; Griffin, Emily; Hamrick, Kasey; Baskett, Jordan; Mellon, Meghan M; Eckel, Stephen F; Granko, Robert P
2018-04-01
Purpose: The purpose of this article is to share experiences after the development of a health-system pharmacy administration residency with a MS degree and express the need for additional programs in nonacademic medical center health-system settings. Summary: Experiences with the development and implementation of a health-system pharmacy administration residency at a large community teaching hospital are described. Resident candidates benefit from collaborations with other health-systems through master's degree programs and visibility to leaders at your health-system. Programs benefit from building a pipeline of future pharmacy administrators and by leveraging the skills of residents to contribute to projects and department-wide initiatives. Tools to assist in the implementation of a new pharmacy administration program are also described and include rotation and preceptor development, marketing and recruiting, financial evaluation, and steps to prepare for accreditation. Conclusion: Health-system pharmacy administration residents provide the opportunity to build a pipeline of high-quality leaders, provide high-level project involvement, and produce a positive return on investment (ROI) for health-systems. These programs should be explored in academic and nonacademic-based health-systems.
77 FR 51848 - Pipeline Safety: Information Collection Activities
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-27
... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket No... Program for Gas Distribution Pipelines. DATES: Interested persons are invited to submit comments on or.... These regulations require operators of hazardous liquid pipelines and gas pipelines to develop and...
NASA Astrophysics Data System (ADS)
Castaneda-Lopez, Homero
A methodology for detecting and locating defects or discontinuities on the outside covering of coated metal underground pipelines subjected to cathodic protection has been addressed. On the basis of wide range AC impedance signals for various frequencies applied to a steel-coated pipeline system and by measuring its corresponding transfer function under several laboratory simulation scenarios, a physical laboratory setup of an underground cathodic-protected, coated pipeline was built. This model included different variables and elements that exist under real conditions, such as soil resistivity, soil chemical composition, defect (holiday) location in the pipeline covering, defect area and geometry, and level of cathodic protection. The AC impedance data obtained under different working conditions were used to fit an electrical transmission line model. This model was then used as a tool to fit the impedance signal for different experimental conditions and to establish trends in the impedance behavior without the necessity of further experimental work. However, due to the chaotic nature of the transfer function response of this system under several conditions, it is believed that non-deterministic models based on pattern recognition algorithms are suitable for field condition analysis. A non-deterministic approach was used for experimental analysis by applying an artificial neural network (ANN) algorithm based on classification analysis capable of studying the pipeline system and differentiating the variables that can change impedance conditions. These variables include level of cathodic protection, location of discontinuities (holidays), and severity of corrosion. This work demonstrated a proof-of-concept for a well-known technique and a novel algorithm capable of classifying impedance data for experimental results to predict the exact location of the active holidays and defects on the buried pipelines. Laboratory findings from this procedure are promising, and efforts to develop it for field conditions should continue.
Microcomputers, software combine to provide daily product, movement inventory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cable, T.
1985-06-01
This paper describes the efforts of Sante Fe Pipelines Inc. in keeping track of product inventory on the 810 mile, 12-in. Chapparal Pipeline and the 1,913 mile, 8- and 10-in. Gulf Central Pipeline. The decision to use a PC for monitoring the inventory was significant. The application was completed by TRON, Inc. The system is actually two major subsystems. The pipeline system accounts for injections into the pipeline and deliveries of product. This feeds the storage and the terminal inventory system where inventories are maintained at storage locations by shipper and supplier account. The paper further explains the inventory monitoringmore » process in detail. Communications software is described as well.« less
49 CFR 195.401 - General requirements.
Code of Federal Regulations, 2013 CFR
2013-10-01
... its pipeline system according to the following requirements: (1) Non Integrity management repairs... affected part of the system until it has corrected the unsafe condition. (2) Integrity management repairs... this part: (1) An interstate pipeline, other than a low-stress pipeline, on which construction was...
49 CFR 195.401 - General requirements.
Code of Federal Regulations, 2014 CFR
2014-10-01
... its pipeline system according to the following requirements: (1) Non Integrity management repairs... affected part of the system until it has corrected the unsafe condition. (2) Integrity management repairs... this part: (1) An interstate pipeline, other than a low-stress pipeline, on which construction was...
49 CFR 195.401 - General requirements.
Code of Federal Regulations, 2011 CFR
2011-10-01
... its pipeline system according to the following requirements: (1) Non Integrity management repairs... affected part of the system until it has corrected the unsafe condition. (2) Integrity management repairs... this part: (1) An interstate pipeline, other than a low-stress pipeline, on which construction was...
49 CFR 195.401 - General requirements.
Code of Federal Regulations, 2012 CFR
2012-10-01
... its pipeline system according to the following requirements: (1) Non Integrity management repairs... affected part of the system until it has corrected the unsafe condition. (2) Integrity management repairs... this part: (1) An interstate pipeline, other than a low-stress pipeline, on which construction was...
NASA Astrophysics Data System (ADS)
Branch, B. D.; Raskin, R. G.; Rock, B.; Gagnon, M.; Lecompte, M. A.; Hayden, L. B.
2009-12-01
With the nation challenged to comply with Executive Order 12906 and its needs to augment the Science, Technology, Engineering and Mathematics (STEM) pipeline, applied focus on geosciences pipelines issue may be at risk. The Geosciences pipeline may require intentional K-12 standard course of study consideration in the form of project based, science based and evidenced based learning. Thus, the K-12 to geosciences to informatics pipeline may benefit from an earth science experience that utilizes a community based “learning by doing” approach. Terms such as Community GIS, Community Remotes Sensing, and Community Based Ontology development are termed Community Informatics. Here, approaches of interdisciplinary work to promote and earth science literacy are affordable, consisting of low cost equipment that renders GIS/remote sensing data processing skills necessary in the workforce. Hence, informal community ontology development may evolve or mature from a local community towards formal scientific community collaboration. Such consideration may become a means to engage educational policy towards earth science paradigms and needs, specifically linking synergy among Math, Computer Science, and Earth Science disciplines.
Guo, Li; Allen, Kelly S; Deiulio, Greg; Zhang, Yong; Madeiras, Angela M; Wick, Robert L; Ma, Li-Jun
2016-01-01
Current and emerging plant diseases caused by obligate parasitic microbes such as rusts, downy mildews, and powdery mildews threaten worldwide crop production and food safety. These obligate parasites are typically unculturable in the laboratory, posing technical challenges to characterize them at the genetic and genomic level. Here we have developed a data analysis pipeline integrating several bioinformatic software programs. This pipeline facilitates rapid gene discovery and expression analysis of a plant host and its obligate parasite simultaneously by next generation sequencing of mixed host and pathogen RNA (i.e., metatranscriptomics). We applied this pipeline to metatranscriptomic sequencing data of sweet basil (Ocimum basilicum) and its obligate downy mildew parasite Peronospora belbahrii, both lacking a sequenced genome. Even with a single data point, we were able to identify both candidate host defense genes and pathogen virulence genes that are highly expressed during infection. This demonstrates the power of this pipeline for identifying genes important in host-pathogen interactions without prior genomic information for either the plant host or the obligate biotrophic pathogen. The simplicity of this pipeline makes it accessible to researchers with limited computational skills and applicable to metatranscriptomic data analysis in a wide range of plant-obligate-parasite systems.
75 FR 67807 - Pipeline Safety: Emergency Preparedness Communications
Federal Register 2010, 2011, 2012, 2013, 2014
2010-11-03
... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket No... is issuing an Advisory Bulletin to remind operators of gas and hazardous liquid pipeline facilities... Gas Pipeline Systems. Subject: Emergency Preparedness Communications. Advisory: To further enhance the...
76 FR 65778 - Pipeline Safety: Information Collection Activities
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-24
... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket No...: 12,120. Frequency of Collection: On occasion. 2. Title: Recordkeeping for Natural Gas Pipeline... investigating incidents. Affected Public: Operators of natural gas pipeline systems. Annual Reporting and...
Rapid Processing of Radio Interferometer Data for Transient Surveys
NASA Astrophysics Data System (ADS)
Bourke, S.; Mooley, K.; Hallinan, G.
2014-05-01
We report on a software infrastructure and pipeline developed to process large radio interferometer datasets. The pipeline is implemented using a radical redesign of the AIPS processing model. An infrastructure we have named AIPSlite is used to spawn, at runtime, minimal AIPS environments across a cluster. The pipeline then distributes and processes its data in parallel. The system is entirely free of the traditional AIPS distribution and is self configuring at runtime. This software has so far been used to process a EVLA Stripe 82 transient survey, the data for the JVLA-COSMOS project, and has been used to process most of the EVLA L-Band data archive imaging each integration to search for short duration transients.
Distributed fiber optic system for oil pipeline leakage detection
NASA Astrophysics Data System (ADS)
Paranjape, R.; Liu, N.; Rumple, C.; Hara, Elmer H.
2003-02-01
We present a novel approach for the detection of leakage in oil pipelines using methods of fiber optic distributed sensors, a presence-of-oil based actuator, and Optical Time Domain Reflectometry (OTDR). While the basic concepts of our approach are well understood, the integration of the components into a complete system is a real world engineering design problem. Our focus has been on the development of the actuator design and testing using installed dark fiber. Initial results are promising, however environmental studies into the long term effects of exposure to the environment are still pending.
iGAS: A framework for using electronic intraoperative medical records for genomic discovery.
Levin, Matthew A; Joseph, Thomas T; Jeff, Janina M; Nadukuru, Rajiv; Ellis, Stephen B; Bottinger, Erwin P; Kenny, Eimear E
2017-03-01
Design and implement a HIPAA and Integrating the Healthcare Enterprise (IHE) profile compliant automated pipeline, the integrated Genomics Anesthesia System (iGAS), linking genomic data from the Mount Sinai Health System (MSHS) BioMe biobank to electronic anesthesia records, including physiological data collected during the perioperative period. The resulting repository of multi-dimensional data can be used for precision medicine analysis of physiological readouts, acute medical conditions, and adverse events that can occur during surgery. A structured pipeline was developed atop our existing anesthesia data warehouse using open-source tools. The pipeline is automated using scheduled tasks. The pipeline runs weekly, and finds and identifies all new and existing anesthetic records for BioMe participants. The pipeline went live in June 2015 with 49.2% (n=15,673) of BioMe participants linked to 40,947 anesthetics. The pipeline runs weekly in minimal time. After eighteen months, an additional 3671 participants were enrolled in BioMe and the number of matched anesthetic records grew 21% to 49,545. Overall percentage of BioMe patients with anesthetics remained similar at 51.1% (n=18,128). Seven patients opted out during this time. The median number of anesthetics per participant was 2 (range 1-144). Collectively, there were over 35 million physiologic data points and 480,000 medication administrations linked to genomic data. To date, two projects are using the pipeline at MSHS. Automated integration of biobank and anesthetic data sources is feasible and practical. This integration enables large-scale genomic analyses that might inform variable physiological response to anesthetic and surgical stress, and examine genetic factors underlying adverse outcomes during and after surgery. Copyright © 2017 Elsevier Inc. All rights reserved.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-23
... Federal agency for pipeline security, it is important for TSA to have contact information for company... DEPARTMENT OF HOMELAND SECURITY Transportation Security Administration Extension of Agency Information Collection Activity Under OMB Review: Pipeline System Operator Security Information AGENCY...
The Eastring gas pipeline in the context of the Central and Eastern European gas supply challenge
NASA Astrophysics Data System (ADS)
Mišík, Matúš; Nosko, Andrej
2017-11-01
Ever since the 2009 natural gas crisis, energy security has been a crucial priority for countries of Central and Eastern Europe. Escalating in 2014, the conflict between Ukraine and Russia further fuelled negative expectations about the future development of energy relations for the region predominantly supplied by Russia. As a response to the planned cessation of gas transit through the Brotherhood pipeline, which brings Russian gas to Europe via Ukraine and Slovakia, the Slovak transmission system operator Eustream proposed the Eastring pipeline. This Perspective analyses this proposal and argues that neither the perceived decrease in Slovak energy security nor the loss of economic rent from the international gas transit should be the main policy driver behind such a major infrastructure project. Although marketed as an answer to current Central and Eastern European gas supply security challenges, the Eastring pipeline is actually mainly focused on issues connected to the Slovak gas transit.
Rapid, Vehicle-Based Identification of Location and Magnitude of Urban Natural Gas Pipeline Leaks.
von Fischer, Joseph C; Cooley, Daniel; Chamberlain, Sam; Gaylord, Adam; Griebenow, Claire J; Hamburg, Steven P; Salo, Jessica; Schumacher, Russ; Theobald, David; Ham, Jay
2017-04-04
Information about the location and magnitudes of natural gas (NG) leaks from urban distribution pipelines is important for minimizing greenhouse gas emissions and optimizing investment in pipeline management. To enable rapid collection of such data, we developed a relatively simple method using high-precision methane analyzers in Google Street View cars. Our data indicate that this automated leak survey system can document patterns in leak location and magnitude within and among cities, even without wind data. We found that urban areas with prevalent corrosion-prone distribution lines (Boston, MA, Staten Island, NY, and Syracuse, NY), leaked approximately 25-fold more methane than cities with more modern pipeline materials (Burlington, VT, and Indianapolis, IN). Although this mobile monitoring method produces conservative estimates of leak rates and leak counts, it can still help prioritize both leak repairs and replacement of leak-prone sections of distribution lines, thus minimizing methane emissions over short and long terms.
GPU-Powered Coherent Beamforming
NASA Astrophysics Data System (ADS)
Magro, A.; Adami, K. Zarb; Hickish, J.
2015-03-01
Graphics processing units (GPU)-based beamforming is a relatively unexplored area in radio astronomy, possibly due to the assumption that any such system will be severely limited by the PCIe bandwidth required to transfer data to the GPU. We have developed a CUDA-based GPU implementation of a coherent beamformer, specifically designed and optimized for deployment at the BEST-2 array which can generate an arbitrary number of synthesized beams for a wide range of parameters. It achieves ˜1.3 TFLOPs on an NVIDIA Tesla K20, approximately 10x faster than an optimized, multithreaded CPU implementation. This kernel has been integrated into two real-time, GPU-based time-domain software pipelines deployed at the BEST-2 array in Medicina: a standalone beamforming pipeline and a transient detection pipeline. We present performance benchmarks for the beamforming kernel as well as the transient detection pipeline with beamforming capabilities as well as results of test observation.
Airborne LIDAR Pipeline Inspection System (ALPIS) Mapping Tests
DOT National Transportation Integrated Search
2003-06-06
Natural gas and hazardous liquid pipeline operators have a need to identify where leaks are occurring along their pipelines in order to lower the risks the pipelines pose to people and the environment. Current methods of locating natural gas and haza...
Engineering considerations for corrosion monitoring of gas gathering pipeline systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Braga, T.G.; Asperger, R.G.
1987-01-01
Proper corrosion monitoring of gas gathering pipelines requires a system review to determine the appropriate monitor locations and types of monitoring techniques. This paper develops and discusses a classification of conditions such as flow regime and gas composition. Also discussed are junction categories which, for corrosion monitoring, need to be considered from two points of view. The first is related to fluid flow in the line and the second is related corrosion inhibitor movement along the pipeline. The appropriate application of the various monitoring techniques such as coupons, hydrogen detectors, electrical resistance probe and linear polarization probes are discussed inmore » relation to flow regime and gas composition. Problems caused by semi-conduction from iron sulfide are considered. Advantages and disadvantages of fluid gathering methods such as pots and flow-through drips are discussed in relation to their reliability as on-line monitoring locations.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grafe, J.L.
During the past decade many changes have taken place in the natural gas industry, not the least of which is the way information (data) is acquired, moved, compiled, integrated and disseminated within organizations. At El Paso Natural Gas Company (EPNG) the Operations Control Department has been at the center of these changes. The Systems Section within Operations Control has been instrumental in developing the computer programs that acquire and store real-time operational data, and then make it available to not only the Gas Control function, but also to anyone else within the company who might require it and, to amore » limited degree, any supplier or purchaser of gas utilizing the El Paso pipeline. These computer programs which make up the VISA system are, in effect, the tools that help move the data that flows in the pipeline of information within the company. Their integration into this pipeline process is the topic of this paper.« less
Finite Element Analysis and Experimental Study on Elbow Vibration Transmission Characteristics
NASA Astrophysics Data System (ADS)
Qing-shan, Dai; Zhen-hai, Zhang; Shi-jian, Zhu
2017-11-01
Pipeline system vibration is one of the significant factors leading to the vibration and noise of vessel. Elbow is widely used in the pipeline system. However, the researches about vibration of elbow are little, and there is no systematic study. In this research, we firstly analysed the relationship between elbow vibration transmission characteristics and bending radius by ABAQUS finite element simulation. Then, we conducted the further vibration test to observe the vibration transmission characteristics of different elbows which have the same diameter and different bending radius under different flow velocity. The results of simulation calculation and experiment both showed that the vibration acceleration levels of the pipeline system decreased with the increase of bending radius of the elbow, which was beneficial to reduce the transmission of vibration in the pipeline system. The results could be used as reference for further studies and designs for the low noise installation of pipeline system.
Diagnostic layer integration in FPGA-based pipeline measurement systems for HEP experiments
NASA Astrophysics Data System (ADS)
Pozniak, Krzysztof T.
2007-08-01
Integrated triggering and data acquisition systems for high energy physics experiments may be considered as fast, multichannel, synchronous, distributed, pipeline measurement systems. A considerable extension of functional, technological and monitoring demands, which has recently been imposed on them, forced a common usage of large field-programmable gate array (FPGA), digital signal processing-enhanced matrices and fast optical transmission for their realization. This paper discusses modelling, design, realization and testing of pipeline measurement systems. A distribution of synchronous data stream flows is considered in the network. A general functional structure of a single network node is presented. A suggested, novel block structure of the node model facilitates full implementation in the FPGA chip, circuit standardization and parametrization, as well as integration of functional and diagnostic layers. A general method for pipeline system design was derived. This method is based on a unified model of the synchronous data network node. A few examples of practically realized, FPGA-based, pipeline measurement systems were presented. The described systems were applied in ZEUS and CMS.
An integrated pipeline to create and experience compelling scenarios in virtual reality
NASA Astrophysics Data System (ADS)
Springer, Jan P.; Neumann, Carsten; Reiners, Dirk; Cruz-Neira, Carolina
2011-03-01
One of the main barriers to create and use compelling scenarios in virtual reality is the complexity and time-consuming efforts for modeling, element integration, and the software development to properly display and interact with the content in the available systems. Still today, most virtual reality applications are tedious to create and they are hard-wired to the specific display and interaction system available to the developers when creating the application. Furthermore, it is not possible to alter the content or the dynamics of the content once the application has been created. We present our research on designing a software pipeline that enables the creation of compelling scenarios with a fair degree of visual and interaction complexity in a semi-automated way. Specifically, we are targeting drivable urban scenarios, ranging from large cities to sparsely populated rural areas that incorporate both static components (e. g., houses, trees) and dynamic components (e. g., people, vehicles) as well as events, such as explosions or ambient noise. Our pipeline has four basic components. First, an environment designer, where users sketch the overall layout of the scenario, and an automated method constructs the 3D environment from the information in the sketch. Second, a scenario editor used for authoring the complete scenario, incorporate the dynamic elements and events, fine tune the automatically generated environment, define the execution conditions of the scenario, and set up any data gathering that may be necessary during the execution of the scenario. Third, a run-time environment for different virtual-reality systems provides users with the interactive experience as designed with the designer and the editor. And fourth, a bi-directional monitoring system that allows for capturing and modification of information from the virtual environment. One of the interesting capabilities of our pipeline is that scenarios can be built and modified on-the-fly as they are being presented in the virtual-reality systems. Users can quickly prototype the basic scene using the designer and the editor on a control workstation. More elements can then be introduced into the scene from both the editor and the virtual-reality display. In this manner, users are able to gradually increase the complexity of the scenario with immediate feedback. The main use of this pipeline is the rapid development of scenarios for human-factors studies. However, it is applicable in a much more general context.
NASA Astrophysics Data System (ADS)
Huang, Li-Xin; Gao, Hai-Xia; Li, Chun-Shu; Xiao, Chang-Ming
2009-08-01
In a colloidal system confined by a small cylindric pipeline, the depletion interaction between two large spheres is different to the system confined by two plates, and the influence on depletion interaction from the pipeline is related to both the size and shape of it. In this paper, the depletion interactions in the systems confined by pipelines of different sizes or different shapes are studied by Monte Carlo simulations. The numerical results show that the influence on depletion force from the cylindric pipeline is stronger than that from two parallel plates, and the depletion force will be strengthened when the diameter of the cylinder is decreased. In addition, we also find that the depletion interaction is rather affected if the shape change of the pipeline is slightly changed, and the influence on depletion force from the shape change is stronger than that from the size change.
Genetically engineered plants in the product development pipeline in India.
Warrier, Ranjini; Pande, Hem
2016-01-02
In order to proactively identify emerging issues that may impact the risk assessment and risk management functions of the Indian biosafety regulatory system, the Ministry of Environment, Forests and Climate Change sought to understand the nature and diversity of genetically engineered crops that may move to product commercialization within the next 10 y. This paper describes the findings from a questionnaire designed to solicit information about public and private sector research and development (R&D) activities in plant biotechnology. It is the first comprehensive overview of the R&D pipeline for GE crops in India.
49 CFR 191.11 - Distribution system: Annual report.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 49 Transportation 3 2011-10-01 2011-10-01 false Distribution system: Annual report. 191.11 Section 191.11 Transportation Other Regulations Relating to Transportation (Continued) PIPELINE AND HAZARDOUS MATERIALS SAFETY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED) PIPELINE SAFETY TRANSPORTATION OF NATURAL AND OTHER GAS BY PIPELINE;...
49 CFR 191.9 - Distribution system: Incident report.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 49 Transportation 3 2011-10-01 2011-10-01 false Distribution system: Incident report. 191.9 Section 191.9 Transportation Other Regulations Relating to Transportation (Continued) PIPELINE AND HAZARDOUS MATERIALS SAFETY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED) PIPELINE SAFETY TRANSPORTATION OF NATURAL AND OTHER GAS BY PIPELINE;...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-09
...'s Belle River-St. Clair Pipeline into the new 21-mile long Dawn Gateway Pipeline system, which... & Optimization, DTE Pipeline/Dawn Gateway LLC, One Energy Plaza, Detroit, MI 48226, phone (313) 235-6531 or e...
New gas-pipeline system planned for Argentina
DOE Office of Scientific and Technical Information (OSTI.GOV)
Andrich, V.
1979-03-01
A new gas-pipeline system planned for Argentina by Gas del Estado will carry up to 10 million cu m/day to Mendoza, San Juan, and San Luis from the Neuquen basin. Operating on a toll system of transport payment, the Centro-Oeste pipeline system will consist of 1100 km of over 30 in. dia trunk line and 600 km of 8, 12, and 18 in. dia lateral lines.
The Herschel Data Processing System - Hipe And Pipelines - During The Early Mission Phase
NASA Astrophysics Data System (ADS)
Ardila, David R.; Herschel Science Ground Segment Consortium
2010-01-01
The Herschel Space Observatory, the fourth cornerstone mission in the ESA science program, was launched 14th of May 2009. With a 3.5 m telescope, it is the largest space telescope ever launched. Herschel's three instruments (HIFI, PACS, and SPIRE) perform photometry and spectroscopy in the 55 - 672 micron range and will deliver exciting science for the astronomical community during at least three years of routine observations. Here we summarize the state of the Herschel Data Processing System and give an overview about future development milestones and plans. The development of the Herschel Data Processing System started seven years ago to support the data analysis for Instrument Level Tests. Resources were made available to implement a freely distributable Data Processing System capable of interactively and automatically reduce Herschel data at different processing levels. The system combines data retrieval, pipeline execution and scientific analysis in one single environment. The software is coded in Java and Jython to be platform independent and to avoid the need for commercial licenses. The Herschel Interactive Processing Environment (HIPE) is the user-friendly face of Herschel Data Processing. The first PACS preview observation of M51 was processed with HIPE, using basic pipeline scripts to a fantastic image within 30 minutes of data reception. Also the first HIFI observations on DR-21 were successfully reduced to high quality spectra, followed by SPIRE observations on M66 and M74. The Herschel Data Processing System is a joint development by the Herschel Science Ground Segment Consortium, consisting of ESA, the NASA Herschel Science Center, and the HIFI, PACS and SPIRE consortium members.
Code of Federal Regulations, 2010 CFR
2010-10-01
... addressing time dependent and independent threats for a transmission pipeline operating below 30% SMYS not in... pipeline system are covered for purposes of the integrity management program requirements, an operator must... system, or an operator may apply one method to individual portions of the pipeline system. (Refer to...
Hydrocarbons pipeline transportation risk assessment
NASA Astrophysics Data System (ADS)
Zanin, A. V.; Milke, A. A.; Kvasov, I. N.
2018-04-01
The pipeline transportation applying risks assessment issue in the arctic conditions is addressed in the paper. Pipeline quality characteristics in the given environment has been assessed. To achieve the stated objective, the pipelines mathematical model was designed and visualized by using the software product SOLIDWORKS. When developing the mathematical model the obtained results made possible to define the pipeline optimal characteristics for designing on the Arctic sea bottom. In the course of conducting the research the pipe avalanche collapse risks were examined, internal longitudinal and circular loads acting on the pipeline were analyzed, as well as the water impact hydrodynamic force was taken into consideration. The conducted calculation can contribute to the pipeline transport further development under the harsh climate conditions of the Russian Federation Arctic shelf territory.
30 CFR 250.905 - How do I get approval for the installation, modification, or repair of my platform?
Code of Federal Regulations, 2013 CFR
2013-07-01
... foundations; drilling, production, and pipeline risers and riser tensioning systems; turrets and turret-and... component design; pile foundations; drilling, production, and pipeline risers and riser tensioning systems... Loads imposed by jacket; decks; production components; drilling, production, and pipeline risers, and...
30 CFR 250.905 - How do I get approval for the installation, modification, or repair of my platform?
Code of Federal Regulations, 2014 CFR
2014-07-01
... foundations; drilling, production, and pipeline risers and riser tensioning systems; turrets and turret-and... component design; pile foundations; drilling, production, and pipeline risers and riser tensioning systems... Loads imposed by jacket; decks; production components; drilling, production, and pipeline risers, and...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-29
... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket No... Liquid Systems AGENCY: Pipeline and Hazardous Materials Safety Administration (PHMSA), DOT. ACTION: Notice; Issuance of Advisory Bulletin. SUMMARY: This notice advises owners and operators of gas pipeline...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Washenfelder, D. J.; Girardot, C. L.; Wilson, E. R.
The twenty-eight double-shell underground radioactive waste storage tanks at the U. S. Department of Energy’s Hanford Site near Richland, WA are interconnected by the Waste Transfer System network of buried steel encased pipelines and pipe jumpers in below-grade pits. The pipeline material is stainless steel or carbon steel in 51 mm to 152 mm (2 in. to 6 in.) sizes. The pipelines carry slurries ranging up to 20 volume percent solids and supernatants with less than one volume percent solids at velocities necessary to prevent settling. The pipelines, installed between 1976 and 2011, were originally intended to last until themore » 2028 completion of the double-shell tank storage mission. The mission has been subsequently extended. In 2010 the Tank Operating Contractor began a systematic evaluation of the Waste Transfer System pipeline conditions applying guidelines from API 579-1/ASME FFS-1 (2007), Fitness-For-Service. Between 2010 and 2014 Fitness-for-Service examinations of the Waste Transfer System pipeline materials, sizes, and components were completed. In parallel, waste throughput histories were prepared allowing side-by-side pipeline wall thinning rate comparisons between carbon and stainless steel, slurries and supernatants and throughput volumes. The work showed that for transfer volumes up to 6.1E+05 m 3 (161 million gallons), the highest throughput of any pipeline segment examined, there has been no detectable wall thinning in either stainless or carbon steel pipeline material regardless of waste fluid characteristics or throughput. The paper describes the field and laboratory evaluation methods used for the Fitness-for-Service examinations, the results of the examinations, and the data reduction methodologies used to support Hanford Waste Transfer System pipeline wall thinning conclusions.« less
Oman India Pipeline: An operational repair strategy based on a rational assessment of risk
DOE Office of Scientific and Technical Information (OSTI.GOV)
German, P.
1996-12-31
This paper describes the development of a repair strategy for the operational phase of the Oman India Pipeline based upon the probability and consequences of a pipeline failure. Risk analyses and cost benefit analyses performed provide guidance on the level of deepwater repair development effort appropriate for the Oman India Pipeline project and identifies critical areas toward which more intense development effort should be directed. The risk analysis results indicate that the likelihood of a failure of the Oman India Pipeline during its 40-year life is low. Furthermore, the probability of operational failure of the pipeline in deepwater regions ismore » extremely low, the major proportion of operational failure risk being associated with the shallow water regions.« less
Analysis of Wastewater and Water System Renewal Decision-Making Tools and Approaches
In regards to the development of software for decision support for pipeline renewal, most of the attention to date has been paid to the development of asset management models which help an owner decide on which portions of a system to prioritize for needed actions. There has not ...
Mathematical simulation for compensation capacities area of pipeline routes in ship systems
NASA Astrophysics Data System (ADS)
Ngo, G. V.; Sakhno, K. N.
2018-05-01
In this paper, the authors considered the problem of manufacturability’s enhancement of ship systems pipeline at the designing stage. The analysis of arrangements and possibilities for compensation of deviations for pipeline routes has been carried out. The task was set to produce the “fit pipe” together with the rest of the pipes in the route. It was proposed to compensate for deviations by movement of the pipeline route during pipe installation and to calculate maximum values of these displacements in the analyzed path. Theoretical bases of deviation compensation for pipeline routes using rotations of parallel section pairs of pipes are assembled. Mathematical and graphical simulations of compensation area capacities of pipeline routes with various configurations are completed. Prerequisites have been created for creating an automated program that will allow one to determine values of the compensatory capacities area for pipeline routes and to assign quantities of necessary allowances.
Leak detectability of the Norman Wells pipeline by mass balance
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liou, J.C.P.
Pipeline leak detection using software-based systems is becoming common practice. The detectability of such systems is measured by how small and how quickly a leak can be detected. Algorithms used and measurement uncertainties determine leak detectability. This paper addresses leak detectability using mass balance, establishes leak detectability for Norman Wells pipelines, and compares it with field leak test results. The pipeline is operated by the Interprovincial Pipe Line (IPL) Inc., of Edmonton, Canada. It is a 12.75-inch outside diameter steel pipe with variable wall thickness. The length of the pipe is approximately 550 miles (868.9 km). The pipeline transports lightmore » crude oil at a constant flow rate of about 250 m{sup 3}/hr. The crude oil can enter the pipeline at two locations. Besides the Norman Wells inlet, there is a side line near Zama terminal that can inject crude oil into the pipeline.« less
NASA Astrophysics Data System (ADS)
Henclik, Sławomir
2018-03-01
The influence of dynamic fluid-structure interaction (FSI) onto the course of water hammer (WH) can be significant in non-rigid pipeline systems. The essence of this effect is the dynamic transfer of liquid energy to the pipeline structure and back, which is important for elastic structures and can be negligible for rigid ones. In the paper a special model of such behavior is analyzed. A straight pipeline with a steady flow, fixed to the floor with several rigid supports is assumed. The transient is generated by a quickly closed valve installed at the end of the pipeline. FSI effects are assumed to be present mainly at the valve which is fixed with a spring dash-pot attachment. Analysis of WH runs, especially transient pressure changes, for various stiffness and damping parameters of the spring dash-pot valve attachment is presented in the paper. The solutions are found analytically and numerically. Numerical results have been computed with the use of an own computer program developed on the basis of the four equation model of WH-FSI and the specific boundary conditions formulated at the valve. Analytical solutions have been found with the separation of variables method for slightly simplified assumptions. Damping at the dash-pot is taken into account within the numerical study. The influence of valve attachment parameters onto the WH courses was discovered and it was found the transient amplitudes can be reduced. Such a system, elastically attached shut-off valve in a pipeline or other, equivalent design can be a real solution applicable in practice.
NASA Astrophysics Data System (ADS)
Yang, Xiaojun; Zhu, Xiaofei; Deng, Chi; Li, Junyi; Liu, Cheng; Yu, Wenpeng; Luo, Hui
2017-10-01
To improve the level of management and monitoring of leakage and abnormal disturbance of long distance oil pipeline, the distributed optical fiber temperature and vibration sensing system is employed to test the feasibility for the healthy monitoring of a domestic oil pipeline. The simulating leakage and abnormal disturbance affairs of oil pipeline are performed in the experiment. It is demonstrated that the leakage and abnormal disturbance affairs of oil pipeline can be monitored and located accurately with the distributed optical fiber sensing system, which exhibits good performance in the sensitivity, reliability, operation and maintenance etc., and shows good market application prospect.
Progress in the planar CPn SOFC system design verification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Elangovan, S.; Hartvigsen, J.; Khandkar, A.
1996-04-01
SOFCo is developing a high efficiency, modular and scaleable planar SOFC module termed the CPn design. This design has been verified in a 1.4 kW module test operated directly on pipeline natural gas. The design features multistage oxidation of fuel wherein the fuel is consumed incrementally over several stages. High efficiency is achieved by uniform current density distribution per stage, which lowers the stack resistance. Additional benefits include thermal regulation and compactness. Test results from stack modules operating in pipeline natural gas are presented.
Simulation of systems for shock wave/compression waves damping in technological plants
NASA Astrophysics Data System (ADS)
Sumskoi, S. I.; Sverchkov, A. M.; Lisanov, M. V.; Egorov, A. F.
2016-09-01
At work of pipeline systems, flow velocity decrease can take place in the pipeline as a result of the pumps stop, the valves shutdown. As a result, compression waves appear in the pipeline systems. These waves can propagate in the pipeline system, leading to its destruction. This phenomenon is called water hammer (water hammer flow). The most dangerous situations occur when the flow is stopped quickly. Such urgent flow cutoff often takes place in an emergency situation when liquid hydrocarbons are being loaded into sea tankers. To prevent environment pollution it is necessary to stop the hydrocarbon loading urgently. The flow in this case is cut off within few seconds. To prevent an increase in pressure in a pipeline system during water hammer flow, special protective systems (pressure relief systems) are installed. The approaches to systems of protection against water hammer (pressure relief systems) modeling are described in this paper. A model of certain pressure relief system is considered. It is shown that in case of an increase in the intensity of hydrocarbons loading at a sea tanker, presence of the pressure relief system allows to organize safe mode of loading.
Liu, Wenbin; Liu, Aimin
2018-01-01
With the exploitation of offshore oil and gas gradually moving to deep water, higher temperature differences and pressure differences are applied to the pipeline system, making the global buckling of the pipeline more serious. For unburied deep-water pipelines, the lateral buckling is the major buckling form. The initial imperfections widely exist in the pipeline system due to manufacture defects or the influence of uneven seabed, and the distribution and geometry features of initial imperfections are random. They can be divided into two kinds based on shape: single-arch imperfections and double-arch imperfections. This paper analyzed the global buckling process of a pipeline with 2 initial imperfections by using a numerical simulation method and revealed how the ratio of the initial imperfection’s space length to the imperfection’s wavelength and the combination of imperfections affects the buckling process. The results show that a pipeline with 2 initial imperfections may suffer the superposition of global buckling. The growth ratios of buckling displacement, axial force and bending moment in the superposition zone are several times larger than no buckling superposition pipeline. The ratio of the initial imperfection’s space length to the imperfection’s wavelength decides whether a pipeline suffers buckling superposition. The potential failure point of pipeline exhibiting buckling superposition is as same as the no buckling superposition pipeline, but the failure risk of pipeline exhibiting buckling superposition is much higher. The shape and direction of two nearby imperfections also affects the failure risk of pipeline exhibiting global buckling superposition. The failure risk of pipeline with two double-arch imperfections is higher than pipeline with two single-arch imperfections. PMID:29554123
NASA Astrophysics Data System (ADS)
Zemenkov, Y. D.; Zemenkova, M. Y.; Vengerov, A. A.; Brand, A. E.
2016-10-01
There is investigated the technology of hydrodynamic cavitational processing viscous and high-viscosity oils and the possibility of its application in the pipeline transport system for the purpose of increasing of rheological properties of the transported oils, including dynamic viscosity shear stress in the article. It is considered the possibility of application of the combined hydrodynamic cavitational processing with addition of depressor additive for identification of effect of a synergism. It is developed the laboratory bench and they are presented results of modeling and laboratory researches. It is developed the hardware and technological scheme of application of the developed equipment at industrial objects of pipeline transport.
Special-purpose computer for holography HORN-4 with recurrence algorithm
NASA Astrophysics Data System (ADS)
Shimobaba, Tomoyoshi; Hishinuma, Sinsuke; Ito, Tomoyoshi
2002-10-01
We designed and built a special-purpose computer for holography, HORN-4 (HOlographic ReconstructioN) using PLD (Programmable Logic Device) technology. HORN computers have a pipeline architecture. We use HORN-4 as an attached processor to enhance the performance of a general-purpose computer when it is used to generate holograms using a "recurrence formulas" algorithm developed by our previous paper. In the HORN-4 system, we designed the pipeline by adopting our "recurrence formulas" algorithm which can calculate the phase on a hologram. As the result, we could integrate the pipeline composed of 21 units into one PLD chip. The units in the pipeline consists of one BPU (Basic Phase Unit) unit and twenty CU (Cascade Unit) units. These CU units can compute twenty light intensities on a hologram plane at one time. By mounting two of the PLD chips on a PCI (Peripheral Component Interconnect) universal board, HORN-4 can calculate holograms at high speed of about 42 Gflops equivalent. The cost of HORN-4 board is about 1700 US dollar. We could obtain 800×600 grids hologram from a 3D-image composed of 415 points in about 0.45 sec with the HORN-4 system.
The inverse electroencephalography pipeline
NASA Astrophysics Data System (ADS)
Weinstein, David Michael
The inverse electroencephalography (EEG) problem is defined as determining which regions of the brain are active based on remote measurements recorded with scalp EEG electrodes. An accurate solution to this problem would benefit both fundamental neuroscience research and clinical neuroscience applications. However, constructing accurate patient-specific inverse EEG solutions requires complex modeling, simulation, and visualization algorithms, and to date only a few systems have been developed that provide such capabilities. In this dissertation, a computational system for generating and investigating patient-specific inverse EEG solutions is introduced, and the requirements for each stage of this Inverse EEG Pipeline are defined and discussed. While the requirements of many of the stages are satisfied with existing algorithms, others have motivated research into novel modeling and simulation methods. The principal technical results of this work include novel surface-based volume modeling techniques, an efficient construction for the EEG lead field, and the Open Source release of the Inverse EEG Pipeline software for use by the bioelectric field research community. In this work, the Inverse EEG Pipeline is applied to three research problems in neurology: comparing focal and distributed source imaging algorithms; separating measurements into independent activation components for multifocal epilepsy; and localizing the cortical activity that produces the P300 effect in schizophrenia.
Exploration Systems Health Management Facilities and Testbed Workshop
NASA Technical Reports Server (NTRS)
Wilson, Scott; Waterman, Robert; McCleskey, Carey
2004-01-01
Presentation Agenda : (1) Technology Maturation Pipeline (The Plan) (2) Cryogenic testbed (and other KSC Labs) (2a) Component / Subsystem technologies (3) Advanced Technology Development Center (ATDC) (3a) System / Vehic1e technologies (4) EL V Flight Experiments (Flight Testbeds).
Federal Register 2010, 2011, 2012, 2013, 2014
2010-10-15
... pipeline infrastructure to receive natural gas produced from Marcellus Shale production areas for delivery to existing interstate pipeline systems of Tennessee Gas Pipeline Company (TGP), CNYOG, and Transcontinental Gas Pipeline Corporation (Transco). It would also provide for bi-directional transportation...
An integrated SNP mining and utilization (ISMU) pipeline for next generation sequencing data.
Azam, Sarwar; Rathore, Abhishek; Shah, Trushar M; Telluri, Mohan; Amindala, BhanuPrakash; Ruperao, Pradeep; Katta, Mohan A V S K; Varshney, Rajeev K
2014-01-01
Open source single nucleotide polymorphism (SNP) discovery pipelines for next generation sequencing data commonly requires working knowledge of command line interface, massive computational resources and expertise which is a daunting task for biologists. Further, the SNP information generated may not be readily used for downstream processes such as genotyping. Hence, a comprehensive pipeline has been developed by integrating several open source next generation sequencing (NGS) tools along with a graphical user interface called Integrated SNP Mining and Utilization (ISMU) for SNP discovery and their utilization by developing genotyping assays. The pipeline features functionalities such as pre-processing of raw data, integration of open source alignment tools (Bowtie2, BWA, Maq, NovoAlign and SOAP2), SNP prediction (SAMtools/SOAPsnp/CNS2snp and CbCC) methods and interfaces for developing genotyping assays. The pipeline outputs a list of high quality SNPs between all pairwise combinations of genotypes analyzed, in addition to the reference genome/sequence. Visualization tools (Tablet and Flapjack) integrated into the pipeline enable inspection of the alignment and errors, if any. The pipeline also provides a confidence score or polymorphism information content value with flanking sequences for identified SNPs in standard format required for developing marker genotyping (KASP and Golden Gate) assays. The pipeline enables users to process a range of NGS datasets such as whole genome re-sequencing, restriction site associated DNA sequencing and transcriptome sequencing data at a fast speed. The pipeline is very useful for plant genetics and breeding community with no computational expertise in order to discover SNPs and utilize in genomics, genetics and breeding studies. The pipeline has been parallelized to process huge datasets of next generation sequencing. It has been developed in Java language and is available at http://hpc.icrisat.cgiar.org/ISMU as a standalone free software.
Duthu, Ray C.
2017-01-01
The process of hydraulic fracturing for recovery of oil and natural gas uses large amounts of fresh water and produces a comparable amount of wastewater, much of which is typically transported by truck. Truck transport of water is an expensive and energy-intensive process with significant external costs including roads damages, and pollution. The integrated development plan (IDP) is the industry nomenclature for an integrated oil and gas infrastructure system incorporating pipeline-based transport of water and wastewater, centralized water treatment, and high rates of wastewater recycling. These IDP have been proposed as an alternative to truck transport systems so as to mitigate many of the economic and environmental problems associated with natural gas production, but the economic and environmental performance of these systems have not been analyzed to date. This study presents a quantification of lifecycle greenhouse gas (GHG) emissions and road damages of a generic oil and gas field, and of an oil and gas development sited in the Denver-Julesburg basin in the northern Colorado region of the US. Results demonstrate that a reduction in economic and environmental externalities can be derived from the development of these IDP-based pipeline water transportation systems. IDPs have marginal utility in reducing GHG emissions and road damage when they are used to replace in-field water transport, but can reduce GHG emissions and road damage by factors of as much as 6 and 7 respectively, when used to replace fresh water transport and waste-disposal routes for exemplar Northern Colorado oil and gas fields. PMID:28686682
Duthu, Ray C; Bradley, Thomas H
2017-01-01
The process of hydraulic fracturing for recovery of oil and natural gas uses large amounts of fresh water and produces a comparable amount of wastewater, much of which is typically transported by truck. Truck transport of water is an expensive and energy-intensive process with significant external costs including roads damages, and pollution. The integrated development plan (IDP) is the industry nomenclature for an integrated oil and gas infrastructure system incorporating pipeline-based transport of water and wastewater, centralized water treatment, and high rates of wastewater recycling. These IDP have been proposed as an alternative to truck transport systems so as to mitigate many of the economic and environmental problems associated with natural gas production, but the economic and environmental performance of these systems have not been analyzed to date. This study presents a quantification of lifecycle greenhouse gas (GHG) emissions and road damages of a generic oil and gas field, and of an oil and gas development sited in the Denver-Julesburg basin in the northern Colorado region of the US. Results demonstrate that a reduction in economic and environmental externalities can be derived from the development of these IDP-based pipeline water transportation systems. IDPs have marginal utility in reducing GHG emissions and road damage when they are used to replace in-field water transport, but can reduce GHG emissions and road damage by factors of as much as 6 and 7 respectively, when used to replace fresh water transport and waste-disposal routes for exemplar Northern Colorado oil and gas fields.
Guo, Li; Allen, Kelly S.; Deiulio, Greg; Zhang, Yong; Madeiras, Angela M.; Wick, Robert L.; Ma, Li-Jun
2016-01-01
Current and emerging plant diseases caused by obligate parasitic microbes such as rusts, downy mildews, and powdery mildews threaten worldwide crop production and food safety. These obligate parasites are typically unculturable in the laboratory, posing technical challenges to characterize them at the genetic and genomic level. Here we have developed a data analysis pipeline integrating several bioinformatic software programs. This pipeline facilitates rapid gene discovery and expression analysis of a plant host and its obligate parasite simultaneously by next generation sequencing of mixed host and pathogen RNA (i.e., metatranscriptomics). We applied this pipeline to metatranscriptomic sequencing data of sweet basil (Ocimum basilicum) and its obligate downy mildew parasite Peronospora belbahrii, both lacking a sequenced genome. Even with a single data point, we were able to identify both candidate host defense genes and pathogen virulence genes that are highly expressed during infection. This demonstrates the power of this pipeline for identifying genes important in host–pathogen interactions without prior genomic information for either the plant host or the obligate biotrophic pathogen. The simplicity of this pipeline makes it accessible to researchers with limited computational skills and applicable to metatranscriptomic data analysis in a wide range of plant-obligate-parasite systems. PMID:27462318
GRAPE-4: A special-purpose computer for gravitational N-body problems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Makino, Junichiro; Taiji, Makoto; Ebisuzaki, Toshikazu
1995-12-01
We describe GRAPE-4, a special-purpose computer for gravitational N-body simulations. In gravitational N-body simulations, almost all computing time is spent for the calculation of interaction between particles. GRAPE-4 is a specialized hardware to calculate the interaction between particles. It is used with a general-purpose host computer that performs all calculations other than the force calculation. With this architecture, it is relatively easy to realize a massively parallel system. In 1991, we developed the GRAPE-3 system with the peak speed equivalent to 14.4 Gflops. It consists of 48 custom pipelined processors. In 1992 we started the development of GRAPE-4. The GRAPE-4more » system will consist of 1920 custom pipeline chips. Each chip has the speed of 600 Mflops, when operated on 30 MHz clock. A prototype system with two custom LSIs has been completed July 1994, and the full system is now under manufacturing.« less
Chan, Kuang-Lim; Rosli, Rozana; Tatarinova, Tatiana V; Hogan, Michael; Firdaus-Raih, Mohd; Low, Eng-Ti Leslie
2017-01-27
Gene prediction is one of the most important steps in the genome annotation process. A large number of software tools and pipelines developed by various computing techniques are available for gene prediction. However, these systems have yet to accurately predict all or even most of the protein-coding regions. Furthermore, none of the currently available gene-finders has a universal Hidden Markov Model (HMM) that can perform gene prediction for all organisms equally well in an automatic fashion. We present an automated gene prediction pipeline, Seqping that uses self-training HMM models and transcriptomic data. The pipeline processes the genome and transcriptome sequences of the target species using GlimmerHMM, SNAP, and AUGUSTUS pipelines, followed by MAKER2 program to combine predictions from the three tools in association with the transcriptomic evidence. Seqping generates species-specific HMMs that are able to offer unbiased gene predictions. The pipeline was evaluated using the Oryza sativa and Arabidopsis thaliana genomes. Benchmarking Universal Single-Copy Orthologs (BUSCO) analysis showed that the pipeline was able to identify at least 95% of BUSCO's plantae dataset. Our evaluation shows that Seqping was able to generate better gene predictions compared to three HMM-based programs (MAKER2, GlimmerHMM and AUGUSTUS) using their respective available HMMs. Seqping had the highest accuracy in rice (0.5648 for CDS, 0.4468 for exon, and 0.6695 nucleotide structure) and A. thaliana (0.5808 for CDS, 0.5955 for exon, and 0.8839 nucleotide structure). Seqping provides researchers a seamless pipeline to train species-specific HMMs and predict genes in newly sequenced or less-studied genomes. We conclude that the Seqping pipeline predictions are more accurate than gene predictions using the other three approaches with the default or available HMMs.
Research on application of GIS and GPS in inspection and management of city gas pipeline network
NASA Astrophysics Data System (ADS)
Zhou, Jin; Meng, Xiangyin; Tao, Tao; Zhang, Fengpei
2018-01-01
To solve the problems existing in the current Gas Company patrol management, such as inaccurate attendance, whether or not the patrol personnel exceed the scope of patrol inspection. This paper Proposed that we apply the SuperMap iDeskTop 8C plug-in desktop GIS application and development platform, the positioning function of GPS and the data transmission function of 3G/4G/GPRS/Ethernet to develop a gas pipeline inspection management system. We build association between real-time data, pipe network information, patrol data, map information, spatial data and so on to realize the bottom data fusion, use the mobile location system and patrol management client to achieve real-time interaction between the client and the mobile terminal. Practical application shows that the system has completed the standardized management of patrol tasks, the reasonable evaluation of patrol work and the maximum utilization of patrol resources.
The Very Large Array Data Processing Pipeline
NASA Astrophysics Data System (ADS)
Kent, Brian R.; Masters, Joseph S.; Chandler, Claire J.; Davis, Lindsey E.; Kern, Jeffrey S.; Ott, Juergen; Schinzel, Frank K.; Medlin, Drew; Muders, Dirk; Williams, Stewart; Geers, Vincent C.; Momjian, Emmanuel; Butler, Bryan J.; Nakazato, Takeshi; Sugimoto, Kanako
2018-01-01
We present the VLA Pipeline, software that is part of the larger pipeline processing framework used for the Karl G. Jansky Very Large Array (VLA), and Atacama Large Millimeter/sub-millimeter Array (ALMA) for both interferometric and single dish observations.Through a collection of base code jointly used by the VLA and ALMA, the pipeline builds a hierarchy of classes to execute individual atomic pipeline tasks within the Common Astronomy Software Applications (CASA) package. Each pipeline task contains heuristics designed by the team to actively decide the best processing path and execution parameters for calibration and imaging. The pipeline code is developed and written in Python and uses a "context" structure for tracking the heuristic decisions and processing results. The pipeline "weblog" acts as the user interface in verifying the quality assurance of each calibration and imaging stage. The majority of VLA scheduling blocks above 1 GHz are now processed with the standard continuum recipe of the pipeline and offer a calibrated measurement set as a basic data product to observatory users. In addition, the pipeline is used for processing data from the VLA Sky Survey (VLASS), a seven year community-driven endeavor started in September 2017 to survey the entire sky down to a declination of -40 degrees at S-band (2-4 GHz). This 5500 hour next-generation large radio survey will explore the time and spectral domains, relying on pipeline processing to generate calibrated measurement sets, polarimetry, and imaging data products that are available to the astronomical community with no proprietary period. Here we present an overview of the pipeline design philosophy, heuristics, and calibration and imaging results produced by the pipeline. Future development will include the testing of spectral line recipes, low signal-to-noise heuristics, and serving as a testing platform for science ready data products.The pipeline is developed as part of the CASA software package by an international consortium of scientists and software developers based at the National Radio Astronomical Observatory (NRAO), the European Southern Observatory (ESO), and the National Astronomical Observatory of Japan (NAOJ).
Environmental impact analysis; the example of the proposed Trans-Alaska Pipeline
Brew, David A.
1974-01-01
The environmental impact analysis made as required by the National Environmental Policy Act of 1969 for the proposed trans-Alaska pipeline included consideration of the (1) technologically complex and geographically extensive proposed project, (2) extremely different physical environments across Alaska along the proposed route and elsewhere in Alaska and in Canada along alternative routes, (3) socioeconomic environment of the State of Alaska, and (4) a wide variety of alternatives. The analysis was designed specifically to fit the project and environment that would be affected. The environment was divided into two general parts--natural physical systems and superposed socioeconomic systems--and those parts were further divided into discipline-oriented systems or components that were studied and analyzed by scientists of the appropriate discipline. Particular attention was given to potential feedback loops in the impact network and to linkages between the project's impacting effects and the environment. The results of the analysis as reported in the final environmental impact statement were that both unavoidable and threatened environmental impacts would result from construction, operation, and maintenance of the proposed pipeline system and the developments related to it. The principal unavoidable effects would be (1) disturbances of terrain, fish and wildlife habitat, and human environs, (2) the results of the discharge of effluent from the tanker-ballast-treatment facility into Port Valdez and of some indeterminate amount of oil released into the ocean from tank-cleaning operations at sea, and (3) the results associated with increased human pressures of all kinds on the environment. Other unavoidable effects would be those related to increase of State and Native Corporation revenues, accelerated cultural change of the Native population, and extraction of the oil and gas resource. The main threatened environmental effects would all be related to unintentional oil loss from the pipeline, from tankers, or in the oil field. Oil losses from the pipeline could be caused by direct or indirect effects of earthquakes, destructive sea waves, slope failure caused by natural or artificial processes, thaw-plug instability (in permafrost), differential settlement of permafrost terrain, and bed scour and bank erosion at stream crossings. Oil loss from tankers could be caused by accidents during transfer operations at Valdez and at destination ports and by casualties involving tankers and other ships. Comparison of alternative routes and transportation systems and of their environmental impacts provided information which indicates to the author that one corridor containing both oil and gas pipelines would have less environmental impact than would separate corridors. Considering also the threat to the marine environment that any tanker system would impose and the threat that zones of high earthquake frequency and magnitude would impose on pipelines, it is apparent to the author that environmental impact and cost would be least for a single-corridor on-land route that avoided earthquake zones. The alternative trans-Alaska-Canada routes would meet these criteria. The decisions of the U.S. Department of the Interior, the U.S. Congress, and the President of the United States in favor of the proposed trans-Alaska pipeline system indicate the relative weight given by the decision makers in balancing the importance of potential environmental consequences against the advantages to be derived from rapid resource development.
The LCOGT Observation Portal, Data Pipeline and Science Archive
NASA Astrophysics Data System (ADS)
Lister, Tim; LCOGT Science Archive Team
2014-01-01
Las Cumbres Observatory Global Telescope (LCOGT) is building and deploying a world-wide network of optical telescopes dedicated to time-domain astronomy. During 2012-2013, we successfully deployed and commissioned nine new 1m telescopes at McDonald Observatory (Texas), CTIO (Chile), SAAO (South Africa) and Siding Spring Observatory (Australia). New, improved cameras and additional telescopes will be deployed during 2014. To enable the diverse LCOGT user community of scientific and educational users to request observations on the LCOGT Network and to see their progress and get access to their data, we have developed an Observation Portal system. This Observation Portal integrates proposal submission and observation requests with seamless access to the data products from the data pipelines in near-realtime and long-term products from the Science Archive. We describe the LCOGT Observation Portal and the data pipeline, currently in operation, which makes use of the ORAC-DR automated recipe-based data reduction pipeline and illustrate some of the new data products. We also present the LCOGT Science Archive, which is being developed in partnership with the Infrared Processing and Analysis Center (IPAC) and show some of the new features the Science Archive provides.
Design and Operation of the World's First Long Distance Bauxite Slurry Pipeline
NASA Astrophysics Data System (ADS)
Gandhi, Ramesh; Weston, Mike; Talavera, Maru; Brittes, Geraldo Pereira; Barbosa, Eder
Mineracão Bauxita Paragominas (MBP) is the first long distance slurry pipeline transporting bauxite slurry. Bauxite had developed a reputation for being difficult to hydraulically transport using long distance pipelines. This myth has now been proven wrong. The 245-km- long, 13.5 MTPY capacity MBP pipeline was designed and commissioned by PSI for CVRD. The pipeline is located in the State of Para, Brazil. The Miltonia bauxite mine is in a remote location with no other efficient means of transport. The bauxite slurry is delivered to Alunorte Alumina refinery located near Barcarena. This first of its kind pipeline required significant development work in order to assure technical and economic feasibility. This paper describes the technical aspects of design of the pipeline. It also summarizes the operating experience gained during the first year of operation.
Jayashree, B; Hanspal, Manindra S; Srinivasan, Rajgopal; Vigneshwaran, R; Varshney, Rajeev K; Spurthi, N; Eshwar, K; Ramesh, N; Chandra, S; Hoisington, David A
2007-01-01
The large amounts of EST sequence data available from a single species of an organism as well as for several species within a genus provide an easy source of identification of intra- and interspecies single nucleotide polymorphisms (SNPs). In the case of model organisms, the data available are numerous, given the degree of redundancy in the deposited EST data. There are several available bioinformatics tools that can be used to mine this data; however, using them requires a certain level of expertise: the tools have to be used sequentially with accompanying format conversion and steps like clustering and assembly of sequences become time-intensive jobs even for moderately sized datasets. We report here a pipeline of open source software extended to run on multiple CPU architectures that can be used to mine large EST datasets for SNPs and identify restriction sites for assaying the SNPs so that cost-effective CAPS assays can be developed for SNP genotyping in genetics and breeding applications. At the International Crops Research Institute for the Semi-Arid Tropics (ICRISAT), the pipeline has been implemented to run on a Paracel high-performance system consisting of four dual AMD Opteron processors running Linux with MPICH. The pipeline can be accessed through user-friendly web interfaces at http://hpc.icrisat.cgiar.org/PBSWeb and is available on request for academic use. We have validated the developed pipeline by mining chickpea ESTs for interspecies SNPs, development of CAPS assays for SNP genotyping, and confirmation of restriction digestion pattern at the sequence level.
NASA Astrophysics Data System (ADS)
Goldoni, P.
2011-03-01
The X-shooter data reduction pipeline is an integral part of the X-shooter project, it allows the production of reduced data in physical quantities from the raw data produced by the instrument. The pipeline is based on the data reduction library developed by the X-shooter consortium with contributions from France, The Netherlands and ESO and it uses the Common Pipeline Library (CPL) developed at ESO. The pipeline has been developed for two main functions. The first function is to monitor the operation of the instrument through the reduction of the acquired data, both at Paranal, for a quick-look control, and in Garching, for a more thorough evaluation. The second function is to allow an optimized data reduction for a scientific user. In the following I will first outline the main steps of data reduction with the pipeline then I will briefly show two examples of optimization of the results for science reduction.
General Investigation Reconnaissance Report Provo and Vicinity, Utah
1997-04-01
However, most development relies on curbs and gutters rather than on pipelines to get water to the Provo River. The local drainage system within the...this study. RECREATION The need for recreation facilities will also grow with the rise in population. Provo has a well developed trail system in place...Northeast and Southeast Drainages will be developed to minimize conflicts with this trail system . SUMMARY There is a significant flood threat in Provo from
Morales-Navarrete, Hernán; Segovia-Miranda, Fabián; Klukowski, Piotr; Meyer, Kirstin; Nonaka, Hidenori; Marsico, Giovanni; Chernykh, Mikhail; Kalaidzidis, Alexander; Zerial, Marino; Kalaidzidis, Yannis
2015-01-01
A prerequisite for the systems biology analysis of tissues is an accurate digital three-dimensional reconstruction of tissue structure based on images of markers covering multiple scales. Here, we designed a flexible pipeline for the multi-scale reconstruction and quantitative morphological analysis of tissue architecture from microscopy images. Our pipeline includes newly developed algorithms that address specific challenges of thick dense tissue reconstruction. Our implementation allows for a flexible workflow, scalable to high-throughput analysis and applicable to various mammalian tissues. We applied it to the analysis of liver tissue and extracted quantitative parameters of sinusoids, bile canaliculi and cell shapes, recognizing different liver cell types with high accuracy. Using our platform, we uncovered an unexpected zonation pattern of hepatocytes with different size, nuclei and DNA content, thus revealing new features of liver tissue organization. The pipeline also proved effective to analyse lung and kidney tissue, demonstrating its generality and robustness. DOI: http://dx.doi.org/10.7554/eLife.11214.001 PMID:26673893
Southeast geysers effluent pipeline project. Final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dellinger, M.
1998-01-15
The project concept originated in 1990 with the convergence of two problems: (1) a need for augmented injection to mitigate declining reservoir productivity at the Geysers; and (2) a need for a new method of wastewater disposal for Lake County communities near the The Geysers. A public/private partnership of Geysers operators and the Lake County Sanitation District (LACOSAN) was formed in 1991 to conduct a series of engineering, environmental, and financing studies of transporting treated wastewater effluent from the communities to the southeast portion of The Geysers via a 29-mile pipeline. By 1994, these evaluations concluded that the concept wasmore » feasible and the stakeholders proceeded to formally develop the project, including pipeline and associated facilities design; preparation of an environmental impact statement; negotiation of construction and operating agreements; and assembly of $45 million in construction funding from the stakeholders, and from state and federal agencies with related program goals. The project development process culminated in the system`s dedication on October 16, 1997. As of this writing, all project components have been constructed or installed, successfully tested in compliance with design specifications, and are operating satisfactorily.« less
Full image-processing pipeline in field-programmable gate array for a small endoscopic camera
NASA Astrophysics Data System (ADS)
Mostafa, Sheikh Shanawaz; Sousa, L. Natércia; Ferreira, Nuno Fábio; Sousa, Ricardo M.; Santos, Joao; Wäny, Martin; Morgado-Dias, F.
2017-01-01
Endoscopy is an imaging procedure used for diagnosis as well as for some surgical purposes. The camera used for the endoscopy should be small and able to produce a good quality image or video, to reduce discomfort of the patients, and to increase the efficiency of the medical team. To achieve these fundamental goals, a small endoscopy camera with a footprint of 1 mm×1 mm×1.65 mm is used. Due to the physical properties of the sensors and human vision system limitations, different image-processing algorithms, such as noise reduction, demosaicking, and gamma correction, among others, are needed to faithfully reproduce the image or video. A full image-processing pipeline is implemented using a field-programmable gate array (FPGA) to accomplish a high frame rate of 60 fps with minimum processing delay. Along with this, a viewer has also been developed to display and control the image-processing pipeline. The control and data transfer are done by a USB 3.0 end point in the computer. The full developed system achieves real-time processing of the image and fits in a Xilinx Spartan-6LX150 FPGA.
Hollow-core fiber sensing technique for pipeline leak detection
NASA Astrophysics Data System (ADS)
Challener, W. A.; Kasten, Matthias A.; Karp, Jason; Choudhury, Niloy
2018-02-01
Recently there has been increased interest on the part of federal and state regulators to detect and quantify emissions of methane, an important greenhouse gas, from various parts of the oil and gas infrastructure including well pads and pipelines. Pressure and/or flow anomalies are typically used to detect leaks along natural gas pipelines, but are generally very insensitive and subject to false alarms. We have developed a system to detect and localize methane leaks along gas pipelines that is an order of magnitude more sensitive by combining tunable diode laser spectroscopy (TDLAS) with conventional sensor tube technology. This technique can potentially localize leaks along pipelines up to 100 km lengths with an accuracy of +/-50 m or less. A sensor tube buried along the pipeline with a gas-permeable membrane collects leaking gas during a soak period. The leak plume within the tube is then carried to the nearest sensor node along the tube in a purge cycle. The time-to-detection is used to determine leak location. Multiple sensor nodes are situated along the pipeline to minimize the time to detection, and each node is composed of a short segment of hollow core fiber (HCF) into which leaking gas is transported quickly through a small pressure differential. The HCF sensing node is spliced to standard telecom solid core fiber which transports the laser light for spectroscopy to a remote interrogator. The interrogator is multiplexed across the sensor nodes to minimize equipment cost and complexity.
Development of 3-Year Roadmap to Transform the Discipline of Systems Engineering
2010-03-31
quickly humans could physically construct them. Indeed, magnetic core memory was entirely constructed by human hands until it was superseded by...For their mainframe computers, IBM develops the applications, operating system, computer hardware and microprocessors (off the shelf standard memory ...processor developers work on potential computational and memory pipelines to support the required performance capabilities and use the available transistors
Cathodic Protection Measurement Through Inline Inspection Technology Uses and Observations
NASA Astrophysics Data System (ADS)
Ferguson, Briana Ley
This research supports the evaluation of an impressed current cathodic protection (CP) system of a buried coated steel pipeline through alternative technology and methods, via an inline inspection device (ILI, CP ILI tool, or tool), in order to prevent and mitigate external corrosion. This thesis investigates the ability to measure the current density of a pipeline's CP system from inside of a pipeline rather than manually from outside, and then convert that CP ILI tool reading into a pipe-to-soil potential as required by regulations and standards. This was demonstrated through a mathematical model that utilizes applications of Ohm's Law, circuit concepts, and attenuation principles in order to match the results of the ILI sample data by varying parameters of the model (i.e., values for over potential and coating resistivity). This research has not been conducted previously in order to determine if the protected potential range can be achieved with respect to the predicted current density from the CP ILI device. Kirchhoff's method was explored, but certain principals could not be used in the model as manual measurements were required. This research was based on circuit concepts which indirectly affected electrochemical processes. Through Ohm's law, the results show that a constant current density is possible in the protected potential range; therefore, indicates polarization of the pipeline, which leads to calcareous deposit development with respect to electrochemistry. Calcareous deposit is desirable in industry since it increases the resistance of the pipeline coating and lowers current, thus slowing the oxygen diffusion process. This research conveys that an alternative method for CP evaluation from inside of the pipeline is possible where the pipe-to-soil potential can be estimated (as required by regulations) from the ILI tool's current density measurement.
The Snow Data System at NASA JPL
NASA Astrophysics Data System (ADS)
Laidlaw, R.; Painter, T. H.; Mattmann, C. A.; Ramirez, P.; Bormann, K.; Brodzik, M. J.; Burgess, A. B.; Rittger, K.; Goodale, C. E.; Joyce, M.; McGibbney, L. J.; Zimdars, P.
2014-12-01
NASA JPL's Snow Data System has a data-processing pipeline powered by Apache OODT, an open source software tool. The pipeline has been running for several years and has successfully generated a significant amount of cryosphere data, including MODIS-based products such as MODSCAG, MODDRFS and MODICE, with historical and near-real time windows and covering regions such as the Artic, Western US, Alaska, Central Europe, Asia, South America, Australia and New Zealand. The team continues to improve the pipeline, using monitoring tools such as Ganglia to give an overview of operations, and improving fault-tolerance with automated recovery scripts. Several alternative adaptations of the Snow Covered Area and Grain size (SCAG) algorithm are being investigated. These include using VIIRS and Landsat TM/ETM+ satellite data as inputs. Parallel computing techniques are being considered for core SCAG processing, such as using the PyCUDA Python API to utilize multi-core GPU architectures. An experimental version of MODSCAG is also being developed for the Google Earth Engine platform, a cloud-based service.
Automatising the analysis of stochastic biochemical time-series
2015-01-01
Background Mathematical and computational modelling of biochemical systems has seen a lot of effort devoted to the definition and implementation of high-performance mechanistic simulation frameworks. Within these frameworks it is possible to analyse complex models under a variety of configurations, eventually selecting the best setting of, e.g., parameters for a target system. Motivation This operational pipeline relies on the ability to interpret the predictions of a model, often represented as simulation time-series. Thus, an efficient data analysis pipeline is crucial to automatise time-series analyses, bearing in mind that errors in this phase might mislead the modeller's conclusions. Results For this reason we have developed an intuitive framework-independent Python tool to automate analyses common to a variety of modelling approaches. These include assessment of useful non-trivial statistics for simulation ensembles, e.g., estimation of master equations. Intuitive and domain-independent batch scripts will allow the researcher to automatically prepare reports, thus speeding up the usual model-definition, testing and refinement pipeline. PMID:26051821
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pineda Porras, Omar Andrey; Ordaz, Mario
2009-01-01
Though Differential Ground Subsidence (DGS) impacts the seismic response of segmented buried pipelines augmenting their vulnerability, fragility formulations to estimate repair rates under such condition are not available in the literature. Physical models to estimate pipeline seismic damage considering other cases of permanent ground subsidence (e.g. faulting, tectonic uplift, liquefaction, and landslides) have been extensively reported, not being the case of DGS. The refinement of the study of two important phenomena in Mexico City - the 1985 Michoacan earthquake scenario and the sinking of the city due to ground subsidence - has contributed to the analysis of the interrelation ofmore » pipeline damage, ground motion intensity, and DGS; from the analysis of the 48-inch pipeline network of the Mexico City's Water System, fragility formulations for segmented buried pipeline systems for two DGS levels are proposed. The novel parameter PGV{sup 2}/PGA, being PGV peak ground velocity and PGA peak ground acceleration, has been used as seismic parameter in these formulations, since it has shown better correlation to pipeline damage than PGV alone according to previous studies. By comparing the proposed fragilities, it is concluded that a change in the DGS level (from Low-Medium to High) could increase the pipeline repair rates (number of repairs per kilometer) by factors ranging from 1.3 to 2.0; being the higher the seismic intensity the lower the factor.« less
Characteristics of vibrational wave propagation and attenuation in submarine fluid-filled pipelines
NASA Astrophysics Data System (ADS)
Yan, Jin; Zhang, Juan
2015-04-01
As an important part of lifeline engineering in the development and utilization of marine resources, the submarine fluid-filled pipeline is a complex coupling system which is subjected to both internal and external flow fields. By utilizing Kennard's shell equations and combining with Helmholtz equations of flow field, the coupling equations of submarine fluid-filled pipeline for n=0 axisymmetrical wave motion are set up. Analytical expressions of wave speed are obtained for both s=1 and s=2 waves, which correspond to a fluid-dominated wave and an axial shell wave, respectively. The numerical results for wave speed and wave attenuation are obtained and discussed subsequently. It shows that the frequency depends on phase velocity, and the attenuation of this mode depends strongly on material parameters of the pipe and the internal and the external fluid fields. The characteristics of PVC pipe are studied for a comparison. The effects of shell thickness/radius ratio and density of the contained fluid on the model are also discussed. The study provides a theoretical basis and helps to accurately predict the situation of submarine pipelines, which also has practical application prospect in the field of pipeline leakage detection.
NASA Astrophysics Data System (ADS)
Longmore, S. N.; Collins, R. P.; Pfeifer, S.; Fox, S. E.; Mulero-Pazmany, M.; Bezombes, F.; Goodwind, A.; de Juan Ovelar, M.; Knapen, J. H.; Wich, S. A.
2017-02-01
In this paper we describe an unmanned aerial system equipped with a thermal-infrared camera and software pipeline that we have developed to monitor animal populations for conservation purposes. Taking a multi-disciplinary approach to tackle this problem, we use freely available astronomical source detection software and the associated expertise of astronomers, to efficiently and reliably detect humans and animals in aerial thermal-infrared footage. Combining this astronomical detection software with existing machine learning algorithms into a single, automated, end-to-end pipeline, we test the software using aerial video footage taken in a controlled, field-like environment. We demonstrate that the pipeline works reliably and describe how it can be used to estimate the completeness of different observational datasets to objects of a given type as a function of height, observing conditions etc. - a crucial step in converting video footage to scientifically useful information such as the spatial distribution and density of different animal species. Finally, having demonstrated the potential utility of the system, we describe the steps we are taking to adapt the system for work in the field, in particular systematic monitoring of endangered species at National Parks around the world.
Maser: one-stop platform for NGS big data from analysis to visualization
Kinjo, Sonoko; Monma, Norikazu; Misu, Sadahiko; Kitamura, Norikazu; Imoto, Junichi; Yoshitake, Kazutoshi; Gojobori, Takashi; Ikeo, Kazuho
2018-01-01
Abstract A major challenge in analyzing the data from high-throughput next-generation sequencing (NGS) is how to handle the huge amounts of data and variety of NGS tools and visualize the resultant outputs. To address these issues, we developed a cloud-based data analysis platform, Maser (Management and Analysis System for Enormous Reads), and an original genome browser, Genome Explorer (GE). Maser enables users to manage up to 2 terabytes of data to conduct analyses with easy graphical user interface operations and offers analysis pipelines in which several individual tools are combined as a single pipeline for very common and standard analyses. GE automatically visualizes genome assembly and mapping results output from Maser pipelines, without requiring additional data upload. With this function, the Maser pipelines can graphically display the results output from all the embedded tools and mapping results in a web browser. Therefore Maser realized a more user-friendly analysis platform especially for beginners by improving graphical display and providing the selected standard pipelines that work with built-in genome browser. In addition, all the analyses executed on Maser are recorded in the analysis history, helping users to trace and repeat the analyses. The entire process of analysis and its histories can be shared with collaborators or opened to the public. In conclusion, our system is useful for managing, analyzing, and visualizing NGS data and achieves traceability, reproducibility, and transparency of NGS analysis. Database URL: http://cell-innovation.nig.ac.jp/maser/ PMID:29688385
A-Track: A New Approach for Detection of Moving Objects in FITS Images
NASA Astrophysics Data System (ADS)
Kılıç, Yücel; Karapınar, Nurdan; Atay, Tolga; Kaplan, Murat
2016-07-01
Small planet and asteroid observations are important for understanding the origin and evolution of the Solar System. In this work, we have developed a fast and robust pipeline, called A-Track, for detecting asteroids and comets in sequential telescope images. The moving objects are detected using a modified line detection algorithm, called ILDA. We have coded the pipeline in Python 3, where we have made use of various scientific modules in Python to process the FITS images. We tested the code on photometrical data taken by an SI-1100 CCD with a 1-meter telescope at TUBITAK National Observatory, Antalya. The pipeline can be used to analyze large data archives or daily sequential data. The code is hosted on GitHub under the GNU GPL v3 license.
Diagnostic Inspection of Pipelines for Estimating the State of Stress in Them
NASA Astrophysics Data System (ADS)
Subbotin, V. A.; Kolotilov, Yu. V.; Smirnova, V. Yu.; Ivashko, S. K.
2017-12-01
The diagnostic inspection used to estimate the technical state of a pipeline is described. The problems of inspection works are listed, and a functional-structural scheme is developed to estimate the state of stress in a pipeline. Final conclusions regarding the actual loading of a pipeline section are drawn upon a cross analysis of the entire information obtained during pipeline inspection.
Underground pipeline laying using the pipe-in-pipe system
NASA Astrophysics Data System (ADS)
Antropova, N.; Krets, V.; Pavlov, M.
2016-09-01
The problems of resource saving and environmental safety during the installation and operation of the underwater crossings are always relevant. The paper describes the existing methods of trenchless pipeline technology, the structure of multi-channel pipelines, the types of supporting and guiding systems. The rational design is suggested for the pipe-in-pipe system. The finite element model is presented for the most dangerous sections of the inner pipes, the optimum distance is detected between the roller supports.
1. FIRST SECTION OF PIPELINE BETWEEN CONFLUENCE POOL AND FISH ...
1. FIRST SECTION OF PIPELINE BETWEEN CONFLUENCE POOL AND FISH SCREEN. NOTE RETAINING WALL BESIDE PIPE. VIEW TO NORTH-NORTHEAST. - Santa Ana River Hydroelectric System, Pipeline to Fish Screen, Redlands, San Bernardino County, CA
Mathematical modeling of non-stationary gas flow in gas pipeline
NASA Astrophysics Data System (ADS)
Fetisov, V. G.; Nikolaev, A. K.; Lykov, Y. V.; Duchnevich, L. N.
2018-03-01
An analysis of the operation of the gas transportation system shows that for a considerable part of time pipelines operate in an unsettled regime of gas movement. Its pressure and flow rate vary along the length of pipeline and over time as a result of uneven consumption and selection, switching on and off compressor units, shutting off stop valves, emergence of emergency leaks. The operational management of such regimes is associated with difficulty of reconciling the operating modes of individual sections of gas pipeline with each other, as well as with compressor stations. Determining the grounds that cause change in the operating mode of the pipeline system and revealing patterns of these changes determine the choice of its parameters. Therefore, knowledge of the laws of changing the main technological parameters of gas pumping through pipelines in conditions of non-stationary motion is of great importance for practice.
NASA Astrophysics Data System (ADS)
Rizzuto, Aaron C.; Mann, Andrew W.; Vanderburg, Andrew; Kraus, Adam L.; Covey, Kevin R.
2017-12-01
Detection of transiting exoplanets around young stars is more difficult than for older systems owing to increased stellar variability. Nine young open cluster planets have been found in the K2 data, but no single analysis pipeline identified all planets. We have developed a transit search pipeline for young stars that uses a transit-shaped notch and quadratic continuum in a 12 or 24 hr window to fit both the stellar variability and the presence of a transit. In addition, for the most rapid rotators ({P}{rot}< 2 days) we model the variability using a linear combination of observed rotations of each star. To maximally exploit our new pipeline, we update the membership for four stellar populations observed by K2 (Upper Scorpius, Pleiades, Hyades, Praesepe) and conduct a uniform search of the members. We identify all known transiting exoplanets in the clusters, 17 eclipsing binaries, one transiting planet candidate orbiting a potential Pleiades member, and three orbiting unlikely members of the young clusters. Limited injection recovery testing on the known planet hosts indicates that for the older Praesepe systems we are sensitive to additional exoplanets as small as 1-2 R ⊕, and for the larger Upper Scorpius planet host (K2-33) our pipeline is sensitive to ˜4 R ⊕ transiting planets. The lack of detected multiple systems in the young clusters is consistent with the expected frequency from the original Kepler sample, within our detection limits. With a robust pipeline that detects all known planets in the young clusters, occurrence rate testing at young ages is now possible.
A Pipeline for 3D Digital Optical Phenotyping Plant Root System Architecture
NASA Astrophysics Data System (ADS)
Davis, T. W.; Shaw, N. M.; Schneider, D. J.; Shaff, J. E.; Larson, B. G.; Craft, E. J.; Liu, Z.; Kochian, L. V.; Piñeros, M. A.
2017-12-01
This work presents a new pipeline for digital optical phenotyping the root system architecture of agricultural crops. The pipeline begins with a 3D root-system imaging apparatus for hydroponically grown crop lines of interest. The apparatus acts as a self-containing dark room, which includes an imaging tank, motorized rotating bearing and digital camera. The pipeline continues with the Plant Root Imaging and Data Acquisition (PRIDA) software, which is responsible for image capturing and storage. Once root images have been captured, image post-processing is performed using the Plant Root Imaging Analysis (PRIA) command-line tool, which extracts root pixels from color images. Following the pre-processing binarization of digital root images, 3D trait characterization is performed using the next-generation RootReader3D software. RootReader3D measures global root system architecture traits, such as total root system volume and length, total number of roots, and maximum rooting depth and width. While designed to work together, the four stages of the phenotyping pipeline are modular and stand-alone, which provides flexibility and adaptability for various research endeavors.
Melicher, Dacotah; Torson, Alex S; Dworkin, Ian; Bowsher, Julia H
2014-03-12
The Sepsidae family of flies is a model for investigating how sexual selection shapes courtship and sexual dimorphism in a comparative framework. However, like many non-model systems, there are few molecular resources available. Large-scale sequencing and assembly have not been performed in any sepsid, and the lack of a closely related genome makes investigation of gene expression challenging. Our goal was to develop an automated pipeline for de novo transcriptome assembly, and to use that pipeline to assemble and analyze the transcriptome of the sepsid Themira biloba. Our bioinformatics pipeline uses cloud computing services to assemble and analyze the transcriptome with off-site data management, processing, and backup. It uses a multiple k-mer length approach combined with a second meta-assembly to extend transcripts and recover more bases of transcript sequences than standard single k-mer assembly. We used 454 sequencing to generate 1.48 million reads from cDNA generated from embryo, larva, and pupae of T. biloba and assembled a transcriptome consisting of 24,495 contigs. Annotation identified 16,705 transcripts, including those involved in embryogenesis and limb patterning. We assembled transcriptomes from an additional three non-model organisms to demonstrate that our pipeline assembled a higher-quality transcriptome than single k-mer approaches across multiple species. The pipeline we have developed for assembly and analysis increases contig length, recovers unique transcripts, and assembles more base pairs than other methods through the use of a meta-assembly. The T. biloba transcriptome is a critical resource for performing large-scale RNA-Seq investigations of gene expression patterns, and is the first transcriptome sequenced in this Dipteran family.
Study on the flow in the pipelines of the support system of circulating fluidized bed
NASA Astrophysics Data System (ADS)
Meng, L.; Yang, J.; Zhou, L. J.; Wang, Z. W.; Zhuang, X. H.
2013-12-01
In the support system of Circulating Fluidized Bed (Below referred to as CFB) of thermal power plant, the pipelines of primary wind are used for transporting the cold air to the boiler, which is important in controlling and combustion effect. The pipeline design will greatly affect the energy loss of the system, and accordingly affect the thermal power plant economic benefits and production environment. Three-dimensional numerical simulation is carried out for the pipeline internal flow field of a thermal power plant in this paper. Firstly three turbulence models were compared and the results showed that the SST k-ω model converged better and the energy losses predicted were closer to the experimental results. The influence of the pipeline design form on the flow characteristics are analysed, then the optimization designs of the pipeline are proposed according to the energy loss distribution of the flow field, in order to reduce energy loss and improve the efficiency of tunnel. The optimization plan turned out to be efficacious; about 36% of the pressure loss is reduced.
Cornwell, MacIntosh; Vangala, Mahesh; Taing, Len; Herbert, Zachary; Köster, Johannes; Li, Bo; Sun, Hanfei; Li, Taiwen; Zhang, Jian; Qiu, Xintao; Pun, Matthew; Jeselsohn, Rinath; Brown, Myles; Liu, X Shirley; Long, Henry W
2018-04-12
RNA sequencing has become a ubiquitous technology used throughout life sciences as an effective method of measuring RNA abundance quantitatively in tissues and cells. The increase in use of RNA-seq technology has led to the continuous development of new tools for every step of analysis from alignment to downstream pathway analysis. However, effectively using these analysis tools in a scalable and reproducible way can be challenging, especially for non-experts. Using the workflow management system Snakemake we have developed a user friendly, fast, efficient, and comprehensive pipeline for RNA-seq analysis. VIPER (Visualization Pipeline for RNA-seq analysis) is an analysis workflow that combines some of the most popular tools to take RNA-seq analysis from raw sequencing data, through alignment and quality control, into downstream differential expression and pathway analysis. VIPER has been created in a modular fashion to allow for the rapid incorporation of new tools to expand the capabilities. This capacity has already been exploited to include very recently developed tools that explore immune infiltrate and T-cell CDR (Complementarity-Determining Regions) reconstruction abilities. The pipeline has been conveniently packaged such that minimal computational skills are required to download and install the dozens of software packages that VIPER uses. VIPER is a comprehensive solution that performs most standard RNA-seq analyses quickly and effectively with a built-in capacity for customization and expansion.
An Integrated SNP Mining and Utilization (ISMU) Pipeline for Next Generation Sequencing Data
Azam, Sarwar; Rathore, Abhishek; Shah, Trushar M.; Telluri, Mohan; Amindala, BhanuPrakash; Ruperao, Pradeep; Katta, Mohan A. V. S. K.; Varshney, Rajeev K.
2014-01-01
Open source single nucleotide polymorphism (SNP) discovery pipelines for next generation sequencing data commonly requires working knowledge of command line interface, massive computational resources and expertise which is a daunting task for biologists. Further, the SNP information generated may not be readily used for downstream processes such as genotyping. Hence, a comprehensive pipeline has been developed by integrating several open source next generation sequencing (NGS) tools along with a graphical user interface called Integrated SNP Mining and Utilization (ISMU) for SNP discovery and their utilization by developing genotyping assays. The pipeline features functionalities such as pre-processing of raw data, integration of open source alignment tools (Bowtie2, BWA, Maq, NovoAlign and SOAP2), SNP prediction (SAMtools/SOAPsnp/CNS2snp and CbCC) methods and interfaces for developing genotyping assays. The pipeline outputs a list of high quality SNPs between all pairwise combinations of genotypes analyzed, in addition to the reference genome/sequence. Visualization tools (Tablet and Flapjack) integrated into the pipeline enable inspection of the alignment and errors, if any. The pipeline also provides a confidence score or polymorphism information content value with flanking sequences for identified SNPs in standard format required for developing marker genotyping (KASP and Golden Gate) assays. The pipeline enables users to process a range of NGS datasets such as whole genome re-sequencing, restriction site associated DNA sequencing and transcriptome sequencing data at a fast speed. The pipeline is very useful for plant genetics and breeding community with no computational expertise in order to discover SNPs and utilize in genomics, genetics and breeding studies. The pipeline has been parallelized to process huge datasets of next generation sequencing. It has been developed in Java language and is available at http://hpc.icrisat.cgiar.org/ISMU as a standalone free software. PMID:25003610
49 CFR 195.452 - Pipeline integrity management in high consequence areas.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 49 Transportation 3 2013-10-01 2013-10-01 false Pipeline integrity management in high consequence... Management § 195.452 Pipeline integrity management in high consequence areas. (a) Which pipelines are covered... by this section must: (1) Develop a written integrity management program that addresses the risks on...
49 CFR 195.452 - Pipeline integrity management in high consequence areas.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 49 Transportation 3 2012-10-01 2012-10-01 false Pipeline integrity management in high consequence... Management § 195.452 Pipeline integrity management in high consequence areas. (a) Which pipelines are covered... by this section must: (1) Develop a written integrity management program that addresses the risks on...
49 CFR 195.452 - Pipeline integrity management in high consequence areas.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 49 Transportation 3 2014-10-01 2014-10-01 false Pipeline integrity management in high consequence... Management § 195.452 Pipeline integrity management in high consequence areas. (a) Which pipelines are covered... by this section must: (1) Develop a written integrity management program that addresses the risks on...
DOT National Transportation Integrated Search
2010-08-01
Significant financial and environmental consequences often result from line leakage of oil product pipelines. Product can escape into the surrounding soil as even the smallest leak can lead to rupture of the pipeline. From a health perspective, water...
DKIST visible broadband imager data processing pipeline
NASA Astrophysics Data System (ADS)
Beard, Andrew; Cowan, Bruce; Ferayorni, Andrew
2014-07-01
The Daniel K. Inouye Solar Telescope (DKIST) Data Handling System (DHS) provides the technical framework and building blocks for developing on-summit instrument quality assurance and data reduction pipelines. The DKIST Visible Broadband Imager (VBI) is a first light instrument that alone will create two data streams with a bandwidth of 960 MB/s each. The high data rate and data volume of the VBI require near-real time processing capability for quality assurance and data reduction, and will be performed on-summit using Graphics Processing Unit (GPU) technology. The VBI data processing pipeline (DPP) is the first designed and developed using the DKIST DHS components, and therefore provides insight into the strengths and weaknesses of the framework. In this paper we lay out the design of the VBI DPP, examine how the underlying DKIST DHS components are utilized, and discuss how integration of the DHS framework with GPUs was accomplished. We present our results of the VBI DPP alpha release implementation of the calibration, frame selection reduction, and quality assurance display processing nodes.
Shaikh, Faiq; Franc, Benjamin; Allen, Erastus; Sala, Evis; Awan, Omer; Hendrata, Kenneth; Halabi, Safwan; Mohiuddin, Sohaib; Malik, Sana; Hadley, Dexter; Shrestha, Rasu
2018-03-01
Enterprise imaging has channeled various technological innovations to the field of clinical radiology, ranging from advanced imaging equipment and postacquisition iterative reconstruction tools to image analysis and computer-aided detection tools. More recently, the advancement in the field of quantitative image analysis coupled with machine learning-based data analytics, classification, and integration has ushered in the era of radiomics, a paradigm shift that holds tremendous potential in clinical decision support as well as drug discovery. However, there are important issues to consider to incorporate radiomics into a clinically applicable system and a commercially viable solution. In this two-part series, we offer insights into the development of the translational pipeline for radiomics from methodology to clinical implementation (Part 1) and from that point to enterprise development (Part 2). In Part 2 of this two-part series, we study the components of the strategy pipeline, from clinical implementation to building enterprise solutions. Copyright © 2017 American College of Radiology. Published by Elsevier Inc. All rights reserved.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-16
... maps, security plans, etc.); and Actual or suspected cyber-attacks that could impact pipeline... suspected attacks on pipeline systems, facilities, or assets; Bomb threats or weapons of mass destruction...
49 CFR 193.2019 - Mobile and temporary LNG facilities.
Code of Federal Regulations, 2010 CFR
2010-10-01
... Section 193.2019 Transportation Other Regulations Relating to Transportation (Continued) PIPELINE AND HAZARDOUS MATERIALS SAFETY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED) PIPELINE SAFETY... during gas pipeline systems repair/alteration, or for other short term applications need not meet the...
49 CFR 193.2513 - Transfer procedures.
Code of Federal Regulations, 2010 CFR
2010-10-01
... Transportation Other Regulations Relating to Transportation (Continued) PIPELINE AND HAZARDOUS MATERIALS SAFETY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED) PIPELINE SAFETY LIQUEFIED NATURAL GAS FACILITIES... transfer position; and (7) Verify that transfers into a pipeline system will not exceed the pressure or...
Moutsatsos, Ioannis K; Hossain, Imtiaz; Agarinis, Claudia; Harbinski, Fred; Abraham, Yann; Dobler, Luc; Zhang, Xian; Wilson, Christopher J; Jenkins, Jeremy L; Holway, Nicholas; Tallarico, John; Parker, Christian N
2017-03-01
High-throughput screening generates large volumes of heterogeneous data that require a diverse set of computational tools for management, processing, and analysis. Building integrated, scalable, and robust computational workflows for such applications is challenging but highly valuable. Scientific data integration and pipelining facilitate standardized data processing, collaboration, and reuse of best practices. We describe how Jenkins-CI, an "off-the-shelf," open-source, continuous integration system, is used to build pipelines for processing images and associated data from high-content screening (HCS). Jenkins-CI provides numerous plugins for standard compute tasks, and its design allows the quick integration of external scientific applications. Using Jenkins-CI, we integrated CellProfiler, an open-source image-processing platform, with various HCS utilities and a high-performance Linux cluster. The platform is web-accessible, facilitates access and sharing of high-performance compute resources, and automates previously cumbersome data and image-processing tasks. Imaging pipelines developed using the desktop CellProfiler client can be managed and shared through a centralized Jenkins-CI repository. Pipelines and managed data are annotated to facilitate collaboration and reuse. Limitations with Jenkins-CI (primarily around the user interface) were addressed through the selection of helper plugins from the Jenkins-CI community.
Moutsatsos, Ioannis K.; Hossain, Imtiaz; Agarinis, Claudia; Harbinski, Fred; Abraham, Yann; Dobler, Luc; Zhang, Xian; Wilson, Christopher J.; Jenkins, Jeremy L.; Holway, Nicholas; Tallarico, John; Parker, Christian N.
2016-01-01
High-throughput screening generates large volumes of heterogeneous data that require a diverse set of computational tools for management, processing, and analysis. Building integrated, scalable, and robust computational workflows for such applications is challenging but highly valuable. Scientific data integration and pipelining facilitate standardized data processing, collaboration, and reuse of best practices. We describe how Jenkins-CI, an “off-the-shelf,” open-source, continuous integration system, is used to build pipelines for processing images and associated data from high-content screening (HCS). Jenkins-CI provides numerous plugins for standard compute tasks, and its design allows the quick integration of external scientific applications. Using Jenkins-CI, we integrated CellProfiler, an open-source image-processing platform, with various HCS utilities and a high-performance Linux cluster. The platform is web-accessible, facilitates access and sharing of high-performance compute resources, and automates previously cumbersome data and image-processing tasks. Imaging pipelines developed using the desktop CellProfiler client can be managed and shared through a centralized Jenkins-CI repository. Pipelines and managed data are annotated to facilitate collaboration and reuse. Limitations with Jenkins-CI (primarily around the user interface) were addressed through the selection of helper plugins from the Jenkins-CI community. PMID:27899692
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rieber, M.; Soo, S.L.
1977-08-01
A coal slurry pipeline system requires that the coal go through a number of processing stages before it is used by the power plant. Once mined, the coal is delivered to a preparation plant where it is pulverized to sizes between 18 and 325 mesh and then suspended in about an equal weight of water. This 50-50 slurry mixture has a consistency approximating toothpaste. It is pushed through the pipeline via electric pumping stations 70 to 100 miles apart. Flow velocity through the line must be maintained within a narrow range. For example, if a 3.5 mph design is usedmore » at 5 mph, the system must be able to withstand double the horsepower, peak pressure, and wear. Minimum flowrate must be maintained to avoid particle settling and plugging. However, in general, once a pipeline system has been designed, because of economic considerations on the one hand and design limits on the other, flowrate is rather inflexible. Pipelines that have a slowly moving throughput and a water carrier may be subject to freezing in northern areas during periods of severe cold. One of the problems associated with slurry pipeline analyses is the lack of operating experience.« less
The Vulnerability Formation Mechanism and Control Strategy of the Oil and Gas Pipeline City
NASA Astrophysics Data System (ADS)
Chen, Y. L.; Han, L.
2017-12-01
Most of the pipelines of oil and gas pipelines in our country have been for more than 25 years. These pipes are buried underground and was difficult to daily test. In addition, it was vulnerable to environmental, corrosion and natural disasters, So there is a hidden nature of accidents. The rapid development of urbanization, population accumulation, dense building and insufficient safety range are all the reasons for the frequent accidents of oil and gas pipelines. Therefore, to appraise and know the safe condition of the city various regions oil and gas pipelines is vital significant. In order to ensure the safety of oil and gas pipeline city, this paper defines the connotation of oil and gas pipeline city vulnerability according to the previous research on vulnerability. Then from three perspectives of environment, structure and behavior, based on the analytical paradigm of “structure—vulnerability conduct—performance” about oil and gas, the influential indicators of vulnerable oil and gas pipelines were analysed, the vulnerability mechanism framework of Oil and gas pipeline city was also constructed. Finally, the paper proposed the regulating strategy of the vulnerability of the oil and gas pipeline city to decrease its vulnerability index, which can be realize the city’s vulnerability evaluation and provides new ideas for the sustainable development of the city.
Chery, Joyce G; Sass, Chodon; Specht, Chelsea D
2017-09-01
We developed a bioinformatic pipeline that leverages a publicly available genome and published transcriptomes to design primers in conserved coding sequences flanking targeted introns of single-copy nuclear loci. Paullinieae (Sapindaceae) is used to demonstrate the pipeline. Transcriptome reads phylogenetically closer to the lineage of interest are aligned to the closest genome. Single-nucleotide polymorphisms are called, generating a "pseudoreference" closer to the lineage of interest. Several filters are applied to meet the criteria of single-copy nuclear loci with introns of a desired size. Primers are designed in conserved coding sequences flanking introns. Using this pipeline, we developed nine single-copy nuclear intron markers for Paullinieae. This pipeline is highly flexible and can be used for any group with available genomic and transcriptomic resources. This pipeline led to the development of nine variable markers for phylogenetic study without generating sequence data de novo.
A quantitative non-destructive residual stress assessment tool for pipelines.
DOT National Transportation Integrated Search
2014-09-01
G2MT successfully demonstrated the eStress system, a powerful new nondestructive evaluation : system for analyzing through-thickness residual stresses in mechanical damaged areas of steel : pipelines. The eStress system is designed to help pipe...
Reducing vibration transfer from power plants by active methods
NASA Astrophysics Data System (ADS)
Kiryukhin, A. V.; Milman, O. O.; Ptakhin, A. V.
2017-12-01
The possibility of applying the methods of active damping of vibration and pressure pulsations for reducing their transfer from power plants into the environment, the seating, and the industrial premises are considered. The results of experimental works implemented by the authors on the active broadband damping of vibration and dynamic forces after shock-absorption up to 15 dB in the frequency band up to 150 Hz, of water pressure pulsations in the pipeline up to 20 dB in the frequency band up to 600 Hz, and of spatial low-frequency air noise indoors of a diesel generator at discrete frequency up to 20 dB are presented. It is shown that a reduction of vibration transfer through a vibration-isolating junction (expansion joints) of pipelines with liquid is the most complicated and has hardly been developed so far. This problem is essential for vibration isolation of power equipment from the seating and the environment through pipelines with water and steam in the power and transport engineering, shipbuilding, and in oil and gas pipelines in pumping stations. For improving efficiency, reducing the energy consumption, and decreasing the overall dimensions of equipment, it is advisable to combine the work of an active system with passive damping means, the use of which is not always sufficient. The executive component of the systems of active damping should be placed behind the vibration isolators (expansion joints). It is shown that the existence of working medium and connection of vibration with pressure pulsations in existing designs of pipeline expansion joints lead to growth of vibration stiffness of the expansion joint with the environment by two and more orders as compared with the static stiffness and makes difficulties for using the active methods. For active damping of vibration transfer through expansion joints of pipelines with a liquid, it is necessary to develop expansion joint structures with minimal connection of vibrations and pulsations and minimal vibration stiffness in the specified frequency range. The example of structure of such expansion joint and its test results are presented.
New Yumurtalik to Kirikkale crude-oil pipeline would boost Turkish industrial area
DOE Office of Scientific and Technical Information (OSTI.GOV)
Simonnet, G.
1982-12-13
Plans for a crude oil pipeline linking the 101 cm (40 in.) Iraq to Turkey pipeline terminal located in Yumurtalik to the site of a future refinery to be situated near Ankara are described. Designed for fully unattended operation, the ''brain'' of the system will be a telecom/telecontrol telemetry system. Support for data information exchanged between the master and local outstations will be a microwave radio carrier system, also permitting the transfer of telephone and telegraph traffic as well as facsimiles.
The High Level Data Reduction Library
NASA Astrophysics Data System (ADS)
Ballester, P.; Gabasch, A.; Jung, Y.; Modigliani, A.; Taylor, J.; Coccato, L.; Freudling, W.; Neeser, M.; Marchetti, E.
2015-09-01
The European Southern Observatory (ESO) provides pipelines to reduce data for most of the instruments at its Very Large telescope (VLT). These pipelines are written as part of the development of VLT instruments, and are used both in the ESO's operational environment and by science users who receive VLT data. All the pipelines are highly specific geared toward instruments. However, experience showed that the independently developed pipelines include significant overlap, duplication and slight variations of similar algorithms. In order to reduce the cost of development, verification and maintenance of ESO pipelines, and at the same time improve the scientific quality of pipelines data products, ESO decided to develop a limited set of versatile high-level scientific functions that are to be used in all future pipelines. The routines are provided by the High-level Data Reduction Library (HDRL). To reach this goal, we first compare several candidate algorithms and verify them during a prototype phase using data sets from several instruments. Once the best algorithm and error model have been chosen, we start a design and implementation phase. The coding of HDRL is done in plain C and using the Common Pipeline Library (CPL) functionality. HDRL adopts consistent function naming conventions and a well defined API to minimise future maintenance costs, implements error propagation, uses pixel quality information, employs OpenMP to take advantage of multi-core processors, and is verified with extensive unit and regression tests. This poster describes the status of the project and the lesson learned during the development of reusable code implementing algorithms of high scientific quality.
Mortise terrorism on the main pipelines
NASA Astrophysics Data System (ADS)
Komarov, V. A.; Nigrey, N. N.; Bronnikov, D. A.; Nigrey, A. A.
2018-01-01
The research aim of the work is to analyze the effectiveness of the methods of physical protection of main pipelines proposed in the article from the "mortise terrorism" A mathematical model has been developed that made it possible to predict the dynamics of "mortise terrorism" in the short term. An analysis of the effectiveness of physical protection methods proposed in the article to prevent unauthorized impacts on the objects under investigation is given. A variant of a video analytics system has been developed that allows detecting violators with recognition of the types of work they perform at a distance of 150 meters in conditions of complex natural backgrounds and precipitation. Probability of detection is 0.959.
Risk Analysis using Corrosion Rate Parameter on Gas Transmission Pipeline
NASA Astrophysics Data System (ADS)
Sasikirono, B.; Kim, S. J.; Haryadi, G. D.; Huda, A.
2017-05-01
In the oil and gas industry, the pipeline is a major component in the transmission and distribution process of oil and gas. Oil and gas distribution process sometimes performed past the pipeline across the various types of environmental conditions. Therefore, in the transmission and distribution process of oil and gas, a pipeline should operate safely so that it does not harm the surrounding environment. Corrosion is still a major cause of failure in some components of the equipment in a production facility. In pipeline systems, corrosion can cause failures in the wall and damage to the pipeline. Therefore it takes care and periodic inspections or checks on the pipeline system. Every production facility in an industry has a level of risk for damage which is a result of the opportunities and consequences of damage caused. The purpose of this research is to analyze the level of risk of 20-inch Natural Gas Transmission Pipeline using Risk-based inspection semi-quantitative based on API 581 associated with the likelihood of failure and the consequences of the failure of a component of the equipment. Then the result is used to determine the next inspection plans. Nine pipeline components were observed, such as a straight pipes inlet, connection tee, and straight pipes outlet. The risk assessment level of the nine pipeline’s components is presented in a risk matrix. The risk level of components is examined at medium risk levels. The failure mechanism that is used in this research is the mechanism of thinning. Based on the results of corrosion rate calculation, remaining pipeline components age can be obtained, so the remaining lifetime of pipeline components are known. The calculation of remaining lifetime obtained and the results vary for each component. Next step is planning the inspection of pipeline components by NDT external methods.
Hybrid Laser/GMAW of High Strength Steel Gas Transmission Pipelines
DOT National Transportation Integrated Search
2008-07-01
Pipelines will be an integral part of our energy distribution systems for the foreseeable future. Operators are currently considering the installation of tens of billions of dollars of pipeline infrastructure. In a number of cases, the cost of export...
BigDataScript: a scripting language for data pipelines.
Cingolani, Pablo; Sladek, Rob; Blanchette, Mathieu
2015-01-01
The analysis of large biological datasets often requires complex processing pipelines that run for a long time on large computational infrastructures. We designed and implemented a simple script-like programming language with a clean and minimalist syntax to develop and manage pipeline execution and provide robustness to various types of software and hardware failures as well as portability. We introduce the BigDataScript (BDS) programming language for data processing pipelines, which improves abstraction from hardware resources and assists with robustness. Hardware abstraction allows BDS pipelines to run without modification on a wide range of computer architectures, from a small laptop to multi-core servers, server farms, clusters and clouds. BDS achieves robustness by incorporating the concepts of absolute serialization and lazy processing, thus allowing pipelines to recover from errors. By abstracting pipeline concepts at programming language level, BDS simplifies implementation, execution and management of complex bioinformatics pipelines, resulting in reduced development and debugging cycles as well as cleaner code. BigDataScript is available under open-source license at http://pcingola.github.io/BigDataScript. © The Author 2014. Published by Oxford University Press.
BigDataScript: a scripting language for data pipelines
Cingolani, Pablo; Sladek, Rob; Blanchette, Mathieu
2015-01-01
Motivation: The analysis of large biological datasets often requires complex processing pipelines that run for a long time on large computational infrastructures. We designed and implemented a simple script-like programming language with a clean and minimalist syntax to develop and manage pipeline execution and provide robustness to various types of software and hardware failures as well as portability. Results: We introduce the BigDataScript (BDS) programming language for data processing pipelines, which improves abstraction from hardware resources and assists with robustness. Hardware abstraction allows BDS pipelines to run without modification on a wide range of computer architectures, from a small laptop to multi-core servers, server farms, clusters and clouds. BDS achieves robustness by incorporating the concepts of absolute serialization and lazy processing, thus allowing pipelines to recover from errors. By abstracting pipeline concepts at programming language level, BDS simplifies implementation, execution and management of complex bioinformatics pipelines, resulting in reduced development and debugging cycles as well as cleaner code. Availability and implementation: BigDataScript is available under open-source license at http://pcingola.github.io/BigDataScript. Contact: pablo.e.cingolani@gmail.com PMID:25189778
NASA Astrophysics Data System (ADS)
Jones, Christopher F.
2009-12-01
Coal canals, oil pipelines, and electricity transmission wires transformed the built environment of the American mid-Atlantic region between 1820 and 1930. By transporting coal, oil, and electrons cheaply, reliably, and in great quantities, these technologies reshaped the energy choices available to mid-Atlantic residents. In particular, canals, pipelines, and wires created new energy landscapes: systems of transport infrastructure that enabled the ever-increasing consumption of fossil fuels. Energy Landscapes integrates history of technology, environmental history, and business history to provide new perspectives on how Americans began to use fossil fuels and the social implications of these practices. First, I argue that the development of transport infrastructure played critical, and underappreciated, roles in shaping social energy choices. Rather than simply responding passively to the needs of producers and consumers, canals, pipelines, and wires structured how, when, where, and in what quantities energy was used. Second, I analyze the ways fossil fuel consumption transformed the society, economy, and environment of the mid-Atlantic. I link the consumption of coal, oil, and electricity to the development of an urban and industrialized region, the transition from an organic to a mineral economy, and the creation of a society dependent on fossil fuel energy.
Blending Hydrogen into Natural Gas Pipeline Networks. A Review of Key Issues
DOE Office of Scientific and Technical Information (OSTI.GOV)
Melaina, M. W.; Antonia, O.; Penev, M.
2013-03-01
This study assesses the potential to deliver hydrogen through the existing natural gas pipeline network as a hydrogen and natural gas mixture to defray the cost of building dedicated hydrogen pipelines. Blending hydrogen into the existing natural gas pipeline network has also been proposed as a means of increasing the output of renewable energy systems such as large wind farms.
Madduri, Ravi K.; Sulakhe, Dinanath; Lacinski, Lukasz; Liu, Bo; Rodriguez, Alex; Chard, Kyle; Dave, Utpal J.; Foster, Ian T.
2014-01-01
We describe Globus Genomics, a system that we have developed for rapid analysis of large quantities of next-generation sequencing (NGS) genomic data. This system achieves a high degree of end-to-end automation that encompasses every stage of data analysis including initial data retrieval from remote sequencing centers or storage (via the Globus file transfer system); specification, configuration, and reuse of multi-step processing pipelines (via the Galaxy workflow system); creation of custom Amazon Machine Images and on-demand resource acquisition via a specialized elastic provisioner (on Amazon EC2); and efficient scheduling of these pipelines over many processors (via the HTCondor scheduler). The system allows biomedical researchers to perform rapid analysis of large NGS datasets in a fully automated manner, without software installation or a need for any local computing infrastructure. We report performance and cost results for some representative workloads. PMID:25342933
Madduri, Ravi K; Sulakhe, Dinanath; Lacinski, Lukasz; Liu, Bo; Rodriguez, Alex; Chard, Kyle; Dave, Utpal J; Foster, Ian T
2014-09-10
We describe Globus Genomics, a system that we have developed for rapid analysis of large quantities of next-generation sequencing (NGS) genomic data. This system achieves a high degree of end-to-end automation that encompasses every stage of data analysis including initial data retrieval from remote sequencing centers or storage (via the Globus file transfer system); specification, configuration, and reuse of multi-step processing pipelines (via the Galaxy workflow system); creation of custom Amazon Machine Images and on-demand resource acquisition via a specialized elastic provisioner (on Amazon EC2); and efficient scheduling of these pipelines over many processors (via the HTCondor scheduler). The system allows biomedical researchers to perform rapid analysis of large NGS datasets in a fully automated manner, without software installation or a need for any local computing infrastructure. We report performance and cost results for some representative workloads.
Application of the actor model to large scale NDE data analysis
NASA Astrophysics Data System (ADS)
Coughlin, Chris
2018-03-01
The Actor model of concurrent computation discretizes a problem into a series of independent units or actors that interact only through the exchange of messages. Without direct coupling between individual components, an Actor-based system is inherently concurrent and fault-tolerant. These traits lend themselves to so-called "Big Data" applications in which the volume of data to analyze requires a distributed multi-system design. For a practical demonstration of the Actor computational model, a system was developed to assist with the automated analysis of Nondestructive Evaluation (NDE) datasets using the open source Myriad Data Reduction Framework. A machine learning model trained to detect damage in two-dimensional slices of C-Scan data was deployed in a streaming data processing pipeline. To demonstrate the flexibility of the Actor model, the pipeline was deployed on a local system and re-deployed as a distributed system without recompiling, reconfiguring, or restarting the running application.
77 FR 11520 - Commission Information Collection Activities; Comment Request; Extension
Federal Register 2010, 2011, 2012, 2013, 2014
2012-02-27
..., Gas Pipeline Certificates: Annual Reports of System Flow Diagrams and System Capacity. DATES: Comments... Certificates: Annual Reports of System Flow Diagrams and System Capacity. OMB Control No.: 1902-0005. Type of... June 1 of each year, diagrams reflecting operating conditions on the pipeline's main transmission...
Financial and environmental impacts of new technologies in the energy sector
NASA Astrophysics Data System (ADS)
Duthu, Ray Charles, III
Energy industries (generation, transmission and distribution of fuels and electricity) have a long history as the key elements of the US energy economy and have operated within a mostly consistent niche in our society for the past century. However, varieties of interrelated drivers are forcing changes to these industries' business practices, relationship to their customers, and function in society. In the electric utility industry, the customer is moving towards acting as a fuller partner in the energy economy: buying, selling, and dispatching its demand according to its own incentives. Natural gas exploration and production has long operated out in rural areas farther from public concerns or regulations, but now, due to hydraulic fracturing, new exploration is occurring in more urbanized, developed regions of the country and is creating significant public concern. For these industries, the challenges to their economic development and to improvements to the energy sector are not necessarily technological; but are social, business, and policy problems. This dissertation seeks to understand and design towards these issues by building economic and life cycle assessment models that quantify value, potential monetization, and the potential difference between the monetization and value for two new technologies: customer-owned distributed generation systems and integrated development plans with pipeline water transport in hydraulically fractured oil and gas fields. An inclusive business model of a generic customer in Fort Collins, Co and its surrounding utilities demonstrates that traditional utility rates provide customers with incentives that encourage over-monetization of a customer's distributed generation resource at the expense of the utilities. Another model which compares customer behavior incented by traditional rates in three New England cities with the behavior incented through a real-time pricing market corroborates this conclusion. Daily customer load peak-shaving is shown to have a negligible and unreliable value in reducing the average cost of electricity and in some cases can increase these costs. These models support the hypothesis that distributed generation systems provide much greater value when operated during a few significant electricity price events than according to a daily cycle. New business practices which foster greater cooperation between customers and utilities, such as a real-time price market with a higher fidelity price signal, reconnect distributed generation's potential monetization to its value in the marketplace. These new business models are required to ensure that these new technologies are integrated into the electric grid and into the energy market in such a way that all of the market participants are interested and invested stakeholders. The truck transport of water associated with hydraulic fracturing creates significant local costs. A life cycle analysis of a hypothetical oil and gas field generic to the northern Colorado Denver-Julesburg basin quantifies the economic, environmental, and social costs associated with truck transport and compares these results with water pipeline systems. A literature review of incident data demonstrates that pipelines historically have spilled less hazardous material and caused fewer injuries and fatalities than truck transport systems. The life cycle analysis demonstrates that pipeline systems also emit less pollutants and cause less local road damage than comparable trucking systems. Pipeline systems are shown to be superior to trucking systems across all the metrics considered in this project. In each of these domains, this research has developed expanded-scope models of these new technologies and systems to quantify the tradeoffs that are present between monetization, environment, and economic value. The results point towards those business models, policies, and management practices that enable the development of more equitable, efficient, and sustainable energy systems.
Closha: bioinformatics workflow system for the analysis of massive sequencing data.
Ko, GunHwan; Kim, Pan-Gyu; Yoon, Jongcheol; Han, Gukhee; Park, Seong-Jin; Song, Wangho; Lee, Byungwook
2018-02-19
While next-generation sequencing (NGS) costs have fallen in recent years, the cost and complexity of computation remain substantial obstacles to the use of NGS in bio-medical care and genomic research. The rapidly increasing amounts of data available from the new high-throughput methods have made data processing infeasible without automated pipelines. The integration of data and analytic resources into workflow systems provides a solution to the problem by simplifying the task of data analysis. To address this challenge, we developed a cloud-based workflow management system, Closha, to provide fast and cost-effective analysis of massive genomic data. We implemented complex workflows making optimal use of high-performance computing clusters. Closha allows users to create multi-step analyses using drag and drop functionality and to modify the parameters of pipeline tools. Users can also import the Galaxy pipelines into Closha. Closha is a hybrid system that enables users to use both analysis programs providing traditional tools and MapReduce-based big data analysis programs simultaneously in a single pipeline. Thus, the execution of analytics algorithms can be parallelized, speeding up the whole process. We also developed a high-speed data transmission solution, KoDS, to transmit a large amount of data at a fast rate. KoDS has a file transfer speed of up to 10 times that of normal FTP and HTTP. The computer hardware for Closha is 660 CPU cores and 800 TB of disk storage, enabling 500 jobs to run at the same time. Closha is a scalable, cost-effective, and publicly available web service for large-scale genomic data analysis. Closha supports the reliable and highly scalable execution of sequencing analysis workflows in a fully automated manner. Closha provides a user-friendly interface to all genomic scientists to try to derive accurate results from NGS platform data. The Closha cloud server is freely available for use from http://closha.kobic.re.kr/ .
Bennett, David A.; Yu, Lei; De Jager, Philip L.
2014-01-01
Cognitive decline, Alzheimer's disease (AD) and other causes are major public health problems worldwide. With changing demographics, the number of persons with dementia will increase rapidly. The treatment and prevention of AD and other dementias, therefore, is an urgent unmet need. There have been considerable advances in understanding the biology of many age-related disorders that cause dementia. Gains in understanding AD have led to the development of ante-mortem biomarkers of traditional neuropathology and the conduct of several phase III interventions in the amyloid-β cascade early in the disease process. Many other intervention strategies are in various stages of development. However, efforts to date have met with limited success. A recent National Institute on Aging Research Summit led to a number of requests for applications. One was to establish multi-disciplinary teams of investigators who use systems biology approaches and stem cell technology to identify a new generation of AD targets. We were recently awarded one of three such grants to build a pipeline that integrates epidemiology, systems biology, and stem cell technology to discover and validate novel therapeutic targets and lead compounds for AD treatment and prevention. Here we describe the two cohorts that provide the data and biospecimens being exploited for our pipeline and describe the available unique datasets. Second, we present evidence in support of a chronic disease model of AD that informs our choice of phenotypes as the target outcome. Third, we provide an overview of our approach. Finally, we present the details of our planned drug discovery pipeline. PMID:24508835
Argentine gas system underway for Gas del Estado
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bosch, H.
Gas del Estado's giant 1074-mile Centro-Oeste pipeline project - designed to ultimately transport over 350 million CF/day of natural gas from the Neuquen basin to the Campo Duran-Buenos Aires pipeline system - is now underway. The COGASCO consortium of Dutch and Argentine companies awarded the construction project will also operate and maintain the system for 15 years after its completion. In addition to the 30-in. pipelines, the agreement calls for a major compressor station at the gas field, three intermediate compressor stations, a gas-treatment plant, liquids-recovery facilities, and the metering, control, communications, and maintenance equipment for the system. Fabricated inmore » Holland, the internally and externally coated pipe will be double-jointed to 80-ft lengths after shipment to Argentina; welders will use conventional manual-arc techniques to weld the pipeline in the field.« less
Extending the Fermi-LAT Data Processing Pipeline to the Grid
NASA Astrophysics Data System (ADS)
Zimmer, S.; Arrabito, L.; Glanzman, T.; Johnson, T.; Lavalley, C.; Tsaregorodtsev, A.
2012-12-01
The Data Handling Pipeline (“Pipeline”) has been developed for the Fermi Gamma-Ray Space Telescope (Fermi) Large Area Telescope (LAT) which launched in June 2008. Since then it has been in use to completely automate the production of data quality monitoring quantities, reconstruction and routine analysis of all data received from the satellite and to deliver science products to the collaboration and the Fermi Science Support Center. Aside from the reconstruction of raw data from the satellite (Level 1), data reprocessing and various event-level analyses are also reasonably heavy loads on the pipeline and computing resources. These other loads, unlike Level 1, can run continuously for weeks or months at a time. In addition it receives heavy use in performing production Monte Carlo tasks. In daily use it receives a new data download every 3 hours and launches about 2000 jobs to process each download, typically completing the processing of the data before the next download arrives. The need for manual intervention has been reduced to less than 0.01% of submitted jobs. The Pipeline software is written almost entirely in Java and comprises several modules. The software comprises web-services that allow online monitoring and provides charts summarizing work flow aspects and performance information. The server supports communication with several batch systems such as LSF and BQS and recently also Sun Grid Engine and Condor. This is accomplished through dedicated job control services that for Fermi are running at SLAC and the other computing site involved in this large scale framework, the Lyon computing center of IN2P3. While being different in the logic of a task, we evaluate a separate interface to the Dirac system in order to communicate with EGI sites to utilize Grid resources, using dedicated Grid optimized systems rather than developing our own. More recently the Pipeline and its associated data catalog have been generalized for use by other experiments, and are currently being used by the Enriched Xenon Observatory (EXO), Cryogenic Dark Matter Search (CDMS) experiments as well as for Monte Carlo simulations for the future Cherenkov Telescope Array (CTA).
The Brackets Design and Stress Analysis of a Refinery's Hot Water Pipeline
NASA Astrophysics Data System (ADS)
Zhou, San-Ping; He, Yan-Lin
2016-05-01
The reconstruction engineering which reconstructs the hot water pipeline from a power station to a heat exchange station requires the new hot water pipeline combine with old pipe racks. Taking the allowable span calculated based on GB50316 and the design philosophy of the pipeline supports into account, determine the types and locations of brackets. By analyzing the stresses of the pipeline in AutoPIPE, adjusting the supports at dangerous segments, recalculating in AutoPIPE, at last determine the types, locations and numbers of supports reasonably. Then the overall pipeline system will satisfy the requirement of the ASME B31.3.
The CHARIS Integral Field Spectrograph with SCExAO: Data Reduction and Performance
NASA Astrophysics Data System (ADS)
Kasdin, N. Jeremy; Groff, Tyler; Brandt, Timothy; Currie, Thayne; Rizzo, Maxime; Chilcote, Jeffrey K.; Guyon, Olivier; Jovanovic, Nemanja; Lozi, Julien; Norris, Barnaby; Tamura, Motohide
2018-01-01
We summarize the data reduction pipeline and on-sky performance of the CHARIS Integral Field Spectrograph behind the SCExAO Adaptive Optics system on the Subaru Telescope. The open-source pipeline produces data cubes from raw detector reads using a Χ^2-based spectral extraction technique. It implements a number of advances, including a fit to the full nonlinear pixel response, suppression of up to a factor of ~2 in read noise, and deconvolution of the spectra with the line-spread function. The CHARIS team is currently developing the calibration and postprocessing software that will comprise the second component of the data reduction pipeline. Here, we show a range of CHARIS images, spectra, and contrast curves produced using provisional routines. CHARIS is now characterizing exoplanets simultaneously across the J, H, and K bands.
Crack detection and leakage monitoring on reinforced concrete pipe
NASA Astrophysics Data System (ADS)
Feng, Qian; Kong, Qingzhao; Huo, Linsheng; Song, Gangbing
2015-11-01
Reinforced concrete underground pipelines are some of the most widely used types of structures in water transportation systems. Cracks and leakage are the leading causes of pipeline structural failures which directly results in economic losses and environmental hazards. In this paper, the authors propose a piezoceramic based active sensing approach to detect the cracks and the further leakage of concrete pipelines. Due to the piezoelectric properties, piezoceramic material can be utilized as both the actuator and the sensor in the active sensing approach. The piezoceramic patch, which is sandwiched between protective materials called ‘smart aggregates,’ can be safely embedded into concrete structures. Circumferential and axial cracks were investigated. A wavelet packet-based energy analysis was developed to distinguish the type of crack and determine the further leakage based on different stress wave energy attenuation propagated through the cracks.
Geolocation Support for Water Supply and Sewerage Projects in Azerbaijan
NASA Astrophysics Data System (ADS)
Qocamanov, M. H.; Gurbanov, Ch. Z.
2016-10-01
Drinking water supply and sewerage system designing and reconstruction projects are being extensively conducted in Azerbaijan Republic. During implementation of such projects, collecting large amount of information about the area and detailed investigations are crucial. Joint use of the aerospace monitoring and GIS play an essential role for the studies of the impact of environmental factors, development of the analytical information systems and others, while achieving the reliable performance of the existing and designed major water supply pipelines, as well as construction and exploitation of the technical installations. With our participation the GIS has been created in "Azersu" OJSC that includes systematic database of the drinking water supply and sewerage system, and rain water networks to carry out necessary geo information analysis. GIScreated based on "Microstation" platform and aerospace data. Should be mentioned that, in the country, specifically in large cities (i.e. Baku, Ganja, Sumqait, etc.,) drinking water supply pipelines cross regions with different physico-geographical conditions, geo-morphological compositions and seismotectonics.Mains water supply lines in many accidents occur during the operation, it also creates problems with drinking water consumers. In some cases the damage is caused by large-scale accidents. Long-term experience gives reason to say that the elimination of the consequences of accidents is a major cost. Therefore, to avoid such events and to prevent their exploitation and geodetic monitoring system to improve the rules on key issues. Therefore, constant control of the plan-height positioning, geodetic measurements for the detailed examination of the dynamics, repetition of the geodetic measurements for certain time intervals, or in other words regular monitoring is very important. During geodetic monitoring using the GIS has special significance. Given that, collecting geodetic monitoring measurements of the main pipelines on the same coordinate system and processing these data on a single GIS system allows the implementation of overall assessment of plan-height state of major water supply pipeline network facilities and the study of the impact of water supply network on environment and alternatively, the impact of natural processes on major pipeline.
49 CFR 192.11 - Petroleum gas systems.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 49 Transportation 3 2010-10-01 2010-10-01 false Petroleum gas systems. 192.11 Section 192.11... BY PIPELINE: MINIMUM FEDERAL SAFETY STANDARDS General § 192.11 Petroleum gas systems. (a) Each plant that supplies petroleum gas by pipeline to a natural gas distribution system must meet the requirements...
49 CFR 192.11 - Petroleum gas systems.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 49 Transportation 3 2011-10-01 2011-10-01 false Petroleum gas systems. 192.11 Section 192.11... BY PIPELINE: MINIMUM FEDERAL SAFETY STANDARDS General § 192.11 Petroleum gas systems. (a) Each plant that supplies petroleum gas by pipeline to a natural gas distribution system must meet the requirements...
Planning bioinformatics workflows using an expert system.
Chen, Xiaoling; Chang, Jeffrey T
2017-04-15
Bioinformatic analyses are becoming formidably more complex due to the increasing number of steps required to process the data, as well as the proliferation of methods that can be used in each step. To alleviate this difficulty, pipelines are commonly employed. However, pipelines are typically implemented to automate a specific analysis, and thus are difficult to use for exploratory analyses requiring systematic changes to the software or parameters used. To automate the development of pipelines, we have investigated expert systems. We created the Bioinformatics ExperT SYstem (BETSY) that includes a knowledge base where the capabilities of bioinformatics software is explicitly and formally encoded. BETSY is a backwards-chaining rule-based expert system comprised of a data model that can capture the richness of biological data, and an inference engine that reasons on the knowledge base to produce workflows. Currently, the knowledge base is populated with rules to analyze microarray and next generation sequencing data. We evaluated BETSY and found that it could generate workflows that reproduce and go beyond previously published bioinformatics results. Finally, a meta-investigation of the workflows generated from the knowledge base produced a quantitative measure of the technical burden imposed by each step of bioinformatics analyses, revealing the large number of steps devoted to the pre-processing of data. In sum, an expert system approach can facilitate exploratory bioinformatic analysis by automating the development of workflows, a task that requires significant domain expertise. https://github.com/jefftc/changlab. jeffrey.t.chang@uth.tmc.edu. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com
Planning bioinformatics workflows using an expert system
Chen, Xiaoling; Chang, Jeffrey T.
2017-01-01
Abstract Motivation: Bioinformatic analyses are becoming formidably more complex due to the increasing number of steps required to process the data, as well as the proliferation of methods that can be used in each step. To alleviate this difficulty, pipelines are commonly employed. However, pipelines are typically implemented to automate a specific analysis, and thus are difficult to use for exploratory analyses requiring systematic changes to the software or parameters used. Results: To automate the development of pipelines, we have investigated expert systems. We created the Bioinformatics ExperT SYstem (BETSY) that includes a knowledge base where the capabilities of bioinformatics software is explicitly and formally encoded. BETSY is a backwards-chaining rule-based expert system comprised of a data model that can capture the richness of biological data, and an inference engine that reasons on the knowledge base to produce workflows. Currently, the knowledge base is populated with rules to analyze microarray and next generation sequencing data. We evaluated BETSY and found that it could generate workflows that reproduce and go beyond previously published bioinformatics results. Finally, a meta-investigation of the workflows generated from the knowledge base produced a quantitative measure of the technical burden imposed by each step of bioinformatics analyses, revealing the large number of steps devoted to the pre-processing of data. In sum, an expert system approach can facilitate exploratory bioinformatic analysis by automating the development of workflows, a task that requires significant domain expertise. Availability and Implementation: https://github.com/jefftc/changlab Contact: jeffrey.t.chang@uth.tmc.edu PMID:28052928
Characterization of Stress Corrosion Cracking Using Laser Ultrasonics
DOT National Transportation Integrated Search
2007-02-15
Stress Corrosion Cracking (SCC) is a phenomenon where metals, when subjected to a combination of suitable loads, corrosive environment and susceptible metallurgy, develop crack-clusters that may lead to a failure. Pipeline systems all-over the world ...
Natural Gas Compressor Stations on the Interstate Pipeline Network: Developments Since 1996
2007-01-01
This special report looks at the use of natural gas pipeline compressor stations on the interstate natural gas pipeline network that serves the lower 48 states. It examines the compression facilities added over the past 10 years and how the expansions have supported pipeline capacity growth intended to meet the increasing demand for natural gas.
Real time software tools and methodologies
NASA Technical Reports Server (NTRS)
Christofferson, M. J.
1981-01-01
Real time systems are characterized by high speed processing and throughput as well as asynchronous event processing requirements. These requirements give rise to particular implementations of parallel or pipeline multitasking structures, of intertask or interprocess communications mechanisms, and finally of message (buffer) routing or switching mechanisms. These mechanisms or structures, along with the data structue, describe the essential character of the system. These common structural elements and mechanisms are identified, their implementation in the form of routines, tasks or macros - in other words, tools are formalized. The tools developed support or make available the following: reentrant task creation, generalized message routing techniques, generalized task structures/task families, standardized intertask communications mechanisms, and pipeline and parallel processing architectures in a multitasking environment. Tools development raise some interesting prospects in the areas of software instrumentation and software portability. These issues are discussed following the description of the tools themselves.
Deep ocean corrosion research in support of Oman India gas pipeline
DOE Office of Scientific and Technical Information (OSTI.GOV)
Graham, F.W.; McKeehan, D.S.
1995-12-01
The increasing interest in deepwater exploration and production has motivated the development of technologies required to accomplish tasks heretofore possible only onshore and in shallow water. The tremendous expense of technology development and the cost of specialized equipment has created concerns that the design life of these facilities may be compromised by corrosion. The requirements to develop and prove design parameters to meet these demands will require an ongoing environmental testing and materials evaluation and development program. This paper describes a two-fold corrosion testing program involving: (1) the installation of two corrosion test devices installed in-situ, and (2) a laboratorymore » test conducted in simulated site-specific seawater. These tests are expected to qualify key parameters necessary to design a cathodic protection system to protect the Oman-to-India pipeline.« less
Santillán, Moisés
2003-07-21
A simple model of an oxygen exchanging network is presented and studied. This network's task is to transfer a given oxygen rate from a source to an oxygen consuming system. It consists of a pipeline, that interconnects the oxygen consuming system and the reservoir and of a fluid, the active oxygen transporting element, moving through the pipeline. The network optimal design (total pipeline surface) and dynamics (volumetric flow of the oxygen transporting fluid), which minimize the energy rate expended in moving the fluid, are calculated in terms of the oxygen exchange rate, the pipeline length, and the pipeline cross-section. After the oxygen exchanging network is optimized, the energy converting system is shown to satisfy a 3/4-like allometric scaling law, based upon the assumption that its performance regime is scale invariant as well as on some feasible geometric scaling assumptions. Finally, the possible implications of this result on the allometric scaling properties observed elsewhere in living beings are discussed.
49 CFR 195.444 - CPM leak detection.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 49 Transportation 3 2010-10-01 2010-10-01 false CPM leak detection. 195.444 Section 195.444... PIPELINE Operation and Maintenance § 195.444 CPM leak detection. Each computational pipeline monitoring (CPM) leak detection system installed on a hazardous liquid pipeline transporting liquid in single...
NASA Technical Reports Server (NTRS)
Dowler, W. L.
1979-01-01
High strength steel pipeline carries hot mixture of powdered coal and coal derived oil to electric-power-generating station. Slurry is processed along way to remove sulfur, ash, and nitrogen and to recycle part of oil. System eliminates hazards and limitations associated with anticipated coal/water-slurry pipelines.
The ORAC-DR data reduction pipeline
NASA Astrophysics Data System (ADS)
Cavanagh, B.; Jenness, T.; Economou, F.; Currie, M. J.
2008-03-01
The ORAC-DR data reduction pipeline has been used by the Joint Astronomy Centre since 1998. Originally developed for an infrared spectrometer and a submillimetre bolometer array, it has since expanded to support twenty instruments from nine different telescopes. By using shared code and a common infrastructure, rapid development of an automated data reduction pipeline for nearly any astronomical data is possible. This paper discusses the infrastructure available to developers and estimates the development timescales expected to reduce data for new instruments using ORAC-DR.
OS friendly microprocessor architecture: Hardware level computer security
NASA Astrophysics Data System (ADS)
Jungwirth, Patrick; La Fratta, Patrick
2016-05-01
We present an introduction to the patented OS Friendly Microprocessor Architecture (OSFA) and hardware level computer security. Conventional microprocessors have not tried to balance hardware performance and OS performance at the same time. Conventional microprocessors have depended on the Operating System for computer security and information assurance. The goal of the OS Friendly Architecture is to provide a high performance and secure microprocessor and OS system. We are interested in cyber security, information technology (IT), and SCADA control professionals reviewing the hardware level security features. The OS Friendly Architecture is a switched set of cache memory banks in a pipeline configuration. For light-weight threads, the memory pipeline configuration provides near instantaneous context switching times. The pipelining and parallelism provided by the cache memory pipeline provides for background cache read and write operations while the microprocessor's execution pipeline is running instructions. The cache bank selection controllers provide arbitration to prevent the memory pipeline and microprocessor's execution pipeline from accessing the same cache bank at the same time. This separation allows the cache memory pages to transfer to and from level 1 (L1) caching while the microprocessor pipeline is executing instructions. Computer security operations are implemented in hardware. By extending Unix file permissions bits to each cache memory bank and memory address, the OSFA provides hardware level computer security.
Designing integrated computational biology pipelines visually.
Jamil, Hasan M
2013-01-01
The long-term cost of developing and maintaining a computational pipeline that depends upon data integration and sophisticated workflow logic is too high to even contemplate "what if" or ad hoc type queries. In this paper, we introduce a novel application building interface for computational biology research, called VizBuilder, by leveraging a recent query language called BioFlow for life sciences databases. Using VizBuilder, it is now possible to develop ad hoc complex computational biology applications at throw away costs. The underlying query language supports data integration and workflow construction almost transparently and fully automatically, using a best effort approach. Users express their application by drawing it with VizBuilder icons and connecting them in a meaningful way. Completed applications are compiled and translated as BioFlow queries for execution by the data management system LifeDB, for which VizBuilder serves as a front end. We discuss VizBuilder features and functionalities in the context of a real life application after we briefly introduce BioFlow. The architecture and design principles of VizBuilder are also discussed. Finally, we outline future extensions of VizBuilder. To our knowledge, VizBuilder is a unique system that allows visually designing computational biology pipelines involving distributed and heterogeneous resources in an ad hoc manner.
Code of Federal Regulations, 2011 CFR
2011-10-01
... when identifying response systems and equipment to be deployed in accordance with a response plan... which those systems or equipment are intended to function. Barrel means 42 United States gallons (159... oil pipeline system or (2) Receive and store oil transported by a pipeline for reinjection and...
Stability of subsea pipelines during large storms
Draper, Scott; An, Hongwei; Cheng, Liang; White, David J.; Griffiths, Terry
2015-01-01
On-bottom stability design of subsea pipelines transporting hydrocarbons is important to ensure safety and reliability but is challenging to achieve in the onerous metocean (meteorological and oceanographic) conditions typical of large storms (such as tropical cyclones, hurricanes or typhoons). This challenge is increased by the fact that industry design guidelines presently give no guidance on how to incorporate the potential benefits of seabed mobility, which can lead to lowering and self-burial of the pipeline on a sandy seabed. In this paper, we demonstrate recent advances in experimental modelling of pipeline scour and present results investigating how pipeline stability can change in a large storm. An emphasis is placed on the initial development of the storm, where scour is inevitable on an erodible bed as the storm velocities build up to peak conditions. During this initial development, we compare the rate at which peak near-bed velocities increase in a large storm (typically less than 10−3 m s−2) to the rate at which a pipeline scours and subsequently lowers (which is dependent not only on the storm velocities, but also on the mechanism of lowering and the pipeline properties). We show that the relative magnitude of these rates influences pipeline embedment during a storm and the stability of the pipeline. PMID:25512592
Pipeline scada upgrade uses satellite terminal system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Conrad, W.; Skovrinski, J.R.
In the recent automation of its supervisory control and data acquisition (scada) system, Transwestern Pipeline Co. has become the first to use very small aperture satellite terminals (VSAT's) for scada. A subsidiary of Enron Interstate Pipeline, Houston, Transwestern moves natural gas through a 4,400-mile system from West Texas, New Mexico, and Oklahoma to southern California markets. Transwestern's modernization, begun in November 1985, addressed problems associated with its aging control equipment which had been installed when the compressor stations were built in 1960. Over the years a combination of three different systems had been added. All were cumbersome to maintain andmore » utilized outdated technology. Problems with reliability, high maintenance time, and difficulty in getting new parts were determining factors in Transwestern's decision to modernize its scada system. In addition, the pipeline was anticipating moving its control center from Roswell, N.M., to Houston and believed it would be impossible to marry the old system with the new computer equipment in Houston.« less
Method for oil pipeline leak detection based on distributed fiber optic technology
NASA Astrophysics Data System (ADS)
Chen, Huabo; Tu, Yaqing; Luo, Ting
1998-08-01
Pipeline leak detection is a difficult problem to solve up to now. Some traditional leak detection methods have such problems as high rate of false alarm or missing detection, low location estimate capability. For the problems given above, a method for oil pipeline leak detection based on distributed optical fiber sensor with special coating is presented. The fiber's coating interacts with hydrocarbon molecules in oil, which alters the refractive indexed of the coating. Therefore the light-guiding properties of the fiber are modified. Thus pipeline leak location can be determined by OTDR. Oil pipeline lead detection system is designed based on the principle. The system has some features like real time, multi-point detection at the same time and high location accuracy. In the end, some factors that probably influence detection are analyzed and primary improving actions are given.
40 CFR 52.987 - Control of hydrocarbon emissions.
Code of Federal Regulations, 2013 CFR
2013-07-01
... control systems on a 37,500 barrel capacity crude oil storage tank at Cities Service Pipeline Company, Oil... a 25,000 barrel capacity crude oil storage tank at Cities Service Pipeline Company, Haynesville... barrel capacity crude oil storage tank at Cities Service Pipeline Company, Summerfield, Louisiana with...
40 CFR 52.987 - Control of hydrocarbon emissions.
Code of Federal Regulations, 2014 CFR
2014-07-01
... control systems on a 37,500 barrel capacity crude oil storage tank at Cities Service Pipeline Company, Oil... a 25,000 barrel capacity crude oil storage tank at Cities Service Pipeline Company, Haynesville... barrel capacity crude oil storage tank at Cities Service Pipeline Company, Summerfield, Louisiana with...
40 CFR 52.987 - Control of hydrocarbon emissions.
Code of Federal Regulations, 2012 CFR
2012-07-01
... control systems on a 37,500 barrel capacity crude oil storage tank at Cities Service Pipeline Company, Oil... a 25,000 barrel capacity crude oil storage tank at Cities Service Pipeline Company, Haynesville... barrel capacity crude oil storage tank at Cities Service Pipeline Company, Summerfield, Louisiana with...
49 CFR 195.104 - Variations in pressure.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 49 Transportation 3 2014-10-01 2014-10-01 false Variations in pressure. 195.104 Section 195.104... ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED) PIPELINE SAFETY TRANSPORTATION OF HAZARDOUS LIQUIDS BY PIPELINE Design Requirements § 195.104 Variations in pressure. If, within a pipeline system, two or more...
49 CFR 195.104 - Variations in pressure.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 49 Transportation 3 2010-10-01 2010-10-01 false Variations in pressure. 195.104 Section 195.104... ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED) PIPELINE SAFETY TRANSPORTATION OF HAZARDOUS LIQUIDS BY PIPELINE Design Requirements § 195.104 Variations in pressure. If, within a pipeline system, two or more...
49 CFR 195.104 - Variations in pressure.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 49 Transportation 3 2012-10-01 2012-10-01 false Variations in pressure. 195.104 Section 195.104... ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED) PIPELINE SAFETY TRANSPORTATION OF HAZARDOUS LIQUIDS BY PIPELINE Design Requirements § 195.104 Variations in pressure. If, within a pipeline system, two or more...
49 CFR 195.104 - Variations in pressure.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 49 Transportation 3 2013-10-01 2013-10-01 false Variations in pressure. 195.104 Section 195.104... ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED) PIPELINE SAFETY TRANSPORTATION OF HAZARDOUS LIQUIDS BY PIPELINE Design Requirements § 195.104 Variations in pressure. If, within a pipeline system, two or more...
49 CFR 195.104 - Variations in pressure.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 49 Transportation 3 2011-10-01 2011-10-01 false Variations in pressure. 195.104 Section 195.104... ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED) PIPELINE SAFETY TRANSPORTATION OF HAZARDOUS LIQUIDS BY PIPELINE Design Requirements § 195.104 Variations in pressure. If, within a pipeline system, two or more...
Research on airborne infrared leakage detection of natural gas pipeline
NASA Astrophysics Data System (ADS)
Tan, Dongjie; Xu, Bin; Xu, Xu; Wang, Hongchao; Yu, Dongliang; Tian, Shengjie
2011-12-01
An airborne laser remote sensing technology is proposed to detect natural gas pipeline leakage in helicopter which carrying a detector, and the detector can detect a high spatial resolution of trace of methane on the ground. The principle of the airborne laser remote sensing system is based on tunable diode laser absorption spectroscopy (TDLAS). The system consists of an optical unit containing the laser, camera, helicopter mount, electronic unit with DGPS antenna, a notebook computer and a pilot monitor. And the system is mounted on a helicopter. The principle and the architecture of the airborne laser remote sensing system are presented. Field test experiments are carried out on West-East Natural Gas Pipeline of China, and the results show that airborne detection method is suitable for detecting gas leak of pipeline on plain, desert, hills but unfit for the area with large altitude diversification.
NASA Astrophysics Data System (ADS)
Karpov, S.; Beskin, G.; Biryukov, A.; Bondar, S.; Ivanov, E.; Katkova, E.; Perkov, A.; Sasyuk, V.
2016-12-01
Here we present the summary of first years of operation and the first results of a novel 9-channel wide-field optical monitoring system with sub-second temporal resolution, Mini-Mega-TORTORA (MMT-9), which is in operation now at Special Astrophysical Observatory on Russian Caucasus. The system is able to observe the sky simultaneously in either wide (˜900 square degrees) or narrow (˜100 square degrees) fields of view, either in clear light or with any combination of color (Johnson-Cousins B, V or R) and polarimetric filters installed, with exposure times ranging from 0.1 s to hundreds of seconds. The real-time system data analysis pipeline performs automatic detection of rapid transient events, both near-Earth and extragalactic. The objects routinely detected by MMT include faint meteors and artificial satellites. The pipeline for a longer time scales variability analysis is still in development.
NASA Astrophysics Data System (ADS)
Karpov, S.; Beskin, G.; Biryukov, A.; Bondar, S.; Ivanov, E.; Katkova, E.; Perkov, A.; Sasyuk, V.
2016-06-01
Here we present a summary of first years of operation and first results of a novel 9-channel wide-field optical monitoring system with sub-second temporal resolution, Mini-MegaTORTORA (MMT-9), which is in operation now at Special Astrophysical Observatory on Russian Caucasus. The system is able to observe the sky simultaneously in either wide (~900 square degrees) or narrow (~100 square degrees) fields of view, either in clear light or with any combination of color (Johnson-Cousins B, V or R) and polarimetric filters installed, with exposure times ranging from 0.1 s to hundreds of seconds. The real-time system data analysis pipeline performs automatic detection of rapid transient events, both near-Earth and extragalactic. The objects routinely detected by MMT include faint meteors and artificial satellites. The pipeline for a longer time scales variability analysis is still in development.
Partitioning problems in parallel, pipelined and distributed computing
NASA Technical Reports Server (NTRS)
Bokhari, S.
1985-01-01
The problem of optimally assigning the modules of a parallel program over the processors of a multiple computer system is addressed. A Sum-Bottleneck path algorithm is developed that permits the efficient solution of many variants of this problem under some constraints on the structure of the partitions. In particular, the following problems are solved optimally for a single-host, multiple satellite system: partitioning multiple chain structured parallel programs, multiple arbitrarily structured serial programs and single tree structured parallel programs. In addition, the problems of partitioning chain structured parallel programs across chain connected systems and across shared memory (or shared bus) systems are also solved under certain constraints. All solutions for parallel programs are equally applicable to pipelined programs. These results extend prior research in this area by explicitly taking concurrency into account and permit the efficient utilization of multiple computer architectures for a wide range of problems of practical interest.
This project was initiated with the overall objective of developing organized information pertaining to the costs of various sewage sludge transport systems. Transport of liquid and dewatered sludge by truck and rail and liquid sludge by barge and pipeline is included. The report...
Almazyad, Abdulaziz S.; Seddiq, Yasser M.; Alotaibi, Ahmed M.; Al-Nasheri, Ahmed Y.; BenSaleh, Mohammed S.; Obeid, Abdulfattah M.; Qasim, Syed Manzoor
2014-01-01
Anomalies such as leakage and bursts in water pipelines have severe consequences for the environment and the economy. To ensure the reliability of water pipelines, they must be monitored effectively. Wireless Sensor Networks (WSNs) have emerged as an effective technology for monitoring critical infrastructure such as water, oil and gas pipelines. In this paper, we present a scalable design and simulation of a water pipeline leakage monitoring system using Radio Frequency IDentification (RFID) and WSN technology. The proposed design targets long-distance aboveground water pipelines that have special considerations for maintenance, energy consumption and cost. The design is based on deploying a group of mobile wireless sensor nodes inside the pipeline and allowing them to work cooperatively according to a prescheduled order. Under this mechanism, only one node is active at a time, while the other nodes are sleeping. The node whose turn is next wakes up according to one of three wakeup techniques: location-based, time-based and interrupt-driven. In this paper, mathematical models are derived for each technique to estimate the corresponding energy consumption and memory size requirements. The proposed equations are analyzed and the results are validated using simulation. PMID:24561404
Almazyad, Abdulaziz S; Seddiq, Yasser M; Alotaibi, Ahmed M; Al-Nasheri, Ahmed Y; BenSaleh, Mohammed S; Obeid, Abdulfattah M; Qasim, Syed Manzoor
2014-02-20
Anomalies such as leakage and bursts in water pipelines have severe consequences for the environment and the economy. To ensure the reliability of water pipelines, they must be monitored effectively. Wireless Sensor Networks (WSNs) have emerged as an effective technology for monitoring critical infrastructure such as water, oil and gas pipelines. In this paper, we present a scalable design and simulation of a water pipeline leakage monitoring system using Radio Frequency IDentification (RFID) and WSN technology. The proposed design targets long-distance aboveground water pipelines that have special considerations for maintenance, energy consumption and cost. The design is based on deploying a group of mobile wireless sensor nodes inside the pipeline and allowing them to work cooperatively according to a prescheduled order. Under this mechanism, only one node is active at a time, while the other nodes are sleeping. The node whose turn is next wakes up according to one of three wakeup techniques: location-based, time-based and interrupt-driven. In this paper, mathematical models are derived for each technique to estimate the corresponding energy consumption and memory size requirements. The proposed equations are analyzed and the results are validated using simulation.
Latorre, Mariano; Silva, Herman; Saba, Juan; Guziolowski, Carito; Vizoso, Paula; Martinez, Veronica; Maldonado, Jonathan; Morales, Andrea; Caroca, Rodrigo; Cambiazo, Veronica; Campos-Vargas, Reinaldo; Gonzalez, Mauricio; Orellana, Ariel; Retamales, Julio; Meisel, Lee A
2006-11-23
Expressed sequence tag (EST) analyses provide a rapid and economical means to identify candidate genes that may be involved in a particular biological process. These ESTs are useful in many Functional Genomics studies. However, the large quantity and complexity of the data generated during an EST sequencing project can make the analysis of this information a daunting task. In an attempt to make this task friendlier, we have developed JUICE, an open source data management system (Apache + PHP + MySQL on Linux), which enables the user to easily upload, organize, visualize and search the different types of data generated in an EST project pipeline. In contrast to other systems, the JUICE data management system allows a branched pipeline to be established, modified and expanded, during the course of an EST project. The web interfaces and tools in JUICE enable the users to visualize the information in a graphical, user-friendly manner. The user may browse or search for sequences and/or sequence information within all the branches of the pipeline. The user can search using terms associated with the sequence name, annotation or other characteristics stored in JUICE and associated with sequences or sequence groups. Groups of sequences can be created by the user, stored in a clipboard and/or downloaded for further analyses. Different user profiles restrict the access of each user depending upon their role in the project. The user may have access exclusively to visualize sequence information, access to annotate sequences and sequence information, or administrative access. JUICE is an open source data management system that has been developed to aid users in organizing and analyzing the large amount of data generated in an EST Project workflow. JUICE has been used in one of the first functional genomics projects in Chile, entitled "Functional Genomics in nectarines: Platform to potentiate the competitiveness of Chile in fruit exportation". However, due to its ability to organize and visualize data from external pipelines, JUICE is a flexible data management system that should be useful for other EST/Genome projects. The JUICE data management system is released under the Open Source GNU Lesser General Public License (LGPL). JUICE may be downloaded from http://genoma.unab.cl/juice_system/ or http://www.genomavegetal.cl/juice_system/.
Toward practical 3D radiography of pipeline girth welds
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wassink, Casper, E-mail: casper.wassink@applusrtd.com; Hol, Martijn, E-mail: martijn.hol@applusrtd.com; Flikweert, Arjan, E-mail: martijn.hol@applusrtd.com
2015-03-31
Digital radiography has made its way into in-the-field girth weld testing. With recent generations of detectors and x-ray tubes it is possible to reach the image quality desired in standards as well as the speed of inspection desired to be competitive with film radiography and automated ultrasonic testing. This paper will show the application of these technologies in the RTD Rayscan system. The method for achieving an image quality that complies with or even exceeds prevailing industrial standards will be presented, as well as the application on pipeline girth welds with CRA layers. A next step in development will bemore » to also achieve a measurement of weld flaw height to allow for performing an Engineering Critical Assessment on the weld. This will allow for similar acceptance limits as currently used with Automated Ultrasonic Testing of pipeline girth welds. Although a sufficient sizing accuracy was already demonstrated and qualified in the TomoCAR system, testing in some applications is restricted to time limits. The paper will present some experiments that were performed to achieve flaw height approximation within these time limits.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2011-12-23
... facilities 486210 Pipeline transportation of natural gas. Petroleum and Natural Gas Systems. 221210 Natural... and Budget PHMSA Pipeline and Hazardous Material Safety Administration QA/QC quality assurance/quality... distribution pipelines, but also into liquefied natural gas storage or into underground storage. We are...
Thermal interaction of underground pipeline with freezing heaving soil
NASA Astrophysics Data System (ADS)
Podorozhnikov, S. Y.; Mikhailov, P.; Puldas, L.; Shabarov, A.
2018-05-01
A mathematical model and a method for calculating the stress-strain state of a pipeline describing the heat-power interaction in the "underground pipeline - soil" system in the conditions of negative temperatures in the soils of soils are offered. Some results of computational-parametric research are presented.
A pipelined FPGA implementation of an encryption algorithm based on genetic algorithm
NASA Astrophysics Data System (ADS)
Thirer, Nonel
2013-05-01
With the evolution of digital data storage and exchange, it is essential to protect the confidential information from every unauthorized access. High performance encryption algorithms were developed and implemented by software and hardware. Also many methods to attack the cipher text were developed. In the last years, the genetic algorithm has gained much interest in cryptanalysis of cipher texts and also in encryption ciphers. This paper analyses the possibility to use the genetic algorithm as a multiple key sequence generator for an AES (Advanced Encryption Standard) cryptographic system, and also to use a three stages pipeline (with four main blocks: Input data, AES Core, Key generator, Output data) to provide a fast encryption and storage/transmission of a large amount of data.
Langlois, Lillie A; Drohan, Patrick J; Brittingham, Margaret C
2017-07-15
Large, continuous forest provides critical habitat for some species of forest dependent wildlife. The rapid expansion of shale gas development within the northern Appalachians results in direct loss of such habitat at well sites, pipelines, and access roads; however the resulting habitat fragmentation surrounding such areas may be of greater importance. Previous research has suggested that infrastructure supporting gas development is the driver for habitat loss, but knowledge of what specific infrastructure affects habitat is limited by a lack of spatial tracking of infrastructure development in different land uses. We used high-resolution aerial imagery, land cover data, and well point data to quantify shale gas development across four time periods (2010, 2012, 2014, 2016), including: the number of wells permitted, drilled, and producing gas (a measure of pipeline development); land use change; and forest fragmentation on both private and public land. As of April 2016, the majority of shale gas development was located on private land (74% of constructed well pads); however, the number of wells drilled per pad was lower on private compared to public land (3.5 and 5.4, respectively). Loss of core forest was more than double on private than public land (4.3 and 2.0%, respectively), which likely results from better management practices implemented on public land. Pipelines were by far the largest contributor to the fragmentation of core forest due to shale gas development. Forecasting future land use change resulting from gas development suggests that the greatest loss of core forest will occur with pads constructed farthest from pre-existing pipelines (new pipelines must be built to connect pads) and in areas with greater amounts of core forest. To reduce future fragmentation, our results suggest new pads should be placed near pre-existing pipelines and methods to consolidate pipelines with other infrastructure should be used. Without these mitigation practices, we will continue to lose core forest as a result of new pipelines and infrastructure particularly on private land. Copyright © 2017 Elsevier Ltd. All rights reserved.
Supply Support of Air Force 463L Equipment: An Analysis of the 463L equipment Spare Parts Pipeline
1989-09-01
service; and 4) the order processing system created inherent delays in the pipeline because of outdated and indirect information systems and technology. Keywords: Materials handling equipment, Theses. (AW)
49 CFR 193.2609 - Support systems.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 49 Transportation 3 2010-10-01 2010-10-01 false Support systems. 193.2609 Section 193.2609 Transportation Other Regulations Relating to Transportation (Continued) PIPELINE AND HAZARDOUS MATERIALS SAFETY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED) PIPELINE SAFETY LIQUEFIED NATURAL GAS FACILITIES...
49 CFR 193.2519 - Communication systems.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 49 Transportation 3 2010-10-01 2010-10-01 false Communication systems. 193.2519 Section 193.2519 Transportation Other Regulations Relating to Transportation (Continued) PIPELINE AND HAZARDOUS MATERIALS SAFETY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED) PIPELINE SAFETY LIQUEFIED NATURAL GAS FACILITIES...
Bio-Docklets: virtualization containers for single-step execution of NGS pipelines.
Kim, Baekdoo; Ali, Thahmina; Lijeron, Carlos; Afgan, Enis; Krampis, Konstantinos
2017-08-01
Processing of next-generation sequencing (NGS) data requires significant technical skills, involving installation, configuration, and execution of bioinformatics data pipelines, in addition to specialized postanalysis visualization and data mining software. In order to address some of these challenges, developers have leveraged virtualization containers toward seamless deployment of preconfigured bioinformatics software and pipelines on any computational platform. We present an approach for abstracting the complex data operations of multistep, bioinformatics pipelines for NGS data analysis. As examples, we have deployed 2 pipelines for RNA sequencing and chromatin immunoprecipitation sequencing, preconfigured within Docker virtualization containers we call Bio-Docklets. Each Bio-Docklet exposes a single data input and output endpoint and from a user perspective, running the pipelines as simply as running a single bioinformatics tool. This is achieved using a "meta-script" that automatically starts the Bio-Docklets and controls the pipeline execution through the BioBlend software library and the Galaxy Application Programming Interface. The pipeline output is postprocessed by integration with the Visual Omics Explorer framework, providing interactive data visualizations that users can access through a web browser. Our goal is to enable easy access to NGS data analysis pipelines for nonbioinformatics experts on any computing environment, whether a laboratory workstation, university computer cluster, or a cloud service provider. Beyond end users, the Bio-Docklets also enables developers to programmatically deploy and run a large number of pipeline instances for concurrent analysis of multiple datasets. © The Authors 2017. Published by Oxford University Press.
Bio-Docklets: virtualization containers for single-step execution of NGS pipelines
Kim, Baekdoo; Ali, Thahmina; Lijeron, Carlos; Afgan, Enis
2017-01-01
Abstract Processing of next-generation sequencing (NGS) data requires significant technical skills, involving installation, configuration, and execution of bioinformatics data pipelines, in addition to specialized postanalysis visualization and data mining software. In order to address some of these challenges, developers have leveraged virtualization containers toward seamless deployment of preconfigured bioinformatics software and pipelines on any computational platform. We present an approach for abstracting the complex data operations of multistep, bioinformatics pipelines for NGS data analysis. As examples, we have deployed 2 pipelines for RNA sequencing and chromatin immunoprecipitation sequencing, preconfigured within Docker virtualization containers we call Bio-Docklets. Each Bio-Docklet exposes a single data input and output endpoint and from a user perspective, running the pipelines as simply as running a single bioinformatics tool. This is achieved using a “meta-script” that automatically starts the Bio-Docklets and controls the pipeline execution through the BioBlend software library and the Galaxy Application Programming Interface. The pipeline output is postprocessed by integration with the Visual Omics Explorer framework, providing interactive data visualizations that users can access through a web browser. Our goal is to enable easy access to NGS data analysis pipelines for nonbioinformatics experts on any computing environment, whether a laboratory workstation, university computer cluster, or a cloud service provider. Beyond end users, the Bio-Docklets also enables developers to programmatically deploy and run a large number of pipeline instances for concurrent analysis of multiple datasets. PMID:28854616
Training System Device Certification and Qualification Process
2013-09-01
Engineering IPT Integrated Product Team ISD Instructional Systems Development ISEO In-Service Engineering Office KSAs Knowledge, Skills, and Attributes...Plan TES Tactical Engagement Simulation TPM Training Pipeline Managers T&R Training and Readiness TRR Test Readiness Review TS Training System...NAWCTSD) is the Navy’s source for a full range of innovative products and services that provide complete training solutions. This includes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1979-05-17
The U.S. Materials Transportation Bureau (MTB) withdraws an advanced notice of proposed rulemaking (ANPR) which requested advice, recommendations, and information relating to the issuance of additional occupational safety and health standards for the protection of employees engaged in the construction, operation, and maintenance of pipeline systems and facilities used in the transportation of hazardous materials. Comments submitted in response to the ANPR indicated that the issuance of additional occupational safety and health standards by the MTB would be a duplication of the U.S. Occupational Safety and Health Administration's efforts and would increase the possibility of jurisdictional disputes. Since the MTB'smore » present standards development efforts are primarily directed at public safety (as opposed to occupational safety) by regulating pipeline design, construction, operation, and maintenance activities, the MTB withdraws the ANPR.« less
A First Step towards a Clinical Decision Support System for Post-traumatic Stress Disorders.
Ma, Sisi; Galatzer-Levy, Isaac R; Wang, Xuya; Fenyö, David; Shalev, Arieh Y
2016-01-01
PTSD is distressful and debilitating, following a non-remitting course in about 10% to 20% of trauma survivors. Numerous risk indicators of PTSD have been identified, but individual level prediction remains elusive. As an effort to bridge the gap between scientific discovery and practical application, we designed and implemented a clinical decision support pipeline to provide clinically relevant recommendation for trauma survivors. To meet the specific challenge of early prediction, this work uses data obtained within ten days of a traumatic event. The pipeline creates personalized predictive model for each individual, and computes quality metrics for each predictive model. Clinical recommendations are made based on both the prediction of the model and its quality, thus avoiding making potentially detrimental recommendations based on insufficient information or suboptimal model. The current pipeline outperforms the acute stress disorder, a commonly used clinical risk factor for PTSD development, both in terms of sensitivity and specificity.
Evaluation of fishing gear induced pipeline damage
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ellinas, C.P.; King, B.; Davies, R.
1995-12-31
Impact and damage to pipelines due to fishing activities is one of the hazards faced by North Sea pipelines during their operating lives. Available data indicate that about one in ten of reported incidents are due to fishing activities. This paper is concerned with one such occurrence, the assessment of the resulting damage, the methods used to confirm pipeline integrity and the approaches developed for its repair.
Effect of Pseudomonas fluorescens on Buried Steel Pipeline Corrosion.
Spark, Amy J; Law, David W; Ward, Liam P; Cole, Ivan S; Best, Adam S
2017-08-01
Buried steel infrastructure can be a source of iron ions for bacterial species, leading to microbiologically influenced corrosion (MIC). Localized corrosion of pipelines due to MIC is one of the key failure mechanisms of buried steel pipelines. In order to better understand the mechanisms of localized corrosion in soil, semisolid agar has been developed as an analogue for soil. Here, Pseudomonas fluorescens has been introduced to the system to understand how bacteria interact with steel. Through electrochemical testing including open circuit potentials, potentiodynamic scans, anodic potential holds, and electrochemical impedance spectroscopy it has been shown that P. fluorescens increases the rate of corrosion. Time for oxide and biofilms to develop was shown to not impact on the rate of corrosion but did alter the consistency of biofilm present and the viability of P. fluorescens following electrochemical testing. The proposed mechanism for increased corrosion rates of carbon steel involves the interactions of pyoverdine with the steel, preventing the formation of a cohesive passive layer, after initial cell attachment, followed by the formation of a metal concentration gradient on the steel surface.
The CWF pipeline system from Shen mu to the Yellow Sea
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ercolani, D.
1993-12-31
A feasibility study on the applicability of coal-water fuel (CWF) technology in the People`s Republic of China (PRC) is in progress. This study, awarded to Snamprogetti by the International Centre for Scientific Culture (World Laboratory) of Geneva, Switzerland, is performed on behalf of Chinese Organizations led by the Ministry of Energy Resources and the Academy of Sciences of the People`s Republic of China. Slurry pipelines appear to be a solution for solving the logistic problems created by a progressively increasing coal consumption and a limited availability of conventional transport infrastructures in the PRC. Within this framework, CWF pipelines are themore » most innovative technological option in consideration of the various advantages the technology offers with respect to conventional slurry pipelines. The PRC CWF pipeline system study evaluates two alternative transport streams, but originating from the same slurry production plant, located at Shachuanguo, about 100 km from Sheng Mu.« less
Ground motion values for use in the seismic design of the Trans-Alaska Pipeline system
Page, Robert A.; Boore, D.M.; Joyner, W.B.; Coulter, H.W.
1972-01-01
The proposed trans-Alaska oil pipeline, which would traverse the state north to south from Prudhoe Bay on the Arctic coast to Valdez on Prince William Sound, will be subject to serious earthquake hazards over much of its length. To be acceptable from an environmental standpoint, the pipeline system is to be designed to minimize the potential of oil leakage resulting from seismic shaking, faulting, and seismically induced ground deformation. The design of the pipeline system must accommodate the effects of earthquakes with magnitudes ranging from 5.5 to 8.5 as specified in the 'Stipulations for Proposed Trans-Alaskan Pipeline System.' This report characterizes ground motions for the specified earthquakes in terms of peak levels of ground acceleration, velocity, and displacement and of duration of shaking. Published strong motion data from the Western United States are critically reviewed to determine the intensity and duration of shaking within several kilometers of the slipped fault. For magnitudes 5 and 6, for which sufficient near-fault records are available, the adopted ground motion values are based on data. For larger earthquakes the values are based on extrapolations from the data for smaller shocks, guided by simplified theoretical models of the faulting process.
Research on distributed optical fiber sensing data processing method based on LabVIEW
NASA Astrophysics Data System (ADS)
Li, Zhonghu; Yang, Meifang; Wang, Luling; Wang, Jinming; Yan, Junhong; Zuo, Jing
2018-01-01
The pipeline leak detection and leak location problem have gotten extensive attention in the industry. In this paper, the distributed optical fiber sensing system is designed based on the heat supply pipeline. The data processing method of distributed optical fiber sensing based on LabVIEW is studied emphatically. The hardware system includes laser, sensing optical fiber, wavelength division multiplexer, photoelectric detector, data acquisition card and computer etc. The software system is developed using LabVIEW. The software system adopts wavelet denoising method to deal with the temperature information, which improved the SNR. By extracting the characteristic value of the fiber temperature information, the system can realize the functions of temperature measurement, leak location and measurement signal storage and inquiry etc. Compared with traditional negative pressure wave method or acoustic signal method, the distributed optical fiber temperature measuring system can measure several temperatures in one measurement and locate the leak point accurately. It has a broad application prospect.
Interdependency Assessment of Coupled Natural Gas and Power Systems in Energy Market
NASA Astrophysics Data System (ADS)
Yang, Hongzhao; Qiu, Jing; Zhang, Sanhua; Lai, Mingyong; Dong, Zhao Yang
2015-12-01
Owing to the technological development of natural gas exploration and the increasing penetration of gas-fired power generation, gas and power systems inevitably interact with each other from both physical and economic points of view. In order to effectively assess the two systems' interdependency, this paper proposes a systematic modeling framework and constructs simulation platforms for coupled gas and power systems in an energy market environment. By applying the proposed approach to the Australian national electricity market (NEM) and gas market, the impacts of six types of market and system factors are quantitatively analyzed, including power transmission limits, gas pipeline contingencies, gas pipeline flow constraints, carbon emission constraints, power load variations, and non-electric gas load variations. The important interdependency and infrastructure weakness for the two systems are well studied and identified. Our work provides a quantitative basis for grid operators and policy makers to support and guide operation and investment decisions for electric power and natural gas industries.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robin Gordon; Bill Bruce; Ian Harris
2004-04-12
The two broad categories of deposited weld metal repair and fiber-reinforced composite liner repair technologies were reviewed for potential application for internal repair of gas transmission pipelines. Both are used to some extent for other applications and could be further developed for internal, local, structural repair of gas transmission pipelines. Preliminary test programs were developed for both deposited weld metal repair and for fiber-reinforced composite liner repair. Evaluation trials have been conducted using a modified fiber-reinforced composite liner provided by RolaTube and pipe sections without liners. All pipe section specimens failed in areas of simulated damage. Pipe sections containing fiber-reinforcedmore » composite liners failed at pressures marginally greater than the pipe sections without liners. The next step is to evaluate a liner material with a modulus of elasticity approximately 95% of the modulus of elasticity for steel. Preliminary welding parameters were developed for deposited weld metal repair in preparation of the receipt of Pacific Gas & Electric's internal pipeline welding repair system (that was designed specifically for 559 mm (22 in.) diameter pipe) and the receipt of 559 mm (22 in.) pipe sections from Panhandle Eastern. The next steps are to transfer welding parameters to the PG&E system and to pressure test repaired pipe sections to failure. A survey of pipeline operators was conducted to better understand the needs and performance requirements of the natural gas transmission industry regarding internal repair. Completed surveys contained the following principal conclusions: (1) Use of internal weld repair is most attractive for river crossings, under other bodies of water, in difficult soil conditions, under highways, under congested intersections, and under railway crossings. (2) Internal pipe repair offers a strong potential advantage to the high cost of horizontal direct drilling (HDD) when a new bore must be created to solve a leak or other problem. (3) Typical travel distances can be divided into three distinct groups: up to 305 m (1,000 ft.); between 305 m and 610 m (1,000 ft. and 2,000 ft.); and beyond 914 m (3,000 ft.). All three groups require pig-based systems. A despooled umbilical system would suffice for the first two groups which represents 81% of survey respondents. The third group would require an onboard self-contained power unit for propulsion and welding/liner repair energy needs. (4) Pipe diameter sizes range from 50.8 mm (2 in.) through 1,219.2 mm (48 in.). The most common size range for 80% to 90% of operators surveyed is 508 mm to 762 mm (20 in. to 30 in.), with 95% using 558.8 mm (22 in.) pipe. An evaluation of potential repair methods clearly indicates that the project should continue to focus on the development of a repair process involving the use of GMAW welding and on the development of a repair process involving the use of fiber-reinforced composite liners.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-16
... DEPARTMENT OF HOMELAND SECURITY Transportation Security Administration New Agency Information Collection Activity Under OMB Review: Pipeline System Operator Security Information AGENCY: Transportation... INFORMATION CONTACT: Joanna Johnson, Office of Information Technology, TSA-11, Transportation Security...
Safety and integrity of pipeline systems - philosophy and experience in Germany
DOT National Transportation Integrated Search
1997-01-01
The design, construction and operation of gas pipeline systems in Germany are subject to the Energy Act and associated regulations. This legal structure is based on a deterministic rather than a probabilistic safety philosophy, consisting of technica...
Automated Laser Ultrasonic Testing (ALUT) of Hybrid Arc Welds for Pipeline Construction, #272
DOT National Transportation Integrated Search
2009-12-22
One challenge in developing new gas reserves is the high cost of pipeline construction. Welding costs are a major component of overall construction costs. Industry continues to seek advanced pipeline welding technologies to improve productivity and s...
Investigating interoperability of the LSST data management software stack with Astropy
NASA Astrophysics Data System (ADS)
Jenness, Tim; Bosch, James; Owen, Russell; Parejko, John; Sick, Jonathan; Swinbank, John; de Val-Borro, Miguel; Dubois-Felsmann, Gregory; Lim, K.-T.; Lupton, Robert H.; Schellart, Pim; Krughoff, K. S.; Tollerud, Erik J.
2016-07-01
The Large Synoptic Survey Telescope (LSST) will be an 8.4m optical survey telescope sited in Chile and capable of imaging the entire sky twice a week. The data rate of approximately 15TB per night and the requirements to both issue alerts on transient sources within 60 seconds of observing and create annual data releases means that automated data management systems and data processing pipelines are a key deliverable of the LSST construction project. The LSST data management software has been in development since 2004 and is based on a C++ core with a Python control layer. The software consists of nearly a quarter of a million lines of code covering the system from fundamental WCS and table libraries to pipeline environments and distributed process execution. The Astropy project began in 2011 as an attempt to bring together disparate open source Python projects and build a core standard infrastructure that can be used and built upon by the astronomy community. This project has been phenomenally successful in the years since it has begun and has grown to be the de facto standard for Python software in astronomy. Astropy brings with it considerable expectations from the community on how astronomy Python software should be developed and it is clear that by the time LSST is fully operational in the 2020s many of the prospective users of the LSST software stack will expect it to be fully interoperable with Astropy. In this paper we describe the overlap between the LSST science pipeline software and Astropy software and investigate areas where the LSST software provides new functionality. We also discuss the possibilities of re-engineering the LSST science pipeline software to build upon Astropy, including the option of contributing affliated packages.
Natural Gas Pipeline and System Expansions
1997-01-01
This special report examines recent expansions to the North American natural gas pipeline network and the nature and type of proposed pipeline projects announced or approved for construction during the next several years in the United States. It includes those projects in Canada and Mexico that tie in with U.S. markets or projects.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-20
... that would permit gas to be received by pipeline at the terminal and liquefied for subsequent export... the natural gas proposed for export will come from the United States natural gas pipeline system... authorization to construct additional pipeline facilities necessary to provide feed gas to the proposed...
Algeria LPG pipeline is build by Bechtel
DOE Office of Scientific and Technical Information (OSTI.GOV)
Horner, C.
1984-08-01
The construction of the 313 mile long, 24 in. LPG pipeline from Hassi R'Mel to Arzew, Algeria is described. The pipeline was designed to deliver 6 million tons of LPG annually using one pumping station. Eventually an additional pumping station will be added to raise the system capacity to 9 million tons annually.
Testing the School-to-Prison Pipeline
ERIC Educational Resources Information Center
Owens, Emily G.
2017-01-01
The School-to-Prison Pipeline is a social phenomenon where students become formally involved with the criminal justice system as a result of school policies that use law enforcement, rather than discipline, to address behavioral problems. A potentially important part of the School-to-Prison Pipeline is the use of sworn School Resource Officers…
NASA Technical Reports Server (NTRS)
1998-01-01
NASA has transferred the improved portable leak detector technology to UE Systems, Inc.. This instrument was developed to detect leaks in fluid systems of critical launch and ground support equipment. This system incorporates innovative electronic circuitry, improved transducers, collecting horns, and contact sensors that provide a much higher degree of reliability, sensitivity and versatility over previously used systems. Potential commercial uses are pipelines, underground utilities, air-conditioning systems, petrochemical systems, aerospace, power transmission lines and medical devices.
Reproducibility of neuroimaging analyses across operating systems
Glatard, Tristan; Lewis, Lindsay B.; Ferreira da Silva, Rafael; Adalat, Reza; Beck, Natacha; Lepage, Claude; Rioux, Pierre; Rousseau, Marc-Etienne; Sherif, Tarek; Deelman, Ewa; Khalili-Mahani, Najmeh; Evans, Alan C.
2015-01-01
Neuroimaging pipelines are known to generate different results depending on the computing platform where they are compiled and executed. We quantify these differences for brain tissue classification, fMRI analysis, and cortical thickness (CT) extraction, using three of the main neuroimaging packages (FSL, Freesurfer and CIVET) and different versions of GNU/Linux. We also identify some causes of these differences using library and system call interception. We find that these packages use mathematical functions based on single-precision floating-point arithmetic whose implementations in operating systems continue to evolve. While these differences have little or no impact on simple analysis pipelines such as brain extraction and cortical tissue classification, their accumulation creates important differences in longer pipelines such as subcortical tissue classification, fMRI analysis, and cortical thickness extraction. With FSL, most Dice coefficients between subcortical classifications obtained on different operating systems remain above 0.9, but values as low as 0.59 are observed. Independent component analyses (ICA) of fMRI data differ between operating systems in one third of the tested subjects, due to differences in motion correction. With Freesurfer and CIVET, in some brain regions we find an effect of build or operating system on cortical thickness. A first step to correct these reproducibility issues would be to use more precise representations of floating-point numbers in the critical sections of the pipelines. The numerical stability of pipelines should also be reviewed. PMID:25964757
Reproducibility of neuroimaging analyses across operating systems.
Glatard, Tristan; Lewis, Lindsay B; Ferreira da Silva, Rafael; Adalat, Reza; Beck, Natacha; Lepage, Claude; Rioux, Pierre; Rousseau, Marc-Etienne; Sherif, Tarek; Deelman, Ewa; Khalili-Mahani, Najmeh; Evans, Alan C
2015-01-01
Neuroimaging pipelines are known to generate different results depending on the computing platform where they are compiled and executed. We quantify these differences for brain tissue classification, fMRI analysis, and cortical thickness (CT) extraction, using three of the main neuroimaging packages (FSL, Freesurfer and CIVET) and different versions of GNU/Linux. We also identify some causes of these differences using library and system call interception. We find that these packages use mathematical functions based on single-precision floating-point arithmetic whose implementations in operating systems continue to evolve. While these differences have little or no impact on simple analysis pipelines such as brain extraction and cortical tissue classification, their accumulation creates important differences in longer pipelines such as subcortical tissue classification, fMRI analysis, and cortical thickness extraction. With FSL, most Dice coefficients between subcortical classifications obtained on different operating systems remain above 0.9, but values as low as 0.59 are observed. Independent component analyses (ICA) of fMRI data differ between operating systems in one third of the tested subjects, due to differences in motion correction. With Freesurfer and CIVET, in some brain regions we find an effect of build or operating system on cortical thickness. A first step to correct these reproducibility issues would be to use more precise representations of floating-point numbers in the critical sections of the pipelines. The numerical stability of pipelines should also be reviewed.
NASA Astrophysics Data System (ADS)
Osland, Anna Christine
Hazardous liquid and natural gas transmission pipelines have received limited attention by planning scholars even though local development decisions can have broad consequences if a rupture occurs. In this dissertation, I evaluated the implications of land-use planning for reducing risk to transmission pipeline hazards in North Carolina via three investigations. First, using a survey of planning directors in jurisdictions with transmission pipeline hazards, I investigated the land use planning tools used to mitigate pipeline hazards and the factors associated with tool adoption. Planning scholars have documented the difficulty of inducing planning in hazardous areas, yet there remain gaps in knowledge about the factors associated with tool adoption. Despite the risks associated with pipeline ruptures, I found most localities use few mitigation tools, and the adoption of regulatory and informational tools appear to be influenced by divergent factors. Whereas risk perception, commitment, capacity, and community context were associated with total tool and information tool use, only risk perception and capacity factors were associated with regulatory tool use. Second, using interviews of emergency managers and planning directors, I examined the role of agency collaboration for building mitigation capacity. Scholars have highlighted the potential of technical collaboration, yet less research has investigated how inter-agency collaboration shapes mitigation capacity. I identify three categories of technical collaboration, discuss how collaborative spillovers can occur from one planning area to another, and challenge the notion that all technical collaborations result in equal mitigation outcomes. Third, I evaluated characteristics of the population near pipelines to address equity concerns. Surprisingly, I did not find broad support for differences in exposure of vulnerable populations. Nonetheless, my analyses uncovered statistically significant clusters of vulnerable groups within the hazard area. Interestingly, development closer to pipelines was newer than areas farther away, illustrating the failure of land-use planning to reduce development encroachment. Collectively, these results highlight the potential of land-use planning to keep people and development from encroaching on pipeline hazards. While this study indicates that planners in many areas address pipeline hazards, it also illustrates how changes to local practices can further reduce risks to human health, homeland security, and the environment.
76 FR 30197 - Notice of Lodging of Consent Decree Under The Clean Air Act
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-24
... Pipeline System, LLC, et al., Civil Action No. 11-CV-1188RPM-CBS was lodged with the United States District... System, LLC, Western Convenience Stores, Inc., and Offen Petroleum, Inc. (collectively, the ``Defendants... environmental mitigation project requires Rocky Mountain Pipeline System to installation a domed cover on an...
Third-Generation Partnerships for P-16 Pipelines and Cradle-through-Career Education Systems
ERIC Educational Resources Information Center
Lawson, Hal A.
2013-01-01
Amid unprecedented novelty, complexity, turbulence, and conflict, it is apparent that a new education system is needed. Focused on a new outcome--postsecondary education completion with advanced competence--heretofore separate systems for early childhood, K-12 schools, and postsecondary education are being joined in P-16 pipelines and…
30 CFR 250.910 - Which of my facilities are subject to the Platform Verification Program?
Code of Federal Regulations, 2010 CFR
2010-07-01
... pipeline risers, and riser tensioning systems (each platform must be designed to accommodate all the loads imposed by all risers and riser does not have tensioning systems);(ii) Turrets and turret-and-hull... Platform Verification Program: (i) Drilling, production, and pipeline risers, and riser tensioning systems...
DOT National Transportation Integrated Search
1978-12-01
This study is the final phase of a muck pipeline program begun in 1973. The objective of the study was to evaluate a pneumatic pipeline system for muck haulage from a tunnel excavated by a tunnel boring machine. The system was comprised of a muck pre...
Component-based control of oil-gas-water mixture composition in pipelines
NASA Astrophysics Data System (ADS)
Voytyuk, I. N.
2018-03-01
The article theoretically proves the method for measuring the changes in content of oil, gas and water in pipelines; also the measurement system design for implementation thereof is discussed. An assessment is presented in connection with random and systemic errors for the future system, and recommendations for optimization thereof are presented.
Human Factors Analysis of Pipeline Monitoring and Control Operations: Final Technical Report
DOT National Transportation Integrated Search
2008-11-26
The purpose of the Human Factors Analysis of Pipeline Monitoring and Control Operations project was to develop procedures that could be used by liquid pipeline operators to assess and manage the human factors risks in their control rooms that may adv...
Accident Prevention and Diagnostics of Underground Pipeline Systems
NASA Astrophysics Data System (ADS)
Trokhimchuk, M.; Bakhracheva, Y.
2017-11-01
Up to forty thousand accidents occur annually with underground pipelines due to corrosion. The comparison of the methods for assessing the quality of anti-corrosion coating is provided. It is proposed to use the device to be tied-in to existing pipeline which has a higher functionality in comparison with other types of the devices due to the possibility of tie-in to the pipelines with different diameters. The existing technologies and applied materials allow us to organize industrial production of the proposed device.
The ALMA Science Pipeline: Current Status
NASA Astrophysics Data System (ADS)
Humphreys, Elizabeth; Miura, Rie; Brogan, Crystal L.; Hibbard, John; Hunter, Todd R.; Indebetouw, Remy
2016-09-01
The ALMA Science Pipeline is being developed for the automated calibration and imaging of ALMA interferometric and single-dish data. The calibration Pipeline for interferometric data was accepted for use by ALMA Science Operations in 2014, and for single-dish data end-to-end processing in 2015. However, work is ongoing to expand the use cases for which the Pipeline can be used e.g. for higher frequency and lower signal-to-noise datasets, and for new observing modes. A current focus includes the commissioning of science target imaging for interferometric data. For the Single Dish Pipeline, the line finding algorithm used in baseline subtraction and baseline flagging heuristics have been greately improved since the prototype used for data from the previous cycle. These algorithms, unique to the Pipeline, produce better results than standard manual processing in many cases. In this poster, we report on the current status of the Pipeline capabilities, present initial results from the Imaging Pipeline, and the smart line finding and flagging algorithm used in the Single Dish Pipeline. The Pipeline is released as part of CASA (the Common Astronomy Software Applications package).
Fisher, Jill A; Cottingham, Marci D; Kalbaugh, Corey A
2015-04-01
In spite of a growing literature on pharmaceuticalization, little is known about the pharmaceutical industry's investments in research and development (R&D). Information about the drugs being developed can provide important context for existing case studies detailing the expanding--and often problematic--role of pharmaceuticals in society. To access the pharmaceutical industry's pipeline, we constructed a database of drugs for which pharmaceutical companies reported initiating clinical trials over a five-year period (July 2006-June 2011), capturing 2477 different drugs in 4182 clinical trials. Comparing drugs in the pipeline that target diseases in high-income and low-income countries, we found that the number of drugs for diseases prevalent in high-income countries was 3.46 times higher than drugs for diseases prevalent in low-income countries. We also found that the plurality of drugs in the pipeline was being developed to treat cancers (26.2%). Interpreting our findings through the lens of pharmaceuticalization, we illustrate how investigating the entire drug development pipeline provides important information about patterns of pharmaceuticalization that are invisible when only marketed drugs are considered. Copyright © 2014 Elsevier Ltd. All rights reserved.
TESS Data Processing and Quick-look Pipeline
NASA Astrophysics Data System (ADS)
Fausnaugh, Michael; Huang, Xu; Glidden, Ana; Guerrero, Natalia; TESS Science Office
2018-01-01
We describe the data analysis procedures and pipelines for the Transiting Exoplanet Survey Satellite (TESS). We briefly review the processing pipeline developed and implemented by the Science Processing Operations Center (SPOC) at NASA Ames, including pixel/full-frame image calibration, photometric analysis, pre-search data conditioning, transiting planet search, and data validation. We also describe data-quality diagnostic analyses and photometric performance assessment tests. Finally, we detail a "quick-look pipeline" (QLP) that has been developed by the MIT branch of the TESS Science Office (TSO) to provide a fast and adaptable routine to search for planet candidates in the 30 minute full-frame images.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robin Gordon; Bill Bruce; Nancy Porter
2003-05-01
The two broad categories of deposited weld metal repair and fiber-reinforced composite repair technologies were reviewed for potential application for internal repair of gas transmission pipelines. Both are used to some extent for other applications and could be further developed for internal, local, structural repair of gas transmission pipelines. Preliminary test programs were developed for both deposited weld metal repairs and for fiber-reinforced composite repair. To date, all of the experimental work pertaining to the evaluation of potential repair methods has focused on fiber-reinforced composite repairs. Hydrostatic testing was also conducted on four pipeline sections with simulated corrosion damage: twomore » with composite liners and two without.« less
Latorre, Mariano; Silva, Herman; Saba, Juan; Guziolowski, Carito; Vizoso, Paula; Martinez, Veronica; Maldonado, Jonathan; Morales, Andrea; Caroca, Rodrigo; Cambiazo, Veronica; Campos-Vargas, Reinaldo; Gonzalez, Mauricio; Orellana, Ariel; Retamales, Julio; Meisel, Lee A
2006-01-01
Background Expressed sequence tag (EST) analyses provide a rapid and economical means to identify candidate genes that may be involved in a particular biological process. These ESTs are useful in many Functional Genomics studies. However, the large quantity and complexity of the data generated during an EST sequencing project can make the analysis of this information a daunting task. Results In an attempt to make this task friendlier, we have developed JUICE, an open source data management system (Apache + PHP + MySQL on Linux), which enables the user to easily upload, organize, visualize and search the different types of data generated in an EST project pipeline. In contrast to other systems, the JUICE data management system allows a branched pipeline to be established, modified and expanded, during the course of an EST project. The web interfaces and tools in JUICE enable the users to visualize the information in a graphical, user-friendly manner. The user may browse or search for sequences and/or sequence information within all the branches of the pipeline. The user can search using terms associated with the sequence name, annotation or other characteristics stored in JUICE and associated with sequences or sequence groups. Groups of sequences can be created by the user, stored in a clipboard and/or downloaded for further analyses. Different user profiles restrict the access of each user depending upon their role in the project. The user may have access exclusively to visualize sequence information, access to annotate sequences and sequence information, or administrative access. Conclusion JUICE is an open source data management system that has been developed to aid users in organizing and analyzing the large amount of data generated in an EST Project workflow. JUICE has been used in one of the first functional genomics projects in Chile, entitled "Functional Genomics in nectarines: Platform to potentiate the competitiveness of Chile in fruit exportation". However, due to its ability to organize and visualize data from external pipelines, JUICE is a flexible data management system that should be useful for other EST/Genome projects. The JUICE data management system is released under the Open Source GNU Lesser General Public License (LGPL). JUICE may be downloaded from or . PMID:17123449
NASA Astrophysics Data System (ADS)
Rui, Zhenhua
This study analyzes historical cost data of 412 pipelines and 220 compressor stations. On the basis of this analysis, the study also evaluates the feasibility of an Alaska in-state gas pipeline using Monte Carlo simulation techniques. Analysis of pipeline construction costs shows that component costs, shares of cost components, and learning rates for material and labor costs vary by diameter, length, volume, year, and location. Overall average learning rates for pipeline material and labor costs are 6.1% and 12.4%, respectively. Overall average cost shares for pipeline material, labor, miscellaneous, and right of way (ROW) are 31%, 40%, 23%, and 7%, respectively. Regression models are developed to estimate pipeline component costs for different lengths, cross-sectional areas, and locations. An analysis of inaccuracy in pipeline cost estimation demonstrates that the cost estimation of pipeline cost components is biased except for in the case of total costs. Overall overrun rates for pipeline material, labor, miscellaneous, ROW, and total costs are 4.9%, 22.4%, -0.9%, 9.1%, and 6.5%, respectively, and project size, capacity, diameter, location, and year of completion have different degrees of impacts on cost overruns of pipeline cost components. Analysis of compressor station costs shows that component costs, shares of cost components, and learning rates for material and labor costs vary in terms of capacity, year, and location. Average learning rates for compressor station material and labor costs are 12.1% and 7.48%, respectively. Overall average cost shares of material, labor, miscellaneous, and ROW are 50.6%, 27.2%, 21.5%, and 0.8%, respectively. Regression models are developed to estimate compressor station component costs in different capacities and locations. An investigation into inaccuracies in compressor station cost estimation demonstrates that the cost estimation for compressor stations is biased except for in the case of material costs. Overall average overrun rates for compressor station material, labor, miscellaneous, land, and total costs are 3%, 60%, 2%, -14%, and 11%, respectively, and cost overruns for cost components are influenced by location and year of completion to different degrees. Monte Carlo models are developed and simulated to evaluate the feasibility of an Alaska in-state gas pipeline by assigning triangular distribution of the values of economic parameters. Simulated results show that the construction of an Alaska in-state natural gas pipeline is feasible at three scenarios: 500 million cubic feet per day (mmcfd), 750 mmcfd, and 1000 mmcfd.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thaule, S.B.; Postvoll, W.
Installation by den norske stats oljeselskap A.S. (Statoil) of a powerful pipeline-modeling system on Zeepipe has allowed this major North Sea gas pipeline to meet the growing demands and seasonal variations of the European gas market. The Troll gas-sales agreement (TGSA) in 1986 called for large volumes of Norwegian gas to begin arriving from the North Sea Sleipner East field in october 1993. It is important to Statoil to maintain regular gas delivers from its integrated transport network. In addition, high utilization of transport capacity maximizes profits. In advance of operations, Statoil realized that state-of-the-art supervisory control and data acquisitionmore » (scada) and pipeline-modeling systems (PMS) would be necessary to meet its goals and to remain the most efficient North Sea operator. The paper describes the linking of Troll and Zeebrugge, contractual issues, the supervisory system, the scada module, pipeline modeling, real-time model, look-ahead model, predictive model, and model performance.« less
Hardware/software codesign for embedded RISC core
NASA Astrophysics Data System (ADS)
Liu, Peng
2001-12-01
This paper describes hardware/software codesign method of the extendible embedded RISC core VIRGO, which based on MIPS-I instruction set architecture. VIRGO is described by Verilog hardware description language that has five-stage pipeline with shared 32-bit cache/memory interface, and it is controlled by distributed control scheme. Every pipeline stage has one small controller, which controls the pipeline stage status and cooperation among the pipeline phase. Since description use high level language and structure is distributed, VIRGO core has highly extension that can meet the requirements of application. We take look at the high-definition television MPEG2 MPHL decoder chip, constructed the hardware/software codesign virtual prototyping machine that can research on VIRGO core instruction set architecture, and system on chip memory size requirements, and system on chip software, etc. We also can evaluate the system on chip design and RISC instruction set based on the virtual prototyping machine platform.
The Chandra Source Catalog 2.0: Data Processing Pipelines
NASA Astrophysics Data System (ADS)
Miller, Joseph; Allen, Christopher E.; Budynkiewicz, Jamie A.; Gibbs, Danny G., II; Paxson, Charles; Chen, Judy C.; Anderson, Craig S.; Burke, Douglas; Civano, Francesca Maria; D'Abrusco, Raffaele; Doe, Stephen M.; Evans, Ian N.; Evans, Janet D.; Fabbiano, Giuseppina; Glotfelty, Kenny J.; Graessle, Dale E.; Grier, John D.; Hain, Roger; Hall, Diane M.; Harbo, Peter N.; Houck, John C.; Lauer, Jennifer L.; Laurino, Omar; Lee, Nicholas P.; Martínez-Galarza, Juan Rafael; McCollough, Michael L.; McDowell, Jonathan C.; McLaughlin, Warren; Morgan, Douglas L.; Mossman, Amy E.; Nguyen, Dan T.; Nichols, Joy S.; Nowak, Michael A.; Plummer, David A.; Primini, Francis Anthony; Rots, Arnold H.; Siemiginowska, Aneta; Sundheim, Beth A.; Tibbetts, Michael; Van Stone, David W.; Zografou, Panagoula
2018-01-01
With the construction of the Second Chandra Source Catalog (CSC2.0), came new requirements and new techniques to create a software system that can process 10,000 observations and identify nearly 320,000 point and compact X-ray sources. A new series of processing pipelines have been developed to allow for deeper more complete exploration of the Chanda observations. In CSC1.0 there were 4 general pipelines, whereas in CSC2.0 there are 20 data processing pipelines that have been organized into 3 distinct phases of operation - detection, master matching and source property characterization.With CSC2.0, observations within one arcminute of each other are stacked before searching for sources. The detection phase of processing combines the data, adjusts for shifts in fine astrometry, detects sources, and assesses the likelihood that sources are real. During the master source phase, detections across stacks of observations are analyzed for coverage of the same source to produce a master source. Finally, in the source property phase, each source is characterized with aperture photometry, spectrometry, variability and other properties at theobservation, stack and master levels over several energy bands.We present how these pipelines were constructed and the challenges we faced in how we processed data ranging from virtually no counts to millions of counts, how pipelines were tuned to work optimally on a computational cluster, and how we ensure the data produced was correct through various quality assurance steps.This work has been supported by NASA under contract NAS 8-03060 to the Smithsonian Astrophysical Observatory for operation of the Chandra X-ray Center.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Norsworthy, R.
A rating system was developed for several coating types used for underground pipeline systems. Consideration included soil stress, adhesion, surface preparation, cathodic protection (CP) shielding, CP requirements, handling and construction, repair, field joint system, bends and other components, and the application process. Polyethylene- and polyvinyl chloride-backed tapes, woven polyolefin geotextile fabric (WGF)-backed tapes, hot-applied tapes, petrolatum- and wax-based tapes, and shrink sleeves were evaluated. WGF-backed tapes had the highest rating.
IJspeert, Hanna; van Schouwenburg, Pauline A.; van Zessen, David; Pico-Knijnenburg, Ingrid
2017-01-01
Antigen Receptor Galaxy (ARGalaxy) is a Web-based tool for analyses and visualization of TCR and BCR sequencing data of 13 species. ARGalaxy consists of four parts: the demultiplex tool, the international ImMunoGeneTics information system (IMGT) concatenate tool, the immune repertoire pipeline, and the somatic hypermutation (SHM) and class switch recombination (CSR) pipeline. Together they allow the analysis of all different aspects of the immune repertoire. All pipelines can be run independently or combined, depending on the available data and the question of interest. The demultiplex tool allows data trimming and demultiplexing, whereas with the concatenate tool multiple IMGT/HighV-QUEST output files can be merged into a single file. The immune repertoire pipeline is an extended version of our previously published ImmunoGlobulin Galaxy (IGGalaxy) virtual machine that was developed to visualize V(D)J gene usage. It allows analysis of both BCR and TCR rearrangements, visualizes CDR3 characteristics (length and amino acid usage) and junction characteristics, and calculates the diversity of the immune repertoire. Finally, ARGalaxy includes the newly developed SHM and CSR pipeline to analyze SHM and/or CSR in BCR rearrangements. It analyzes the frequency and patterns of SHM, Ag selection (including BASELINe), clonality (Change-O), and CSR. The functionality of the ARGalaxy tool is illustrated in several clinical examples of patients with primary immunodeficiencies. In conclusion, ARGalaxy is a novel tool for the analysis of the complete immune repertoire, which is applicable to many patient groups with disturbances in the immune repertoire such as autoimmune diseases, allergy, and leukemia, but it can also be used to address basic research questions in repertoire formation and selection. PMID:28416602
1983-10-24
Gas Transport System Examined (R. Verdiyan; VYSHKA, 26 Jul 83).. 40 National Pipeline Transport System Examined (L. Korenev ; SOVETSKAYA...SOVETSKAYA LATVIYA in Russian 9 Aug 83 p 2 /Article by L. Korenev : "The Country’s Pipeline TransportV /Text/ The Soviet Union produces more steel pipes
The Chandra Source Catalog 2.0: Building The Catalog
NASA Astrophysics Data System (ADS)
Grier, John D.; Plummer, David A.; Allen, Christopher E.; Anderson, Craig S.; Budynkiewicz, Jamie A.; Burke, Douglas; Chen, Judy C.; Civano, Francesca Maria; D'Abrusco, Raffaele; Doe, Stephen M.; Evans, Ian N.; Evans, Janet D.; Fabbiano, Giuseppina; Gibbs, Danny G., II; Glotfelty, Kenny J.; Graessle, Dale E.; Hain, Roger; Hall, Diane M.; Harbo, Peter N.; Houck, John C.; Lauer, Jennifer L.; Laurino, Omar; Lee, Nicholas P.; Martínez-Galarza, Juan Rafael; McCollough, Michael L.; McDowell, Jonathan C.; Miller, Joseph; McLaughlin, Warren; Morgan, Douglas L.; Mossman, Amy E.; Nguyen, Dan T.; Nichols, Joy S.; Nowak, Michael A.; Paxson, Charles; Primini, Francis Anthony; Rots, Arnold H.; Siemiginowska, Aneta; Sundheim, Beth A.; Tibbetts, Michael; Van Stone, David W.; Zografou, Panagoula
2018-01-01
To build release 2.0 of the Chandra Source Catalog (CSC2), we require scientific software tools and processing pipelines to evaluate and analyze the data. Additionally, software and hardware infrastructure is needed to coordinate and distribute pipeline execution, manage data i/o, and handle data for Quality Assurance (QA) intervention. We also provide data product staging for archive ingestion.Release 2 utilizes a database driven system used for integration and production. Included are four distinct instances of the Automatic Processing (AP) system (Source Detection, Master Match, Source Properties and Convex Hulls) and a high performance computing (HPC) cluster that is managed to provide efficient catalog processing. In this poster we highlight the internal systems developed to meet the CSC2 challenge.This work has been supported by NASA under contract NAS 8-03060 to the Smithsonian Astrophysical Observatory for operation of the Chandra X-ray Center.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-05
... integrated U.S. natural gas pipeline system. GLLC notes that due to the Gulf LNG Terminal's direct access to multiple major interstate pipelines and indirect access to the national gas pipeline grid, the Project's... possible impacts that the Export Project might have on natural gas supply and pricing. Navigant's analysis...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-27
..., of pipeline- quality natural gas. Pangea states that such gas will be delivered to the ST LNG Project... systems \\4\\ via the South Texas Pipeline, thereby allowing natural gas to be supplied through displacement... the siting, construction, and operation of an affiliated natural gas pipeline that will bring feed gas...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-02-03
..., Regulatory Certainty, and Job Creation Act of 2011 (PL112-90), have imposed additional demands on their... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration 49 CFR Part 192 [Docket ID PHMSA-2011-0009] RIN 2137-AE71 Pipeline Safety: Expanding the Use of Excess Flow Valves...
(Un)Doing Hegemony in Education: Disrupting School-to-Prison Pipelines for Black Males
ERIC Educational Resources Information Center
Dancy, T. Elon, II
2014-01-01
The school-to-prison pipeline refers to the disturbing national trend in which children are funneled out of public schools and into juvenile and criminal justice systems. The purpose of this article is to theorize how this pipeline fulfills societal commitments to black male over-incarceration. First, the author reviews the troublesome perceptions…
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-02
... integrated pipeline system which connects producers and shippers of crude oil and natural gas liquids in....enbridgepartners.com ). Enbridge Partners provides pipeline transportation of petroleum and natural gas in the mid... or importation of liquid petroleum, petroleum products, or other non-gaseous fuels to or from a...
77 FR 68115 - Millennium Pipeline Company, L.L.C.; Notice of Application
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-15
...] Millennium Pipeline Company, L.L.C.; Notice of Application Take notice that on November 1, 2012, Millennium Pipeline Company, L.L.C. (Millennium), One Blue Hill Plaza, Seventh Floor, P.O. Box 1565, Pearl River, New... system to the existing interconnection with Algonquin Gas Transmission, L.L.C. in Ramapo, New York and...
Song, Yan; Dhodda, Raj; Zhang, Jun; Sydor, Jens
2014-05-01
In the recent past, we have seen an increase in the outsourcing of bioanalysis in pharmaceutical companies in support of their drug development pipeline. This trend is largely driven by the effort to reduce internal cost, especially in support of late-stage pipeline assets where established bioanalytical assays are used to analyze a large volume of samples. This article will highlight our perspective of how bioanalytical laboratories within pharmaceutical companies can be developed into the best partner in the advancement of drug development pipelines with high-quality support at competitive cost.
Developing a Comprehensive Risk Assessment Framework for Geological Storage CO 2
DOE Office of Scientific and Technical Information (OSTI.GOV)
Duncan, Ian
2014-08-31
The operational risks for CCS projects include: risks of capturing, compressing, transporting and injecting CO₂; risks of well blowouts; risk that CO 2 will leak into shallow aquifers and contaminate potable water; and risk that sequestered CO 2 will leak into the atmosphere. This report examines these risks by using information on the risks associated with analogue activities such as CO 2 based enhanced oil recovery (CO 2-EOR), natural gas storage and acid gas disposal. We have developed a new analysis of pipeline risk based on Bayesian statistical analysis. Bayesian theory probabilities may describe states of partial knowledge, even perhapsmore » those related to non-repeatable events. The Bayesian approach enables both utilizing existing data and at the same time having the capability to adsorb new information thus to lower uncertainty in our understanding of complex systems. Incident rates for both natural gas and CO 2 pipelines have been widely used in papers and reports on risk of CO 2 pipelines as proxies for the individual risk created by such pipelines. Published risk studies of CO 2 pipelines suggest that the individual risk associated with CO2 pipelines is between 10 -3 and 10 -4, which reflects risk levels approaching those of mountain climbing, which many would find unacceptably high. This report concludes, based on a careful analysis of natural gas pipeline failures, suggests that the individual risk of CO 2 pipelines is likely in the range of 10-6 to 10-7, a risk range considered in the acceptable to negligible range in most countries. If, as is commonly thought, pipelines represent the highest risk component of CCS outside of the capture plant, then this conclusion suggests that most (if not all) previous quantitative- risk assessments of components of CCS may be orders of magnitude to high. The potential lethality of unexpected CO 2 releases from pipelines or wells are arguably the highest risk aspects of CO 2 enhanced oil recovery (CO2-EOR), carbon capture, and storage (CCS). Assertions in the CCS literature, that CO 2 levels of 10% for ten minutes, or 20 to 30% for a few minutes are lethal to humans, are not supported by the available evidence. The results of published experiments with animals exposed to CO 2, from mice to monkeys, at both normal and depleted oxygen levels, suggest that lethal levels of CO 2 toxicity are in the range 50 to 60%. These experiments demonstrate that CO 2 does not kill by asphyxia, but rather is toxic at high concentrations. It is concluded that quantitative risk assessments of CCS have overestimated the risk of fatalities by using values of lethality a factor two to six lower than the values estimated in this paper. In many dispersion models of CO 2 releases from pipelines, no fatalities would be predicted if appropriate levels of lethality for CO 2 had been used in the analysis.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Higgins, G.L.; Bates, C.R.
A new procedure for testing elevated-temperature cathodic disbondment (C.D.) in fusion-bonded epoxy (FBE) pipeline coatings appears consistent and reliable. Further, its results question C.D. theories that fail to account for effects at above-ambient temperatures. The work to develop this procedure also included experiments that demonstrated how the relative performance of coating systems - especially FBE line-pipe coatings operated at elevated temperature - could not be predicted from ambient-temperature assessment. Data reported in this third in a series on pipeline-protection technology confirm and expand on these aspects and introduce more recent results on the behavior of FBE coatings subjected to elevated-temperaturemore » C.D. testing.« less
Building a common pipeline for rule-based document classification.
Patterson, Olga V; Ginter, Thomas; DuVall, Scott L
2013-01-01
Instance-based classification of clinical text is a widely used natural language processing task employed as a step for patient classification, document retrieval, or information extraction. Rule-based approaches rely on concept identification and context analysis in order to determine the appropriate class. We propose a five-step process that enables even small research teams to develop simple but powerful rule-based NLP systems by taking advantage of a common UIMA AS based pipeline for classification. Our proposed methodology coupled with the general-purpose solution provides researchers with access to the data locked in clinical text in cases of limited human resources and compact timelines.
NASA Astrophysics Data System (ADS)
Tejedor, J.; Macias-Guarasa, J.; Martins, H. F.; Piote, D.; Pastor-Graells, J.; Martin-Lopez, S.; Corredera, P.; De Pauw, G.; De Smet, F.; Postvoll, W.; Ahlen, C. H.; Gonzalez-Herraez, M.
2017-04-01
This paper presents the first report on on-line and final blind field test results of a pipeline integrity threat surveillance system. The system integrates a machine+activity identification mode, and a threat detection mode. Two different pipeline sections were selected for the blind tests: One close to the sensor position, and the other 35 km away from it. Results of the machine+activity identification mode showed that about 46% of the times the machine, the activity or both were correctly identified. For the threat detection mode, 8 out of 10 threats were correctly detected, with 1 false alarm.
General-Purpose Electronic System Tests Aircraft
NASA Technical Reports Server (NTRS)
Glover, Richard D.
1989-01-01
Versatile digital equipment supports research, development, and maintenance. Extended aircraft interrogation and display system is general-purpose assembly of digital electronic equipment on ground for testing of digital electronic systems on advanced aircraft. Many advanced features, including multiple 16-bit microprocessors, pipeline data-flow architecture, advanced operating system, and resident software-development tools. Basic collection of software includes program for handling many types of data and for displays in various formats. User easily extends basic software library. Hardware and software interfaces to subsystems provided by user designed for flexibility in configuration to meet user's requirements.
An integrated database-pipeline system for studying single nucleotide polymorphisms and diseases.
Yang, Jin Ok; Hwang, Sohyun; Oh, Jeongsu; Bhak, Jong; Sohn, Tae-Kwon
2008-12-12
Studies on the relationship between disease and genetic variations such as single nucleotide polymorphisms (SNPs) are important. Genetic variations can cause disease by influencing important biological regulation processes. Despite the needs for analyzing SNP and disease correlation, most existing databases provide information only on functional variants at specific locations on the genome, or deal with only a few genes associated with disease. There is no combined resource to widely support gene-, SNP-, and disease-related information, and to capture relationships among such data. Therefore, we developed an integrated database-pipeline system for studying SNPs and diseases. To implement the pipeline system for the integrated database, we first unified complicated and redundant disease terms and gene names using the Unified Medical Language System (UMLS) for classification and noun modification, and the HUGO Gene Nomenclature Committee (HGNC) and NCBI gene databases. Next, we collected and integrated representative databases for three categories of information. For genes and proteins, we examined the NCBI mRNA, UniProt, UCSC Table Track and MitoDat databases. For genetic variants we used the dbSNP, JSNP, ALFRED, and HGVbase databases. For disease, we employed OMIM, GAD, and HGMD databases. The database-pipeline system provides a disease thesaurus, including genes and SNPs associated with disease. The search results for these categories are available on the web page http://diseasome.kobic.re.kr/, and a genome browser is also available to highlight findings, as well as to permit the convenient review of potentially deleterious SNPs among genes strongly associated with specific diseases and clinical phenotypes. Our system is designed to capture the relationships between SNPs associated with disease and disease-causing genes. The integrated database-pipeline provides a list of candidate genes and SNP markers for evaluation in both epidemiological and molecular biological approaches to diseases-gene association studies. Furthermore, researchers then can decide semi-automatically the data set for association studies while considering the relationships between genetic variation and diseases. The database can also be economical for disease-association studies, as well as to facilitate an understanding of the processes which cause disease. Currently, the database contains 14,674 SNP records and 109,715 gene records associated with human diseases and it is updated at regular intervals.
19. PIPELINE INTERSECTION AT THE MOUTH OF WAIKOLU VALLEY ON ...
19. PIPELINE INTERSECTION AT THE MOUTH OF WAIKOLU VALLEY ON THE BEACH. VALVE AT RIGHT (WITH WRENCH NEARBY) OPENS TO FLUSH VALLEY SYSTEM OUT. VALVE AT LEFT CLOSES TO KEEP WATER FROM ENTERING SYSTEM ALONG THE PALI DURING REPAIRS. - Kalaupapa Water Supply System, Waikolu Valley to Kalaupapa Settlement, Island of Molokai, Kalaupapa, Kalawao County, HI
Bellec, Pierre; Lavoie-Courchesne, Sébastien; Dickinson, Phil; Lerch, Jason P; Zijdenbos, Alex P; Evans, Alan C
2012-01-01
The analysis of neuroimaging databases typically involves a large number of inter-connected steps called a pipeline. The pipeline system for Octave and Matlab (PSOM) is a flexible framework for the implementation of pipelines in the form of Octave or Matlab scripts. PSOM does not introduce new language constructs to specify the steps and structure of the workflow. All steps of analysis are instead described by a regular Matlab data structure, documenting their associated command and options, as well as their input, output, and cleaned-up files. The PSOM execution engine provides a number of automated services: (1) it executes jobs in parallel on a local computing facility as long as the dependencies between jobs allow for it and sufficient resources are available; (2) it generates a comprehensive record of the pipeline stages and the history of execution, which is detailed enough to fully reproduce the analysis; (3) if an analysis is started multiple times, it executes only the parts of the pipeline that need to be reprocessed. PSOM is distributed under an open-source MIT license and can be used without restriction for academic or commercial projects. The package has no external dependencies besides Matlab or Octave, is straightforward to install and supports of variety of operating systems (Linux, Windows, Mac). We ran several benchmark experiments on a public database including 200 subjects, using a pipeline for the preprocessing of functional magnetic resonance images (fMRI). The benchmark results showed that PSOM is a powerful solution for the analysis of large databases using local or distributed computing resources.
Bellec, Pierre; Lavoie-Courchesne, Sébastien; Dickinson, Phil; Lerch, Jason P.; Zijdenbos, Alex P.; Evans, Alan C.
2012-01-01
The analysis of neuroimaging databases typically involves a large number of inter-connected steps called a pipeline. The pipeline system for Octave and Matlab (PSOM) is a flexible framework for the implementation of pipelines in the form of Octave or Matlab scripts. PSOM does not introduce new language constructs to specify the steps and structure of the workflow. All steps of analysis are instead described by a regular Matlab data structure, documenting their associated command and options, as well as their input, output, and cleaned-up files. The PSOM execution engine provides a number of automated services: (1) it executes jobs in parallel on a local computing facility as long as the dependencies between jobs allow for it and sufficient resources are available; (2) it generates a comprehensive record of the pipeline stages and the history of execution, which is detailed enough to fully reproduce the analysis; (3) if an analysis is started multiple times, it executes only the parts of the pipeline that need to be reprocessed. PSOM is distributed under an open-source MIT license and can be used without restriction for academic or commercial projects. The package has no external dependencies besides Matlab or Octave, is straightforward to install and supports of variety of operating systems (Linux, Windows, Mac). We ran several benchmark experiments on a public database including 200 subjects, using a pipeline for the preprocessing of functional magnetic resonance images (fMRI). The benchmark results showed that PSOM is a powerful solution for the analysis of large databases using local or distributed computing resources. PMID:22493575
Next Generation Models for Storage and Representation of Microbial Biological Annotation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Quest, Daniel J; Land, Miriam L; Brettin, Thomas S
2010-01-01
Background Traditional genome annotation systems were developed in a very different computing era, one where the World Wide Web was just emerging. Consequently, these systems are built as centralized black boxes focused on generating high quality annotation submissions to GenBank/EMBL supported by expert manual curation. The exponential growth of sequence data drives a growing need for increasingly higher quality and automatically generated annotation. Typical annotation pipelines utilize traditional database technologies, clustered computing resources, Perl, C, and UNIX file systems to process raw sequence data, identify genes, and predict and categorize gene function. These technologies tightly couple the annotation software systemmore » to hardware and third party software (e.g. relational database systems and schemas). This makes annotation systems hard to reproduce, inflexible to modification over time, difficult to assess, difficult to partition across multiple geographic sites, and difficult to understand for those who are not domain experts. These systems are not readily open to scrutiny and therefore not scientifically tractable. The advent of Semantic Web standards such as Resource Description Framework (RDF) and OWL Web Ontology Language (OWL) enables us to construct systems that address these challenges in a new comprehensive way. Results Here, we develop a framework for linking traditional data to OWL-based ontologies in genome annotation. We show how data standards can decouple hardware and third party software tools from annotation pipelines, thereby making annotation pipelines easier to reproduce and assess. An illustrative example shows how TURTLE (Terse RDF Triple Language) can be used as a human readable, but also semantically-aware, equivalent to GenBank/EMBL files. Conclusions The power of this approach lies in its ability to assemble annotation data from multiple databases across multiple locations into a representation that is understandable to researchers. In this way, all researchers, experimental and computational, will more easily understand the informatics processes constructing genome annotation and ultimately be able to help improve the systems that produce them.« less
49 CFR 195.402 - Procedural manual for operations, maintenance, and emergencies.
Code of Federal Regulations, 2013 CFR
2013-10-01
... effective. This manual shall be prepared before initial operations of a pipeline system commence, and... in a timely and effective manner. (3) Operating, maintaining, and repairing the pipeline system in... public officials to learn the responsibility and resources of each government organization that may...
The Environmental Technology Verification report discusses the technology and performance of the Parametric Emissions Monitoring System (PEMS) manufactured by ANR Pipeline Company, a subsidiary of Coastal Corporation, now El Paso Corporation. The PEMS predicts carbon doixide (CO2...
78 FR 39717 - Iroquois Gas Transmission System, LP; Notice of Application
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-02
... associated with these new and modified facilities to Constitution Pipeline Company, LLC (Constitution), a... to establish a new receipt interconnection with Constitution and create an incremental 650,000... Constitution to interconnections with Iroquois' mainline system as well as Tennessee Gas Pipeline Company, LLC...
49 CFR 191.12 - Distribution Systems: Mechanical Fitting Failure Reports
Code of Federal Regulations, 2011 CFR
2011-10-01
... 49 Transportation 3 2011-10-01 2011-10-01 false Distribution Systems: Mechanical Fitting Failure Reports 191.12 Section 191.12 Transportation Other Regulations Relating to Transportation (Continued) PIPELINE AND HAZARDOUS MATERIALS SAFETY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED) PIPELINE SAFETY TRANSPORTATION OF NATURAL AND OTHER...
Turning Noise into Signal: Utilizing Impressed Pipeline Currents for EM Exploration
NASA Astrophysics Data System (ADS)
Lindau, Tobias; Becken, Michael
2017-04-01
Impressed Current Cathodic Protection (ICCP) systems are extensively used for the protection of central Europe's dense network of oil-, gas- and water pipelines against destruction by electrochemical corrosion. While ICCP systems usually provide protection by injecting a DC current into the pipeline, mandatory pipeline integrity surveys demand a periodical switching of the current. Consequently, the resulting time varying pipe currents induce secondary electric- and magnetic fields in the surrounding earth. While these fields are usually considered to be unwanted cultural noise in electromagnetic exploration, this work aims at utilizing the fields generated by the ICCP system for determining the electrical resistivity of the subsurface. The fundamental period of the switching cycles typically amounts to 15 seconds in Germany and thereby roughly corresponds to periods used in controlled source EM applications (CSEM). For detailed studies we chose an approximately 30km long pipeline segment near Herford, Germany as a test site. The segment is located close to the southern margin of the Lower Saxony Basin (LSB) and part of a larger gas pipeline composed of multiple segments. The current injected into the pipeline segment originates in a rectified 50Hz AC signal which is periodically switched on and off. In contrast to the usual dipole sources used in CSEM surveys, the current distribution along the pipeline is unknown and expected to be non-uniform due to coating defects that cause current to leak into the surrounding soil. However, an accurate current distribution is needed to model the fields generated by the pipeline source. We measured the magnetic fields at several locations above the pipeline and used Biot-Savarts-Law to estimate the currents decay function. The resulting frequency dependent current distribution shows a current decay away from the injection point as well as a frequency dependent phase shift which is increasing with distance from the injection point. Electric field data were recorded at 45 stations located in an area of about 60 square kilometers in the vicinity to the pipeline. Additionally, the injected source current was recorded directly at the injection point. Transfer functions between the local electric fields and the injected source current are estimated for frequencies ranging from 0.03Hz to 15Hz using robust time series processing techniques. The resulting transfer functions are inverted for a 3D conductivity model of the subsurface using an elaborate pipeline model. We interpret the model with regards to the local geologic setting, demonstrating the methods capabilities to image the subsurface.
Lee, Unseok; Chang, Sungyul; Putra, Gian Anantrio; Kim, Hyoungseok; Kim, Dong Hwan
2018-01-01
A high-throughput plant phenotyping system automatically observes and grows many plant samples. Many plant sample images are acquired by the system to determine the characteristics of the plants (populations). Stable image acquisition and processing is very important to accurately determine the characteristics. However, hardware for acquiring plant images rapidly and stably, while minimizing plant stress, is lacking. Moreover, most software cannot adequately handle large-scale plant imaging. To address these problems, we developed a new, automated, high-throughput plant phenotyping system using simple and robust hardware, and an automated plant-imaging-analysis pipeline consisting of machine-learning-based plant segmentation. Our hardware acquires images reliably and quickly and minimizes plant stress. Furthermore, the images are processed automatically. In particular, large-scale plant-image datasets can be segmented precisely using a classifier developed using a superpixel-based machine-learning algorithm (Random Forest), and variations in plant parameters (such as area) over time can be assessed using the segmented images. We performed comparative evaluations to identify an appropriate learning algorithm for our proposed system, and tested three robust learning algorithms. We developed not only an automatic analysis pipeline but also a convenient means of plant-growth analysis that provides a learning data interface and visualization of plant growth trends. Thus, our system allows end-users such as plant biologists to analyze plant growth via large-scale plant image data easily.
Internal Corrosion Detection in Liquids Pipelines
DOT National Transportation Integrated Search
2012-01-01
PHMSA project DTRS56-05-T-0005 "Development of ICDA for Liquid Petroleum Pipelines" led to the development of a Direct Assessment (DA) protocol to prioritize locations of possible internal corrosion. The underlying basis LP-ICDA is simple; corrosion ...
The Calibration Reference Data System
NASA Astrophysics Data System (ADS)
Greenfield, P.; Miller, T.
2016-07-01
We describe a software architecture and implementation for using rules to determine which calibration files are appropriate for calibrating a given observation. This new system, the Calibration Reference Data System (CRDS), replaces what had been previously used for the Hubble Space Telescope (HST) calibration pipelines, the Calibration Database System (CDBS). CRDS will be used for the James Webb Space Telescope (JWST) calibration pipelines, and is currently being used for HST calibration pipelines. CRDS can be easily generalized for use in similar applications that need a rules-based system for selecting the appropriate item for a given dataset; we give some examples of such generalizations that will likely be used for JWST. The core functionality of the Calibration Reference Data System is available under an Open Source license. CRDS is briefly contrasted with a sampling of other similar systems used at other observatories.
CFHT's SkyProbe: a real-time sky-transparency monitor
NASA Astrophysics Data System (ADS)
Cuillandre, Jean-Charles; Magnier, Eugene A.; Isani, Sidik; Sabin, Daniel; Knight, Wiley; Kras, Simon; Lai, Kamson
2002-12-01
We have developed a system at the Canada-France-Hawaii Telescope (CFHT), SkyProbe, which allows for the direct measurement of the true attenuation by clouds once per minute, within a percent, directly on the field pointed by the telescope. It has been possible to make this system relatively inexpensively due to the low-cost CCD cameras from the amateur market. A crucial addition to this hardware is the quite recent availability of a full-sky photometry catalog at the appropriate depth: the Tycho catalog, from the Hipparcos mission. The central element is the automatic data analysis pipeline developed at CFHT, Elixir, for the improved operation of the CFHT wide-field imagers, CFH12K and MegaCam. SkyProbe"s FITS images are processed in real-time and the pipeline output (a zero point attenuation) provides the current sky transmission to the observers and helps immediate decision making. These measurements are also attached to the archived data, adding a key criteria for future use by other astronomers.
NASA Astrophysics Data System (ADS)
Christianson, D. S.; Beekwilder, N.; Chan, S.; Cheah, Y. W.; Chu, H.; Dengel, S.; O'Brien, F.; Pastorello, G.; Sandesh, M.; Torn, M. S.; Agarwal, D.
2017-12-01
AmeriFlux is a network of scientists who independently collect eddy covariance and related environmental observations at over 250 locations across the Americas. As part of the AmeriFlux Management Project, the AmeriFlux Data Team manages standardization, collection, quality assurance / quality control (QA/QC), and distribution of data submitted by network members. To generate data products that are timely, QA/QC'd, and repeatable, and have traceable provenance, we developed a semi-automated data processing pipeline. The new pipeline consists of semi-automated format and data QA/QC checks. Results are communicated via on-line reports as well as an issue-tracking system. Data processing time has been reduced from 2-3 days to a few hours of manual review time, resulting in faster data availability from the time of data submission. The pipeline is scalable to the network level and has the following key features. (1) On-line results of the format QA/QC checks are available immediately for data provider review. This enables data providers to correct and resubmit data quickly. (2) The format QA/QC assessment includes an automated attempt to fix minor format errors. Data submissions that are formatted in the new AmeriFlux FP-In standard can be queued for the data QA/QC assessment, often with minimal delay. (3) Automated data QA/QC checks identify and communicate potentially erroneous data via online, graphical quick views that highlight observations with unexpected values, incorrect units, time drifts, invalid multivariate correlations, and/or radiation shadows. (4) Progress through the pipeline is integrated with an issue-tracking system that facilitates communications between data providers and the data processing team in an organized and searchable fashion. Through development of these and other features of the pipeline, we present solutions to challenges that include optimizing automated with manual processing, bridging legacy data management infrastructure with various software tools, and working across interdisciplinary and international science cultures. Additionally, we discuss results from community member feedback that helped refine QA/QC communications for efficient data submission and revision.
Commanding Constellations (Pipeline Architecture)
NASA Technical Reports Server (NTRS)
Ray, Tim; Condron, Jeff
2003-01-01
Providing ground command software for constellations of spacecraft is a challenging problem. Reliable command delivery requires a feedback loop; for a constellation there will likely be an independent feedback loop for each constellation member. Each command must be sent via the proper Ground Station, which may change from one contact to the next (and may be different for different members). Dynamic configuration of the ground command software is usually required (e.g. directives to configure each member's feedback loop and assign the appropriate Ground Station). For testing purposes, there must be a way to insert command data at any level in the protocol stack. The Pipeline architecture described in this paper can support all these capabilities with a sequence of software modules (the pipeline), and a single self-identifying message format (for all types of command data and configuration directives). The Pipeline architecture is quite simple, yet it can solve some complex problems. The resulting solutions are conceptually simple, and therefore, reliable. They are also modular, and therefore, easy to distribute and extend. We first used the Pipeline architecture to design a CCSDS (Consultative Committee for Space Data Systems) Ground Telecommand system (to command one spacecraft at a time with a fixed Ground Station interface). This pipeline was later extended to include gateways to any of several Ground Stations. The resulting pipeline was then extended to handle a small constellation of spacecraft. The use of the Pipeline architecture allowed us to easily handle the increasing complexity. This paper will describe the Pipeline architecture, show how it was used to solve each of the above commanding situations, and how it can easily be extended to handle larger constellations.
Finite-Element Modeling of a Damaged Pipeline Repaired Using the Wrap of a Composite Material
NASA Astrophysics Data System (ADS)
Lyapin, A. A.; Chebakov, M. I.; Dumitrescu, A.; Zecheru, G.
2015-07-01
The nonlinear static problem of FEM modeling of a damaged pipeline repaired by a composite material and subjected to internal pressure is considered. The calculation is carried out using plasticity theory for the pipeline material and considering the polymeric filler and the composite wrap. The level of stresses in various zones of the structure is analyzed. The most widespread alloy used for oil pipelines is selected as pipe material. The contribution of each component of the pipeline-filler-wrap system to the level of stresses is investigated. The effect of the number of composite wrap layers is estimated. The results obtained allow one to decrease the costs needed for producing test specimens.
A bipolar population counter using wave pipelining to achieve 2.5 x normal clock frequency
NASA Technical Reports Server (NTRS)
Wong, Derek C.; De Micheli, Giovanni; Flynn, Michael J.; Huston, Robert E.
1992-01-01
Wave pipelining is a technique for pipelining digital systems that can increase clock frequency in practical circuits without increasing the number of storage elements. In wave pipelining, multiple coherent waves of data are sent through a block of combinational logic by applying new inputs faster than the delay through the logic. The throughput of a 63-b CML population counter was increased from 97 to 250 MHz using wave pipelining. The internal circuit is flowthrough combinational logic. Novel CAD methods have balanced all input-to-output paths to about the same delay. This allows multiple data waves to propagate in sequence when the circuit is clocked faster than its propagation delay.
2013-12-01
21 a. Siberian Pipeline Explosion (1982) ............................21 b. Chevron Emergency Alert...the fifth domain: Are the mouse and keyboard the new weapons of conflict?,” The Economist, July 1, 2010, http://www.economist.com/node/16478792. 15...a. Siberian Pipeline Explosion (1982) In 1982, intruders planted a Trojan horse in the SCADA system that controls the Siberian Pipeline. This is the
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-05
... is a new natural gas pipeline system that would transport natural gas produced on the Alaska North... provisions of section 7(c) of the Natural Gas Act (NGA) and the Alaska Natural Gas Pipeline Act of 2004... pipeline to Valdez, Alaska for delivery into a liquefied natural gas (LNG) plant for liquefaction and...
ERIC Educational Resources Information Center
Boeke, Marianne; Zis, Stacey; Ewell, Peter
2011-01-01
With support from the Bill and Melinda Gates Foundation, the National Center for Higher Education Management Systems (NCHEMS) is engaged in a two year project centered on state policies that foster student progression and success in the "adult re-entry pipeline." The adult re-entry pipeline consists of the many alternative pathways to…
Pipelined CPU Design with FPGA in Teaching Computer Architecture
ERIC Educational Resources Information Center
Lee, Jong Hyuk; Lee, Seung Eun; Yu, Heon Chang; Suh, Taeweon
2012-01-01
This paper presents a pipelined CPU design project with a field programmable gate array (FPGA) system in a computer architecture course. The class project is a five-stage pipelined 32-bit MIPS design with experiments on the Altera DE2 board. For proper scheduling, milestones were set every one or two weeks to help students complete the project on…
77 FR 76024 - Combined Notice of Filings
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-26
... that the Commission has received the following Natural Gas Pipeline Rate and Refund Report filings...: Natural Gas Pipeline Company of America. Description: Removal of Agreements to be effective 1/19/2013... Natural Gas System, L.L.C. Description: Gulfstream Natural Gas System, L.L.C. submits tariff filing per...
NASA Astrophysics Data System (ADS)
Lewis, J. R.; Irwin, M.; Bunclark, P.
2010-12-01
The VISTA telescope is a 4 metre instrument which has recently been commissioned at Paranal, Chile. Equipped with an infrared camera, 16 2Kx2K Raytheon detectors and a 1.7 square degree field of view, VISTA represents a huge leap in infrared survey capability in the southern hemisphere. Pipeline processing of IR data is far more technically challenging than for optical data. IR detectors are inherently more unstable, while the sky emission is over 100 times brighter than most objects of interest, and varies in a complex spatial and temporal manner. To compensate for this, exposure times are kept short, leading to high nightly data rates. VISTA is expected to generate an average of 250 GB of data per night over the next 5-10 years, which far exceeds the current total data rate of all 8m-class telescopes. In this presentation we discuss the pipelines that have been developed to deal with IR imaging data from VISTA and discuss the primary issues involved in an end-to-end system capable of: robustly removing instrument and night sky signatures; monitoring data quality and system integrity; providing astrometric and photometric calibration; and generating photon noise-limited images and science-ready astronomical catalogues.
ATI SAA Annex 3 Button Tensile Test Report I
NASA Technical Reports Server (NTRS)
Tang, Henry H.
2013-01-01
This report documents the results of a study carried out under Splace Act Agreement SAA-EA-10-004 between the National Aeronautics and Space Administration (NASA) and Astro Technology Incorpporated (ATI). NASA and ATI have entered into this agreement to collaborate on the development of technologies that can benefit both the US government space programs and the oil and gas industry. The report documents the results of a test done on an adnesive system for attaching new monitoring sensor devices to pipelines under Annex III of SAA-EA-10-004: "Proof-of-Concept Design and Testing of a Post Installed Sensing Device on Subsea Risers and Pipelines". The tasks of Annex III are to design and test a proof-of-concept sensing device for in-situ installation on pipelines, risers, or other structures deployed in deep water. The function of the sensor device is to measure various signals such as strain, stress and temperature. This study complements the work done, in Annex I of the SAA, on attaching a fiber optic sensing device to pipe via adhesive bonding. Both Annex I and Annex III studies were conducted in the Crew and Thermal System Division (CTSD) at the Johnson Space Center (JSC) in collaboration with ATI.
Landslide and Land Subsidence Hazards to Pipelines
Baum, Rex L.; Galloway, Devin L.; Harp, Edwin L.
2008-01-01
Landslides and land subsidence pose serious hazards to pipelines throughout the world. Many existing pipeline corridors and more and more new pipelines cross terrain that is affected by either landslides, land subsidence, or both. Consequently the pipeline industry recognizes a need for increased awareness of methods for identifying and evaluating landslide and subsidence hazard for pipeline corridors. This report was prepared in cooperation with the U.S. Department of Transportation Pipeline and Hazardous Materials Safety Administration, and Pipeline Research Council International through a cooperative research and development agreement (CRADA) with DGH Consulting, Inc., to address the need for up-to-date information about current methods to identify and assess these hazards. Chapters in this report (1) describe methods for evaluating landslide hazard on a regional basis, (2) describe the various types of land subsidence hazard in the United States and available methods for identifying and quantifying subsidence, and (3) summarize current methods for investigating individual landslides. In addition to the descriptions, this report provides information about the relative costs, limitations and reliability of various methods.
Iturbe, Rosario; Flores, Carlos; Castro, Alejandrina; Torres, Luis G
2007-10-01
Oil spills due to oil pipelines is a very frequent problem in Mexico. Petroleos Mexicanos (PEMEX), very concerned with the environmental agenda, has been developing inspection and correction plans for zones around oil pipelines pumping stations and pipeline right-of-way. These stations are located at regular intervals of kilometres along the pipelines. In this study, two sections of an oil pipeline and two pipeline pumping stations zones are characterized in terms of the presence of Total Petroleum Hydrocarbons (TPHs) and Polycyclic Aromatic Hydrocarbons (PAHs). The study comprehends sampling of the areas, delimitation of contamination in the vertical and horizontal extension, analysis of the sampled soils regarding TPHs content and, in some cases, the 16 PAHs considered as priority by USEPA, calculation of areas and volumes contaminated (according to Mexican legislation, specifically NOM-EM-138-ECOL-2002) and, finally, a proposal for the best remediation techniques suitable for the contamination levels and the localization of contaminants.
Toward cognitive pipelines of medical assistance algorithms.
Philipp, Patrick; Maleshkova, Maria; Katic, Darko; Weber, Christian; Götz, Michael; Rettinger, Achim; Speidel, Stefanie; Kämpgen, Benedikt; Nolden, Marco; Wekerle, Anna-Laura; Dillmann, Rüdiger; Kenngott, Hannes; Müller, Beat; Studer, Rudi
2016-09-01
Assistance algorithms for medical tasks have great potential to support physicians with their daily work. However, medicine is also one of the most demanding domains for computer-based support systems, since medical assistance tasks are complex and the practical experience of the physician is crucial. Recent developments in the area of cognitive computing appear to be well suited to tackle medicine as an application domain. We propose a system based on the idea of cognitive computing and consisting of auto-configurable medical assistance algorithms and their self-adapting combination. The system enables automatic execution of new algorithms, given they are made available as Medical Cognitive Apps and are registered in a central semantic repository. Learning components can be added to the system to optimize the results in the cases when numerous Medical Cognitive Apps are available for the same task. Our prototypical implementation is applied to the areas of surgical phase recognition based on sensor data and image progressing for tumor progression mappings. Our results suggest that such assistance algorithms can be automatically configured in execution pipelines, candidate results can be automatically scored and combined, and the system can learn from experience. Furthermore, our evaluation shows that the Medical Cognitive Apps are providing the correct results as they did for local execution and run in a reasonable amount of time. The proposed solution is applicable to a variety of medical use cases and effectively supports the automated and self-adaptive configuration of cognitive pipelines based on medical interpretation algorithms.
Prime the Pipeline Project (P[cube]): Putting Knowledge to Work
ERIC Educational Resources Information Center
Greenes, Carole; Wolfe, Susan; Weight, Stephanie; Cavanagh, Mary; Zehring, Julie
2011-01-01
With funding from NSF, the Prime the Pipeline Project (P[cube]) is responding to the need to strengthen the science, technology, engineering, and mathematics (STEM) pipeline from high school to college by developing and evaluating the scientific village strategy and the culture it creates. The scientific village, a community of high school…
Kravatsky, Yuri; Chechetkin, Vladimir; Fedoseeva, Daria; Gorbacheva, Maria; Kravatskaya, Galina; Kretova, Olga; Tchurikov, Nickolai
2017-11-23
The efficient development of antiviral drugs, including efficient antiviral small interfering RNAs (siRNAs), requires continuous monitoring of the strict correspondence between a drug and the related highly variable viral DNA/RNA target(s). Deep sequencing is able to provide an assessment of both the general target conservation and the frequency of particular mutations in the different target sites. The aim of this study was to develop a reliable bioinformatic pipeline for the analysis of millions of short, deep sequencing reads corresponding to selected highly variable viral sequences that are drug target(s). The suggested bioinformatic pipeline combines the available programs and the ad hoc scripts based on an original algorithm of the search for the conserved targets in the deep sequencing data. We also present the statistical criteria for the threshold of reliable mutation detection and for the assessment of variations between corresponding data sets. These criteria are robust against the possible sequencing errors in the reads. As an example, the bioinformatic pipeline is applied to the study of the conservation of RNA interference (RNAi) targets in human immunodeficiency virus 1 (HIV-1) subtype A. The developed pipeline is freely available to download at the website http://virmut.eimb.ru/. Brief comments and comparisons between VirMut and other pipelines are also presented.
System for corrosion monitoring in pipeline applying fuzzy logic mathematics
NASA Astrophysics Data System (ADS)
Kuzyakov, O. N.; Kolosova, A. L.; Andreeva, M. A.
2018-05-01
A list of factors influencing corrosion rate on the external side of underground pipeline is determined. Principles of constructing a corrosion monitoring system are described; the system performance algorithm and program are elaborated. A comparative analysis of methods for calculating corrosion rate is undertaken. Fuzzy logic mathematics is applied to reduce calculations while considering a wider range of corrosion factors.
Pipeline oil fire detection with MODIS active fire products
NASA Astrophysics Data System (ADS)
Ogungbuyi, M. G.; Martinez, P.; Eckardt, F. D.
2017-12-01
We investigate 85 129 MODIS satellite active fire events from 2007 to 2015 in the Niger Delta of Nigeria. The region is the oil base for Nigerian economy and the hub of oil exploration where oil facilities (i.e. flowlines, flow stations, trunklines, oil wells and oil fields) are domiciled, and from where crude oil and refined products are transported to different Nigerian locations through a network of pipeline systems. Pipeline and other oil facilities are consistently susceptible to oil leaks due to operational or maintenance error, and by acts of deliberate sabotage of the pipeline equipment which often result in explosions and fire outbreaks. We used ground oil spill reports obtained from the National Oil Spill Detection and Response Agency (NOSDRA) database (see www.oilspillmonitor.ng) to validate MODIS satellite data. NOSDRA database shows an estimate of 10 000 spill events from 2007 - 2015. The spill events were filtered to include largest spills by volume and events occurring only in the Niger Delta (i.e. 386 spills). By projecting both MODIS fire and spill as `input vector' layers with `Points' geometry, and the Nigerian pipeline networks as `from vector' layers with `LineString' geometry in a geographical information system, we extracted the nearest MODIS events (i.e. 2192) closed to the pipelines by 1000m distance in spatial vector analysis. The extraction process that defined the nearest distance to the pipelines is based on the global practices of the Right of Way (ROW) in pipeline management that earmarked 30m strip of land to the pipeline. The KML files of the extracted fires in a Google map validated their source origin to be from oil facilities. Land cover mapping confirmed fire anomalies. The aim of the study is to propose a near-real-time monitoring of spill events along pipeline routes using 250 m spatial resolution of MODIS active fire detection sensor when such spills are accompanied by fire events in the study location.
Assessing fugitive emissions of CH4 from high-pressure gas pipelines
NASA Astrophysics Data System (ADS)
Worrall, Fred; Boothroyd, Ian; Davies, Richard
2017-04-01
The impact of unconventional natural gas production using hydraulic fracturing methods from shale gas basins has been assessed using life-cycle emissions inventories, covering areas such as pre-production, production and transmission processes. The transmission of natural gas from well pad to processing plants and its transport to domestic sites is an important source of fugitive CH4, yet emissions factors and fluxes from transmission processes are often based upon ver out of date measurements. It is important to determine accurate measurements of natural gas losses when compressed and transported between production and processing facilities so as to accurately determine life-cycle CH4 emissions. This study considers CH4 emissions from the UK National Transmission System (NTS) of high pressure natural gas pipelines. Mobile surveys of CH4 emissions using a Picarro Surveyor cavity-ring-down spectrometer were conducted across four areas in the UK, with routes bisecting high pressure pipelines and separate control routes away from the pipelines. A manual survey of soil gas measurements was also conducted along one of the high pressure pipelines using a tunable diode laser. When wind adjusted 92 km of high pressure pipeline and 72 km of control route were drive over a 10 day period. When wind and distance adjusted CH4 fluxes were significantly greater on routes with a pipeline than those without. The smallest leak detectable was 3% above ambient (1.03 relative concentration) with any leaks below 3% above ambient assumed ambient. The number of leaks detected along the pipelines correlate to the estimated length of pipe joints, inferring that there are constant fugitive CH4 emissions from these joints. When scaled up to the UK's National Transmission System pipeline length of 7600 km gives a fugitive CH4 flux of 4700 ± 2864 kt CH4/yr - this fugitive emission from high pressure pipelines is 0.016% of the annual gas supply.
Real-time inspection by submarine images
NASA Astrophysics Data System (ADS)
Tascini, Guido; Zingaretti, Primo; Conte, Giuseppe
1996-10-01
A real-time application of computer vision concerning tracking and inspection of a submarine pipeline is described. The objective is to develop automatic procedures for supporting human operators in the real-time analysis of images acquired by means of cameras mounted on underwater remotely operated vehicles (ROV) Implementation of such procedures gives rise to a human-machine system for underwater pipeline inspection that can automatically detect and signal the presence of the pipe, of its structural or accessory elements, and of dangerous or alien objects in its neighborhood. The possibility of modifying the image acquisition rate in the simulations performed on video- recorded images is used to prove that the system performs all necessary processing with an acceptable robustness working in real-time up to a speed of about 2.5 kn, widely greater than that the actual ROVs and the security features allow.
Salata, Robert A; Geraci, Mark W; Rockey, Don C; Blanchard, Melvin; Brown, Nancy J; Cardinal, Lucien J; Garcia, Maria; Madaio, Michael P; Marsh, James D; Todd, Robert F
2017-10-03
The U.S. physician-scientist (PS) workforce is invaluable to the nation's biomedical research effort. It is through biomedical research that certain diseases have been eliminated, cures for others have been discovered, and medical procedures and therapies that save lives have been developed. Yet, the U.S. PS workforce has both declined and aged over the last several years. The resulting decreased inflow and outflow to the PS pipeline renders the system vulnerable to collapsing suddenly as the senior workforce retires. In November 2015, the Alliance for Academic Internal Medicine hosted a consensus conference on the PS workforce to address issues impacting academic medical schools, with input from early-career PSs based on their individual experiences and concerns. One of the goals of the conference was to identify current impediments in attracting and supporting PSs and to develop a new set of recommendations for sustaining the PS workforce in 2016 and beyond. This Perspective reports on the opportunities and factors identified at the conference and presents five recommendations designed to increase entry into the PS pipeline and nine recommendations designed to decrease attrition from the PS workflow.
Structural health monitoring of pipelines rehabilitated with lining technology
NASA Astrophysics Data System (ADS)
Farhidzadeh, Alireza; Dehghan-Niri, Ehsan; Salamone, Salvatore
2014-03-01
Damage detection of pipeline systems is a tedious and time consuming job due to digging requirement, accessibility, interference with other facilities, and being extremely wide spread in metropolitans. Therefore, a real-time and automated monitoring system can pervasively reduce labor work, time, and expenditures. This paper presents the results of an experimental study aimed at monitoring the performance of full scale pipe lining systems, subjected to static and dynamic (seismic) loading, using Acoustic Emission (AE) technique and Guided Ultrasonic Waves (GUWs). Particularly, two damage mechanisms are investigated: 1) delamination between pipeline and liner as the early indicator of damage, and 2) onset of nonlinearity and incipient failure of the liner as critical damage state.
An acceleration system for Laplacian image fusion based on SoC
NASA Astrophysics Data System (ADS)
Gao, Liwen; Zhao, Hongtu; Qu, Xiujie; Wei, Tianbo; Du, Peng
2018-04-01
Based on the analysis of Laplacian image fusion algorithm, this paper proposes a partial pipelining and modular processing architecture, and a SoC based acceleration system is implemented accordingly. Full pipelining method is used for the design of each module, and modules in series form the partial pipelining with unified data formation, which is easy for management and reuse. Integrated with ARM processor, DMA and embedded bare-mental program, this system achieves 4 layers of Laplacian pyramid on the Zynq-7000 board. Experiments show that, with small resources consumption, a couple of 256×256 images can be fused within 1ms, maintaining a fine fusion effect at the same time.
Applications of the pipeline environment for visual informatics and genomics computations
2011-01-01
Background Contemporary informatics and genomics research require efficient, flexible and robust management of large heterogeneous data, advanced computational tools, powerful visualization, reliable hardware infrastructure, interoperability of computational resources, and detailed data and analysis-protocol provenance. The Pipeline is a client-server distributed computational environment that facilitates the visual graphical construction, execution, monitoring, validation and dissemination of advanced data analysis protocols. Results This paper reports on the applications of the LONI Pipeline environment to address two informatics challenges - graphical management of diverse genomics tools, and the interoperability of informatics software. Specifically, this manuscript presents the concrete details of deploying general informatics suites and individual software tools to new hardware infrastructures, the design, validation and execution of new visual analysis protocols via the Pipeline graphical interface, and integration of diverse informatics tools via the Pipeline eXtensible Markup Language syntax. We demonstrate each of these processes using several established informatics packages (e.g., miBLAST, EMBOSS, mrFAST, GWASS, MAQ, SAMtools, Bowtie) for basic local sequence alignment and search, molecular biology data analysis, and genome-wide association studies. These examples demonstrate the power of the Pipeline graphical workflow environment to enable integration of bioinformatics resources which provide a well-defined syntax for dynamic specification of the input/output parameters and the run-time execution controls. Conclusions The LONI Pipeline environment http://pipeline.loni.ucla.edu provides a flexible graphical infrastructure for efficient biomedical computing and distributed informatics research. The interactive Pipeline resource manager enables the utilization and interoperability of diverse types of informatics resources. The Pipeline client-server model provides computational power to a broad spectrum of informatics investigators - experienced developers and novice users, user with or without access to advanced computational-resources (e.g., Grid, data), as well as basic and translational scientists. The open development, validation and dissemination of computational networks (pipeline workflows) facilitates the sharing of knowledge, tools, protocols and best practices, and enables the unbiased validation and replication of scientific findings by the entire community. PMID:21791102
NASA Technical Reports Server (NTRS)
Brownston, Lee; Jenkins, Jon M.
2015-01-01
The Kepler Mission was launched in 2009 as NASAs first mission capable of finding Earth-size planets in the habitable zone of Sun-like stars. Its telescope consists of a 1.5-m primary mirror and a 0.95-m aperture. The 42 charge-coupled devices in its focal plane are read out every half hour, compressed, and then downlinked monthly. After four years, the second of four reaction wheels failed, ending the original mission. Back on earth, the Science Operations Center developed the Science Pipeline to analyze about 200,000 target stars in Keplers field of view, looking for evidence of periodic dimming suggesting that one or more planets had crossed the face of its host star. The Pipeline comprises several steps, from pixel-level calibration, through noise and artifact removal, to detection of transit-like signals and the construction of a suite of diagnostic tests to guard against false positives. The Kepler Science Pipeline consists of a pipeline infrastructure written in the Java programming language, which marshals data input to and output from MATLAB applications that are executed as external processes. The pipeline modules, which underwent continuous development and refinement even after data started arriving, employ several analytic techniques, many developed for the Kepler Project. Because of the large number of targets, the large amount of data per target and the complexity of the pipeline algorithms, the processing demands are daunting. Some pipeline modules require days to weeks to process all of their targets, even when run on NASA's 128-node Pleiades supercomputer. The software developers are still seeking ways to increase the throughput. To date, the Kepler project has discovered more than 4000 planetary candidates, of which more than 1000 have been independently confirmed or validated to be exoplanets. Funding for this mission is provided by NASAs Science Mission Directorate.
A design of endoscopic imaging system for hyper long pipeline based on wheeled pipe robot
NASA Astrophysics Data System (ADS)
Zheng, Dongtian; Tan, Haishu; Zhou, Fuqiang
2017-03-01
An endoscopic imaging system of hyper long pipeline is designed to acquire the inner surface image in advance for the hyper long pipeline detects measurement. The system consists of structured light sensors, pipe robots and control system. The pipe robot is in the form of wheel structure, with the sensor which is at the front of the vehicle body. The control system is at the tail of the vehicle body in the form of upper and lower computer. The sensor can be translated and scanned in three steps: walking, lifting and scanning, then the inner surface image can be acquired at a plurality of positions and different angles. The results of imaging experiments show that the system's transmission distance is longer, the acquisition angle is more diverse and the result is more comprehensive than the traditional imaging system, which lays an important foundation for later inner surface vision measurement.
Status of Natural Gas Pipeline System Capacity Entering the 2000-2001 Heating Season
2000-01-01
This special report looks at the capabilities of the national natural gas pipeline network in 2000 and provides an assessment of the current levels of available capacity to transport supplies from production areas to markets throughout the United States during the upcoming heating season. It also examines how completion of currently planned expansion projects and proposed new pipelines would affect the network.
Code of Federal Regulations, 2010 CFR
2010-10-01
... PIPELINE Design Requirements § 195.116 Valves. Each valve installed in a pipeline system must comply with the following: (a) The valve must be of a sound engineering design. (b) Materials subject to the...
Detail of new rain shed (Building No. 241). Note pipeline ...
Detail of new rain shed (Building No. 241). Note pipeline connection from collection trough. - Hawaii Volcanoes National Park Water Collection System, Hawaii Volcanoes National Park, Volcano, Hawaii County, HI
Optimum Design and Development of High Strength and Toughness Welding Wire for Pipeline Steel
NASA Astrophysics Data System (ADS)
Chen, Cuixin; Xue, Haitao; Yin, Fuxing; Peng, Huifen; Zhi, Lei; Wang, Sixu
Pipeline steel with higher strength(>800MPa) has been gradually used in recent years, so how to achieve good match of base metal and weld deposit is very important for its practical application. Based on the alloy system of 0.02-0.04%C, 2.0%Mn and 0.5%Si, four different kinds of welding wires were designed and produced. The effects of alloy elements on phase transformation and mechanical properties were analyzed. Experimental results show that the designed steels with the addition of 2-4% Ni+Cr+Mo and <0.2% Nb+V+Ti have high strength (>800MPa) and good elongation (>15%). The microstructure of deposits metal is mainly composed of granular bainite and M-A constituents with the mean size of 0.2-07μm are dispersed on ferritic matrix. The deposited metals have good match of strength (>800MPa) and impact toughness (>130J) which well meet the requirement of pipeline welding.
Neural-Fuzzy model Based Steel Pipeline Multiple Cracks Classification
NASA Astrophysics Data System (ADS)
Elwalwal, Hatem Mostafa; Mahzan, Shahruddin Bin Hj.; Abdalla, Ahmed N.
2017-10-01
While pipes are cheaper than other means of transportation, this cost saving comes with a major price: pipes are subject to cracks, corrosion etc., which in turn can cause leakage and environmental damage. In this paper, Neural-Fuzzy model for multiple cracks classification based on Lamb Guide Wave. Simulation results for 42 sample were collected using ANSYS software. The current research object to carry on the numerical simulation and experimental study, aiming at finding an effective way to detection and the localization of cracks and holes defects in the main body of pipeline. Considering the damage form of multiple cracks and holes which may exist in pipeline, to determine the respective position in the steel pipe. In addition, the technique used in this research a guided lamb wave based structural health monitoring method whereas piezoelectric transducers will use as exciting and receiving sensors by Pitch-Catch method. Implementation of simple learning mechanism has been developed specially for the ANN for fuzzy the system represented.
Sinha, S K; Karray, F
2002-01-01
Pipeline surface defects such as holes and cracks cause major problems for utility managers, particularly when the pipeline is buried under the ground. Manual inspection for surface defects in the pipeline has a number of drawbacks, including subjectivity, varying standards, and high costs. Automatic inspection system using image processing and artificial intelligence techniques can overcome many of these disadvantages and offer utility managers an opportunity to significantly improve quality and reduce costs. A recognition and classification of pipe cracks using images analysis and neuro-fuzzy algorithm is proposed. In the preprocessing step the scanned images of pipe are analyzed and crack features are extracted. In the classification step the neuro-fuzzy algorithm is developed that employs a fuzzy membership function and error backpropagation algorithm. The idea behind the proposed approach is that the fuzzy membership function will absorb variation of feature values and the backpropagation network, with its learning ability, will show good classification efficiency.
Leak detection by mass balance effective for Norman Wells line
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liou, J.C.P.
Mass-balance calculations for leak detection have been shown as effective as a leading software system, in a comparison based on a major Canadian crude-oil pipeline. The calculations and NovaCorp`s Leakstop software each detected 4% (approximately) or greater leaks on Interprovincial Pipe Line (IPL) Inc.`s Norman Wells pipeline. Insufficient data exist to assess performances of the two methods for leaks smaller than 4%. Pipeline leak detection using such software-based systems are common. Their effectiveness is measured by how small and how quickly a leak can be detected. Algorithms used and measurement uncertainties determine leak detectability.
Fiber glass reinforcement wrap gets DOT nod for gas-line use
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1993-12-13
Panhandle Eastern Corp.'s Texas Eastern Transmission Corp. has become the first US natural-gas pipeline company to install, under federal waiver, a fiber glass reinforcement on an in-service gas pipeline. The Clock Spring repair system was installed in August on six segments of Texas Eastern's 20-in. gas pipeline in Fayette County, Ohio, after the company had received a US Department of Transportation (DOT) waiver to use the system in place of conventional DOT-mandated repair methods. The paper describes the conventional methods, as well as comparing costs of both methods.
30 CFR 250.910 - Which of my facilities are subject to the Platform Verification Program?
Code of Federal Regulations, 2013 CFR
2013-07-01
..., production, and pipeline risers, and riser tensioning systems (each platform must be designed to accommodate all the loads imposed by all risers and riser does not have tensioning systems);(ii) Turrets and... are subject to the Platform Verification Program: (i) Drilling, production, and pipeline risers, and...
30 CFR 250.910 - Which of my facilities are subject to the Platform Verification Program?
Code of Federal Regulations, 2011 CFR
2011-07-01
..., production, and pipeline risers, and riser tensioning systems (each platform must be designed to accommodate all the loads imposed by all risers and riser does not have tensioning systems);(ii) Turrets and... are subject to the Platform Verification Program: (i) Drilling, production, and pipeline risers, and...
30 CFR 250.910 - Which of my facilities are subject to the Platform Verification Program?
Code of Federal Regulations, 2012 CFR
2012-07-01
..., production, and pipeline risers, and riser tensioning systems (each platform must be designed to accommodate all the loads imposed by all risers and riser does not have tensioning systems);(ii) Turrets and... are subject to the Platform Verification Program: (i) Drilling, production, and pipeline risers, and...
30 CFR 250.910 - Which of my facilities are subject to the Platform Verification Program?
Code of Federal Regulations, 2014 CFR
2014-07-01
..., production, and pipeline risers, and riser tensioning systems (each platform must be designed to accommodate all the loads imposed by all risers and riser does not have tensioning systems);(ii) Turrets and... are subject to the Platform Verification Program: (i) Drilling, production, and pipeline risers, and...
49 CFR 198.39 - Qualifications for operation of one-call notification system.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 49 Transportation 3 2010-10-01 2010-10-01 false Qualifications for operation of one-call...) PIPELINE SAFETY REGULATIONS FOR GRANTS TO AID STATE PIPELINE SAFETY PROGRAMS Adoption of One-Call Damage Prevention Program § 198.39 Qualifications for operation of one-call notification system. A one-call...
THE DEVELOPMENT AND THE STRATEGY OF THE OIL AND GAS PIPELINES OF RUSSIA
NASA Astrophysics Data System (ADS)
Motomura, Masumi
The Russian oil and gas industry earns more than half of the Russian tax revenue and foreign currency, and has been playing the role of the backbone of the state economy through the eras of the Soviet Union and the Russian Federation. With the elongation of distance to the European market from the oil producing regions, starting from Baku in the era of Imperial Russia to the Second Baku (Volga-Ural) and the third Baku (West Siberia) in turn, the role of the oil pipeline system as the transportation infrastructure became more and more important and the deployment of pipelines has become one of the indispensable pillars of oil strategy. Now, the oil pipeline network is to reach the Pacific Ocean, which will enable Northeast Asia to be added as a destination for Russian oil, with a result of expanding influence for Russia in these regions. On the other hand, gas exports from the Soviet Union to Eastern Europe started in 1967 by constructing a trunk pipeline from Ukraine, which was extended to West Germany in 1973, overcoming the confrontation between the East and the West and becoming a regional stabilizer. The United States considered this pipeline as an energy weapon and criticized this deal by saying that when Soviet gas flows to Western Europe, its political influence must flow like the gas itself. However, the Soviet Union collapsed in 1991, while gas transportation continued without any disruption. This is evidence that the gas pipeline from the Soviet Union was purely for a business purpose and was not politicized. Recently, Russia is aiming to export gas to northeastern Asia, which is expected to be a new stabilizer in this region, although different types of diffi culties (especially about the method of determination of the gas price) still need to be resolved.
Social cost impact assessment of pipeline infrastructure projects
DOE Office of Scientific and Technical Information (OSTI.GOV)
Matthews, John C., E-mail: matthewsj@battelle.org; Allouche, Erez N., E-mail: allouche@latech.edu; Sterling, Raymond L., E-mail: sterling@latech.edu
A key advantage of trenchless construction methods compared with traditional open-cut methods is their ability to install or rehabilitate underground utility systems with limited disruption to the surrounding built and natural environments. The equivalent monetary values of these disruptions are commonly called social costs. Social costs are often ignored by engineers or project managers during project planning and design phases, partially because they cannot be calculated using standard estimating methods. In recent years some approaches for estimating social costs were presented. Nevertheless, the cost data needed for validation of these estimating methods is lacking. Development of such social cost databasesmore » can be accomplished by compiling relevant information reported in various case histories. This paper identifies eight most important social cost categories, presents mathematical methods for calculating them, and summarizes the social cost impacts for two pipeline construction projects. The case histories are analyzed in order to identify trends for the various social cost categories. The effectiveness of the methods used to estimate these values is also discussed. These findings are valuable for pipeline infrastructure engineers making renewal technology selection decisions by providing a more accurate process for the assessment of social costs and impacts. - Highlights: • Identified the eight most important social cost factors for pipeline construction • Presented mathematical methods for calculating those social cost factors • Summarized social cost impacts for two pipeline construction projects • Analyzed those projects to identify trends for the social cost factors.« less
Channel erosion surveys along TAPS route, Alaska, 1974
Childers, Joseph; Jones, Stanley H.
1975-01-01
Repeated site surveys and aerial photographs at 26 stream crossings along the trans-Alaska pipeline system (TAPS) route during the period 1969-74 provide chronologie records of channel changes that predate pipeline-related construction at the sites. The 1974 surveys and photographs show some of the channel changes wrought by construction of the haul road from the Yukon River to Prudhoe Bay and by construction of camps and working pads all along the pipeline route. No pipeline crossings were constructed before 1975. These records of channel changes together with flood and icing measurements are part of the United States Department of the lnterior's continuing surveillance program to document the hydrologic aspects of the trans-Alaska pipeline and its environmental impacts.
Consent Decree Magellan Pipeline Company, L.P.
This is the consent decree for the settlement of Magellan has agreed to complete approximately $16 million of injunctive relief across its 11,000-mile pipeline system and pay a $2 million civil penalty.
Rooftop view of old rain shed (Building No. 43), pipeline ...
Rooftop view of old rain shed (Building No. 43), pipeline on trestle, and water tanks. - Hawaii Volcanoes National Park Water Collection System, Hawaii Volcanoes National Park, Volcano, Hawaii County, HI
Harmonize Pipeline and Archiving Aystem: PESSTO@IA2 Use Case
NASA Astrophysics Data System (ADS)
Smareglia, R.; Knapic, C.; Molinaro, M.; Young, D.; Valenti, S.
2013-10-01
Italian Astronomical Archives Center (IA2) is a research infrastructure project that aims at coordinating different national and international initiatives to improve the quality of astrophysical data services. IA2 is now also involved in the PESSTO (Public ESO Spectroscopic Survey of Transient Objects) collaboration, developing a complete archiving system to store calibrated post processed data (including sensitive intermediate products), a user interface to access private data and Virtual Observatory (VO) compliant web services to access public fast reduction data via VO tools. The archive system shall rely on the PESSTO Marshall to provide file data and its associated metadata output by the PESSTO data-reduction pipeline. To harmonize the object repository, data handling and archiving system, new tools are under development. These systems must have a strong cross-interaction without increasing the complexities of any single task, in order to improve the performances of the whole system and must have a sturdy logic in order to perform all operations in coordination with the other PESSTO tools. MySQL Replication technology and triggers are used for the synchronization of new data in an efficient, fault tolerant manner. A general purpose library is under development to manage data starting from raw observations to final calibrated ones, open to the overriding of different sources, formats, management fields, storage and publication policies. Configurations for all the systems are stored in a dedicated schema (no configuration files), but can be easily updated by a planned Archiving System Configuration Interface (ASCI).
NASA Astrophysics Data System (ADS)
Leporini, M.; Terenzi, A.; Marchetti, B.; Giacchetta, G.; Polonara, F.; Corvaro, F.; Cocci Grifoni, R.
2017-11-01
Pipelining Liquefied Petroleum Gas (LPG) is a mode of LPG transportation more environmentally-friendly than others due to the lower energy consumption and exhaust emissions. Worldwide, there are over 20000 kilometers of LPG pipelines. There are a number of codes that industry follows for the design, fabrication, construction and operation of liquid LPG pipelines. However, no standards exist to modelling particular critical phenomena which can occur on these lines due to external environmental conditions like the solar radiation pressurization. In fact, the solar radiation can expose above ground pipeline sections at pressure values above the maximum Design Pressure with resulting risks and problems. The present work presents an innovative practice suitable for the Oil & Gas industry to modelling the pressurization induced by the solar radiation on above ground LPG pipeline sections with the application to a real case.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-01
... Plan for the American Burying Beetle for Pipelines and Well Field Development in Oklahoma and Texas..., operation, and repair of oil and gas pipelines, and related well field activities. Individual oil and gas... pipelines and related well field activities, and will include measures necessary to minimize and mitigate...
NASA Astrophysics Data System (ADS)
Tomarov, G. V.; Shipkov, A. A.; Lovchev, V. N.; Gutsev, D. F.
2016-10-01
Problems of metal flow-accelerated corrosion (FAC) in the pipelines and equipment of the condensate- feeding and wet-steam paths of NPP power-generating units (PGU) are examined. Goals, objectives, and main principles of the methodology for the implementation of an integrated program of AO Concern Rosenergoatom for the prevention of unacceptable FAC thinning and for increasing operational flow-accelerated corrosion resistance of NPP EaP are worded (further the Program). A role is determined and potentialities are shown for the use of Russian software packages in the evaluation and prediction of FAC rate upon solving practical problems for the timely detection of unacceptable FAC thinning in the elements of pipelines and equipment (EaP) of the secondary circuit of NPP PGU. Information is given concerning the structure, properties, and functions of the software systems for plant personnel support in the monitoring and planning of the inservice inspection of FAC thinning elements of pipelines and equipment of the secondary circuit of NPP PGUs, which are created and implemented at some Russian NPPs equipped with VVER-1000, VVER-440, and BN-600 reactors. It is noted that one of the most important practical results of software packages for supporting NPP personnel concerning the issue of flow-accelerated corrosion consists in revealing elements under a hazard of intense local FAC thinning. Examples are given for successful practice at some Russian NPP concerning the use of software systems for supporting the personnel in early detection of secondary-circuit pipeline elements with FAC thinning close to an unacceptable level. Intermediate results of working on the Program are presented and new tasks set in 2012 as a part of the updated program are denoted. The prospects of the developed methods and tools in the scope of the Program measures at the stages of design and construction of NPP PGU are discussed. The main directions of the work on solving the problems of flow-accelerated corrosion of pipelines and equipment in Russian NPP PGU are defined.
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
New fields are being added even while recent finds are brought on using floating production systems and gas pipelines. Intensive workover/redrilling continues in older onshore provinces. The paper discusses exploration, development, drilling and production in China, Indonesia, India, Malaysia, Thailand, Viet Nam, Pakistan, Myanmar, Brunei, and the Philippines, Cambodia, Bangladesh, Japan, Mongolia, and Taiwan are briefly mentioned.
Leadership Characteristics of Workforce Development Administrators in Community Colleges
ERIC Educational Resources Information Center
Lebesch, Anna Marie
2011-01-01
The community college environment is a complex and ever-changing system that requires effective leadership. The leadership characteristics in community colleges have been investigated substantially with studies primarily focused on the presidency and the pathway of the traditional academic pipeline. But as community colleges have struggled to do…
OpenCluster: A Flexible Distributed Computing Framework for Astronomical Data Processing
NASA Astrophysics Data System (ADS)
Wei, Shoulin; Wang, Feng; Deng, Hui; Liu, Cuiyin; Dai, Wei; Liang, Bo; Mei, Ying; Shi, Congming; Liu, Yingbo; Wu, Jingping
2017-02-01
The volume of data generated by modern astronomical telescopes is extremely large and rapidly growing. However, current high-performance data processing architectures/frameworks are not well suited for astronomers because of their limitations and programming difficulties. In this paper, we therefore present OpenCluster, an open-source distributed computing framework to support rapidly developing high-performance processing pipelines of astronomical big data. We first detail the OpenCluster design principles and implementations and present the APIs facilitated by the framework. We then demonstrate a case in which OpenCluster is used to resolve complex data processing problems for developing a pipeline for the Mingantu Ultrawide Spectral Radioheliograph. Finally, we present our OpenCluster performance evaluation. Overall, OpenCluster provides not only high fault tolerance and simple programming interfaces, but also a flexible means of scaling up the number of interacting entities. OpenCluster thereby provides an easily integrated distributed computing framework for quickly developing a high-performance data processing system of astronomical telescopes and for significantly reducing software development expenses.
Automatic Measuring System for Oil Stream Paraffin Deposits Parameters
NASA Astrophysics Data System (ADS)
Kopteva, A. V.; Koptev, V. Yu
2018-03-01
This paper describes a new method for monitoring oil pipelines, as well as a highly efficient and automated paraffin deposit monitoring method. When operating oil pipelines, there is an issue of paraffin, resin and salt deposits on the pipeline walls that come with the oil stream. It ultimately results in frequent transportation suspension to clean or even replace pipes and other equipment, thus shortening operation periods between repairs, creating emergency situations and increasing production expenses, badly affecting environment, damaging ecology and spoil underground water, killing animals, birds etc. Oil spills contaminate rivers, lakes, and ground waters. Oil transportation monitoring issues are still subject for further studying. Thus, there is the need to invent a radically new automated process control and management system, together with measurement means intellectualization. The measurement principle is based on the Lambert-Beer law that describes the dependence between the gamma-radiation frequency and the density together with the linear attenuation coefficient for a substance. Using the measuring system with high accuracy (± 0,2%), one can measure the thickness of paraffin deposits with an absolute accuracy of ± 5 mm, which is sufficient to ensure reliable operation of the pipeline system. Safety is a key advantage, when using the proposed control system.
Parallel processing in a host plus multiple array processor system for radar
NASA Technical Reports Server (NTRS)
Barkan, B. Z.
1983-01-01
Host plus multiple array processor architecture is demonstrated to yield a modular, fast, and cost-effective system for radar processing. Software methodology for programming such a system is developed. Parallel processing with pipelined data flow among the host, array processors, and discs is implemented. Theoretical analysis of performance is made and experimentally verified. The broad class of problems to which the architecture and methodology can be applied is indicated.
Study of stress-strain state of pipeline under permafrost conditions
NASA Astrophysics Data System (ADS)
Tarasenko, A. A.; Redutinskiy, M. N.; Chepur, P. V.; Gruchenkova, A. A.
2018-05-01
In this paper, the dependences of the stress-strain state and subsidence of pipelines on the dimensions of the subsidence zone are obtained for the sizes of pipes that have become most widespread during the construction of main oil pipelines (530x10, 820x12, 1020x12, 1020x14, 1020x16, 1220x14, 1220x16, 1220x18 mm). True values of stresses in the pipeline wall, as well as the exact location of maximum stresses for the interval of subsidence zones from 5 to 60 meters, are determined. For this purpose, the authors developed a finite element model of the pipeline that takes into account the actual interaction of the pipeline with the subgrade and allows calculating the SSS of the structure for a variable subsidence zone. Based on the obtained dependences for the underground laying of oil pipelines in permafrost areas, it is proposed to artificially limit the zone of possible subsidence by separation supports from the soil with higher building properties and physical-mechanical parameters. This technical solution would significantly reduce costs when constructing new oil pipelines in permafrost areas.
Urban Underground Pipelines Mapping Using Ground Penetrating Radar
NASA Astrophysics Data System (ADS)
Jaw, S. W.; M, Hashim
2014-02-01
Underground spaces are now being given attention to exploit for transportation, utilities, and public usage. The underground has become a spider's web of utility networks. Mapping of underground utility pipelines has become a challenging and difficult task. As such, mapping of underground utility pipelines is a "hit-and-miss" affair, and results in many catastrophic damages, particularly in urban areas. Therefore, this study was conducted to extract locational information of the urban underground utility pipeline using trenchless measuring tool, namely ground penetrating radar (GPR). The focus of this study was to conduct underground utility pipeline mapping for retrieval of geometry properties of the pipelines, using GPR. In doing this, a series of tests were first conducted at the preferred test site and real-life experiment, followed by modeling of field-based model using Finite-Difference Time-Domain (FDTD). Results provide the locational information of underground utility pipelines associated with its mapping accuracy. Eventually, this locational information of the underground utility pipelines is beneficial to civil infrastructure management and maintenance which in the long term is time-saving and critically important for the development of metropolitan areas.
Detail of pipeline on trestle with redwood tank and old ...
Detail of pipeline on trestle with redwood tank and old rain shed (Building No. 43) on either side. - Hawaii Volcanoes National Park Water Collection System, Hawaii Volcanoes National Park, Volcano, Hawaii County, HI
Steel tanks T5 and T4 with overhead pipeline between. Redwood ...
Steel tanks T5 and T4 with overhead pipeline between. Redwood tanks seen in background - Hawaii Volcanoes National Park Water Collection System, Hawaii Volcanoes National Park, Volcano, Hawaii County, HI
78 FR 27217 - Combined Notice of Filings
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-09
...: RP13-845-000. Applicants: ETC Tiger Pipeline, LLC. Description: ETC Tiger 2013--System Map Filing to be...-859-000. Applicants: ETC Tiger Pipeline, LLC. Description: ETC Tiger 2013 Semi-Annual Fuel Filing 4/30...
Research on prognostics and health management of underground pipeline
NASA Astrophysics Data System (ADS)
Zhang, Guangdi; Yang, Meng; Yang, Fan; Ni, Na
2018-04-01
With the development of the city, the construction of the underground pipeline is more and more complex, which has relation to the safety and normal operation of the city, known as "the lifeline of the city". First of all, this paper introduces the principle of PHM (Prognostics and Health Management) technology, then proposed for fault diagnosis, prognostics and health management in view of underground pipeline, make a diagnosis and prognostics for the faults appearing in the operation of the underground pipeline, and then make a health assessment of the whole underground pipe network in order to ensure the operation of the pipeline safely. Finally, summarize and prospect the future research direction.
The Chandra Source Catalog: Processing and Infrastructure
NASA Astrophysics Data System (ADS)
Evans, Janet; Evans, Ian N.; Glotfelty, Kenny J.; Hain, Roger; Hall, Diane M.; Miller, Joseph B.; Plummer, David A.; Zografou, Panagoula; Primini, Francis A.; Anderson, Craig S.; Bonaventura, Nina R.; Chen, Judy C.; Davis, John E.; Doe, Stephen M.; Fabbiano, Giuseppina; Galle, Elizabeth C.; Gibbs, Danny G., II; Grier, John D.; Harbo, Peter N.; He, Xiang Qun (Helen); Houck, John C.; Karovska, Margarita; Kashyap, Vinay L.; Lauer, Jennifer; McCollough, Michael L.; McDowell, Jonathan C.; Mitschang, Arik W.; Morgan, Douglas L.; Mossman, Amy E.; Nichols, Joy S.; Nowak, Michael A.; Refsdal, Brian L.; Rots, Arnold H.; Siemiginowska, Aneta L.; Sundheim, Beth A.; Tibbetts, Michael S.; van Stone, David W.; Winkelman, Sherry L.
2009-09-01
Chandra Source Catalog processing recalibrates each observation using the latest available calibration data, and employs a wavelet-based source detection algorithm to identify all the X-ray sources in the field of view. Source properties are then extracted from each detected source that is a candidate for inclusion in the catalog. Catalog processing is completed by matching sources across multiple observations, merging common detections, and applying quality assurance checks. The Chandra Source Catalog processing system shares a common processing infrastructure and utilizes much of the functionality that is built into the Standard Data Processing (SDP) pipeline system that provides calibrated Chandra data to end-users. Other key components of the catalog processing system have been assembled from the portable CIAO data analysis package. Minimal new software tool development has been required to support the science algorithms needed for catalog production. Since processing pipelines must be instantiated for each detected source, the number of pipelines that are run during catalog construction is a factor of order 100 times larger than for SDP. The increased computational load, and inherent parallel nature of the processing, is handled by distributing the workload across a multi-node Beowulf cluster. Modifications to the SDP automated processing application to support catalog processing, and extensions to Chandra Data Archive software to ingest and retrieve catalog products, complete the upgrades to the infrastructure to support catalog processing.
Leak detection in gas pipeline by acoustic and signal processing - A review
NASA Astrophysics Data System (ADS)
Adnan, N. F.; Ghazali, M. F.; Amin, M. M.; Hamat, A. M. A.
2015-12-01
The pipeline system is the most important part in media transport in order to deliver fluid to another station. The weak maintenance and poor safety will contribute to financial losses in term of fluid waste and environmental impacts. There are many classifications of techniques to make it easier to show their specific method and application. This paper's discussion about gas leak detection in pipeline system using acoustic method will be presented in this paper. The wave propagation in the pipeline is a key parameter in acoustic method when the leak occurs and the pressure balance of the pipe will generated by the friction between wall in the pipe. The signal processing is used to decompose the raw signal and show in time- frequency. Findings based on the acoustic method can be used for comparative study in the future. Acoustic signal and HHT is the best method to detect leak in gas pipelines. More experiments and simulation need to be carried out to get the fast result of leaking and estimation of their location.
Concept of an advanced hyperspectral remote sensing system for pipeline monitoring
NASA Astrophysics Data System (ADS)
Keskin, Göksu; Teutsch, Caroline D.; Lenz, Andreas; Middelmann, Wolfgang
2015-10-01
Areas occupied by oil pipelines and storage facilities are prone to severe contamination due to leaks caused by natural forces, poor maintenance or third parties. These threats have to be detected as quickly as possible in order to prevent serious environmental damage. Periodical and emergency monitoring activities need to be carried out for successful disaster management and pollution minimization. Airborne remote sensing stands out as an appropriate choice to operate either in an emergency or periodically. Hydrocarbon Index (HI) and Hydrocarbon Detection Index (HDI) utilize the unique absorption features of hydrocarbon based materials at SWIR spectral region. These band ratio based methods require no a priori knowledge of the reference spectrum and can be calculated in real time. This work introduces a flexible airborne pipeline monitoring system based on the online quasi-operational hyperspectral remote sensing system developed at Fraunhofer IOSB, utilizing HI and HDI for oil leak detection on the data acquired by an SWIR imaging sensor. Robustness of HI and HDI compared to state of the art detection algorithms is evaluated in an experimental setup using a synthetic dataset, which was prepared in a systematic way to simulate linear mixtures of selected background and oil spectra consisting of gradually decreasing percentages of oil content. Real airborne measurements in Ettlingen, Germany are used to gather background data while the crude oil spectrum was measured with a field spectrometer. The results indicate that the system can be utilized for online and offline monitoring activities.
A software framework for pipelined arithmetic algorithms in field programmable gate arrays
NASA Astrophysics Data System (ADS)
Kim, J. B.; Won, E.
2018-03-01
Pipelined algorithms implemented in field programmable gate arrays are extensively used for hardware triggers in the modern experimental high energy physics field and the complexity of such algorithms increases rapidly. For development of such hardware triggers, algorithms are developed in C++, ported to hardware description language for synthesizing firmware, and then ported back to C++ for simulating the firmware response down to the single bit level. We present a C++ software framework which automatically simulates and generates hardware description language code for pipelined arithmetic algorithms.
NASA Astrophysics Data System (ADS)
Ding, Wenhua; Li, Shaopo; Li, Jiading; Li, Qun; Chen, Tieqiang; Zhang, Hai
In recent years, there has been development of several significant pipeline projects for the transmission of oil and gas from deep water environments. The production of gas transmission pipelines for application demands heavy wall, high strength, good lower temperature toughness and good weldability. To overcome the difficulty of producing consistent mechanical property in heavy wall pipe Shougang Steel Research in cooperation with the Shougang Steel Qinhuangdao China (Shouqin) 4.3m heavy wide plate mill research was conducted.
Pipeline enhances Norman Wells potential
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
Approval of an oil pipeline from halfway down Canada's MacKenzie River Valley at Norman Wells to N. Alberta has raised the potential for development of large reserves along with controversy over native claims. The project involves 2 closely related proposals. One, by Esso Resources, the exploration and production unit of Imperial Oil, will increase oil production from the Norman Wells field from 3000 bpd currently to 25,000 bpd. The other proposal, by Interprovincial Pipeline (N.W) Ltd., calls for construction of an underground pipeline to transport the additional production from Norman Wells to Alberta. The 560-mile, 12-in. pipeline will extend frommore » Norman Wells, which is 90 miles south of the Arctic Circle on the north shore of the Mackenzie River, south to the end of an existing line at Zama in N. Alberta. There will be 3 pumping stations en route. This work also discusses recovery, potential, drilling limitations, the processing plant, positive impact, and further development of the Norman Wells project.« less
Education or Incarceration: Zero Tolerance Policies and the School to Prison Pipeline
ERIC Educational Resources Information Center
Heitzeg, Nancy A.
2009-01-01
In the past decade, there has been a growing convergence between schools and legal systems. The school to prison pipeline refers to this growing pattern of tracking students out of educational institutions, primarily via "zero tolerance" policies, and, directly and/or indirectly, into the juvenile and adult criminal justice systems. The school to…
A Critique of the STEM Pipeline: Young People's Identities in Sweden and Science Education Policy
ERIC Educational Resources Information Center
Mendick, Heather; Berge, Maria; Danielsson, Anna
2017-01-01
In this article, we develop critiques of the pipeline model which dominates Western science education policy, using discourse analysis of interviews with two Swedish young women focused on "identity work". We argue that it is important to unpack the ways that the pipeline model fails to engage with intersections of gender, ethnicity,…
Minimum separation distances for natural gas pipeline and boilers in the 300 area, Hanford Site
DOE Office of Scientific and Technical Information (OSTI.GOV)
Daling, P.M.; Graham, T.M.
1997-08-01
The U.S. Department of Energy (DOE) is proposing actions to reduce energy expenditures and improve energy system reliability at the 300 Area of the Hanford Site. These actions include replacing the centralized heating system with heating units for individual buildings or groups of buildings, constructing a new natural gas distribution system to provide a fuel source for many of these units, and constructing a central control building to operate and maintain the system. The individual heating units will include steam boilers that are to be housed in individual annex buildings located at some distance away from nearby 300 Area nuclearmore » facilities. This analysis develops the basis for siting the package boilers and natural gas distribution systems to be used to supply steam to 300 Area nuclear facilities. The effects of four potential fire and explosion scenarios involving the boiler and natural gas pipeline were quantified to determine minimum separation distances that would reduce the risks to nearby nuclear facilities. The resulting minimum separation distances are shown in Table ES.1.« less
Open source pipeline for ESPaDOnS reduction and analysis
NASA Astrophysics Data System (ADS)
Martioli, Eder; Teeple, Doug; Manset, Nadine; Devost, Daniel; Withington, Kanoa; Venne, Andre; Tannock, Megan
2012-09-01
OPERA is a Canada-France-Hawaii Telescope (CFHT) open source collaborative software project currently under development for an ESPaDOnS echelle spectro-polarimetric image reduction pipeline. OPERA is designed to be fully automated, performing calibrations and reduction, producing one-dimensional intensity and polarimetric spectra. The calibrations are performed on two-dimensional images. Spectra are extracted using an optimal extraction algorithm. While primarily designed for CFHT ESPaDOnS data, the pipeline is being written to be extensible to other echelle spectrographs. A primary design goal is to make use of fast, modern object-oriented technologies. Processing is controlled by a harness, which manages a set of processing modules, that make use of a collection of native OPERA software libraries and standard external software libraries. The harness and modules are completely parametrized by site configuration and instrument parameters. The software is open- ended, permitting users of OPERA to extend the pipeline capabilities. All these features have been designed to provide a portable infrastructure that facilitates collaborative development, code re-usability and extensibility. OPERA is free software with support for both GNU/Linux and MacOSX platforms. The pipeline is hosted on SourceForge under the name "opera-pipeline".
ERIC Educational Resources Information Center
Hitt, Dallas Hambrick; Tucker, Pamela D.; Young, Michelle D.
2012-01-01
The professional pipeline represents a developmental perspective for fostering leadership capacity in schools and districts, from identification of potential talent during the recruitment phase to ensuring career-long learning through professional development. An intentional and mindful approach to supporting the development of educational leaders…
Magnetic Flux Leakage and Principal Component Analysis for metal loss approximation in a pipeline
NASA Astrophysics Data System (ADS)
Ruiz, M.; Mujica, L. E.; Quintero, M.; Florez, J.; Quintero, S.
2015-07-01
Safety and reliability of hydrocarbon transportation pipelines represent a critical aspect for the Oil an Gas industry. Pipeline failures caused by corrosion, external agents, among others, can develop leaks or even rupture, which can negatively impact on population, natural environment, infrastructure and economy. It is imperative to have accurate inspection tools traveling through the pipeline to diagnose the integrity. In this way, over the last few years, different techniques under the concept of structural health monitoring (SHM) have continuously been in development. This work is based on a hybrid methodology that combines the Magnetic Flux Leakage (MFL) and Principal Components Analysis (PCA) approaches. The MFL technique induces a magnetic field in the pipeline's walls. The data are recorded by sensors measuring leakage magnetic field in segments with loss of metal, such as cracking, corrosion, among others. The data provide information of a pipeline with 15 years of operation approximately, which transports gas, has a diameter of 20 inches and a total length of 110 km (with several changes in the topography). On the other hand, PCA is a well-known technique that compresses the information and extracts the most relevant information facilitating the detection of damage in several structures. At this point, the goal of this work is to detect and localize critical loss of metal of a pipeline that are currently working.
77 FR 3762 - Magellan Pipeline Company, L.P.; Notice of Petition for Declaratory Order
Federal Register 2010, 2011, 2012, 2013, 2014
2012-01-25
... declaratory order that approves priority committed space and the overall rate structure involving the proposed partial reversal and expansion of Magellan's refined petroleum products pipeline system in Texas to move...
U. K. to resume natural gas imports
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1992-02-17
This paper reports that the U.K. government has opened the way for resuming gas imports into Britain by approving a contract signed by U.K. electric power utility National Power to buy gas from Norway. A new joint marketing venture of BP Exploration, Den norske stats oljeselskap AS (Statoil), and Norsk Hydro AS also will be allowed to import gas for electric power plant fuel once it has a contract. National Power and the BP/Statoil/Norsk Hydro group will use the Frigg pipeline from Norwegian waters into St. Fergus, north of Aberdeen, the only existing link between the British transmission system andmore » foreign supplies of gas. Meantime, progress is under way toward a second pipeline to link the U.K. with foreign natural gas supplies, calling for a pipeline across the English Channel joining the continental European pipeline system to the U.K. network.« less
NASA Astrophysics Data System (ADS)
Smits, K. M.; Mitton, M.; Moradi, A.; Chamindu, D. K.
2017-12-01
Reducing the amount of leaked natural gas (NG) from pipelines from production to use has become a high priority in efforts to cut anthropogenic emissions of methane. In addition to environmental impacts, NG leakage can cause significant economic losses and safety failures such as fires and explosions. However, tracking and evaluating NG pipeline leaks requires a better understanding of the leak from the source to the detector as well as more robust quantification methods. Although recent measurement-based approaches continue to make progress towards this end, efforts are hampered due to the complexity of leakage scenarios. Sub- surface transport of leaked NG from pipelines occurs through complex transport pathways due to soil heterogeneities and changes in soil moisture. Furthermore, it is affected by variable atmospheric conditions such as winds, frontal passages and rain. To better understand fugitive emissions from NG pipelines, we developed a field scale testbed that simulates low pressure gas leaks from pipe buried in soil. The system is equipped with subsurface and surface sensors to continuously monitor changes in soil and atmospheric conditions (e.g. moisture, pressure, temperature) and methane concentrations. Using this testbed, we are currently conducting a series of gas leakage experiments to study of the impact of subsurface (e.g. soil moisture, heterogeneity) and atmospheric conditions (near-surface wind and temperature) on the detected gas signals and establish the relative importance of the many pathways for methane migration between the source and the sensor location. Accompanying numerical modeling of the system using the multiphase transport simulator TOUGH2-EOS7CA demonstrates the influence of leak location and direction on gas migration. These findings will better inform leak detectors of the leak severity before excavation, aiding with safety precautions and work order categorization for improved efficiency.
Pipeline transport and simultaneous saccharification of corn stover.
Kumar, Amit; Cameron, Jay B; Flynn, Peter C
2005-05-01
Pipeline transport of corn stover delivered by truck from the field is evaluated against a range of truck transport costs. Corn stover transported by pipeline at 20% solids concentration (wet basis) or higher could directly enter an ethanol fermentation plant, and hence the investment in the pipeline inlet end processing facilities displaces comparable investment in the plant. At 20% solids, pipeline transport of corn stover costs less than trucking at capacities in excess of 1.4 M drytonnes/yr when compared to a mid range of truck transport cost (excluding any credit for economies of scale achieved in the ethanol fermentation plant from larger scale due to multiple pipelines). Pipelining of corn stover gives the opportunity to conduct simultaneous transport and saccharification (STS). If current enzymes are used, this would require elevated temperature. Heating of the slurry for STS, which in a fermentation plant is achieved from waste heat, is a significant cost element (more than 5 cents/l of ethanol) if done at the pipeline inlet unless waste heat is available, for example from an electric power plant located adjacent to the pipeline inlet. Heat loss in a 1.26 m pipeline carrying 2 M drytonnes/yr is about 5 degrees C at a distance of 400 km in typical prairie clay soils, and would not likely require insulation; smaller pipelines or different soil conditions might require insulation for STS. Saccharification in the pipeline would reduce the need for investment in the fermentation plant, saving about 0.2 cents/l of ethanol. Transport of corn stover in multiple pipelines offers the opportunity to develop a large ethanol fermentation plant, avoiding some of the diseconomies of scale that arise from smaller plants whose capacities are limited by issues of truck congestion.
Tigres Workflow Library: Supporting Scientific Pipelines on HPC Systems
Hendrix, Valerie; Fox, James; Ghoshal, Devarshi; ...
2016-07-21
The growth in scientific data volumes has resulted in the need for new tools that enable users to operate on and analyze data on large-scale resources. In the last decade, a number of scientific workflow tools have emerged. These tools often target distributed environments, and often need expert help to compose and execute the workflows. Data-intensive workflows are often ad-hoc, they involve an iterative development process that includes users composing and testing their workflows on desktops, and scaling up to larger systems. In this paper, we present the design and implementation of Tigres, a workflow library that supports the iterativemore » workflow development cycle of data-intensive workflows. Tigres provides an application programming interface to a set of programming templates i.e., sequence, parallel, split, merge, that can be used to compose and execute computational and data pipelines. We discuss the results of our evaluation of scientific and synthetic workflows showing Tigres performs with minimal template overheads (mean of 13 seconds over all experiments). We also discuss various factors (e.g., I/O performance, execution mechanisms) that affect the performance of scientific workflows on HPC systems.« less
NMRPipe: a multidimensional spectral processing system based on UNIX pipes.
Delaglio, F; Grzesiek, S; Vuister, G W; Zhu, G; Pfeifer, J; Bax, A
1995-11-01
The NMRPipe system is a UNIX software environment of processing, graphics, and analysis tools designed to meet current routine and research-oriented multidimensional processing requirements, and to anticipate and accommodate future demands and developments. The system is based on UNIX pipes, which allow programs running simultaneously to exchange streams of data under user control. In an NMRPipe processing scheme, a stream of spectral data flows through a pipeline of processing programs, each of which performs one component of the overall scheme, such as Fourier transformation or linear prediction. Complete multidimensional processing schemes are constructed as simple UNIX shell scripts. The processing modules themselves maintain and exploit accurate records of data sizes, detection modes, and calibration information in all dimensions, so that schemes can be constructed without the need to explicitly define or anticipate data sizes or storage details of real and imaginary channels during processing. The asynchronous pipeline scheme provides other substantial advantages, including high flexibility, favorable processing speeds, choice of both all-in-memory and disk-bound processing, easy adaptation to different data formats, simpler software development and maintenance, and the ability to distribute processing tasks on multi-CPU computers and computer networks.
Tigres Workflow Library: Supporting Scientific Pipelines on HPC Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hendrix, Valerie; Fox, James; Ghoshal, Devarshi
The growth in scientific data volumes has resulted in the need for new tools that enable users to operate on and analyze data on large-scale resources. In the last decade, a number of scientific workflow tools have emerged. These tools often target distributed environments, and often need expert help to compose and execute the workflows. Data-intensive workflows are often ad-hoc, they involve an iterative development process that includes users composing and testing their workflows on desktops, and scaling up to larger systems. In this paper, we present the design and implementation of Tigres, a workflow library that supports the iterativemore » workflow development cycle of data-intensive workflows. Tigres provides an application programming interface to a set of programming templates i.e., sequence, parallel, split, merge, that can be used to compose and execute computational and data pipelines. We discuss the results of our evaluation of scientific and synthetic workflows showing Tigres performs with minimal template overheads (mean of 13 seconds over all experiments). We also discuss various factors (e.g., I/O performance, execution mechanisms) that affect the performance of scientific workflows on HPC systems.« less
30 CFR 1206.157 - Determination of transportation allowances.
Code of Federal Regulations, 2014 CFR
2014-07-01
... pipelines are interconnected to a series of outgoing pipelines; (5) Gas Research Institute (GRI) fees. The GRI conducts research, development, and commercialization programs on natural gas related topics for...