Oman India Pipeline: An operational repair strategy based on a rational assessment of risk
DOE Office of Scientific and Technical Information (OSTI.GOV)
German, P.
1996-12-31
This paper describes the development of a repair strategy for the operational phase of the Oman India Pipeline based upon the probability and consequences of a pipeline failure. Risk analyses and cost benefit analyses performed provide guidance on the level of deepwater repair development effort appropriate for the Oman India Pipeline project and identifies critical areas toward which more intense development effort should be directed. The risk analysis results indicate that the likelihood of a failure of the Oman India Pipeline during its 40-year life is low. Furthermore, the probability of operational failure of the pipeline in deepwater regions ismore » extremely low, the major proportion of operational failure risk being associated with the shallow water regions.« less
Hydrocarbons pipeline transportation risk assessment
NASA Astrophysics Data System (ADS)
Zanin, A. V.; Milke, A. A.; Kvasov, I. N.
2018-04-01
The pipeline transportation applying risks assessment issue in the arctic conditions is addressed in the paper. Pipeline quality characteristics in the given environment has been assessed. To achieve the stated objective, the pipelines mathematical model was designed and visualized by using the software product SOLIDWORKS. When developing the mathematical model the obtained results made possible to define the pipeline optimal characteristics for designing on the Arctic sea bottom. In the course of conducting the research the pipe avalanche collapse risks were examined, internal longitudinal and circular loads acting on the pipeline were analyzed, as well as the water impact hydrodynamic force was taken into consideration. The conducted calculation can contribute to the pipeline transport further development under the harsh climate conditions of the Russian Federation Arctic shelf territory.
Risk analysis of urban gas pipeline network based on improved bow-tie model
NASA Astrophysics Data System (ADS)
Hao, M. J.; You, Q. J.; Yue, Z.
2017-11-01
Gas pipeline network is a major hazard source in urban areas. In the event of an accident, there could be grave consequences. In order to understand more clearly the causes and consequences of gas pipeline network accidents, and to develop prevention and mitigation measures, the author puts forward the application of improved bow-tie model to analyze risks of urban gas pipeline network. The improved bow-tie model analyzes accident causes from four aspects: human, materials, environment and management; it also analyzes the consequences from four aspects: casualty, property loss, environment and society. Then it quantifies the causes and consequences. Risk identification, risk analysis, risk assessment, risk control, and risk management will be clearly shown in the model figures. Then it can suggest prevention and mitigation measures accordingly to help reduce accident rate of gas pipeline network. The results show that the whole process of an accident can be visually investigated using the bow-tie model. It can also provide reasons for and predict consequences of an unfortunate event. It is of great significance in order to analyze leakage failure of gas pipeline network.
NASA Astrophysics Data System (ADS)
Osland, Anna Christine
Hazardous liquid and natural gas transmission pipelines have received limited attention by planning scholars even though local development decisions can have broad consequences if a rupture occurs. In this dissertation, I evaluated the implications of land-use planning for reducing risk to transmission pipeline hazards in North Carolina via three investigations. First, using a survey of planning directors in jurisdictions with transmission pipeline hazards, I investigated the land use planning tools used to mitigate pipeline hazards and the factors associated with tool adoption. Planning scholars have documented the difficulty of inducing planning in hazardous areas, yet there remain gaps in knowledge about the factors associated with tool adoption. Despite the risks associated with pipeline ruptures, I found most localities use few mitigation tools, and the adoption of regulatory and informational tools appear to be influenced by divergent factors. Whereas risk perception, commitment, capacity, and community context were associated with total tool and information tool use, only risk perception and capacity factors were associated with regulatory tool use. Second, using interviews of emergency managers and planning directors, I examined the role of agency collaboration for building mitigation capacity. Scholars have highlighted the potential of technical collaboration, yet less research has investigated how inter-agency collaboration shapes mitigation capacity. I identify three categories of technical collaboration, discuss how collaborative spillovers can occur from one planning area to another, and challenge the notion that all technical collaborations result in equal mitigation outcomes. Third, I evaluated characteristics of the population near pipelines to address equity concerns. Surprisingly, I did not find broad support for differences in exposure of vulnerable populations. Nonetheless, my analyses uncovered statistically significant clusters of vulnerable groups within the hazard area. Interestingly, development closer to pipelines was newer than areas farther away, illustrating the failure of land-use planning to reduce development encroachment. Collectively, these results highlight the potential of land-use planning to keep people and development from encroaching on pipeline hazards. While this study indicates that planners in many areas address pipeline hazards, it also illustrates how changes to local practices can further reduce risks to human health, homeland security, and the environment.
49 CFR 195.452 - Pipeline integrity management in high consequence areas.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 49 Transportation 3 2013-10-01 2013-10-01 false Pipeline integrity management in high consequence... Management § 195.452 Pipeline integrity management in high consequence areas. (a) Which pipelines are covered... by this section must: (1) Develop a written integrity management program that addresses the risks on...
49 CFR 195.452 - Pipeline integrity management in high consequence areas.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 49 Transportation 3 2012-10-01 2012-10-01 false Pipeline integrity management in high consequence... Management § 195.452 Pipeline integrity management in high consequence areas. (a) Which pipelines are covered... by this section must: (1) Develop a written integrity management program that addresses the risks on...
49 CFR 195.452 - Pipeline integrity management in high consequence areas.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 49 Transportation 3 2014-10-01 2014-10-01 false Pipeline integrity management in high consequence... Management § 195.452 Pipeline integrity management in high consequence areas. (a) Which pipelines are covered... by this section must: (1) Develop a written integrity management program that addresses the risks on...
Developing a Comprehensive Risk Assessment Framework for Geological Storage CO 2
DOE Office of Scientific and Technical Information (OSTI.GOV)
Duncan, Ian
2014-08-31
The operational risks for CCS projects include: risks of capturing, compressing, transporting and injecting CO₂; risks of well blowouts; risk that CO 2 will leak into shallow aquifers and contaminate potable water; and risk that sequestered CO 2 will leak into the atmosphere. This report examines these risks by using information on the risks associated with analogue activities such as CO 2 based enhanced oil recovery (CO 2-EOR), natural gas storage and acid gas disposal. We have developed a new analysis of pipeline risk based on Bayesian statistical analysis. Bayesian theory probabilities may describe states of partial knowledge, even perhapsmore » those related to non-repeatable events. The Bayesian approach enables both utilizing existing data and at the same time having the capability to adsorb new information thus to lower uncertainty in our understanding of complex systems. Incident rates for both natural gas and CO 2 pipelines have been widely used in papers and reports on risk of CO 2 pipelines as proxies for the individual risk created by such pipelines. Published risk studies of CO 2 pipelines suggest that the individual risk associated with CO2 pipelines is between 10 -3 and 10 -4, which reflects risk levels approaching those of mountain climbing, which many would find unacceptably high. This report concludes, based on a careful analysis of natural gas pipeline failures, suggests that the individual risk of CO 2 pipelines is likely in the range of 10-6 to 10-7, a risk range considered in the acceptable to negligible range in most countries. If, as is commonly thought, pipelines represent the highest risk component of CCS outside of the capture plant, then this conclusion suggests that most (if not all) previous quantitative- risk assessments of components of CCS may be orders of magnitude to high. The potential lethality of unexpected CO 2 releases from pipelines or wells are arguably the highest risk aspects of CO 2 enhanced oil recovery (CO2-EOR), carbon capture, and storage (CCS). Assertions in the CCS literature, that CO 2 levels of 10% for ten minutes, or 20 to 30% for a few minutes are lethal to humans, are not supported by the available evidence. The results of published experiments with animals exposed to CO 2, from mice to monkeys, at both normal and depleted oxygen levels, suggest that lethal levels of CO 2 toxicity are in the range 50 to 60%. These experiments demonstrate that CO 2 does not kill by asphyxia, but rather is toxic at high concentrations. It is concluded that quantitative risk assessments of CCS have overestimated the risk of fatalities by using values of lethality a factor two to six lower than the values estimated in this paper. In many dispersion models of CO 2 releases from pipelines, no fatalities would be predicted if appropriate levels of lethality for CO 2 had been used in the analysis.« less
Liquid Pipeline Operator's Control Room Human Factors Risk Assessment and Management Guide
DOT National Transportation Integrated Search
2008-11-26
The purpose of this guide is to document methodologies, tools, procedures, guidance, and instructions that have been developed to provide liquid pipeline operators with an efficient and effective means of managing the human factors risks in their con...
Human Factors Analysis of Pipeline Monitoring and Control Operations: Final Technical Report
DOT National Transportation Integrated Search
2008-11-26
The purpose of the Human Factors Analysis of Pipeline Monitoring and Control Operations project was to develop procedures that could be used by liquid pipeline operators to assess and manage the human factors risks in their control rooms that may adv...
Koornneef, Joris; Spruijt, Mark; Molag, Menso; Ramírez, Andrea; Turkenburg, Wim; Faaij, André
2010-05-15
A systematic assessment, based on an extensive literature review, of the impact of gaps and uncertainties on the results of quantitative risk assessments (QRAs) for CO(2) pipelines is presented. Sources of uncertainties that have been assessed are: failure rates, pipeline pressure, temperature, section length, diameter, orifice size, type and direction of release, meteorological conditions, jet diameter, vapour mass fraction in the release and the dose-effect relationship for CO(2). A sensitivity analysis with these parameters is performed using release, dispersion and impact models. The results show that the knowledge gaps and uncertainties have a large effect on the accuracy of the assessed risks of CO(2) pipelines. In this study it is found that the individual risk contour can vary between 0 and 204 m from the pipeline depending on assumptions made. In existing studies this range is found to be between <1m and 7.2 km. Mitigating the relevant risks is part of current practice, making them controllable. It is concluded that QRA for CO(2) pipelines can be improved by validation of release and dispersion models for high-pressure CO(2) releases, definition and adoption of a universal dose-effect relationship and development of a good practice guide for QRAs for CO(2) pipelines. Copyright (c) 2009 Elsevier B.V. All rights reserved.
Risk Analysis using Corrosion Rate Parameter on Gas Transmission Pipeline
NASA Astrophysics Data System (ADS)
Sasikirono, B.; Kim, S. J.; Haryadi, G. D.; Huda, A.
2017-05-01
In the oil and gas industry, the pipeline is a major component in the transmission and distribution process of oil and gas. Oil and gas distribution process sometimes performed past the pipeline across the various types of environmental conditions. Therefore, in the transmission and distribution process of oil and gas, a pipeline should operate safely so that it does not harm the surrounding environment. Corrosion is still a major cause of failure in some components of the equipment in a production facility. In pipeline systems, corrosion can cause failures in the wall and damage to the pipeline. Therefore it takes care and periodic inspections or checks on the pipeline system. Every production facility in an industry has a level of risk for damage which is a result of the opportunities and consequences of damage caused. The purpose of this research is to analyze the level of risk of 20-inch Natural Gas Transmission Pipeline using Risk-based inspection semi-quantitative based on API 581 associated with the likelihood of failure and the consequences of the failure of a component of the equipment. Then the result is used to determine the next inspection plans. Nine pipeline components were observed, such as a straight pipes inlet, connection tee, and straight pipes outlet. The risk assessment level of the nine pipeline’s components is presented in a risk matrix. The risk level of components is examined at medium risk levels. The failure mechanism that is used in this research is the mechanism of thinning. Based on the results of corrosion rate calculation, remaining pipeline components age can be obtained, so the remaining lifetime of pipeline components are known. The calculation of remaining lifetime obtained and the results vary for each component. Next step is planning the inspection of pipeline components by NDT external methods.
Quantitative Risk Mapping of Urban Gas Pipeline Networks Using GIS
NASA Astrophysics Data System (ADS)
Azari, P.; Karimi, M.
2017-09-01
Natural gas is considered an important source of energy in the world. By increasing growth of urbanization, urban gas pipelines which transmit natural gas from transmission pipelines to consumers, will become a dense network. The increase in the density of urban pipelines will influence probability of occurring bad accidents in urban areas. These accidents have a catastrophic effect on people and their property. Within the next few years, risk mapping will become an important component in urban planning and management of large cities in order to decrease the probability of accident and to control them. Therefore, it is important to assess risk values and determine their location on urban map using an appropriate method. In the history of risk analysis of urban natural gas pipeline networks, the pipelines has always been considered one by one and their density in urban area has not been considered. The aim of this study is to determine the effect of several pipelines on the risk value of a specific grid point. This paper outlines a quantitative risk assessment method for analysing the risk of urban natural gas pipeline networks. It consists of two main parts: failure rate calculation where the EGIG historical data are used and fatal length calculation that involves calculation of gas release and fatality rate of consequences. We consider jet fire, fireball and explosion for investigating the consequences of gas pipeline failure. The outcome of this method is an individual risk and is shown as a risk map.
Genetically engineered plants in the product development pipeline in India.
Warrier, Ranjini; Pande, Hem
2016-01-02
In order to proactively identify emerging issues that may impact the risk assessment and risk management functions of the Indian biosafety regulatory system, the Ministry of Environment, Forests and Climate Change sought to understand the nature and diversity of genetically engineered crops that may move to product commercialization within the next 10 y. This paper describes the findings from a questionnaire designed to solicit information about public and private sector research and development (R&D) activities in plant biotechnology. It is the first comprehensive overview of the R&D pipeline for GE crops in India.
ERIC Educational Resources Information Center
Shippen, Margaret E.; Patterson, DaShaunda; Green, Kemeche L.; Smitherman, Tracy
2012-01-01
Youth at risk for school failure need community and school supports to reduce the likelihood of developing delinquent behavior. This article provides an overview of community and school approaches aimed at intervening on the school-to-prison pipeline. Community and school efforts are emerging that take into account empirical evidence demonstrating…
Building a genome analysis pipeline to predict disease risk and prevent disease.
Bromberg, Y
2013-11-01
Reduced costs and increased speed and accuracy of sequencing can bring the genome-based evaluation of individual disease risk to the bedside. While past efforts have identified a number of actionable mutations, the bulk of genetic risk remains hidden in sequence data. The biggest challenge facing genomic medicine today is the development of new techniques to predict the specifics of a given human phenome (set of all expressed phenotypes) encoded by each individual variome (full set of genome variants) in the context of the given environment. Numerous tools exist for the computational identification of the functional effects of a single variant. However, the pipelines taking advantage of full genomic, exomic, transcriptomic (and other) sequences have only recently become a reality. This review looks at the building of methodologies for predicting "variome"-defined disease risk. It also discusses some of the challenges for incorporating such a pipeline into everyday medical practice. © 2013. Published by Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oldenburg, Curtis M.; Budnitz, Robert J.
If Carbon dioxide Capture and Storage (CCS) is to be effective in mitigating climate change, it will need to be carried out on a very large scale. This will involve many thousands of miles of dedicated high-pressure pipelines in order to transport many millions of tonnes of CO 2 annually, with the CO 2 delivered to many thousands of wells that will inject the CO 2 underground. The new CCS infrastructure could rival in size the current U.S. upstream natural gas pipeline and well infrastructure. This new infrastructure entails hazards for life, health, animals, the environment, and natural resources. Pipelinesmore » are known to rupture due to corrosion, from external forces such as impacts by vehicles or digging equipment, by defects in construction, or from the failure of valves and seals. Similarly, wells are vulnerable to catastrophic failure due to corrosion, cement degradation, or operational mistakes. While most accidents involving pipelines and wells will be minor, there is the inevitable possibility of accidents with very high consequences, especially to public health. The most important consequence of concern is CO 2 release to the environment in concentrations sufficient to cause death by asphyxiation to nearby populations. Such accidents are thought to be very unlikely, but of course they cannot be excluded, even if major engineering effort is devoted (as it will be) to keeping their probability low and their consequences minimized. This project has developed a methodology for analyzing the risks of these rare but high-consequence accidents, using a step-by-step probabilistic methodology. A key difference between risks for pipelines and wells is that the former are spatially distributed along the pipe whereas the latter are confined to the vicinity of the well. Otherwise, the methodology we develop for risk assessment of pipeline and well failures is similar and provides an analysis both of the annual probabilities of accident sequences of concern and of their consequences, and crucially the methodology provides insights into what measures might be taken to mitigate those accident sequences identified as of concern. Mitigating strategies could address reducing the likelihood of an accident sequence of concern, or reducing the consequences, or some combination. The methodology elucidates both local and integrated risks along the pipeline or at the well providing information useful to decision makers at various levels including local (e.g., property owners and town councils), regional (e.g., county and state representatives), and national levels (federal regulators and corporate proponents).« less
Satellite Radar Interferometry For Risk Management Of Gas Pipeline Networks
NASA Astrophysics Data System (ADS)
Ianoschi, Raluca; Schouten, Mathijs; Bas Leezenberg, Pieter; Dheenathayalan, Prabu; Hanssen, Ramon
2013-12-01
InSAR time series analyses can be fine-tuned for specific applications, yielding a potential increase in benchmark density, precision and reliability. Here we demonstrate the algorithms developed for gas pipeline monitoring, enabling operators to precisely pinpoint unstable locations. This helps asset management in planning, prioritizing and focusing in-situ inspections, thus reducing maintenance costs. In unconsolidated Quaternary soils, ground settlement contributes to possible failure of brittle cast iron gas pipes and their connections to houses. Other risk factors include the age and material of the pipe. The soil dynamics have led to a catastrophic explosion in the city of Amsterdam, which triggered an increased awareness for the significance of this problem. As the extent of the networks can be very wide, InSAR is shown to be a valuable source of information for identifying the hazard regions. We monitor subsidence affecting an urban gas transportation network in the Netherlands using both medium and high resolution SAR data. Results for the 2003-2010 period provide clear insights on the differential subsidence rates in the area. This enables characterization of underground motion that affects the integrity of the pipeline. High resolution SAR data add extra detail of door-to-door pipeline connections, which are vulnerable due to different settlements between house connections and main pipelines. The rates which we measure represent important input in planning of maintenance works. Managers can decide the priority and timing for inspecting the pipelines. The service helps manage the risk and reduce operational cost in gas transportation networks.
Zhang, Limao; Wu, Xianguo; Qin, Yawei; Skibniewski, Miroslaw J; Liu, Wenli
2016-02-01
Tunneling excavation is bound to produce significant disturbances to surrounding environments, and the tunnel-induced damage to adjacent underground buried pipelines is of considerable importance for geotechnical practice. A fuzzy Bayesian networks (FBNs) based approach for safety risk analysis is developed in this article with detailed step-by-step procedures, consisting of risk mechanism analysis, the FBN model establishment, fuzzification, FBN-based inference, defuzzification, and decision making. In accordance with the failure mechanism analysis, a tunnel-induced pipeline damage model is proposed to reveal the cause-effect relationships between the pipeline damage and its influential variables. In terms of the fuzzification process, an expert confidence indicator is proposed to reveal the reliability of the data when determining the fuzzy probability of occurrence of basic events, with both the judgment ability level and the subjectivity reliability level taken into account. By means of the fuzzy Bayesian inference, the approach proposed in this article is capable of calculating the probability distribution of potential safety risks and identifying the most likely potential causes of accidents under both prior knowledge and given evidence circumstances. A case concerning the safety analysis of underground buried pipelines adjacent to the construction of the Wuhan Yangtze River Tunnel is presented. The results demonstrate the feasibility of the proposed FBN approach and its application potential. The proposed approach can be used as a decision tool to provide support for safety assurance and management in tunnel construction, and thus increase the likelihood of a successful project in a complex project environment. © 2015 Society for Risk Analysis.
Pharmaceutical new product development: the increasing role of in-licensing.
Edwards, Nancy V
2008-12-01
Many pharmaceutical companies are facing a pipeline gap because of the increasing economic burden and uncertainty associated with internal research and development programs designed to develop new pharmaceutical products. To fill this pipeline gap, pharmaceutical companies are increasingly relying on in-licensing opportunities. New business development identifies new pharmaceuticals that satisfy unmet needs and are a good strategic fit for the company, completes valuation models and forecasts, evaluates the ability of the company to develop and launch products, and pursues in-licensing agreements for pharmaceuticals that cannot be developed internally on a timely basis. These agreements involve the transfer of access rights for patents, trademarks, or similar intellectual property from an outside company in exchange for payments. Despite the risks, in-licensing is increasingly becoming the preferred method for pharmaceutical companies with pipeline gaps to bring new pharmaceuticals to the clinician.
Shi, Peng; Xiao, Jun; Wang, Yafeng; Chen, Liding
2014-02-28
The construction of large-scale infrastructures such as nature gas/oil pipelines involves extensive disturbance to regional ecosystems. Few studies have documented the soil degradation and heavy metal contamination caused by pipeline construction. In this study, chromium (Cr), cadmium (Cd), copper (Cu), nickel (Ni), lead (Pb) and zinc (Zn) levels were evaluated using Index of Geo-accumulation (Igeo) and Potential Ecological Risk Index (RI) values, and human health risk assessments were used to elucidate the level and spatial variation of heavy metal pollution risks. The results showed that the impact zone of pipeline installation on soil heavy metal contamination was restricted to pipeline right-of-way (RoW), which had higher Igeo of Cd, Cu, Ni and Pb than that of 20 m and 50 m. RI showed a declining tendency in different zones as follows: trench > working zone > piling area > 20 m > 50 m. Pipeline RoW resulted in higher human health risks than that of 20 m and 50 m, and children were more susceptible to non-carcinogenic hazard risk. Cluster analysis showed that Cu, Ni, Pb and Cd had similar sources, drawing attention to the anthropogenic activity. The findings in this study should help better understand the type, degree, scope and sources of heavy metal pollution from pipeline construction to reduce pollutant emissions, and are helpful in providing a scientific basis for future risk management.
Health, safety and environmental risk of a gas pipeline in an oil exploring area of Gachsaran.
Kalatpoor, Omid; Goshtasp, Kambiz; Khavaji, Solieman
2011-01-01
The purpose of this study was assessing health, safety and environmental risk of a gas transfer pipeline in an oily area of Gachsaran. In this method, we used the Kent's pipeline risk assessment method except that to facilitate using the method more practically some changes were exerted into Kent's method. A pipeline with 16 kilometers length was selected considering surrounding nature of the pipeline. It was divided into two sections. Analogous to Kent's method, in this method, parameters included: interested party's injuries, corrosion, design factor, incorrect operation index and consequence scoring. The difference here was that for consequence scoring we used ALOHA 5.6 software instead of Kent's pattern. Results showed that health, safety and environmental risks of section 2 (the next 13 kilometers of outgoing pipeline from gas station after the first 3 kilometers) were greater. It seems the main cause of gaining a bigger risk number was related to more activities of interested parties around section 2. Because all figures gathered from indexes are almost close to gather except third parties activity.
Rights, Bunche, Rose and the "pipeline".
Marks, Steven R.; Wilkinson-Lee, Ada M.
2006-01-01
We address education "pipelines" and their social ecology, drawing on the 1930's writing of Ralph J. Bunche, a Nobel peace maker whose war against systematic second-class education for the poor, minority and nonminority alike is nearly forgotten; and of the epidemiologist Geoffrey Rose, whose 1985 paper spotlighted the difficulty of shifting health status and risks in a "sick society. From the perspective of human rights and human development, we offer suggestions toward the paired "ends" of the pipeline: equality of opportunity for individuals, and equality of health for populations. We offer a national "to do" list to improve pipeline flow and then reconsider the merits of the "pipeline" metaphor, which neither matches the reality of lived education pathways nor supports notions of human rights, freedoms and capabilities, but rather reflects a commoditizing stance to free persons. PMID:17019927
75 FR 45591 - Pipeline Safety: Notice of Technical Pipeline Safety Advisory Committee Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-03
... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration 49 CFR Part... Committee Meetings AGENCY: Pipeline and Hazardous Materials Safety Administration (PHMSA); DOT. ACTION... safety standards, risk assessments, and safety policies for natural gas pipelines and for hazardous...
Liu, Wenbin; Liu, Aimin
2018-01-01
With the exploitation of offshore oil and gas gradually moving to deep water, higher temperature differences and pressure differences are applied to the pipeline system, making the global buckling of the pipeline more serious. For unburied deep-water pipelines, the lateral buckling is the major buckling form. The initial imperfections widely exist in the pipeline system due to manufacture defects or the influence of uneven seabed, and the distribution and geometry features of initial imperfections are random. They can be divided into two kinds based on shape: single-arch imperfections and double-arch imperfections. This paper analyzed the global buckling process of a pipeline with 2 initial imperfections by using a numerical simulation method and revealed how the ratio of the initial imperfection’s space length to the imperfection’s wavelength and the combination of imperfections affects the buckling process. The results show that a pipeline with 2 initial imperfections may suffer the superposition of global buckling. The growth ratios of buckling displacement, axial force and bending moment in the superposition zone are several times larger than no buckling superposition pipeline. The ratio of the initial imperfection’s space length to the imperfection’s wavelength decides whether a pipeline suffers buckling superposition. The potential failure point of pipeline exhibiting buckling superposition is as same as the no buckling superposition pipeline, but the failure risk of pipeline exhibiting buckling superposition is much higher. The shape and direction of two nearby imperfections also affects the failure risk of pipeline exhibiting global buckling superposition. The failure risk of pipeline with two double-arch imperfections is higher than pipeline with two single-arch imperfections. PMID:29554123
Han, Z Y; Weng, W G
2011-05-15
In this paper, a qualitative and a quantitative risk assessment methods for urban natural gas pipeline network are proposed. The qualitative method is comprised of an index system, which includes a causation index, an inherent risk index, a consequence index and their corresponding weights. The quantitative method consists of a probability assessment, a consequences analysis and a risk evaluation. The outcome of the qualitative method is a qualitative risk value, and for quantitative method the outcomes are individual risk and social risk. In comparison with previous research, the qualitative method proposed in this paper is particularly suitable for urban natural gas pipeline network, and the quantitative method takes different consequences of accidents into consideration, such as toxic gas diffusion, jet flame, fire ball combustion and UVCE. Two sample urban natural gas pipeline networks are used to demonstrate these two methods. It is indicated that both of the two methods can be applied to practical application, and the choice of the methods depends on the actual basic data of the gas pipelines and the precision requirements of risk assessment. Crown Copyright © 2011. Published by Elsevier B.V. All rights reserved.
Dynamic safety assessment of natural gas stations using Bayesian network.
Zarei, Esmaeil; Azadeh, Ali; Khakzad, Nima; Aliabadi, Mostafa Mirzaei; Mohammadfam, Iraj
2017-01-05
Pipelines are one of the most popular and effective ways of transporting hazardous materials, especially natural gas. However, the rapid development of gas pipelines and stations in urban areas has introduced a serious threat to public safety and assets. Although different methods have been developed for risk analysis of gas transportation systems, a comprehensive methodology for risk analysis is still lacking, especially in natural gas stations. The present work is aimed at developing a dynamic and comprehensive quantitative risk analysis (DCQRA) approach for accident scenario and risk modeling of natural gas stations. In this approach, a FMEA is used for hazard analysis while a Bow-tie diagram and Bayesian network are employed to model the worst-case accident scenario and to assess the risks. The results have indicated that the failure of the regulator system was the worst-case accident scenario with the human error as the most contributing factor. Thus, in risk management plan of natural gas stations, priority should be given to the most probable root events and main contribution factors, which have identified in the present study, in order to reduce the occurrence probability of the accident scenarios and thus alleviate the risks. Copyright © 2016 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Astisiasari; Van Westen, Cees; Jetten, Victor; van der Meer, Freek; Rahmawati Hizbaron, Dyah
2017-12-01
An operating geothermal power plant consists of installation units that work systematically in a network. The pipeline network connects various engineering structures, e.g. well pads, separator, scrubber, and power station, in the process of transferring geothermal fluids to generate electricity. Besides, a pipeline infrastructure also delivers the brine back to earth, through the injection well-pads. Despite of its important functions, a geothermal pipeline may bear a threat to its vicinity through a pipeline failure. The pipeline can be impacted by perilous events like landslides, earthquakes, and subsidence. The pipeline failure itself may relate to physical deterioration over time, e.g. due to corrosion and fatigue. The geothermal reservoirs are usually located in mountainous areas that are associated with steep slopes, complex geology, and weathered soil. Geothermal areas record a noteworthy number of disasters, especially due to landslide and subsidence. Therefore, a proper multi-risk assessment along the geothermal pipeline is required, particularly for these two types of hazard. This is also to mention that the impact on human fatality and injury is not presently discussed here. This paper aims to give a basic overview on the existing approaches for the assessment of multi-risk assessment along geothermal pipelines. It delivers basic principles on the analysis of risks and its contributing variables, in order to model the loss consequences. By considering the loss consequences, as well as the alternatives for mitigation measures, the environmental safety in geothermal working area could be enforced.
A First Step towards a Clinical Decision Support System for Post-traumatic Stress Disorders.
Ma, Sisi; Galatzer-Levy, Isaac R; Wang, Xuya; Fenyö, David; Shalev, Arieh Y
2016-01-01
PTSD is distressful and debilitating, following a non-remitting course in about 10% to 20% of trauma survivors. Numerous risk indicators of PTSD have been identified, but individual level prediction remains elusive. As an effort to bridge the gap between scientific discovery and practical application, we designed and implemented a clinical decision support pipeline to provide clinically relevant recommendation for trauma survivors. To meet the specific challenge of early prediction, this work uses data obtained within ten days of a traumatic event. The pipeline creates personalized predictive model for each individual, and computes quality metrics for each predictive model. Clinical recommendations are made based on both the prediction of the model and its quality, thus avoiding making potentially detrimental recommendations based on insufficient information or suboptimal model. The current pipeline outperforms the acute stress disorder, a commonly used clinical risk factor for PTSD development, both in terms of sensitivity and specificity.
76 FR 35130 - Pipeline Safety: Control Room Management/Human Factors
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-16
... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration 49 CFR Parts...: Control Room Management/Human Factors AGENCY: Pipeline and Hazardous Materials Safety Administration... safety standards, risk assessments, and safety policies for natural gas pipelines and for hazardous...
Airborne LIDAR Pipeline Inspection System (ALPIS) Mapping Tests
DOT National Transportation Integrated Search
2003-06-06
Natural gas and hazardous liquid pipeline operators have a need to identify where leaks are occurring along their pipelines in order to lower the risks the pipelines pose to people and the environment. Current methods of locating natural gas and haza...
Sanderson, Hans; Fauser, Patrik; Rahbek, Malene; Larsen, Jørn Bo
2014-08-30
This paper compiles all the measured chemical warfare agent (CWA) concentrations found in relation to the Nord Stream pipeline work in Danish waters for the past 5 years. Sediment and biota sampling were performed along the pipeline route in four campaigns, prior to (in 2008 and 2010), during (in 2011) and after (in 2012) the construction work. No parent CWAs were detected in the sediments. Patchy residues of CWA degradation products of Adamsite, Clark I, phenyldichloroarsine, trichloroarsine and Lewisite II, were detected in a total of 29 of the 391 sediment samples collected and analyzed the past 5 years. The cumulative fish community risk quotient for the different locations, calculated as a sum of background and added risk, ranged between 0 and 0.017 suggesting a negligible acute CWA risk toward the fish community. The added risk from sediment disturbance in relation to construction of the pipelines represents less than 2% of the total risk in the areas with the highest calculated risk. The analyses of benthic infauna corroborate the finding of CWA related low risk across the years. There was no significant difference in CWA risk before (2008) and after the pipeline construction (2012). Copyright © 2014 Elsevier B.V. All rights reserved.
49 CFR 192.921 - How is the baseline assessment to be conducted?
Code of Federal Regulations, 2010 CFR
2010-10-01
...) PIPELINE AND HAZARDOUS MATERIALS SAFETY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED) PIPELINE SAFETY TRANSPORTATION OF NATURAL AND OTHER GAS BY PIPELINE: MINIMUM FEDERAL SAFETY STANDARDS Gas... the covered pipeline segments for the baseline assessment according to a risk analysis that considers...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mazzoldi, A.; Oldenburg, C. M.
The Illinois Basin Decatur Project (IBDP) is designed to confirm the ability of the Mt. Simon Sandstone, a major regional saline-water-bearing formation in the Illinois Basin, to store 1 million tons of carbon dioxide (CO{sub 2}) injected over a period of three years. The CO{sub 2} will be provided by Archer Daniels Midland (ADM) from its Decatur, Illinois, ethanol plant. In order to transport CO{sub 2} from the capture facility to the injection well (also located within the ADM plant boundaries), a high-pressure pipeline of length 3,200 ft (975 m) has been constructed, running above the ground surface within themore » ADM plant footprint. We have qualitatively evaluated risks associated with possible pipeline failure scenarios that lead to discharge of CO{sub 2} within the real-world environment of the ADM plant in which there are often workers and visitors in the vicinity of the pipeline. There are several aspects of CO{sub 2} that make its transportation and potential leakage somewhat different from other substances, most notable is its non-flammability and propensity to change to solid (dry ice) upon strong decompression. In this study, we present numerical simulations using Computational Fluid Dynamics (CFD) methods of the release and dispersion of CO{sub 2} from individual hypothetical pipeline failures (i.e., leaks). Failure frequency of the various components of a pipeline transportation system over time are taken from prior work on general pipeline safety and leakage modeling and suggest a 4.65% chance of some kind of pipeline failure over the three-years of operation. Following the Precautionary Principle (see below), we accounted for full-bore leakage scenarios, where the temporal evolution of the mass release rate from the high-pressure pipeline leak locations was simulated using a state-of-the-art Pipe model which considers the thermodynamic effects of decompression in the entire pipeline. Failures have been simulated at four representative locations along the pipeline route within the ADM plant. Leakage scenarios at sites along the route of the pipeline, where plant operations (e.g., vehicular and train transportation) seem to present a higher likelihood of accidental failure, for example due to vehicles or equipment crashing into the pipeline and completely severing it, were modeled by allowing them to have a double source consistent with the pipeline releasing high-pressure CO{sub 2} from both ends of the broken pipe after a full-bore offset rupture. Simulation results show that the built environment of the plant plays a significant role in the dispersion of the gas as leaking CO{sub 2} can impinge upon buildings and other infrastructure. In all scenarios simulated, the region of very high-concentration of CO{sub 2} is limited to a small area around the pipeline failure, suggesting the likelihood of widespread harmful CO{sub 2} exposure to plant personnel from pipeline leakage is low. An additional risk is posed by the blast wave that emanates from a high-pressure pipeline when it is breached quickly. We estimate the blast wave risk as low because it occurs only for a short time in the immediate vicinity of the rupture, and requires an instantaneous large-scale rupture to occur. We recommend consideration of signage and guard rails and posts to mitigate the likelihood of vehicles crashing into the pipeline. A standardized emergency response plan applicable to capture plants within industrial sites could be developed based on the IBDP that would be useful for other capture plants. Finally, we recommend carrying out coupled wellbore-reservoir blowout scenario modeling to understand the potential for hazardous conditions arising from an unexpected blowout at the wellhead.« less
Historical analysis of US pipeline accidents triggered by natural hazards
NASA Astrophysics Data System (ADS)
Girgin, Serkan; Krausmann, Elisabeth
2015-04-01
Natural hazards, such as earthquakes, floods, landslides, or lightning, can initiate accidents in oil and gas pipelines with potentially major consequences on the population or the environment due to toxic releases, fires and explosions. Accidents of this type are also referred to as Natech events. Many major accidents highlight the risk associated with natural-hazard impact on pipelines transporting dangerous substances. For instance, in the USA in 1994, flooding of the San Jacinto River caused the rupture of 8 and the undermining of 29 pipelines by the floodwaters. About 5.5 million litres of petroleum and related products were spilled into the river and ignited. As a results, 547 people were injured and significant environmental damage occurred. Post-incident analysis is a valuable tool for better understanding the causes, dynamics and impacts of pipeline Natech accidents in support of future accident prevention and mitigation. Therefore, data on onshore hazardous-liquid pipeline accidents collected by the US Pipeline and Hazardous Materials Safety Administration (PHMSA) was analysed. For this purpose, a database-driven incident data analysis system was developed to aid the rapid review and categorization of PHMSA incident reports. Using an automated data-mining process followed by a peer review of the incident records and supported by natural hazard databases and external information sources, the pipeline Natechs were identified. As a by-product of the data-collection process, the database now includes over 800,000 incidents from all causes in industrial and transportation activities, which are automatically classified in the same way as the PHMSA record. This presentation describes the data collection and reviewing steps conducted during the study, provides information on the developed database and data analysis tools, and reports the findings of a statistical analysis of the identified hazardous liquid pipeline incidents in terms of accident dynamics and consequences.
NASA Astrophysics Data System (ADS)
Zhang, Chao; Qin, Ting Xin; Huang, Shuai; Wu, Jian Song; Meng, Xin Yan
2018-06-01
Some factors can affect the consequences of oil pipeline accident and their effects should be analyzed to improve emergency preparation and emergency response. Although there are some qualitative analysis models of risk factors' effects, the quantitative analysis model still should be researched. In this study, we introduce a Bayesian network (BN) model of risk factors' effects analysis in an oil pipeline accident case that happened in China. The incident evolution diagram is built to identify the risk factors. And the BN model is built based on the deployment rule for factor nodes in BN and the expert knowledge by Dempster-Shafer evidence theory. Then the probabilities of incident consequences and risk factors' effects can be calculated. The most likely consequences given by this model are consilient with the case. Meanwhile, the quantitative estimations of risk factors' effects may provide a theoretical basis to take optimal risk treatment measures for oil pipeline management, which can be used in emergency preparation and emergency response.
49 CFR 192.1007 - What are the required elements of an integrity management plan?
Code of Federal Regulations, 2010 CFR
2010-10-01
... (Continued) PIPELINE AND HAZARDOUS MATERIALS SAFETY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED) PIPELINE SAFETY TRANSPORTATION OF NATURAL AND OTHER GAS BY PIPELINE: MINIMUM FEDERAL SAFETY STANDARDS Gas... threats and risks to its gas distribution pipeline. (2) Consider the information gained from past design...
NASA Astrophysics Data System (ADS)
Qiu, Zeyang; Liang, Wei; Wang, Xue; Lin, Yang; Zhang, Meng
2017-05-01
As an important part of national energy supply system, transmission pipelines for natural gas are possible to cause serious environmental pollution, life and property loss in case of accident. The third party damage is one of the most significant causes for natural gas pipeline system accidents, and it is very important to establish an effective quantitative risk assessment model of the third party damage for reducing the number of gas pipelines operation accidents. Against the third party damage accident has the characteristics such as diversity, complexity and uncertainty, this paper establishes a quantitative risk assessment model of the third party damage based on Analytic Hierarchy Process (AHP) and Fuzzy Comprehensive Evaluation (FCE). Firstly, risk sources of third party damage should be identified exactly, and the weight of factors could be determined via improved AHP, finally the importance of each factor is calculated by fuzzy comprehensive evaluation model. The results show that the quantitative risk assessment model is suitable for the third party damage of natural gas pipelines and improvement measures could be put forward to avoid accidents based on the importance of each factor.
Sanderson, Hans; Fauser, Patrik; Thomsen, Marianne; Larsen, Jørn Bo
2012-05-15
In connection with installation of two natural gas pipelines through the Baltic Sea between Russia and Germany, there has been concern regarding potential re-suspension of historically dumped chemical warfare agents (CWA) in a nearby dump site and the potential environmental risks associated. 192 sediment and 11 porewater samples were analyzed for CWA residues, both parent and metabolites in 2008 and 2010 along the pipeline corridor next to the dump site. Macrozoobenthos and background variables were also collected and compared to the observed CWA levels and predicted potential risks. Detection frequencies and levels of intact CWA found were low, whereas CWA metabolites were more frequently found. Re-suspension of CWA residue-containing sediment from installation of the pipelines contributes marginally to the overall background CWA residue exposure and risk along the pipeline route. The multivariate weight-of-evidence analysis showed that physical and background parameters of the sediment were of higher importance for the biota than observed CWA levels. Copyright © 2012 Elsevier B.V. All rights reserved.
Code of Federal Regulations, 2014 CFR
2014-10-01
...) (Propane, butane, Natural Gas Liquid (NGL), ammonia) Highly toxic (Benzene, high Hydrogen Sulfide content... Hazardous Liquid and Carbon Dioxide Pipelines B Appendix B to Part 195 Transportation Other Regulations... OF TRANSPORTATION (CONTINUED) PIPELINE SAFETY TRANSPORTATION OF HAZARDOUS LIQUIDS BY PIPELINE Pt. 195...
Code of Federal Regulations, 2012 CFR
2012-10-01
...) (Propane, butane, Natural Gas Liquid (NGL), ammonia) Highly toxic (Benzene, high Hydrogen Sulfide content... Hazardous Liquid and Carbon Dioxide Pipelines B Appendix B to Part 195 Transportation Other Regulations... OF TRANSPORTATION (CONTINUED) PIPELINE SAFETY TRANSPORTATION OF HAZARDOUS LIQUIDS BY PIPELINE Pt. 195...
Code of Federal Regulations, 2011 CFR
2011-10-01
...) (Propane, butane, Natural Gas Liquid (NGL), ammonia) Highly toxic (Benzene, high Hydrogen Sulfide content... Hazardous Liquid and Carbon Dioxide Pipelines B Appendix B to Part 195 Transportation Other Regulations... OF TRANSPORTATION (CONTINUED) PIPELINE SAFETY TRANSPORTATION OF HAZARDOUS LIQUIDS BY PIPELINE Pt. 195...
Code of Federal Regulations, 2013 CFR
2013-10-01
...) (Propane, butane, Natural Gas Liquid (NGL), ammonia) Highly toxic (Benzene, high Hydrogen Sulfide content... Hazardous Liquid and Carbon Dioxide Pipelines B Appendix B to Part 195 Transportation Other Regulations... OF TRANSPORTATION (CONTINUED) PIPELINE SAFETY TRANSPORTATION OF HAZARDOUS LIQUIDS BY PIPELINE Pt. 195...
Code of Federal Regulations, 2010 CFR
2010-10-01
...) (Propane, butane, Natural Gas Liquid (NGL), ammonia) Highly toxic (Benzene, high Hydrogen Sulfide content... Hazardous Liquid and Carbon Dioxide Pipelines B Appendix B to Part 195 Transportation Other Regulations... OF TRANSPORTATION (CONTINUED) PIPELINE SAFETY TRANSPORTATION OF HAZARDOUS LIQUIDS BY PIPELINE Pt. 195...
2014-06-01
SCADA / ICS Cyber Test Lab initiated in 2013 Psychosocial – academic research exists,; opportunity for sharing and developing impact assessment...ecosystems and species at risk), accidents / system failure (rail; pipelines ; ferries CSSP strategy for the North Focus on regional l(and local) problem...Guidance; business planning; environmental scan; proposal evaluation; and performance measurement Program Risk Management – Guidelines for project
Caspian games: A dynamic bargaining game
NASA Astrophysics Data System (ADS)
Michaud, Dennis Wright
This Dissertation was written under the direction of Professor P.Terrence Hopmann. In this work, the author seeks to identify the independent variables affecting the outcome of three key decisions required of the international consortiums constructing Caspian oil export pipelines. The first of involves whether or not the enterprises developing the pipelines to export Kazakh oil, the Caspian Pipeline Consortium ("CPC"), and Azeri oil, the Azerbaijan International Operating Consortium ("CPC"), cooperate by utilizing the same route or utilize separate energy export corridors. Second, I analyzed how the actual Main Export Pipeline route ("MEP") for Azeri oil was selected by the AIOC. Finally, I tried to understand the factors driving the residual equity positions in each consortium. I was particularly interested in the equity position of Russian state and commercial interests in each consortium. I approached the puzzle as a multilevel bargaining problem. Hence, the preferences of each relevant actor (state and corporate levels) were assessed. The covering theory utilized was rational choice. An application of game theoretic modeling, particularly Bayesian analysis (used as a metaphor), accounted for the learning process resulting from the strategic interaction between actors. I sought to understand greater the refinement of each actor's perception of counterpart preferences. Additionally, the Gordon Constant Growth Model ("CGM") and the Sharp's Capital Asset Pricing Model ("CAPM") were utilized to relate multinational actors preferences, achieving a cost of capital based hurdle rate, to political risk. My end findings demonstrate this interrelationship and provide a clear argument for great power states to persuade newly developing Caspian states to adopt a more transparent, and credible approach to corporate governance. This revised state strategy will reduce multinationals' perception of political risk, lower firms' cost of capital (hurdle rate), and increase the funding of major energy development projects, which will stimulate economic and political development.
Use of FBG sensors for health monitoring of pipelines
NASA Astrophysics Data System (ADS)
Felli, Ferdinando; Paolozzi, Antonio; Vendittozzi, Cristian; Paris, Claudio; Asanuma, Hiroshi
2016-04-01
The infrastructures for oil and gas production and distribution need reliable monitoring systems. The risks for pipelines, in particular, are not only limited to natural disasters (landslides, earthquakes, extreme environmental conditions) and accidents, but involve also the damages related to criminal activities, such as oil theft. The existing monitoring systems are not adequate for detecting damages from oil theft, and in several occasion the illegal activities resulted in leakage of oil and catastrophic environmental pollution. Systems based on fiber optic FBG (Fiber Bragg Grating) sensors present a number of advantages for pipeline monitoring. FBG sensors can withstand harsh environment, are immune to interferences, and can be used to develop a smart system for monitoring at the same time several physical characteristics, such as strain, temperature, acceleration, pressure, and vibrations. The monitoring station can be positioned tens of kilometers away from the measuring points, lowering the costs and the complexity of the system. This paper describes tests on a sensor, based on FBG technology, developed specifically for detecting damages of pipeline due to illegal activities (drilling of the pipes), that can be integrated into a smart monitoring chain.
NASA Astrophysics Data System (ADS)
Branch, B. D.; Raskin, R. G.; Rock, B.; Gagnon, M.; Lecompte, M. A.; Hayden, L. B.
2009-12-01
With the nation challenged to comply with Executive Order 12906 and its needs to augment the Science, Technology, Engineering and Mathematics (STEM) pipeline, applied focus on geosciences pipelines issue may be at risk. The Geosciences pipeline may require intentional K-12 standard course of study consideration in the form of project based, science based and evidenced based learning. Thus, the K-12 to geosciences to informatics pipeline may benefit from an earth science experience that utilizes a community based “learning by doing” approach. Terms such as Community GIS, Community Remotes Sensing, and Community Based Ontology development are termed Community Informatics. Here, approaches of interdisciplinary work to promote and earth science literacy are affordable, consisting of low cost equipment that renders GIS/remote sensing data processing skills necessary in the workforce. Hence, informal community ontology development may evolve or mature from a local community towards formal scientific community collaboration. Such consideration may become a means to engage educational policy towards earth science paradigms and needs, specifically linking synergy among Math, Computer Science, and Earth Science disciplines.
Building a virtual ligand screening pipeline using free software: a survey.
Glaab, Enrico
2016-03-01
Virtual screening, the search for bioactive compounds via computational methods, provides a wide range of opportunities to speed up drug development and reduce the associated risks and costs. While virtual screening is already a standard practice in pharmaceutical companies, its applications in preclinical academic research still remain under-exploited, in spite of an increasing availability of dedicated free databases and software tools. In this survey, an overview of recent developments in this field is presented, focusing on free software and data repositories for screening as alternatives to their commercial counterparts, and outlining how available resources can be interlinked into a comprehensive virtual screening pipeline using typical academic computing facilities. Finally, to facilitate the set-up of corresponding pipelines, a downloadable software system is provided, using platform virtualization to integrate pre-installed screening tools and scripts for reproducible application across different operating systems. © The Author 2015. Published by Oxford University Press.
Building a virtual ligand screening pipeline using free software: a survey
2016-01-01
Virtual screening, the search for bioactive compounds via computational methods, provides a wide range of opportunities to speed up drug development and reduce the associated risks and costs. While virtual screening is already a standard practice in pharmaceutical companies, its applications in preclinical academic research still remain under-exploited, in spite of an increasing availability of dedicated free databases and software tools. In this survey, an overview of recent developments in this field is presented, focusing on free software and data repositories for screening as alternatives to their commercial counterparts, and outlining how available resources can be interlinked into a comprehensive virtual screening pipeline using typical academic computing facilities. Finally, to facilitate the set-up of corresponding pipelines, a downloadable software system is provided, using platform virtualization to integrate pre-installed screening tools and scripts for reproducible application across different operating systems. PMID:26094053
Oil pipeline geohazard monitoring using optical fiber FBG strain sensors (Conference Presentation)
NASA Astrophysics Data System (ADS)
Salazar-Ferro, Andres; Mendez, Alexis
2016-04-01
Pipelines are naturally vulnerable to operational, environmental and man-made effects such as internal erosion and corrosion; mechanical deformation due to geophysical risks and ground movements; leaks from neglect and vandalism; as well as encroachments from nearby excavations or illegal intrusions. The actual detection and localization of incipient and advanced faults in pipelines is a very difficult, expensive and inexact task. Anything that operators can do to mitigate the effects of these faults will provide increased reliability, reduced downtime and maintenance costs, as well as increased revenues. This talk will review the on-line monitoring of an extensive network of oil pipelines in service in Colombia using optical fiber Bragg grating (FBG) strain sensors for the measurement of strains and bending caused by geohazard risks such as soil movements, landslides, settlements, flooding and seismic activity. The FBG sensors were mounted on the outside of the pipelines at discrete locations where geohazard risk was expected. The system has been in service for the past 3 years with over 1,000 strain sensors mounted. The technique has been reliable and effective in giving advanced warning of accumulated pipeline strains as well as possible ruptures.
Code of Federal Regulations, 2014 CFR
2014-10-01
... Pressure Testing § 195.303 Risk-based alternative to pressure testing older hazardous liquid and carbon... 49 Transportation 3 2014-10-01 2014-10-01 false Risk-based alternative to pressure testing older hazardous liquid and carbon dioxide pipelines. 195.303 Section 195.303 Transportation Other Regulations...
Code of Federal Regulations, 2013 CFR
2013-10-01
... Pressure Testing § 195.303 Risk-based alternative to pressure testing older hazardous liquid and carbon... 49 Transportation 3 2013-10-01 2013-10-01 false Risk-based alternative to pressure testing older hazardous liquid and carbon dioxide pipelines. 195.303 Section 195.303 Transportation Other Regulations...
Code of Federal Regulations, 2012 CFR
2012-10-01
... Pressure Testing § 195.303 Risk-based alternative to pressure testing older hazardous liquid and carbon... 49 Transportation 3 2012-10-01 2012-10-01 false Risk-based alternative to pressure testing older hazardous liquid and carbon dioxide pipelines. 195.303 Section 195.303 Transportation Other Regulations...
49 CFR 190.239 - Safety orders.
Code of Federal Regulations, 2011 CFR
2011-10-01
... pipeline integrity risk to public safety, property, or the environment, the Associate Administrator may... existence of a condition that poses a pipeline integrity risk to public safety, property, or the environment... public safety, property, or the environment. (5) Post-hearing action. Following a hearing under this...
Nelson, W G; Wilding, G
2001-04-01
Epidemiologic data suggest that prostate cancer morbidity and mortality ought to be preventable. New insights into the molecular pathogenesis of prostate cancer offer new opportunities for the discovery of prostate cancer chemoprevention drugs and new challenges for their development. Established pathways that lead to US Food and Drug Administration (FDA) approval of drugs for advanced prostate cancer may not be appropriate for the development of drugs for prostate cancer chemoprevention. For example, large randomized clinical trials designed to test the efficacy of new chemoprevention drugs on prostate cancer survival in the general population are likely to be conducted at great expense and take many years, threatening to increase commercial development risks while decreasing exclusive marketing revenues. As a consequence, to accelerate progress in research, new validated surrogate and strategic clinical trial endpoints, and new clinical trial designs featuring more precisely defined high-risk clinical trial cohorts, are needed. In this review, 10 criteria for prostate cancer chemoprevention agent development are offered and the pipeline of new prostate cancer chemoprevention drug candidates is considered.
Development and Applications of Pipeline Steel in Long-Distance Gas Pipeline of China
NASA Astrophysics Data System (ADS)
Chunyong, Huo; Yang, Li; Lingkang, Ji
In past decades, with widely utilizing of Microalloying and Thermal Mechanical Control Processing (TMCP) technology, the good matching of strength, toughness, plasticity and weldability on pipeline steel has been reached so that oil and gas pipeline has been greatly developed in China to meet the demand of strong domestic consumption of energy. In this paper, development history of pipeline steel and gas pipeline in china is briefly reviewed. The microstructure characteristic and mechanical performance of pipeline steel used in some representative gas pipelines of china built in different stage are summarized. Through the analysis on the evolution of pipeline service environment, some prospective development trend of application of pipeline steel in China is also presented.
World Bank oil-pipeline project designed to prevent HIV transmission.
Kigotho, A W
1997-11-29
A World Bank-funded oil pipeline project, in Chad and Cameroon, is the first large-scale construction project in sub-Saharan Africa to incorporate an HIV/AIDS prevention component. The project entails the development of oil fields in southern Chad and construction of 1100 km of pipeline to port facilities on Cameroon's Atlantic coast. 3000 construction workers from the two countries will be employed between 1998 and 2001, including about 600 truck drivers. In some areas along the pipeline route, 50% of the prostitutes (who are frequented by truck drivers) are HIV-infected. The HIV/AIDS intervention aims to prevent HIV and sexually transmitted diseases (STDs) among project workers through social marketing of condoms, treatment of STDs in prostitutes along the route, and health education to modify high-risk behaviors. The program is considered a test case for African governments and donors interested in whether the integration of a health component in major construction projects can avoid AIDS epidemics in affected countries.
NASA Astrophysics Data System (ADS)
Jurman, Elisabeth Antonie
1997-08-01
The natural gas shortages in the 1970s focused considerable attention on the federal government's role in altering energy consumption. For the natural gas industry these shortages eventually led to the passage of the Natural Gas Policy Act (NGPA) in 1978 as part of the National Energy Plan. A series of events in the decade of the 1980s has brought about the restructuring of interstate natural gas pipelines which have been transformed by regulators and the courts from monopolies into competitive entities. This transformation also changed their relationship with their downstream customers, the LDCs, who no longer had to deal with pipelines as the only merchants of gas. Regulatory reform made it possible for LDCs to buy directly from producers using the pipelines only for delivery of their purchases. This study tests for the existence of monopoly rents by analyzing the daily returns of natural gas pipeline and utility industry stock price data from 1982 to 1990, a period of regulatory reform for the natural gas industry. The study's main objective is to investigate the degree of empirical support for claims that regulatory reforms increase profits in the affected industry, as the normative theory of regulation expects, or decrease profits, as advocates of the positive theory of regulation believe. I also test Norton's theory of risk which predicts that systematic risk will increase for firms undergoing deregulation. Based on a sample of twelve natural gas pipelines, and 25 utilities an event study concept was employed to measure the impact of regulatory event announcements on daily natural gas pipeline or utility industry stock price data using a market model regression equation. The results of this study provide some evidence that regulatory reforms did not increase the profits of pipeline firms, confirming the expectations of those who claim that excess profits result from regulation and will disappear, once that protection is removed and the firms are operating in competitive markets. The study's empirical findings support the claims of Norton's risk theory that systematic risk is higher in unregulated firms.
Creation and Implementation of a Workforce Development Pipeline Program at MSFC
NASA Technical Reports Server (NTRS)
Hix, Billy
2003-01-01
Within the context of NASA's Education Programs, this Workforce Development Pipeline guide describes the goals and objectives of MSFC's Workforce Development Pipeline Program as well as the principles and strategies for guiding implementation. It is designed to support the initiatives described in the NASA Implementation Plan for Education, 1999-2003 (EP-1998-12-383-HQ) and represents the vision of the members of the Education Programs office at MSFC. This document: 1) Outlines NASA s Contribution to National Priorities; 2) Sets the context for the Workforce Development Pipeline Program; 3) Describes Workforce Development Pipeline Program Strategies; 4) Articulates the Workforce Development Pipeline Program Goals and Aims; 5) List the actions to build a unified approach; 6) Outlines the Workforce Development Pipeline Programs guiding Principles; and 7) The results of implementation.
Pipeline repair development in support of the Oman to India gas pipeline
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abadie, W.; Carlson, W.
1995-12-01
This paper provides a summary of development which has been conducted to date for the ultra deep, diverless pipeline repair system for the proposed Oman to India Gas Pipeline. The work has addressed critical development areas involving testing and/or prototype development of tools and procedures required to perform a diverless pipeline repair in water depths of up to 3,525 m.
Code of Federal Regulations, 2013 CFR
2013-10-01
... reasonable to foresee fault currents or an unusual risk of lightning, you must protect the pipeline against... metallic structures, unless you electrically interconnect and cathodically protect the pipeline and the... isolation of a portion of a pipeline is necessary to facilitate the application of corrosion control. (c...
Code of Federal Regulations, 2014 CFR
2014-10-01
... reasonable to foresee fault currents or an unusual risk of lightning, you must protect the pipeline against... metallic structures, unless you electrically interconnect and cathodically protect the pipeline and the... isolation of a portion of a pipeline is necessary to facilitate the application of corrosion control. (c...
Code of Federal Regulations, 2012 CFR
2012-10-01
... reasonable to foresee fault currents or an unusual risk of lightning, you must protect the pipeline against... metallic structures, unless you electrically interconnect and cathodically protect the pipeline and the... isolation of a portion of a pipeline is necessary to facilitate the application of corrosion control. (c...
Geohazard assessment lifecycle for a natural gas pipeline project
NASA Astrophysics Data System (ADS)
Lekkakis, D.; Boone, M. D.; Strassburger, E.; Li, Z.; Duffy, W. P.
2015-09-01
This paper is a walkthrough of the geohazard risk assessment performed for the Front End Engineering Design (FEED) of a planned large-diameter natural gas pipeline, extending from Eastern Europe to Western Asia for a total length of approximately 1,850 km. The geohazards discussed herein include liquefaction-induced pipe buoyancy, cyclic softening, lateral spreading, slope instability, groundwater rise-induced pipe buoyancy, and karst. The geohazard risk assessment lifecycle was comprised of 4 stages: initially a desktop study was carried out to describe the geologic setting along the alignment and to conduct a preliminary assessment of the geohazards. The development of a comprehensive Digital Terrain Model topography and aerial photography data were fundamental in this process. Subsequently, field geohazard mapping was conducted with the deployment of 8 teams of geoprofessionals, to investigate the proposed major reroutes and delve into areas of poor or questionable data. During the third stage, a geotechnical subsurface site investigation was then executed based on the results of the above study and mapping efforts in order to obtain sufficient data tailored for risk quantification. Lastly, all gathered and processed information was overlain into a Geographical Information database towards a final determination of the critical reaches of the pipeline alignment. Input from Subject Matter Experts (SME) in the fields of landslides, karst and fluvial geomorphology was incorporated during the second and fourth stages of the assessment. Their experience in that particular geographical region was key to making appropriate decisions based on engineering judgment. As the design evolved through the above stages, the pipeline corridor was narrowed from a 2-km wide corridor, to a 500-m corridor and finally to a fixed alignment. Where the geohazard risk was high, rerouting of the pipeline was generally selected as a mitigation measure. In some cases of high uncertainty in the assessment, further exploration was proposed. In cases where rerouting was constrained, mitigation via structural measures was proposed. This paper further discusses the cost, schedule and resource challenges of planning and executing such a large-scale geotechnical investigation, the interfaces between the various disciplines involved during the assessment, the innovative tools employed for the field mapping, the classifications developed for mapping landslides, karst geology, and trench excavatability, determining liquefaction stretches and the process for the site localization of the Above Ground Installations (AGI). It finally discusses the objectives of the FEED study in terms of providing a route, a ± 20% project cost estimate and a schedule, and the additional engineering work foreseen to take place in the detailed engineering phase of the project.
2009-01-01
preliminary assessment , that no longer pose a significant risk or require further activity under CERCLA. Resource Conservation and Recovery Act...facts regarding, or prediction or forecast of, any environmental risk for any property. Only a Phase I Environmental Site Assessment performed by an... Assessments (E 1527-05) or custom requirements developed for the evaluation of environmental risk associated with a parcel of real estate. TARGET PROPERTY
NASA Technical Reports Server (NTRS)
Jenkins, Jon M.
2017-01-01
The experience acquired through development, implementation and operation of the KeplerK2 science pipelines can provide lessons learned for the development of science pipelines for other missions such as NASA's Transiting Exoplanet Survey Satellite, and ESA's PLATO mission.
Oil and gas pipeline construction cost analysis and developing regression models for cost estimation
NASA Astrophysics Data System (ADS)
Thaduri, Ravi Kiran
In this study, cost data for 180 pipelines and 136 compressor stations have been analyzed. On the basis of the distribution analysis, regression models have been developed. Material, Labor, ROW and miscellaneous costs make up the total cost of a pipeline construction. The pipelines are analyzed based on different pipeline lengths, diameter, location, pipeline volume and year of completion. In a pipeline construction, labor costs dominate the total costs with a share of about 40%. Multiple non-linear regression models are developed to estimate the component costs of pipelines for various cross-sectional areas, lengths and locations. The Compressor stations are analyzed based on the capacity, year of completion and location. Unlike the pipeline costs, material costs dominate the total costs in the construction of compressor station, with an average share of about 50.6%. Land costs have very little influence on the total costs. Similar regression models are developed to estimate the component costs of compressor station for various capacities and locations.
MetaCompare: A computational pipeline for prioritizing environmental resistome risk.
Oh, Min; Pruden, Amy; Chen, Chaoqi; Heath, Lenwood S; Xia, Kang; Zhang, Liqing
2018-04-26
The spread of antibiotic resistance is a growing public health concern. While numerous studies have highlighted the importance of environmental sources and pathways of the spread of antibiotic resistance, a systematic means of comparing and prioritizing risks represented by various environmental compartments is lacking. Here we introduce MetaCompare, a publicly-available tool for ranking 'resistome risk,' which we define as the potential for antibiotic resistance genes (ARGs) to be associated with mobile genetic elements (MGEs) and mobilize to pathogens based on metagenomic data. A computational pipeline was developed in which each ARG is evaluated based on relative abundance, mobility, and presence within a pathogen. This is determined through assembly of shotgun sequencing data and analysis of contigs containing ARGs to determine if they contain sequence similarity to MGEs or human pathogens. Based on the assembled metagenomes, samples are projected into a 3-D hazard space and assigned resistome risk scores. To validate, we tested previously published metagenomic data derived from distinct aquatic environments. Based on unsupervised machine learning, the test samples clustered in the hazard space in a manner consistent with their origin. The derived scores produced a well-resolved ascending resistome risk ranking of: wastewater treatment plant effluent, dairy lagoon, hospital sewage.
NASA Astrophysics Data System (ADS)
Duan, Yanzhi
2017-01-01
The gas pipeline networks in Sichuan and Chongqing (Sichuan-Chongqing) region have formed a fully-fledged gas pipeline transportation system in China, which supports and promotes the rapid development of gas market in Sichuan-Chongqing region. In the circumstances of further developed market-oriented economy, it is necessary to carry out further the pipeline system reform in the areas of investment/financing system, operation system and pricing system to lay a solid foundation for improving future gas production and marketing capability and adapting itself to the national gas system reform, and to achieve the objectives of multiparty participated pipeline construction, improved pipeline transportation efficiency and fair and rational pipeline transportation prices. In this article, main thinking on reform in the three areas and major deployment are addressed, and corresponding measures on developing shared pipeline economy, providing financial support to pipeline construction, setting up independent regulatory agency to enhance the industrial supervision for gas pipeline transportation, and promoting the construction of regional gas trade market are recommended.
Li, Jun; Zhang, Hong; Han, Yinshan; Wang, Baodong
2016-01-01
Focusing on the diversity, complexity and uncertainty of the third-party damage accident, the failure probability of third-party damage to urban gas pipeline was evaluated on the theory of analytic hierarchy process and fuzzy mathematics. The fault tree of third-party damage containing 56 basic events was built by hazard identification of third-party damage. The fuzzy evaluation of basic event probabilities were conducted by the expert judgment method and using membership function of fuzzy set. The determination of the weight of each expert and the modification of the evaluation opinions were accomplished using the improved analytic hierarchy process, and the failure possibility of the third-party to urban gas pipeline was calculated. Taking gas pipelines of a certain large provincial capital city as an example, the risk assessment structure of the method was proved to conform to the actual situation, which provides the basis for the safety risk prevention.
Maloney, Kelly O.; Young, John A.; Faulkner, Stephen; Hailegiorgis, Atesmachew; Slonecker, E. Terrence; Milheim, Lesley
2018-01-01
The development of unconventional oil and gas (UOG) involves infrastructure development (well pads, roads and pipelines), well drilling and stimulation (hydraulic fracturing), and production; all of which have the potential to affect stream ecosystems. Here, we developed a fine-scaled (1:24,000) catchment-level disturbance intensity index (DII) that included 17 measures of UOG capturing all steps in the development process (infrastructure, water withdrawals, probabilistic spills) that could affect headwater streams (< 200 km2 in upstream catchment) in the Upper Susquehanna River Basin in Pennsylvania, U.S.A. The DII ranged from 0 (no UOG disturbance) to 100 (the catchment with the highest UOG disturbance in the study area) and it was most sensitive to removal of pipeline cover, road cover and well pad cover metrics. We related this DII to three measures of high quality streams: Pennsylvania State Exceptional Value (EV) streams, Class A brook trout streams and Eastern Brook Trout Joint Venture brook trout patches. Overall only 3.8% of all catchments and 2.7% of EV stream length, 1.9% of Class A streams and 1.2% of patches were classified as having medium to high level DII scores (> 50). Well density, often used as a proxy for development, only correlated strongly with well pad coverage and produced materials, and therefore may miss potential effects associated with roads and pipelines, water withdrawals and spills. When analyzed with a future development scenario, 91.1% of EV stream length, 68.7% of Class A streams and 80.0% of patches were in catchments with a moderate to high probability of development. Our method incorporated the cumulative effects of UOG on streams and can be used to identify catchments and reaches at risk to existing stressors or future development.
75 FR 63774 - Pipeline Safety: Safety of On-Shore Hazardous Liquid Pipelines
Federal Register 2010, 2011, 2012, 2013, 2014
2010-10-18
... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration 49 CFR Part... Pipelines AGENCY: Pipeline and Hazardous Materials Safety Administration (PHMSA), Department of... Gas Pipeline Safety Act of 1968, Public Law 90-481, delegated to DOT the authority to develop...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-02-02
... Management Program for Gas Distribution Pipelines; Correction AGENCY: Pipeline and Hazardous Materials Safety... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration 49 CFR Part... Regulations to require operators of gas distribution pipelines to develop and implement integrity management...
State of art of seismic design and seismic hazard analysis for oil and gas pipeline system
NASA Astrophysics Data System (ADS)
Liu, Aiwen; Chen, Kun; Wu, Jian
2010-06-01
The purpose of this paper is to adopt the uniform confidence method in both water pipeline design and oil-gas pipeline design. Based on the importance of pipeline and consequence of its failure, oil and gas pipeline can be classified into three pipe classes, with exceeding probabilities over 50 years of 2%, 5% and 10%, respectively. Performance-based design requires more information about ground motion, which should be obtained by evaluating seismic safety for pipeline engineering site. Different from a city’s water pipeline network, the long-distance oil and gas pipeline system is a spatially linearly distributed system. For the uniform confidence of seismic safety, a long-distance oil and pipeline formed with pump stations and different-class pipe segments should be considered as a whole system when analyzing seismic risk. Considering the uncertainty of earthquake magnitude, the design-basis fault displacements corresponding to the different pipeline classes are proposed to improve deterministic seismic hazard analysis (DSHA). A new empirical relationship between the maximum fault displacement and the surface-wave magnitude is obtained with the supplemented earthquake data in East Asia. The estimation of fault displacement for a refined oil pipeline in Wenchuan M S8.0 earthquake is introduced as an example in this paper.
Physical and numerical modeling of hydrophysical proceses on the site of underwater pipelines
NASA Astrophysics Data System (ADS)
Garmakova, M. E.; Degtyarev, V. V.; Fedorova, N. N.; Shlychkov, V. A.
2018-03-01
The paper outlines issues related to ensuring the exploitation safety of underwater pipelines that are at risk of accidents. The performed research is based on physical and mathematical modeling of local bottom erosion in the area of pipeline location. The experimental studies were performed on the basis of the Hydraulics Laboratory of the Department of Hydraulic Engineering Construction, Safety and Ecology of NSUACE (Sibstrin). In the course of physical experiments it was revealed that the intensity of the bottom soil reforming depends on the deepening of the pipeline. The ANSYS software has been used for numerical modeling. The process of erosion of the sandy bottom was modeled under the pipeline. Comparison of computational results at various mass flow rates was made.
Tsunami Early Warning via a Physics-Based Simulation Pipeline
NASA Astrophysics Data System (ADS)
Wilson, J. M.; Rundle, J. B.; Donnellan, A.; Ward, S. N.; Komjathy, A.
2017-12-01
Through independent efforts, physics-based simulations of earthquakes, tsunamis, and atmospheric signatures of these phenomenon have been developed. With the goal of producing tsunami forecasts and early warning tools for at-risk regions, we join these three spheres to create a simulation pipeline. The Virtual Quake simulator can produce thousands of years of synthetic seismicity on large, complex fault geometries, as well as the expected surface displacement in tsunamigenic regions. These displacements are used as initial conditions for tsunami simulators, such as Tsunami Squares, to produce catalogs of potential tsunami scenarios with probabilities. Finally, these tsunami scenarios can act as input for simulations of associated ionospheric total electron content, signals which can be detected by GNSS satellites for purposes of early warning in the event of a real tsunami. We present the most recent developments in this project.
77 FR 51848 - Pipeline Safety: Information Collection Activities
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-27
... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket No... Program for Gas Distribution Pipelines. DATES: Interested persons are invited to submit comments on or.... These regulations require operators of hazardous liquid pipelines and gas pipelines to develop and...
77 FR 74275 - Pipeline Safety: Information Collection Activities
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-13
... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket No.... These regulations require operators of hazardous liquid pipelines and gas pipelines to develop and... control room. Affected Public: Operators of both natural gas and hazardous liquid pipeline systems. Annual...
NASA Astrophysics Data System (ADS)
Leporini, M.; Terenzi, A.; Marchetti, B.; Giacchetta, G.; Polonara, F.; Corvaro, F.; Cocci Grifoni, R.
2017-11-01
Pipelining Liquefied Petroleum Gas (LPG) is a mode of LPG transportation more environmentally-friendly than others due to the lower energy consumption and exhaust emissions. Worldwide, there are over 20000 kilometers of LPG pipelines. There are a number of codes that industry follows for the design, fabrication, construction and operation of liquid LPG pipelines. However, no standards exist to modelling particular critical phenomena which can occur on these lines due to external environmental conditions like the solar radiation pressurization. In fact, the solar radiation can expose above ground pipeline sections at pressure values above the maximum Design Pressure with resulting risks and problems. The present work presents an innovative practice suitable for the Oil & Gas industry to modelling the pressurization induced by the solar radiation on above ground LPG pipeline sections with the application to a real case.
NASA Astrophysics Data System (ADS)
Vetrov, A.
2009-05-01
The condition of underground constructions, communication and supply systems in the cities has to be periodically monitored and controlled in order to prevent their breakage, which can result in serious accident, especially in urban area. The most risk of damage have the underground construction made of steal such as pipelines widely used for water, gas and heat supply. To ensure the pipeline survivability it is necessary to carry out the operative and inexpensive control of pipelines condition. Induced electromagnetic methods of geophysics can be applied to provide such diagnostics. The highly developed surface in urbane area is one of cause hampering the realization of electromagnetic methods of diagnostics. The main problem is in finding of an appropriate place for the source line and electrodes on a limited surface area and their optimal position relative to the observation path to minimize their influence on observed data. Author made a number of experiments of an underground heating system pipeline diagnostics using different position of the source line and electrodes. The experiments were made on a 200 meters section over 2 meters deep pipeline. The admissible length of the source line and angle between the source line and the observation path were determined. The minimal length of the source line for the experiment conditions and accuracy made 30 meters, the maximum admissible angle departure from the perpendicular position made 30 degrees. The work was undertaken in cooperation with diagnostics company DIsSO, Saint-Petersburg, Russia.
Li, Jun; Zhang, Hong; Han, Yinshan; Wang, Baodong
2016-01-01
Focusing on the diversity, complexity and uncertainty of the third-party damage accident, the failure probability of third-party damage to urban gas pipeline was evaluated on the theory of analytic hierarchy process and fuzzy mathematics. The fault tree of third-party damage containing 56 basic events was built by hazard identification of third-party damage. The fuzzy evaluation of basic event probabilities were conducted by the expert judgment method and using membership function of fuzzy set. The determination of the weight of each expert and the modification of the evaluation opinions were accomplished using the improved analytic hierarchy process, and the failure possibility of the third-party to urban gas pipeline was calculated. Taking gas pipelines of a certain large provincial capital city as an example, the risk assessment structure of the method was proved to conform to the actual situation, which provides the basis for the safety risk prevention. PMID:27875545
Korneeva, Ia A; Simonova, N N
2015-01-01
The article is devoted to the study of character accentuations as a criterion for psychological risks in the professional activity of builders of main gas pipelines in the conditions of Arctic. to study the severity of character accentuations in rotation-employed builders of main gas pipelines, stipulated by their professional activities, as well as personal resources to overcome these destructions. The study involved 70 rotation-employed builders of trunk pipelines, working in the Tyumen Region (duration of the shift-in--52 days), aged from 23 to 59 (mean age 34,9 ± 8.1) years, with the experience of work from 0.5 years to 14 years (the average length of 4.42 ± 3.1). Methods of the study: questionnaires, psychological testing, participant observation. One-Sample t-test of Student, multiple regression analysis, incremental analysis. In the work there were revealed differences of expression of character accentuations in builders of trunk pipelines with experience in work on rotation less and more than five years. There was determined that builders of the main gas pipelines, working on the rotation in Arctic, with more pronounced accentuation ofthe character use mainly psychological defenses of compensation, substitution and denial, and have an average level of expression of flexibility as the regulatory process.
Implications of deregulation in natural gas industry on utility risks and returns
NASA Astrophysics Data System (ADS)
Addepalli, Rajendra P.
This thesis examines the changes in risk and required return on capital for local distribution utility companies in the increasingly competitive natural gas industry. The deregulation in the industry impacts the LDCs in several ways. First, with the introduction of competition consumers have been given choices among suppliers besides the traditional monopoly, the local utility, for purchasing their natural gas supply needs. Second, with the introduction of competition, some of the interstate pipelines were stuck with 'Take Or Pay' contracts and other costs that resulted in 'stranded costs', which have been passed on to customers of the pipeline including the LDCs. Third, the new obligation for the LDCs to purchase gas from the market, as opposed to buying it from pipelines and passing on the costs to its customers, brought opportunities and risks as well. Finally, with the introduction of competition, in some states LDCs have been allowed to enter into unregulated ventures to increase their profits. In the thesis we first develop a multifactor model (MFM) to explain historical common stock returns of individual utilities and of utility portfolios. We use 'rolling regression' analysis to analyze how different variables explain the variation in stock returns over time. Second, we conduct event studies to analyze the events in the deregulation process that had significant impacts on the LDC returns. Finally we assess the changes in risk and required return on capital for the LDCs over a 15 year time frame, covering the deregulation period. We employ four aspects in the examination of risk and return profile of the utilities: measuring (a) changes in required return on common equity and Weighted Average Cost of Capital, (b) changes in risk premium (WACC less an interest rate proxy), (c) changes in utility bond ratings, and (d) changes in dividend payments, new debt and equity issuances. We perform regression analysis to explain the changes in the required WACC using new security issuances, dividend payments and revenues of the companies.
PipelineDog: a simple and flexible graphic pipeline construction and maintenance tool.
Zhou, Anbo; Zhang, Yeting; Sun, Yazhou; Xing, Jinchuan
2018-05-01
Analysis pipelines are an essential part of bioinformatics research, and ad hoc pipelines are frequently created by researchers for prototyping and proof-of-concept purposes. However, most existing pipeline management system or workflow engines are too complex for rapid prototyping or learning the pipeline concept. A lightweight, user-friendly and flexible solution is thus desirable. In this study, we developed a new pipeline construction and maintenance tool, PipelineDog. This is a web-based integrated development environment with a modern web graphical user interface. It offers cross-platform compatibility, project management capabilities, code formatting and error checking functions and an online repository. It uses an easy-to-read/write script system that encourages code reuse. With the online repository, it also encourages sharing of pipelines, which enhances analysis reproducibility and accountability. For most users, PipelineDog requires no software installation. Overall, this web application provides a way to rapidly create and easily manage pipelines. PipelineDog web app is freely available at http://web.pipeline.dog. The command line version is available at http://www.npmjs.com/package/pipelinedog and online repository at http://repo.pipeline.dog. ysun@kean.edu or xing@biology.rutgers.edu or ysun@diagnoa.com. Supplementary data are available at Bioinformatics online.
Development of Protective Coatings for Co-Sequestration Processes and Pipelines
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bierwagen, Gordon; Huang, Yaping
2011-11-30
The program, entitled Development of Protective Coatings for Co-Sequestration Processes and Pipelines, examined the sensitivity of existing coating systems to supercritical carbon dioxide (SCCO2) exposure and developed new coating system to protect pipelines from their corrosion under SCCO2 exposure. A literature review was also conducted regarding pipeline corrosion sensors to monitor pipes used in handling co-sequestration fluids. Research was to ensure safety and reliability for a pipeline involving transport of SCCO2 from the power plant to the sequestration site to mitigate the greenhouse gas effect. Results showed that one commercial coating and one designed formulation can both be supplied asmore » potential candidates for internal pipeline coating to transport SCCO2.« less
The Hyper Suprime-Cam software pipeline
NASA Astrophysics Data System (ADS)
Bosch, James; Armstrong, Robert; Bickerton, Steven; Furusawa, Hisanori; Ikeda, Hiroyuki; Koike, Michitaro; Lupton, Robert; Mineo, Sogo; Price, Paul; Takata, Tadafumi; Tanaka, Masayuki; Yasuda, Naoki; AlSayyad, Yusra; Becker, Andrew C.; Coulton, William; Coupon, Jean; Garmilla, Jose; Huang, Song; Krughoff, K. Simon; Lang, Dustin; Leauthaud, Alexie; Lim, Kian-Tat; Lust, Nate B.; MacArthur, Lauren A.; Mandelbaum, Rachel; Miyatake, Hironao; Miyazaki, Satoshi; Murata, Ryoma; More, Surhud; Okura, Yuki; Owen, Russell; Swinbank, John D.; Strauss, Michael A.; Yamada, Yoshihiko; Yamanoi, Hitomi
2018-01-01
In this paper, we describe the optical imaging data processing pipeline developed for the Subaru Telescope's Hyper Suprime-Cam (HSC) instrument. The HSC Pipeline builds on the prototype pipeline being developed by the Large Synoptic Survey Telescope's Data Management system, adding customizations for HSC, large-scale processing capabilities, and novel algorithms that have since been reincorporated into the LSST codebase. While designed primarily to reduce HSC Subaru Strategic Program (SSP) data, it is also the recommended pipeline for reducing general-observer HSC data. The HSC pipeline includes high-level processing steps that generate coadded images and science-ready catalogs as well as low-level detrending and image characterizations.
An integrated SNP mining and utilization (ISMU) pipeline for next generation sequencing data.
Azam, Sarwar; Rathore, Abhishek; Shah, Trushar M; Telluri, Mohan; Amindala, BhanuPrakash; Ruperao, Pradeep; Katta, Mohan A V S K; Varshney, Rajeev K
2014-01-01
Open source single nucleotide polymorphism (SNP) discovery pipelines for next generation sequencing data commonly requires working knowledge of command line interface, massive computational resources and expertise which is a daunting task for biologists. Further, the SNP information generated may not be readily used for downstream processes such as genotyping. Hence, a comprehensive pipeline has been developed by integrating several open source next generation sequencing (NGS) tools along with a graphical user interface called Integrated SNP Mining and Utilization (ISMU) for SNP discovery and their utilization by developing genotyping assays. The pipeline features functionalities such as pre-processing of raw data, integration of open source alignment tools (Bowtie2, BWA, Maq, NovoAlign and SOAP2), SNP prediction (SAMtools/SOAPsnp/CNS2snp and CbCC) methods and interfaces for developing genotyping assays. The pipeline outputs a list of high quality SNPs between all pairwise combinations of genotypes analyzed, in addition to the reference genome/sequence. Visualization tools (Tablet and Flapjack) integrated into the pipeline enable inspection of the alignment and errors, if any. The pipeline also provides a confidence score or polymorphism information content value with flanking sequences for identified SNPs in standard format required for developing marker genotyping (KASP and Golden Gate) assays. The pipeline enables users to process a range of NGS datasets such as whole genome re-sequencing, restriction site associated DNA sequencing and transcriptome sequencing data at a fast speed. The pipeline is very useful for plant genetics and breeding community with no computational expertise in order to discover SNPs and utilize in genomics, genetics and breeding studies. The pipeline has been parallelized to process huge datasets of next generation sequencing. It has been developed in Java language and is available at http://hpc.icrisat.cgiar.org/ISMU as a standalone free software.
The Hyper Suprime-Cam software pipeline
Bosch, James; Armstrong, Robert; Bickerton, Steven; ...
2017-10-12
Here in this article, we describe the optical imaging data processing pipeline developed for the Subaru Telescope’s Hyper Suprime-Cam (HSC) instrument. The HSC Pipeline builds on the prototype pipeline being developed by the Large Synoptic Survey Telescope’s Data Management system, adding customizations for HSC, large-scale processing capabilities, and novel algorithms that have since been reincorporated into the LSST codebase. While designed primarily to reduce HSC Subaru Strategic Program (SSP) data, it is also the recommended pipeline for reducing general-observer HSC data. The HSC pipeline includes high-level processing steps that generate coadded images and science-ready catalogs as well as low-level detrendingmore » and image characterizations.« less
The Hyper Suprime-Cam software pipeline
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bosch, James; Armstrong, Robert; Bickerton, Steven
Here in this article, we describe the optical imaging data processing pipeline developed for the Subaru Telescope’s Hyper Suprime-Cam (HSC) instrument. The HSC Pipeline builds on the prototype pipeline being developed by the Large Synoptic Survey Telescope’s Data Management system, adding customizations for HSC, large-scale processing capabilities, and novel algorithms that have since been reincorporated into the LSST codebase. While designed primarily to reduce HSC Subaru Strategic Program (SSP) data, it is also the recommended pipeline for reducing general-observer HSC data. The HSC pipeline includes high-level processing steps that generate coadded images and science-ready catalogs as well as low-level detrendingmore » and image characterizations.« less
The Very Large Array Data Processing Pipeline
NASA Astrophysics Data System (ADS)
Kent, Brian R.; Masters, Joseph S.; Chandler, Claire J.; Davis, Lindsey E.; Kern, Jeffrey S.; Ott, Juergen; Schinzel, Frank K.; Medlin, Drew; Muders, Dirk; Williams, Stewart; Geers, Vincent C.; Momjian, Emmanuel; Butler, Bryan J.; Nakazato, Takeshi; Sugimoto, Kanako
2018-01-01
We present the VLA Pipeline, software that is part of the larger pipeline processing framework used for the Karl G. Jansky Very Large Array (VLA), and Atacama Large Millimeter/sub-millimeter Array (ALMA) for both interferometric and single dish observations.Through a collection of base code jointly used by the VLA and ALMA, the pipeline builds a hierarchy of classes to execute individual atomic pipeline tasks within the Common Astronomy Software Applications (CASA) package. Each pipeline task contains heuristics designed by the team to actively decide the best processing path and execution parameters for calibration and imaging. The pipeline code is developed and written in Python and uses a "context" structure for tracking the heuristic decisions and processing results. The pipeline "weblog" acts as the user interface in verifying the quality assurance of each calibration and imaging stage. The majority of VLA scheduling blocks above 1 GHz are now processed with the standard continuum recipe of the pipeline and offer a calibrated measurement set as a basic data product to observatory users. In addition, the pipeline is used for processing data from the VLA Sky Survey (VLASS), a seven year community-driven endeavor started in September 2017 to survey the entire sky down to a declination of -40 degrees at S-band (2-4 GHz). This 5500 hour next-generation large radio survey will explore the time and spectral domains, relying on pipeline processing to generate calibrated measurement sets, polarimetry, and imaging data products that are available to the astronomical community with no proprietary period. Here we present an overview of the pipeline design philosophy, heuristics, and calibration and imaging results produced by the pipeline. Future development will include the testing of spectral line recipes, low signal-to-noise heuristics, and serving as a testing platform for science ready data products.The pipeline is developed as part of the CASA software package by an international consortium of scientists and software developers based at the National Radio Astronomical Observatory (NRAO), the European Southern Observatory (ESO), and the National Astronomical Observatory of Japan (NAOJ).
75 FR 76077 - Pipeline Safety: Information Collection Activities
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-07
.... ADDRESSES: Comments may be submitted in the following ways: E-Gov Web Site: http://www.regulations.gov....regulations.gov , including any personal information provided. You should know that anyone is able to search... meters) deep as measured from mean low water that are at risk of being an exposed underwater pipeline or...
About U.S. Natural Gas Pipelines
2007-01-01
This information product provides the interested reader with a broad and non-technical overview of how the U.S. natural gas pipeline network operates, along with some insights into the many individual pipeline systems that make up the network. While the focus of the presentation is the transportation of natural gas over the interstate and intrastate pipeline systems, information on subjects related to pipeline development, such as system design and pipeline expansion, are also included.
Design and Operation of the World's First Long Distance Bauxite Slurry Pipeline
NASA Astrophysics Data System (ADS)
Gandhi, Ramesh; Weston, Mike; Talavera, Maru; Brittes, Geraldo Pereira; Barbosa, Eder
Mineracão Bauxita Paragominas (MBP) is the first long distance slurry pipeline transporting bauxite slurry. Bauxite had developed a reputation for being difficult to hydraulically transport using long distance pipelines. This myth has now been proven wrong. The 245-km- long, 13.5 MTPY capacity MBP pipeline was designed and commissioned by PSI for CVRD. The pipeline is located in the State of Para, Brazil. The Miltonia bauxite mine is in a remote location with no other efficient means of transport. The bauxite slurry is delivered to Alunorte Alumina refinery located near Barcarena. This first of its kind pipeline required significant development work in order to assure technical and economic feasibility. This paper describes the technical aspects of design of the pipeline. It also summarizes the operating experience gained during the first year of operation.
Whiley, H; Keegan, A; Fallowfield, H; Bentham, R
2015-06-01
Water reuse has become increasingly important for sustainable water management. Currently, its application is primarily constrained by the potential health risks. Presently there is limited knowledge regarding the presence and fate of opportunistic pathogens along reuse water distribution pipelines. In this study opportunistic human pathogens Legionella spp., L. pneumophila and Mycobacterium avium complex were detected using real-time polymerase chain reaction along two South Australian reuse water distribution pipelines at maximum concentrations of 10⁵, 10³ and 10⁵ copies/mL, respectively. During the summer period of sampling the concentration of all three organisms significantly increased (P < 0.05) along the pipeline, suggesting multiplication and hence viability. No seasonality in the decrease in chlorine residual along the pipelines was observed. This suggests that the combination of reduced chlorine residual and increased water temperature promoted the presence of these opportunistic pathogens.
NASA Astrophysics Data System (ADS)
Goldoni, P.
2011-03-01
The X-shooter data reduction pipeline is an integral part of the X-shooter project, it allows the production of reduced data in physical quantities from the raw data produced by the instrument. The pipeline is based on the data reduction library developed by the X-shooter consortium with contributions from France, The Netherlands and ESO and it uses the Common Pipeline Library (CPL) developed at ESO. The pipeline has been developed for two main functions. The first function is to monitor the operation of the instrument through the reduction of the acquired data, both at Paranal, for a quick-look control, and in Garching, for a more thorough evaluation. The second function is to allow an optimized data reduction for a scientific user. In the following I will first outline the main steps of data reduction with the pipeline then I will briefly show two examples of optimization of the results for science reduction.
NASA Astrophysics Data System (ADS)
Wen, Shipeng; Xu, Jishang; Hu, Guanghai; Dong, Ping; Shen, Hong
2015-08-01
The safety of submarine pipelines is largely influenced by free spans and corrosions. Previous studies on free spans caused by seabed scours are mainly based on the stable environment, where the background seabed scour is in equilibrium and the soil is homogeneous. To study the effects of background erosion on the free span development of subsea pipelines, a submarine pipeline located at the abandoned Yellow River subaqueous delta lobe was investigated with an integrated surveying system which included a Multibeam bathymetric system, a dual-frequency side-scan sonar, a high resolution sub-bottom profiler, and a Magnetic Flux Leakage (MFL) sensor. We found that seabed homogeneity has a great influence on the free span development of the pipeline. More specifically, for homogeneous background scours, the morphology of scour hole below the pipeline is quite similar to that without the background scour, whereas for inhomogeneous background scour, the nature of spanning is mainly dependent on the evolution of seabed morphology near the pipeline. Magnetic Flux Leakage (MFL) detection results also reveal the possible connection between long free spans and accelerated corrosion of the pipeline.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1994-02-01
Upper East Fork Poplar Creek Operable Unit 2 consists of the Abandoned Nitric Acid pipeline (ANAP). This pipeline was installed in 1951 to transport liquid wastes {approximately}4800 ft from Buildings 9212, 9215, and 9206 to the S-3 Ponds. Materials known to have been discharged through the pipeline include nitric acid, depleted and enriched uranium, various metal nitrates, salts, and lead skimmings. During the mid-1980s, sections of the pipeline were removed during various construction projects. A total of 19 locations were chosen to be investigated along the pipeline for the first phase of this Remedial Investigation. Sampling consisted of drilling downmore » to obtain a soil sample at a depth immediately below the pipeline. Additional samples were obtained deeper in the subsurface depending upon the depth of the pipeline, the depth of the water table, and the point of auger refusal. The 19 samples collected below the pipeline were analyzed by the Oak Ridge Y-12 Plant`s laboratory for metals, nitrate/nitrite, and isotopic uranium. Samples collected from three boreholes were also analyzed for volatile organic compounds because these samples produced a response with organic vapor monitoring equipment. Uranium activities in the soil samples ranged from 0.53 to 13.0 pCi/g for {sup 238}U, from 0.075 to 0.75 pCi/g for {sup 235}U, and from 0.71 to 5.0 pCi/g for {sup 238}U. Maximum total values for lead, chromium, and nickel were 75.1 mg/kg, 56.3 mg/kg, and 53.0 mg/kg, respectively. The maximum nitrate/nitrite value detected was 32.0 mg-N/kg. One sample obtained adjacent to a sewer line contained various organic compounds, at least some of which were tentatively identified as fragrance chemicals commonly associated with soaps and cleaning solutions. The results of the baseline human health risk assessment for the ANAP contaminants of potential concern show no unacceptable risks to human health.« less
An approach for estimating toxic releases of H2S-containing natural gas.
Jianwen, Zhang; Da, Lei; Wenxing, Feng
2014-01-15
China is well known being rich in sulfurous natural gas with huge deposits widely distributed all over the country. Due to the toxic nature, the release of hydrogen sulfide-containing natural gas from the pipelines intends to impose serious threats to the human, society and environment around the release sources. CFD algorithm is adopted to simulate the dispersion process of gas, and the results prove that Gaussian plume model is suitable for determining the affected region of the well blowout of sulfide hydrogen-containing natural gas. In accordance with the analysis of release scenarios, the present study proposes a new approach for estimating the risk of hydrogen sulfide poisoning hazards, as caused by sulfide-hydrogen-containing natural gas releases. Historical accident-statistical data from the EGIG (European Gas Pipeline Incident Data Group) and the Britain Gas Transco are integrated into the approach. Also, the dose-load effect is introduced to exploit the hazards' effects by two essential parameters - toxic concentration and exposure time. The approach was applied to three release scenarios occurring on the East-Sichuan Gas Transportation Project, and the individual risk and societal risk are classified and discussed. Results show that societal risk varies significantly with different factors, including population density, distance from pipeline, operating conditions and so on. Concerning the dispersion process of hazardous gas, available safe egress time was studied from the perspective of individual fatality risks. The present approach can provide reliable support for the safety management and maintenance of natural gas pipelines as well as evacuations that may occur after release incidents. Copyright © 2013 Elsevier B.V. All rights reserved.
Tuberculosis vaccines: time to think about the next generation.
Kaufmann, Stefan H E
2013-04-01
Efforts over the last 2 decades have led to a rich research and development pipeline of tuberculosis (TB) vaccines. Although none of the candidates has successfully completed the clinical trial pipeline, many are under advanced clinical assessment. These vaccines aim at prevention of active TB, with most of them being considered for preexposure with recent additions for postexposure or multistage administration. A few therapeutic vaccines are under clinical assessment, as well. Preexposure vaccination with the licensed TB vaccine BCG prevents severe forms of TB in children but not in adolescents and adults. The current vaccine pipeline does not include strategies which prevent or eliminate infection with the causative agent Mycobacterium tuberculosis (Mtb). Rather in a best-case scenario, they are quantitatively superior to BCG in preventing active TB over prolonged periods of time, ideally lifelong in the face of latent Mtb infection. Qualitatively superior vaccines should be capable of preventing or eliminating Mtb infection, in this way eliminating the risk of TB reactivation. The time is now ripe to exploit radically new strategies to achieve this goal. Copyright © 2013. Published by Elsevier Ltd.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-25
... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket ID... Management Programs AGENCY: Pipeline and Hazardous Materials Safety Administration (PHMSA), DOT. ACTION... Nation's gas distribution pipeline systems through development of inspection methods and guidance for the...
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
This study, conducted by Bechtel, was funded by the U.S. Trade and Development Agency. The report specifically addresses an LNG terminal and associated gas pipeline, the crude oil pipeline component of the Southern Seaboard project, in addition to a power plant which uses a portion of the gas. Volume II contains the Appendix and is divided into the following sections: (1.0) PTT Data; (2.0) Design Criteria; (3.0) Khao Bo Ya Soils Data; (4.0) Khao Bo Ya Oceanographic Data; (5.0) Thailand Seismic Data; (6.0) Risk Assessment; (7.0) Equipment Lists; (8.0) Equipment Data Sheets; (9.0) Drawings; (10.0) Cost Data; (11.0) Calculations; (12.0)more » Terms of Reference.« less
More than a Metaphor: The Contribution of Exclusionary Discipline to a School-to-Prison Pipeline
ERIC Educational Resources Information Center
Skiba, Russell J.; Arredondo, Mariella I.; Williams, Natasha T.
2014-01-01
The term and construct "school-to-prison" pipeline has been widely used by advocates, researchers, and policymakers to describe the relationship between school disciplinary practices and increased risk of juvenile justice contact. It has been unclear whether the construct is a useful heuristic or a descriptor of empirically validated…
Deng, Yajun; Hu, Hongbing; Yu, Bo; Sun, Dongliang; Hou, Lei; Liang, Yongtu
2018-01-15
The rupture of a high-pressure natural gas pipeline can pose a serious threat to human life and environment. In this research, a method has been proposed to simulate the release of natural gas from the rupture of high-pressure pipelines in any terrain. The process of gas releases from the rupture of a high-pressure pipeline is divided into three stages, namely the discharge, jet, and dispersion stages. Firstly, a discharge model is established to calculate the release rate of the orifice. Secondly, an improved jet model is proposed to obtain the parameters of the pseudo source. Thirdly, a fast-modeling method applicable to any terrain is introduced. Finally, based upon these three steps, a dispersion model, which can take any terrain into account, is established. Then, the dispersion scenarios of released gas in four different terrains are studied. Moreover, the effects of pipeline pressure, pipeline diameter, wind speed and concentration of hydrogen sulfide on the dispersion scenario in real terrain are systematically analyzed. The results provide significant guidance for risk assessment and contingency planning of a ruptured natural gas pipeline. Copyright © 2017. Published by Elsevier B.V.
The SCUBA Data Reduction Pipeline: ORAC-DR at the JCMT
NASA Astrophysics Data System (ADS)
Jenness, Tim; Economou, Frossie
The ORAC data reduction pipeline, developed for UKIRT, has been designed to be a completely general approach to writing data reduction pipelines. This generality has enabled the JCMT to adapt the system for use with SCUBA with minimal development time using the existing SCUBA data reduction algorithms (Surf).
Report to Congress on Sustainable Ranges
2014-02-01
with the potential to impact Army training and testing. These energy initiatives include wind turbines , new energy corridors for gas/oil pipelines and...the capability to effectively test and train inside the range boundaries. This is particularly evident when the Doppler Effect from wind turbines ...adverse impacts from wind turbine installation. These “High Risk of Adverse Impact Zones” will provide developers with advance information on
A graph-based approach for designing extensible pipelines
2012-01-01
Background In bioinformatics, it is important to build extensible and low-maintenance systems that are able to deal with the new tools and data formats that are constantly being developed. The traditional and simplest implementation of pipelines involves hardcoding the execution steps into programs or scripts. This approach can lead to problems when a pipeline is expanding because the incorporation of new tools is often error prone and time consuming. Current approaches to pipeline development such as workflow management systems focus on analysis tasks that are systematically repeated without significant changes in their course of execution, such as genome annotation. However, more dynamism on the pipeline composition is necessary when each execution requires a different combination of steps. Results We propose a graph-based approach to implement extensible and low-maintenance pipelines that is suitable for pipeline applications with multiple functionalities that require different combinations of steps in each execution. Here pipelines are composed automatically by compiling a specialised set of tools on demand, depending on the functionality required, instead of specifying every sequence of tools in advance. We represent the connectivity of pipeline components with a directed graph in which components are the graph edges, their inputs and outputs are the graph nodes, and the paths through the graph are pipelines. To that end, we developed special data structures and a pipeline system algorithm. We demonstrate the applicability of our approach by implementing a format conversion pipeline for the fields of population genetics and genetic epidemiology, but our approach is also helpful in other fields where the use of multiple software is necessary to perform comprehensive analyses, such as gene expression and proteomics analyses. The project code, documentation and the Java executables are available under an open source license at http://code.google.com/p/dynamic-pipeline. The system has been tested on Linux and Windows platforms. Conclusions Our graph-based approach enables the automatic creation of pipelines by compiling a specialised set of tools on demand, depending on the functionality required. It also allows the implementation of extensible and low-maintenance pipelines and contributes towards consolidating openness and collaboration in bioinformatics systems. It is targeted at pipeline developers and is suited for implementing applications with sequential execution steps and combined functionalities. In the format conversion application, the automatic combination of conversion tools increased both the number of possible conversions available to the user and the extensibility of the system to allow for future updates with new file formats. PMID:22788675
Diagnostic Inspection of Pipelines for Estimating the State of Stress in Them
NASA Astrophysics Data System (ADS)
Subbotin, V. A.; Kolotilov, Yu. V.; Smirnova, V. Yu.; Ivashko, S. K.
2017-12-01
The diagnostic inspection used to estimate the technical state of a pipeline is described. The problems of inspection works are listed, and a functional-structural scheme is developed to estimate the state of stress in a pipeline. Final conclusions regarding the actual loading of a pipeline section are drawn upon a cross analysis of the entire information obtained during pipeline inspection.
An Integrated SNP Mining and Utilization (ISMU) Pipeline for Next Generation Sequencing Data
Azam, Sarwar; Rathore, Abhishek; Shah, Trushar M.; Telluri, Mohan; Amindala, BhanuPrakash; Ruperao, Pradeep; Katta, Mohan A. V. S. K.; Varshney, Rajeev K.
2014-01-01
Open source single nucleotide polymorphism (SNP) discovery pipelines for next generation sequencing data commonly requires working knowledge of command line interface, massive computational resources and expertise which is a daunting task for biologists. Further, the SNP information generated may not be readily used for downstream processes such as genotyping. Hence, a comprehensive pipeline has been developed by integrating several open source next generation sequencing (NGS) tools along with a graphical user interface called Integrated SNP Mining and Utilization (ISMU) for SNP discovery and their utilization by developing genotyping assays. The pipeline features functionalities such as pre-processing of raw data, integration of open source alignment tools (Bowtie2, BWA, Maq, NovoAlign and SOAP2), SNP prediction (SAMtools/SOAPsnp/CNS2snp and CbCC) methods and interfaces for developing genotyping assays. The pipeline outputs a list of high quality SNPs between all pairwise combinations of genotypes analyzed, in addition to the reference genome/sequence. Visualization tools (Tablet and Flapjack) integrated into the pipeline enable inspection of the alignment and errors, if any. The pipeline also provides a confidence score or polymorphism information content value with flanking sequences for identified SNPs in standard format required for developing marker genotyping (KASP and Golden Gate) assays. The pipeline enables users to process a range of NGS datasets such as whole genome re-sequencing, restriction site associated DNA sequencing and transcriptome sequencing data at a fast speed. The pipeline is very useful for plant genetics and breeding community with no computational expertise in order to discover SNPs and utilize in genomics, genetics and breeding studies. The pipeline has been parallelized to process huge datasets of next generation sequencing. It has been developed in Java language and is available at http://hpc.icrisat.cgiar.org/ISMU as a standalone free software. PMID:25003610
DOT National Transportation Integrated Search
2010-08-01
Significant financial and environmental consequences often result from line leakage of oil product pipelines. Product can escape into the surrounding soil as even the smallest leak can lead to rupture of the pipeline. From a health perspective, water...
Freight pipelines: Current status and anticipated future use
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1998-07-01
This report is issued by the Task Committee on Freight Pipelines, Pipeline Division, ASCE. Freight pipelines of various types (including slurry pipeline, pneumatic pipeline, and capsule pipeline) have been used throughout the world for over a century for transporting solid and sometimes even package products. Recent advancements in pipeline technology, aided by advanced computer control systems and trenchless technologies, have greatly facilitated the transportation of solids by pipelines. Today, in many situations, freight pipelines are not only the most economical and practical means for transporting solids, they are also the most reliable, safest and most environmentally friendly transportation mode. Increasedmore » use of underground pipelines to transport freight is anticipated in the future, especially as the technology continues to improve and surface transportation modes such as highways become more congested. This paper describes the state of the art and expected future uses of various types of freight pipelines. Obstacles hindering the development and use of the most advanced freight pipeline systems, such as the pneumatic capsule pipeline for interstate transport of freight, are discussed.« less
The Vulnerability Formation Mechanism and Control Strategy of the Oil and Gas Pipeline City
NASA Astrophysics Data System (ADS)
Chen, Y. L.; Han, L.
2017-12-01
Most of the pipelines of oil and gas pipelines in our country have been for more than 25 years. These pipes are buried underground and was difficult to daily test. In addition, it was vulnerable to environmental, corrosion and natural disasters, So there is a hidden nature of accidents. The rapid development of urbanization, population accumulation, dense building and insufficient safety range are all the reasons for the frequent accidents of oil and gas pipelines. Therefore, to appraise and know the safe condition of the city various regions oil and gas pipelines is vital significant. In order to ensure the safety of oil and gas pipeline city, this paper defines the connotation of oil and gas pipeline city vulnerability according to the previous research on vulnerability. Then from three perspectives of environment, structure and behavior, based on the analytical paradigm of “structure—vulnerability conduct—performance” about oil and gas, the influential indicators of vulnerable oil and gas pipelines were analysed, the vulnerability mechanism framework of Oil and gas pipeline city was also constructed. Finally, the paper proposed the regulating strategy of the vulnerability of the oil and gas pipeline city to decrease its vulnerability index, which can be realize the city’s vulnerability evaluation and provides new ideas for the sustainable development of the city.
Chery, Joyce G; Sass, Chodon; Specht, Chelsea D
2017-09-01
We developed a bioinformatic pipeline that leverages a publicly available genome and published transcriptomes to design primers in conserved coding sequences flanking targeted introns of single-copy nuclear loci. Paullinieae (Sapindaceae) is used to demonstrate the pipeline. Transcriptome reads phylogenetically closer to the lineage of interest are aligned to the closest genome. Single-nucleotide polymorphisms are called, generating a "pseudoreference" closer to the lineage of interest. Several filters are applied to meet the criteria of single-copy nuclear loci with introns of a desired size. Primers are designed in conserved coding sequences flanking introns. Using this pipeline, we developed nine single-copy nuclear intron markers for Paullinieae. This pipeline is highly flexible and can be used for any group with available genomic and transcriptomic resources. This pipeline led to the development of nine variable markers for phylogenetic study without generating sequence data de novo.
Torching the Haystack: modelling fast-fail strategies in drug development.
Lendrem, Dennis W; Lendrem, B Clare
2013-04-01
By quickly clearing the development pipeline of failing or marginal products, fast-fail strategies release resources to focus on more promising molecules. The Quick-Kill model of drug development demonstrates that fast-fail strategies will: (1) reduce the expected time to market; (2) reduce expected R&D costs; and (3) increase R&D productivity. This paper outlines the model and demonstrates the impact of fast-fail strategies. The model is illustrated with costs and risks data from pharmaceutical and biopharmaceutical companies. Copyright © 2012 Elsevier Ltd. All rights reserved.
Data as a Service: A Seismic Web Service Pipeline
NASA Astrophysics Data System (ADS)
Martinez, E.
2016-12-01
Publishing data as a service pipeline provides an improved, dynamic approach over static data archives. A service pipeline is a collection of micro web services that each perform a specific task and expose the results of that task. Structured request/response formats allow micro web services to be chained together into a service pipeline to provide more complex results. The U.S. Geological Survey adopted service pipelines to publish seismic hazard and design data supporting both specific and generalized audiences. The seismic web service pipeline starts at source data and exposes probability and deterministic hazard curves, response spectra, risk-targeted ground motions, and seismic design provision metadata. This pipeline supports public/private organizations and individual engineers/researchers. Publishing data as a service pipeline provides a variety of benefits. Exposing the component services enables advanced users to inspect or use the data at each processing step. Exposing a composite service enables new users quick access to published data with a very low barrier to entry. Advanced users may re-use micro web services by chaining them in new ways or injecting new micros services into the pipeline. This allows the user to test hypothesis and compare their results to published results. Exposing data at each step in the pipeline enables users to review and validate the data and process more quickly and accurately. Making the source code open source, per USGS policy, further enables this transparency. Each micro service may be scaled independent of any other micro service. This ensures data remains available and timely in a cost-effective manner regardless of load. Additionally, if a new or more efficient approach to processing the data is discovered, this new approach may replace the old approach at any time, keeping the pipeline running while not affecting other micro services.
Zisapel, Nava
2012-09-01
Sleep is a vital neurochemical process involving sleep-promoting and arousal centers in the brain. Insomnia is a pervasive disorder characterized by difficulties in initiating or maintaining or non-refreshing (poor quality) sleep and clinically significant daytime distress. Insomnia is more prevalent in women and old age and puts sufferers at significant physical and mental health risks. This review summarizes published data on the current and emerging insomnia drug classes, rationale for development and associated risks/benefits. (Summary of Product Characteristics and Medline search on "hypnotic" or specific drug names and "Insomnia"). GABA(A) receptor modulators facilitate sleep onset and some improve maintenance but increase risk of dependence, memory, cognitive and psychomotor impairments, falls, accidents and mortality. Melatonin receptor agonists improve quality of sleep and/or sleep onset but response may develop over several days. They have more benign safety profiles and are indicated for milder insomnia, longer usage and (prolonged release melatonin) older patients. Histamine H-1 receptor antagonists improve sleep maintenance but their effects on cognition, memory and falls remain to be demonstrated. Late-stage pipeline orexin OX1/OX2 and serotonin 5HT2A receptor antagonists may hold the potential to address several unmet needs in insomnia pharmacotherapy but safety issues cast some doubts over their future. Current and new insomnia drugs in the pipeline target different sleep regulating mechanisms and symptoms and have different tolerability profiles. Drug selection would ideally be based on improvement in the quality of patients' sleep, overall quality of life and functional status weighed against risk to the individual and public health.
The High Level Data Reduction Library
NASA Astrophysics Data System (ADS)
Ballester, P.; Gabasch, A.; Jung, Y.; Modigliani, A.; Taylor, J.; Coccato, L.; Freudling, W.; Neeser, M.; Marchetti, E.
2015-09-01
The European Southern Observatory (ESO) provides pipelines to reduce data for most of the instruments at its Very Large telescope (VLT). These pipelines are written as part of the development of VLT instruments, and are used both in the ESO's operational environment and by science users who receive VLT data. All the pipelines are highly specific geared toward instruments. However, experience showed that the independently developed pipelines include significant overlap, duplication and slight variations of similar algorithms. In order to reduce the cost of development, verification and maintenance of ESO pipelines, and at the same time improve the scientific quality of pipelines data products, ESO decided to develop a limited set of versatile high-level scientific functions that are to be used in all future pipelines. The routines are provided by the High-level Data Reduction Library (HDRL). To reach this goal, we first compare several candidate algorithms and verify them during a prototype phase using data sets from several instruments. Once the best algorithm and error model have been chosen, we start a design and implementation phase. The coding of HDRL is done in plain C and using the Common Pipeline Library (CPL) functionality. HDRL adopts consistent function naming conventions and a well defined API to minimise future maintenance costs, implements error propagation, uses pixel quality information, employs OpenMP to take advantage of multi-core processors, and is verified with extensive unit and regression tests. This poster describes the status of the project and the lesson learned during the development of reusable code implementing algorithms of high scientific quality.
NASA Astrophysics Data System (ADS)
Pérez-López, F.; Vallejo, J. C.; Martínez, S.; Ortiz, I.; Macfarlane, A.; Osuna, P.; Gill, R.; Casale, M.
2015-09-01
BepiColombo is an interdisciplinary ESA mission to explore the planet Mercury in cooperation with JAXA. The mission consists of two separate orbiters: ESA's Mercury Planetary Orbiter (MPO) and JAXA's Mercury Magnetospheric Orbiter (MMO), which are dedicated to the detailed study of the planet and its magnetosphere. The MPO scientific payload comprises eleven instruments packages covering different disciplines developed by several European teams. This paper describes the design and development approach of the framework required to support the operation of the distributed BepiColombo MPO instruments pipelines, developed and operated from different locations, but designed as a single entity. An architecture based on primary-redundant configuration, fully integrated into the BepiColombo Science Operations Control System (BSCS), has been selected, where some instrument pipelines will be operated from the instrument team's data processing centres, having a pipeline replica that can be run from the Science Ground Segment (SGS), while others will be executed as primary pipelines from the SGS, adopting the SGS the pipeline orchestration role.
Underwater Adhesives Retrofit Pipelines with Advanced Sensors
NASA Technical Reports Server (NTRS)
2015-01-01
Houston-based Astro Technology Inc. used a partnership with Johnson Space Center to pioneer an advanced fiber-optic monitoring system for offshore oil pipelines. The company's underwater adhesives allow it to retrofit older deepwater systems in order to measure pressure, temperature, strain, and flow properties, giving energy companies crucial data in real time and significantly decreasing the risk of a catastrophe.
Development of a robotic system of nonstripping pipeline repair by reinforced polymeric compositions
NASA Astrophysics Data System (ADS)
Rybalkin, LA
2018-03-01
The article considers the possibility of creating a robotic system for pipeline repair. The pipeline repair is performed due to inner layer formation by special polyurethane compositions reinforced by short glass fiber strands. This approach provides the opportunity to repair pipelines without excavation works and pipe replacement.
Expansion of the U.S. Natural Gas Pipeline Network
2009-01-01
Additions in 2008 and Projects through 2011. This report examines new natural gas pipeline capacity added to the U.S. natural gas pipeline system during 2008. In addition, it discusses and analyzes proposed natural gas pipeline projects that may be developed between 2009 and 2011, and the market factors supporting these initiatives.
BigDataScript: a scripting language for data pipelines.
Cingolani, Pablo; Sladek, Rob; Blanchette, Mathieu
2015-01-01
The analysis of large biological datasets often requires complex processing pipelines that run for a long time on large computational infrastructures. We designed and implemented a simple script-like programming language with a clean and minimalist syntax to develop and manage pipeline execution and provide robustness to various types of software and hardware failures as well as portability. We introduce the BigDataScript (BDS) programming language for data processing pipelines, which improves abstraction from hardware resources and assists with robustness. Hardware abstraction allows BDS pipelines to run without modification on a wide range of computer architectures, from a small laptop to multi-core servers, server farms, clusters and clouds. BDS achieves robustness by incorporating the concepts of absolute serialization and lazy processing, thus allowing pipelines to recover from errors. By abstracting pipeline concepts at programming language level, BDS simplifies implementation, execution and management of complex bioinformatics pipelines, resulting in reduced development and debugging cycles as well as cleaner code. BigDataScript is available under open-source license at http://pcingola.github.io/BigDataScript. © The Author 2014. Published by Oxford University Press.
BigDataScript: a scripting language for data pipelines
Cingolani, Pablo; Sladek, Rob; Blanchette, Mathieu
2015-01-01
Motivation: The analysis of large biological datasets often requires complex processing pipelines that run for a long time on large computational infrastructures. We designed and implemented a simple script-like programming language with a clean and minimalist syntax to develop and manage pipeline execution and provide robustness to various types of software and hardware failures as well as portability. Results: We introduce the BigDataScript (BDS) programming language for data processing pipelines, which improves abstraction from hardware resources and assists with robustness. Hardware abstraction allows BDS pipelines to run without modification on a wide range of computer architectures, from a small laptop to multi-core servers, server farms, clusters and clouds. BDS achieves robustness by incorporating the concepts of absolute serialization and lazy processing, thus allowing pipelines to recover from errors. By abstracting pipeline concepts at programming language level, BDS simplifies implementation, execution and management of complex bioinformatics pipelines, resulting in reduced development and debugging cycles as well as cleaner code. Availability and implementation: BigDataScript is available under open-source license at http://pcingola.github.io/BigDataScript. Contact: pablo.e.cingolani@gmail.com PMID:25189778
Seismic hazard evaluation of the Oman India pipeline
DOE Office of Scientific and Technical Information (OSTI.GOV)
Campbell, K.W.; Thenhaus, P.C.; Mullee, J.E.
1996-12-31
The proposed Oman India pipeline will traverse approximately 1,135 km of the northern Arabian Sea floor and adjacent continental shelves at depths of over 3 km on its route from Ra`s al Jifan, Oman, to Rapar Gadhwali, India. The western part of the route crosses active faults that form the transform boundary between the Arabian and Indian tectonic plates. The eastern terminus of the route lies in the vicinity of the great (M {approximately} 8) 1829 Kutch, India earthquake. A probabilistic seismic hazard analysis was used to estimate the values of peak ground acceleration (PGA) with return periods of 200,more » 500 and 1,000 years at selected locations along the pipeline route and the submarine Indus Canyon -- a possible source of large turbidity flows. The results defined the ground-shaking hazard along the pipeline route and Indus Canyon for evaluation of risks to the pipeline from potential earthquake-induced geologic hazards such as liquefaction, slope instability, and turbidity flows. 44 refs.« less
Monitoring of pipeline ruptures by means of a Robust Satellite Technique (RST)
NASA Astrophysics Data System (ADS)
Filizzola, C.; Baldassarre, G.; Corrado, R.; Mazzeo, G.; Marchese, F.; Paciello, R.; Pergola, N.; Tramutoli, V.
2009-04-01
Pipeline ruptures have deep economic and ecologic consequences so that pipeline networks represent critical infrastructures to be carefully monitored particularly in areas which are frequently affected by natural disasters like earthquakes, hurricanes, landslide, etc. In order to minimize damages, the detection of harmful events along pipelines should be as rapid as possible and, at the same time, what is detected should be an actual incident and not a false alarm. In this work, a Robust Satellite Technique (RST), already applied to the prevision and NRT (Near Real Time) monitoring of major natural and environmental hazards (such as seismically active areas, volcanic activity, hydrological risk, forest fires and oil spills) has been employed to automatically identify, from satellite, anomalous Thermal Infrared (TIR) transients related to explosions of oil/gas pipelines. In this context, the combination of the RST approach with high temporal resolution, offered by geostationary satellites, seems to assure both a reliable and timely detection of such events. The potentials of the technique (applied to MSG-SEVIRI data) were tested over Iraq, a region which is sadly known for the numerous (mainly manmade) accidents to pipelines, in order to have a simulation of the effects (such as fires or explosions near or directly involving a pipeline facility) due to natural disasters.
Natural Gas Compressor Stations on the Interstate Pipeline Network: Developments Since 1996
2007-01-01
This special report looks at the use of natural gas pipeline compressor stations on the interstate natural gas pipeline network that serves the lower 48 states. It examines the compression facilities added over the past 10 years and how the expansions have supported pipeline capacity growth intended to meet the increasing demand for natural gas.
Data-driven risk models could help target pipeline safety inspections
DOT National Transportation Integrated Search
2008-07-01
Federal safety agencies share a common problemthe : need to target resources effectively to reduce risk. One : way this targeting is commonly done is with a risk model : that uses safety data along with expert judgment to identify : and weight ris...
The ORAC-DR data reduction pipeline
NASA Astrophysics Data System (ADS)
Cavanagh, B.; Jenness, T.; Economou, F.; Currie, M. J.
2008-03-01
The ORAC-DR data reduction pipeline has been used by the Joint Astronomy Centre since 1998. Originally developed for an infrared spectrometer and a submillimetre bolometer array, it has since expanded to support twenty instruments from nine different telescopes. By using shared code and a common infrastructure, rapid development of an automated data reduction pipeline for nearly any astronomical data is possible. This paper discusses the infrastructure available to developers and estimates the development timescales expected to reduce data for new instruments using ORAC-DR.
NASA Astrophysics Data System (ADS)
Pogue, Brian W.; Paulsen, Keith D.; Hull, Sally M.; Samkoe, Kimberley S.; Gunn, Jason; Hoopes, Jack; Roberts, David W.; Strong, Theresa V.; Draney, Daniel; Feldwisch, Joachim
2015-03-01
Molecular guided oncology surgery has the potential to transform the way decisions about resection are done, and can be critically important in areas such as neurosurgery where the margins of tumor relative to critical normal tissues are not readily apparent from visual or palpable guidance. Yet there are major financial barriers to advancing agents into clinical trials with commercial backing. We observe that development of these agents in the standard biological therapeutic paradigm is not viable, due to the high up front financial investment needed and the limitations in the revenue models of contrast agents for imaging. The hypothesized solution to this problem is to develop small molecular biologicals tagged with an established fluorescent reporter, through the chemical agent approval pathway, targeting a phase 0 trials initially, such that the initial startup phase can be completely funded by a single NIH grant. In this way, fast trials can be completed to de-risk the development pipeline, and advance the idea of fluorescence-guided surgery (FGS) reporters into human testing. As with biological therapies the potential successes of each agent are still moderate, but this process will allow the field to advance in a more stable and productive manner, rather than relying upon isolated molecules developed at high cost and risk. The pathway proposed and tested here uses peptide synthesis of an epidermal growth factor receptor (EGFR)-binding Affibody molecules, uniquely conjugated to IRDye 800CW, developed and tested in academic and industrial laboratories with well-established records for GMP production, fill and finish, toxicity testing, and early phase clinical trials with image guidance.
ERIC Educational Resources Information Center
Chase, Lance Montieth
2012-01-01
Prior studies establish that Black males follow a disproportionate trajectory from school to prison when compared to other groups. This same research has documented that multiple risk factors operating within schools may contribute to this phenomenon, commonly known as the "school to prison pipeline." The specific focus of the present…
Stability of subsea pipelines during large storms
Draper, Scott; An, Hongwei; Cheng, Liang; White, David J.; Griffiths, Terry
2015-01-01
On-bottom stability design of subsea pipelines transporting hydrocarbons is important to ensure safety and reliability but is challenging to achieve in the onerous metocean (meteorological and oceanographic) conditions typical of large storms (such as tropical cyclones, hurricanes or typhoons). This challenge is increased by the fact that industry design guidelines presently give no guidance on how to incorporate the potential benefits of seabed mobility, which can lead to lowering and self-burial of the pipeline on a sandy seabed. In this paper, we demonstrate recent advances in experimental modelling of pipeline scour and present results investigating how pipeline stability can change in a large storm. An emphasis is placed on the initial development of the storm, where scour is inevitable on an erodible bed as the storm velocities build up to peak conditions. During this initial development, we compare the rate at which peak near-bed velocities increase in a large storm (typically less than 10−3 m s−2) to the rate at which a pipeline scours and subsequently lowers (which is dependent not only on the storm velocities, but also on the mechanism of lowering and the pipeline properties). We show that the relative magnitude of these rates influences pipeline embedment during a storm and the stability of the pipeline. PMID:25512592
Bioinformatic pipelines in Python with Leaf
2013-01-01
Background An incremental, loosely planned development approach is often used in bioinformatic studies when dealing with custom data analysis in a rapidly changing environment. Unfortunately, the lack of a rigorous software structuring can undermine the maintainability, communicability and replicability of the process. To ameliorate this problem we propose the Leaf system, the aim of which is to seamlessly introduce the pipeline formality on top of a dynamical development process with minimum overhead for the programmer, thus providing a simple layer of software structuring. Results Leaf includes a formal language for the definition of pipelines with code that can be transparently inserted into the user’s Python code. Its syntax is designed to visually highlight dependencies in the pipeline structure it defines. While encouraging the developer to think in terms of bioinformatic pipelines, Leaf supports a number of automated features including data and session persistence, consistency checks between steps of the analysis, processing optimization and publication of the analytic protocol in the form of a hypertext. Conclusions Leaf offers a powerful balance between plan-driven and change-driven development environments in the design, management and communication of bioinformatic pipelines. Its unique features make it a valuable alternative to other related tools. PMID:23786315
Viability of using different types of main oil pipelines pump drives
NASA Astrophysics Data System (ADS)
Zakirzakov, A. G.; Zemenkov, Yu D.; Akulov, K. A.
2018-05-01
The choice of the pumping units' drive of main oil pipelines is of great importance both for design of pipelines and for modernization of existing ones. At the beginning of oil pipeline transport development, due to the limited number and types of energy sources, the choice was not difficult. The combustion energy of the pumped product was often the only available energy resource for its transportation. In this regard, the pipelines that had autonomous energy sources favorably differed from other energy consumers in the sector. With the passage of time, with the development of the country's electricity supply system, the electric drive for power-line equipment of oil pipelines becomes the dominant type of a pumping station drive. Nowadays, the traditional component is an essential factor when choosing some type of the drive. For many years, oil companies have been using electric drives for pumps, while gas transport enterprises prefer self-contained gas turbines.
Drive Control System for Pipeline Crawl Robot Based on CAN Bus
NASA Astrophysics Data System (ADS)
Chen, H. J.; Gao, B. T.; Zhang, X. H.; Deng2, Z. Q.
2006-10-01
Drive control system plays important roles in pipeline robot. In order to inspect the flaw and corrosion of seabed crude oil pipeline, an original mobile pipeline robot with crawler drive unit, power and monitor unit, central control unit, and ultrasonic wave inspection device is developed. The CAN bus connects these different function units and presents a reliable information channel. Considering the limited space, a compact hardware system is designed based on an ARM processor with two CAN controllers. With made-to-order CAN protocol for the crawl robot, an intelligent drive control system is developed. The implementation of the crawl robot demonstrates that the presented drive control scheme can meet the motion control requirements of the underwater pipeline crawl robot.
Study of a risk-based piping inspection guideline system.
Tien, Shiaw-Wen; Hwang, Wen-Tsung; Tsai, Chih-Hung
2007-02-01
A risk-based inspection system and a piping inspection guideline model were developed in this study. The research procedure consists of two parts--the building of a risk-based inspection model for piping and the construction of a risk-based piping inspection guideline model. Field visits at the plant were conducted to develop the risk-based inspection and strategic analysis system. A knowledge-based model had been built in accordance with international standards and local government regulations, and the rational unified process was applied for reducing the discrepancy in the development of the models. The models had been designed to analyze damage factors, damage models, and potential damage positions of piping in the petrochemical plants. The purpose of this study was to provide inspection-related personnel with the optimal planning tools for piping inspections, hence, to enable effective predictions of potential piping risks and to enhance the better degree of safety in plant operations that the petrochemical industries can be expected to achieve. A risk analysis was conducted on the piping system of a petrochemical plant. The outcome indicated that most of the risks resulted from a small number of pipelines.
Improved FTA methodology and application to subsea pipeline reliability design.
Lin, Jing; Yuan, Yongbo; Zhang, Mingyuan
2014-01-01
An innovative logic tree, Failure Expansion Tree (FET), is proposed in this paper, which improves on traditional Fault Tree Analysis (FTA). It describes a different thinking approach for risk factor identification and reliability risk assessment. By providing a more comprehensive and objective methodology, the rather subjective nature of FTA node discovery is significantly reduced and the resulting mathematical calculations for quantitative analysis are greatly simplified. Applied to the Useful Life phase of a subsea pipeline engineering project, the approach provides a more structured analysis by constructing a tree following the laws of physics and geometry. Resulting improvements are summarized in comparison table form.
Improved FTA Methodology and Application to Subsea Pipeline Reliability Design
Lin, Jing; Yuan, Yongbo; Zhang, Mingyuan
2014-01-01
An innovative logic tree, Failure Expansion Tree (FET), is proposed in this paper, which improves on traditional Fault Tree Analysis (FTA). It describes a different thinking approach for risk factor identification and reliability risk assessment. By providing a more comprehensive and objective methodology, the rather subjective nature of FTA node discovery is significantly reduced and the resulting mathematical calculations for quantitative analysis are greatly simplified. Applied to the Useful Life phase of a subsea pipeline engineering project, the approach provides a more structured analysis by constructing a tree following the laws of physics and geometry. Resulting improvements are summarized in comparison table form. PMID:24667681
Langlois, Lillie A; Drohan, Patrick J; Brittingham, Margaret C
2017-07-15
Large, continuous forest provides critical habitat for some species of forest dependent wildlife. The rapid expansion of shale gas development within the northern Appalachians results in direct loss of such habitat at well sites, pipelines, and access roads; however the resulting habitat fragmentation surrounding such areas may be of greater importance. Previous research has suggested that infrastructure supporting gas development is the driver for habitat loss, but knowledge of what specific infrastructure affects habitat is limited by a lack of spatial tracking of infrastructure development in different land uses. We used high-resolution aerial imagery, land cover data, and well point data to quantify shale gas development across four time periods (2010, 2012, 2014, 2016), including: the number of wells permitted, drilled, and producing gas (a measure of pipeline development); land use change; and forest fragmentation on both private and public land. As of April 2016, the majority of shale gas development was located on private land (74% of constructed well pads); however, the number of wells drilled per pad was lower on private compared to public land (3.5 and 5.4, respectively). Loss of core forest was more than double on private than public land (4.3 and 2.0%, respectively), which likely results from better management practices implemented on public land. Pipelines were by far the largest contributor to the fragmentation of core forest due to shale gas development. Forecasting future land use change resulting from gas development suggests that the greatest loss of core forest will occur with pads constructed farthest from pre-existing pipelines (new pipelines must be built to connect pads) and in areas with greater amounts of core forest. To reduce future fragmentation, our results suggest new pads should be placed near pre-existing pipelines and methods to consolidate pipelines with other infrastructure should be used. Without these mitigation practices, we will continue to lose core forest as a result of new pipelines and infrastructure particularly on private land. Copyright © 2017 Elsevier Ltd. All rights reserved.
Bio-Docklets: virtualization containers for single-step execution of NGS pipelines.
Kim, Baekdoo; Ali, Thahmina; Lijeron, Carlos; Afgan, Enis; Krampis, Konstantinos
2017-08-01
Processing of next-generation sequencing (NGS) data requires significant technical skills, involving installation, configuration, and execution of bioinformatics data pipelines, in addition to specialized postanalysis visualization and data mining software. In order to address some of these challenges, developers have leveraged virtualization containers toward seamless deployment of preconfigured bioinformatics software and pipelines on any computational platform. We present an approach for abstracting the complex data operations of multistep, bioinformatics pipelines for NGS data analysis. As examples, we have deployed 2 pipelines for RNA sequencing and chromatin immunoprecipitation sequencing, preconfigured within Docker virtualization containers we call Bio-Docklets. Each Bio-Docklet exposes a single data input and output endpoint and from a user perspective, running the pipelines as simply as running a single bioinformatics tool. This is achieved using a "meta-script" that automatically starts the Bio-Docklets and controls the pipeline execution through the BioBlend software library and the Galaxy Application Programming Interface. The pipeline output is postprocessed by integration with the Visual Omics Explorer framework, providing interactive data visualizations that users can access through a web browser. Our goal is to enable easy access to NGS data analysis pipelines for nonbioinformatics experts on any computing environment, whether a laboratory workstation, university computer cluster, or a cloud service provider. Beyond end users, the Bio-Docklets also enables developers to programmatically deploy and run a large number of pipeline instances for concurrent analysis of multiple datasets. © The Authors 2017. Published by Oxford University Press.
Bio-Docklets: virtualization containers for single-step execution of NGS pipelines
Kim, Baekdoo; Ali, Thahmina; Lijeron, Carlos; Afgan, Enis
2017-01-01
Abstract Processing of next-generation sequencing (NGS) data requires significant technical skills, involving installation, configuration, and execution of bioinformatics data pipelines, in addition to specialized postanalysis visualization and data mining software. In order to address some of these challenges, developers have leveraged virtualization containers toward seamless deployment of preconfigured bioinformatics software and pipelines on any computational platform. We present an approach for abstracting the complex data operations of multistep, bioinformatics pipelines for NGS data analysis. As examples, we have deployed 2 pipelines for RNA sequencing and chromatin immunoprecipitation sequencing, preconfigured within Docker virtualization containers we call Bio-Docklets. Each Bio-Docklet exposes a single data input and output endpoint and from a user perspective, running the pipelines as simply as running a single bioinformatics tool. This is achieved using a “meta-script” that automatically starts the Bio-Docklets and controls the pipeline execution through the BioBlend software library and the Galaxy Application Programming Interface. The pipeline output is postprocessed by integration with the Visual Omics Explorer framework, providing interactive data visualizations that users can access through a web browser. Our goal is to enable easy access to NGS data analysis pipelines for nonbioinformatics experts on any computing environment, whether a laboratory workstation, university computer cluster, or a cloud service provider. Beyond end users, the Bio-Docklets also enables developers to programmatically deploy and run a large number of pipeline instances for concurrent analysis of multiple datasets. PMID:28854616
Evaluation of fishing gear induced pipeline damage
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ellinas, C.P.; King, B.; Davies, R.
1995-12-31
Impact and damage to pipelines due to fishing activities is one of the hazards faced by North Sea pipelines during their operating lives. Available data indicate that about one in ten of reported incidents are due to fishing activities. This paper is concerned with one such occurrence, the assessment of the resulting damage, the methods used to confirm pipeline integrity and the approaches developed for its repair.
Managing risks in the project pipeline.
DOT National Transportation Integrated Search
2013-08-01
This research focuses on how to manage the risks of project costs and revenue uncertainties over the long-term, and identifies significant : process improvements to ensure projects are delivered on time and as intended, thus maximizing the miles pave...
Application of the API/NPRA SVA methodology to transportation security issues.
Moore, David A
2006-03-17
Security vulnerability analysis (SVA) is becoming more prevalent as the issue of chemical process security is of greater concern. The American Petroleum Institute (API) and the National Petrochemical and Refiner's Association (NPRA) have developed a guideline for conducting SVAs of petroleum and petrochemical facilities in May 2003. In 2004, the same organizations enhanced the guidelines by adding the ability to evaluate transportation security risks (pipeline, truck, and rail). The importance of including transportation and value chain security in addition to fixed facility security in a SVA is that these issues may be critically important to understanding the total risk of the operation. Most of the SVAs done using the API/NPRA SVA and other SVA methods were centered on the fixed facility and the operations within the plant fence. Transportation interfaces alone are normally studied as a part of the facility SVA, and the entire transportation route impacts and value chain disruption are not commonly considered. Particularly from a national, regional, or local infrastructure analysis standpoint, understanding the interdependencies is critical to the risk assessment. Transportation risks may include weaponization of the asset by direct attack en route, sabotage, or a Trojan Horse style attack into a facility. The risks differ in the level of access control and the degree of public exposures, as well as the dynamic nature of the assets. The public exposures along the transportation route need to be carefully considered. Risks may be mitigated by one of many strategies including internment, staging, prioritization, conscription, or prohibition, as well as by administrative security measures and technology for monitoring and isolating the assets. This paper illustrates how these risks can be analyzed by the API/NPRA SVA methodology. Examples are given of a pipeline operation, and other examples are found in the guidelines.
Automated Laser Ultrasonic Testing (ALUT) of Hybrid Arc Welds for Pipeline Construction, #272
DOT National Transportation Integrated Search
2009-12-22
One challenge in developing new gas reserves is the high cost of pipeline construction. Welding costs are a major component of overall construction costs. Industry continues to seek advanced pipeline welding technologies to improve productivity and s...
Why the poor pay with their lives: oil pipeline vandalisation, fires and human security in Nigeria.
Onuoha, Freedom C
2009-07-01
Since its discovery in Nigeria in 1956 crude oil has been a source of mixed blessing to the country. It is believed to have generated enormous wealth, but it has also claimed a great many lives. Scholarly attention on the impact of oil on security in Nigeria has largely focused on internal conflicts rather than on how disasters associated with oil pipeline vandalisation have impacted on human security in terms of causing bodily injuries and death, destroying livelihoods and fracturing families. This paper examines how pipeline vandalisation affects human security in these ways. It identifies women and children as those who are hardest hit and questions why the poor are the most vulnerable in oil pipeline disasters in this country. It recommends the adoption of a comprehensive and integrated framework of disaster management that will ensure prompt response to key early warning signs, risk-reduction and appropriate mitigation and management strategies.
CCDLAB: A Graphical User Interface FITS Image Data Reducer, Viewer, and Canadian UVIT Data Pipeline
NASA Astrophysics Data System (ADS)
Postma, Joseph E.; Leahy, Denis
2017-11-01
CCDLAB was originally developed as a FITS image data reducer and viewer, and development was then continued to provide ground support for the development of the UVIT detector system provided by the Canadian Space Agency to the Indian Space Research Organization’s ASTROSAT satellite and UVIT telescopes. After the launch of ASTROSAT and during UVIT’s first-light and PV phase starting in 2015 December, necessity required the development of a data pipeline to produce scientific images out of the Level 1 format data produced for UVIT by ISRO. Given the previous development of CCDLAB for UVIT ground support, the author provided a pipeline for the new Level 1 format data to be run through CCDLAB with the additional satellite-dependent reduction operations required to produce scientific data. Features of the pipeline are discussed with focus on the relevant data-reduction challenges intrinsic to UVIT data.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-02-25
... is necessitated in order to avoid active and historic landslides, and reduce risk to the pipeline and... and historic landslides, and reduce risk from geologic faults. The purpose of this Environmental...
The ALMA Science Pipeline: Current Status
NASA Astrophysics Data System (ADS)
Humphreys, Elizabeth; Miura, Rie; Brogan, Crystal L.; Hibbard, John; Hunter, Todd R.; Indebetouw, Remy
2016-09-01
The ALMA Science Pipeline is being developed for the automated calibration and imaging of ALMA interferometric and single-dish data. The calibration Pipeline for interferometric data was accepted for use by ALMA Science Operations in 2014, and for single-dish data end-to-end processing in 2015. However, work is ongoing to expand the use cases for which the Pipeline can be used e.g. for higher frequency and lower signal-to-noise datasets, and for new observing modes. A current focus includes the commissioning of science target imaging for interferometric data. For the Single Dish Pipeline, the line finding algorithm used in baseline subtraction and baseline flagging heuristics have been greately improved since the prototype used for data from the previous cycle. These algorithms, unique to the Pipeline, produce better results than standard manual processing in many cases. In this poster, we report on the current status of the Pipeline capabilities, present initial results from the Imaging Pipeline, and the smart line finding and flagging algorithm used in the Single Dish Pipeline. The Pipeline is released as part of CASA (the Common Astronomy Software Applications package).
Fisher, Jill A; Cottingham, Marci D; Kalbaugh, Corey A
2015-04-01
In spite of a growing literature on pharmaceuticalization, little is known about the pharmaceutical industry's investments in research and development (R&D). Information about the drugs being developed can provide important context for existing case studies detailing the expanding--and often problematic--role of pharmaceuticals in society. To access the pharmaceutical industry's pipeline, we constructed a database of drugs for which pharmaceutical companies reported initiating clinical trials over a five-year period (July 2006-June 2011), capturing 2477 different drugs in 4182 clinical trials. Comparing drugs in the pipeline that target diseases in high-income and low-income countries, we found that the number of drugs for diseases prevalent in high-income countries was 3.46 times higher than drugs for diseases prevalent in low-income countries. We also found that the plurality of drugs in the pipeline was being developed to treat cancers (26.2%). Interpreting our findings through the lens of pharmaceuticalization, we illustrate how investigating the entire drug development pipeline provides important information about patterns of pharmaceuticalization that are invisible when only marketed drugs are considered. Copyright © 2014 Elsevier Ltd. All rights reserved.
TESS Data Processing and Quick-look Pipeline
NASA Astrophysics Data System (ADS)
Fausnaugh, Michael; Huang, Xu; Glidden, Ana; Guerrero, Natalia; TESS Science Office
2018-01-01
We describe the data analysis procedures and pipelines for the Transiting Exoplanet Survey Satellite (TESS). We briefly review the processing pipeline developed and implemented by the Science Processing Operations Center (SPOC) at NASA Ames, including pixel/full-frame image calibration, photometric analysis, pre-search data conditioning, transiting planet search, and data validation. We also describe data-quality diagnostic analyses and photometric performance assessment tests. Finally, we detail a "quick-look pipeline" (QLP) that has been developed by the MIT branch of the TESS Science Office (TSO) to provide a fast and adaptable routine to search for planet candidates in the 30 minute full-frame images.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robin Gordon; Bill Bruce; Nancy Porter
2003-05-01
The two broad categories of deposited weld metal repair and fiber-reinforced composite repair technologies were reviewed for potential application for internal repair of gas transmission pipelines. Both are used to some extent for other applications and could be further developed for internal, local, structural repair of gas transmission pipelines. Preliminary test programs were developed for both deposited weld metal repairs and for fiber-reinforced composite repair. To date, all of the experimental work pertaining to the evaluation of potential repair methods has focused on fiber-reinforced composite repairs. Hydrostatic testing was also conducted on four pipeline sections with simulated corrosion damage: twomore » with composite liners and two without.« less
Guidelines for riser splash zone design and repair
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1995-02-01
The many years of offshore oil and gas development has established the subsea pipeline as a reliable and cost effective means of transportation for produced hydrocarbons. The requirement for subsea pipeline systems will continue to move into deeper water and more remote locations with the future development of oil and gas exploration. The integrity of subsea pipeline and riser systems, throughout their operating lifetime, is an important area for operators to consider in maximizing reliability and serviceability for economic, contractual and environmental reasons. Adequate design and installation are the basis for ensuring the integrity of any subsea pipeline and risermore » systems. In the event of system damage, from any source, quick and accurate repair and reinstatement of the pipeline system is essential. This report has been developed to provide guidelines for riser and splash zone design, to perform a detailed overview of existing riser repair techniques and products, and to prepare comprehensive guidelines identifying the capabilities and limits of riser reinstatement systems.« less
Development of the updated system of city underground pipelines based on Visual Studio
NASA Astrophysics Data System (ADS)
Zhang, Jianxiong; Zhu, Yun; Li, Xiangdong
2009-10-01
Our city has owned the integrated pipeline network management system with ArcGIS Engine 9.1 as the bottom development platform and with Oracle9i as basic database for storaging data. In this system, ArcGIS SDE9.1 is applied as the spatial data engine, and the system was a synthetic management software developed with Visual Studio visualization procedures development tools. As the pipeline update function of the system has the phenomenon of slower update and even sometimes the data lost, to ensure the underground pipeline data can real-time be updated conveniently and frequently, and the actuality and integrity of the underground pipeline data, we have increased a new update module in the system developed and researched by ourselves. The module has the powerful data update function, and can realize the function of inputting and outputting and rapid update volume of data. The new developed module adopts Visual Studio visualization procedures development tools, and uses access as the basic database to storage data. We can edit the graphics in AutoCAD software, and realize the database update using link between the graphics and the system. Practice shows that the update module has good compatibility with the original system, reliable and high update efficient of the database.
A Spatial Risk Analysis of Oil Refineries within the United States
2012-03-01
regulator and consumer. This is especially true within the energy sector which is composed of electrical power, oil , and gas infrastructure [10...Naphtali, "Analysis of Electrical Power and Oil and Gas Pipeline Failures," in International Federation for Information Processing, E. Goetz and S...61-67, September 1999. [5] J. Simonoff, C. Restrepo, R. Zimmerman, and Z. Naphtali, "Analysis of Electrical Power and Oil and Gas Pipeline Failures
NASA Astrophysics Data System (ADS)
Rui, Zhenhua
This study analyzes historical cost data of 412 pipelines and 220 compressor stations. On the basis of this analysis, the study also evaluates the feasibility of an Alaska in-state gas pipeline using Monte Carlo simulation techniques. Analysis of pipeline construction costs shows that component costs, shares of cost components, and learning rates for material and labor costs vary by diameter, length, volume, year, and location. Overall average learning rates for pipeline material and labor costs are 6.1% and 12.4%, respectively. Overall average cost shares for pipeline material, labor, miscellaneous, and right of way (ROW) are 31%, 40%, 23%, and 7%, respectively. Regression models are developed to estimate pipeline component costs for different lengths, cross-sectional areas, and locations. An analysis of inaccuracy in pipeline cost estimation demonstrates that the cost estimation of pipeline cost components is biased except for in the case of total costs. Overall overrun rates for pipeline material, labor, miscellaneous, ROW, and total costs are 4.9%, 22.4%, -0.9%, 9.1%, and 6.5%, respectively, and project size, capacity, diameter, location, and year of completion have different degrees of impacts on cost overruns of pipeline cost components. Analysis of compressor station costs shows that component costs, shares of cost components, and learning rates for material and labor costs vary in terms of capacity, year, and location. Average learning rates for compressor station material and labor costs are 12.1% and 7.48%, respectively. Overall average cost shares of material, labor, miscellaneous, and ROW are 50.6%, 27.2%, 21.5%, and 0.8%, respectively. Regression models are developed to estimate compressor station component costs in different capacities and locations. An investigation into inaccuracies in compressor station cost estimation demonstrates that the cost estimation for compressor stations is biased except for in the case of material costs. Overall average overrun rates for compressor station material, labor, miscellaneous, land, and total costs are 3%, 60%, 2%, -14%, and 11%, respectively, and cost overruns for cost components are influenced by location and year of completion to different degrees. Monte Carlo models are developed and simulated to evaluate the feasibility of an Alaska in-state gas pipeline by assigning triangular distribution of the values of economic parameters. Simulated results show that the construction of an Alaska in-state natural gas pipeline is feasible at three scenarios: 500 million cubic feet per day (mmcfd), 750 mmcfd, and 1000 mmcfd.
Song, Yan; Dhodda, Raj; Zhang, Jun; Sydor, Jens
2014-05-01
In the recent past, we have seen an increase in the outsourcing of bioanalysis in pharmaceutical companies in support of their drug development pipeline. This trend is largely driven by the effort to reduce internal cost, especially in support of late-stage pipeline assets where established bioanalytical assays are used to analyze a large volume of samples. This article will highlight our perspective of how bioanalytical laboratories within pharmaceutical companies can be developed into the best partner in the advancement of drug development pipelines with high-quality support at competitive cost.
Qasim, M; Farinella, G; Zhang, J; Li, X; Yang, L; Eastell, R; Viceconti, M
2016-09-01
A finite element modelling pipeline was adopted to predict femur strength in a retrospective cohort of 100 women. The effects of the imaging protocol and the meshing technique on the ability of the femur strength to classify the fracture and the control groups were analysed. The clinical standard to estimate the risk of osteoporotic hip fracture is based on the areal bone mineral density (aBMD). A few retrospective studies have concluded that finite element (FE)-based femoral strength is a better classifier of fracture and control groups than the aBMD, while others could not find significant differences. We investigated the effect of the imaging protocol and of the FE modelling techniques on the discriminatory power of femoral strength. A retrospective cohort of 100 post-menopausal women (50 with hip fracture, 50 controls) was examined. Each subject received a dual-energy absorptiometry (DXA) exam and a computed tomography (CT) scan of the proximal femur region. Each case was modelled a number of times, using different modelling pipelines, and the results were compared in terms of accuracy in discriminating the fracture and the control cases. The baseline pipeline involved local anatomical orientation and mesh morphing. Revised pipelines involved global anatomical orientation using a full-femur atlas registration and an optimised meshing algorithm. Minimum physiological (MPhyS) and pathological (MPatS) strengths were estimated for each subject. Area under the receiver operating characteristic (ROC) curve (AUC) was calculated to compare the ability of MPhyS, MPatS and aBMD to classify the control and the cases. Differences in the modelling protocol were found to considerably affect the accuracy of the FE predictors. For the most optimised protocol, logistic regression showed aBMDNeck, MPhyS and MPatS to be significantly associated with the facture status, with AUC of 0.75, 0.75 and 0.79, respectively. The study emphasized the necessity of modelling the whole femur anatomy to develop a robust FE-based tool for hip fracture risk assessment. FE-strength performed only slightly better than the aBMD in discriminating the fracture and control cases. Differences between the published studies can be explained in terms of differences in the modelling protocol and cohort design.
v3NLP Framework: Tools to Build Applications for Extracting Concepts from Clinical Text
Divita, Guy; Carter, Marjorie E.; Tran, Le-Thuy; Redd, Doug; Zeng, Qing T; Duvall, Scott; Samore, Matthew H.; Gundlapalli, Adi V.
2016-01-01
Introduction: Substantial amounts of clinically significant information are contained only within the narrative of the clinical notes in electronic medical records. The v3NLP Framework is a set of “best-of-breed” functionalities developed to transform this information into structured data for use in quality improvement, research, population health surveillance, and decision support. Background: MetaMap, cTAKES and similar well-known natural language processing (NLP) tools do not have sufficient scalability out of the box. The v3NLP Framework evolved out of the necessity to scale-up these tools up and provide a framework to customize and tune techniques that fit a variety of tasks, including document classification, tuned concept extraction for specific conditions, patient classification, and information retrieval. Innovation: Beyond scalability, several v3NLP Framework-developed projects have been efficacy tested and benchmarked. While v3NLP Framework includes annotators, pipelines and applications, its functionalities enable developers to create novel annotators and to place annotators into pipelines and scaled applications. Discussion: The v3NLP Framework has been successfully utilized in many projects including general concept extraction, risk factors for homelessness among veterans, and identification of mentions of the presence of an indwelling urinary catheter. Projects as diverse as predicting colonization with methicillin-resistant Staphylococcus aureus and extracting references to military sexual trauma are being built using v3NLP Framework components. Conclusion: The v3NLP Framework is a set of functionalities and components that provide Java developers with the ability to create novel annotators and to place those annotators into pipelines and applications to extract concepts from clinical text. There are scale-up and scale-out functionalities to process large numbers of records. PMID:27683667
Analysis of oil-pipeline distribution of multiple products subject to delivery time-windows
NASA Astrophysics Data System (ADS)
Jittamai, Phongchai
This dissertation defines the operational problems of, and develops solution methodologies for, a distribution of multiple products into oil pipeline subject to delivery time-windows constraints. A multiple-product oil pipeline is a pipeline system composing of pipes, pumps, valves and storage facilities used to transport different types of liquids. Typically, products delivered by pipelines are petroleum of different grades moving either from production facilities to refineries or from refineries to distributors. Time-windows, which are generally used in logistics and scheduling areas, are incorporated in this study. The distribution of multiple products into oil pipeline subject to delivery time-windows is modeled as multicommodity network flow structure and mathematically formulated. The main focus of this dissertation is the investigation of operating issues and problem complexity of single-source pipeline problems and also providing solution methodology to compute input schedule that yields minimum total time violation from due delivery time-windows. The problem is proved to be NP-complete. The heuristic approach, a reversed-flow algorithm, is developed based on pipeline flow reversibility to compute input schedule for the pipeline problem. This algorithm is implemented in no longer than O(T·E) time. This dissertation also extends the study to examine some operating attributes and problem complexity of multiple-source pipelines. The multiple-source pipeline problem is also NP-complete. A heuristic algorithm modified from the one used in single-source pipeline problems is introduced. This algorithm can also be implemented in no longer than O(T·E) time. Computational results are presented for both methodologies on randomly generated problem sets. The computational experience indicates that reversed-flow algorithms provide good solutions in comparison with the optimal solutions. Only 25% of the problems tested were more than 30% greater than optimal values and approximately 40% of the tested problems were solved optimally by the algorithms.
Internal Corrosion Detection in Liquids Pipelines
DOT National Transportation Integrated Search
2012-01-01
PHMSA project DTRS56-05-T-0005 "Development of ICDA for Liquid Petroleum Pipelines" led to the development of a Direct Assessment (DA) protocol to prioritize locations of possible internal corrosion. The underlying basis LP-ICDA is simple; corrosion ...
The Kepler Science Data Processing Pipeline Source Code Road Map
NASA Technical Reports Server (NTRS)
Wohler, Bill; Jenkins, Jon M.; Twicken, Joseph D.; Bryson, Stephen T.; Clarke, Bruce Donald; Middour, Christopher K.; Quintana, Elisa Victoria; Sanderfer, Jesse Thomas; Uddin, Akm Kamal; Sabale, Anima;
2016-01-01
We give an overview of the operational concepts and architecture of the Kepler Science Processing Pipeline. Designed, developed, operated, and maintained by the Kepler Science Operations Center (SOC) at NASA Ames Research Center, the Science Processing Pipeline is a central element of the Kepler Ground Data System. The SOC consists of an office at Ames Research Center, software development and operations departments, and a data center which hosts the computers required to perform data analysis. The SOC's charter is to analyze stellar photometric data from the Kepler spacecraft and report results to the Kepler Science Office for further analysis. We describe how this is accomplished via the Kepler Science Processing Pipeline, including, the software algorithms. We present the high-performance, parallel computing software modules of the pipeline that perform transit photometry, pixel-level calibration, systematic error correction, attitude determination, stellar target management, and instrument characterization.
Landslide and Land Subsidence Hazards to Pipelines
Baum, Rex L.; Galloway, Devin L.; Harp, Edwin L.
2008-01-01
Landslides and land subsidence pose serious hazards to pipelines throughout the world. Many existing pipeline corridors and more and more new pipelines cross terrain that is affected by either landslides, land subsidence, or both. Consequently the pipeline industry recognizes a need for increased awareness of methods for identifying and evaluating landslide and subsidence hazard for pipeline corridors. This report was prepared in cooperation with the U.S. Department of Transportation Pipeline and Hazardous Materials Safety Administration, and Pipeline Research Council International through a cooperative research and development agreement (CRADA) with DGH Consulting, Inc., to address the need for up-to-date information about current methods to identify and assess these hazards. Chapters in this report (1) describe methods for evaluating landslide hazard on a regional basis, (2) describe the various types of land subsidence hazard in the United States and available methods for identifying and quantifying subsidence, and (3) summarize current methods for investigating individual landslides. In addition to the descriptions, this report provides information about the relative costs, limitations and reliability of various methods.
Iturbe, Rosario; Flores, Carlos; Castro, Alejandrina; Torres, Luis G
2007-10-01
Oil spills due to oil pipelines is a very frequent problem in Mexico. Petroleos Mexicanos (PEMEX), very concerned with the environmental agenda, has been developing inspection and correction plans for zones around oil pipelines pumping stations and pipeline right-of-way. These stations are located at regular intervals of kilometres along the pipelines. In this study, two sections of an oil pipeline and two pipeline pumping stations zones are characterized in terms of the presence of Total Petroleum Hydrocarbons (TPHs) and Polycyclic Aromatic Hydrocarbons (PAHs). The study comprehends sampling of the areas, delimitation of contamination in the vertical and horizontal extension, analysis of the sampled soils regarding TPHs content and, in some cases, the 16 PAHs considered as priority by USEPA, calculation of areas and volumes contaminated (according to Mexican legislation, specifically NOM-EM-138-ECOL-2002) and, finally, a proposal for the best remediation techniques suitable for the contamination levels and the localization of contaminants.
Bad bugs, no drugs: no ESKAPE! An update from the Infectious Diseases Society of America.
Boucher, Helen W; Talbot, George H; Bradley, John S; Edwards, John E; Gilbert, David; Rice, Louis B; Scheld, Michael; Spellberg, Brad; Bartlett, John
2009-01-01
The Infectious Diseases Society of America (IDSA) continues to view with concern the lean pipeline for novel therapeutics to treat drug-resistant infections, especially those caused by gram-negative pathogens. Infections now occur that are resistant to all current antibacterial options. Although the IDSA is encouraged by the prospect of success for some agents currently in preclinical development, there is an urgent, immediate need for new agents with activity against these panresistant organisms. There is no evidence that this need will be met in the foreseeable future. Furthermore, we remain concerned that the infrastructure for discovering and developing new antibacterials continues to stagnate, thereby risking the future pipeline of antibacterial drugs. The IDSA proposed solutions in its 2004 policy report, "Bad Bugs, No Drugs: As Antibiotic R&D Stagnates, a Public Health Crisis Brews," and recently issued a "Call to Action" to provide an update on the scope of the problem and the proposed solutions. A primary objective of these periodic reports is to encourage a community and legislative response to establish greater financial parity between the antimicrobial development and the development of other drugs. Although recent actions of the Food and Drug Administration and the 110th US Congress present a glimmer of hope, significant uncertainly remains. Now, more than ever, it is essential to create a robust and sustainable antibacterial research and development infrastructure--one that can respond to current antibacterial resistance now and anticipate evolving resistance. This challenge requires that industry, academia, the National Institutes of Health, the Food and Drug Administration, the Centers for Disease Control and Prevention, the US Department of Defense, and the new Biomedical Advanced Research and Development Authority at the Department of Health and Human Services work productively together. This report provides an update on potentially effective antibacterial drugs in the late-stage development pipeline, in the hope of encouraging such collaborative action.
Use of microwaves for the detection of corrosion under insulation: The effect of bends
NASA Astrophysics Data System (ADS)
Jones, R. E.; Simonetti, F.; Lowe, M. J. S.; Bradley, I. P.
2012-05-01
The detection of corrosion under insulation is an ongoing challenge in the oil and gas industry. An early warning of areas of pipe at risk of corrosion can be obtained by screening along the length of the pipeline to inspect the insulation layer for the presence of water, as water is a necessary precursor to corrosion. Long-range detection of water volumes can be achieved with microwave signals, using the structure of the clad and insulated pipeline as a coaxial waveguide, with water volumes presenting an impedance contrast and producing reflections of the incident microwave signal. An investigation into what effect bends in the pipeline will have on this inspection technique is presented here.
Prime the Pipeline Project (P[cube]): Putting Knowledge to Work
ERIC Educational Resources Information Center
Greenes, Carole; Wolfe, Susan; Weight, Stephanie; Cavanagh, Mary; Zehring, Julie
2011-01-01
With funding from NSF, the Prime the Pipeline Project (P[cube]) is responding to the need to strengthen the science, technology, engineering, and mathematics (STEM) pipeline from high school to college by developing and evaluating the scientific village strategy and the culture it creates. The scientific village, a community of high school…
Kravatsky, Yuri; Chechetkin, Vladimir; Fedoseeva, Daria; Gorbacheva, Maria; Kravatskaya, Galina; Kretova, Olga; Tchurikov, Nickolai
2017-11-23
The efficient development of antiviral drugs, including efficient antiviral small interfering RNAs (siRNAs), requires continuous monitoring of the strict correspondence between a drug and the related highly variable viral DNA/RNA target(s). Deep sequencing is able to provide an assessment of both the general target conservation and the frequency of particular mutations in the different target sites. The aim of this study was to develop a reliable bioinformatic pipeline for the analysis of millions of short, deep sequencing reads corresponding to selected highly variable viral sequences that are drug target(s). The suggested bioinformatic pipeline combines the available programs and the ad hoc scripts based on an original algorithm of the search for the conserved targets in the deep sequencing data. We also present the statistical criteria for the threshold of reliable mutation detection and for the assessment of variations between corresponding data sets. These criteria are robust against the possible sequencing errors in the reads. As an example, the bioinformatic pipeline is applied to the study of the conservation of RNA interference (RNAi) targets in human immunodeficiency virus 1 (HIV-1) subtype A. The developed pipeline is freely available to download at the website http://virmut.eimb.ru/. Brief comments and comparisons between VirMut and other pipelines are also presented.
Identification of missing variants by combining multiple analytic pipelines.
Ren, Yingxue; Reddy, Joseph S; Pottier, Cyril; Sarangi, Vivekananda; Tian, Shulan; Sinnwell, Jason P; McDonnell, Shannon K; Biernacka, Joanna M; Carrasquillo, Minerva M; Ross, Owen A; Ertekin-Taner, Nilüfer; Rademakers, Rosa; Hudson, Matthew; Mainzer, Liudmila Sergeevna; Asmann, Yan W
2018-04-16
After decades of identifying risk factors using array-based genome-wide association studies (GWAS), genetic research of complex diseases has shifted to sequencing-based rare variants discovery. This requires large sample sizes for statistical power and has brought up questions about whether the current variant calling practices are adequate for large cohorts. It is well-known that there are discrepancies between variants called by different pipelines, and that using a single pipeline always misses true variants exclusively identifiable by other pipelines. Nonetheless, it is common practice today to call variants by one pipeline due to computational cost and assume that false negative calls are a small percent of total. We analyzed 10,000 exomes from the Alzheimer's Disease Sequencing Project (ADSP) using multiple analytic pipelines consisting of different read aligners and variant calling strategies. We compared variants identified by using two aligners in 50,100, 200, 500, 1000, and 1952 samples; and compared variants identified by adding single-sample genotyping to the default multi-sample joint genotyping in 50,100, 500, 2000, 5000 and 10,000 samples. We found that using a single pipeline missed increasing numbers of high-quality variants correlated with sample sizes. By combining two read aligners and two variant calling strategies, we rescued 30% of pass-QC variants at sample size of 2000, and 56% at 10,000 samples. The rescued variants had higher proportions of low frequency (minor allele frequency [MAF] 1-5%) and rare (MAF < 1%) variants, which are the very type of variants of interest. In 660 Alzheimer's disease cases with earlier onset ages of ≤65, 4 out of 13 (31%) previously-published rare pathogenic and protective mutations in APP, PSEN1, and PSEN2 genes were undetected by the default one-pipeline approach but recovered by the multi-pipeline approach. Identification of the complete variant set from sequencing data is the prerequisite of genetic association analyses. The current analytic practice of calling genetic variants from sequencing data using a single bioinformatics pipeline is no longer adequate with the increasingly large projects. The number and percentage of quality variants that passed quality filters but are missed by the one-pipeline approach rapidly increased with sample size.
Pipelining in a changing competitive environment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jones, E.G.; Wishart, D.M.
1996-12-31
The changing competitive environment for the pipeline industry presents a broad spectrum of new challenges and opportunities: international cooperation; globalization of opportunities, organizations and competition; and integrated systems approach to system configuration, financing, contracting strategy, materials sourcing, and operations; cutting edge and emerging technologies; adherence to high standards of environmental protection; an emphasis on safety; innovative approaches to project financing; and advances in technology and programs to maintain the long term, cost effective integrity of operating pipeline systems. These challenges and opportunities are partially a result of the increasingly competitive nature of pipeline development and the public`s intolerance to incidentsmore » of pipeline failure. A creative systems approach to these challenges is often the key to the project moving ahead. This usually encompasses collaboration among users of the pipeline, pipeline owners and operators, international engineering and construction companies, equipment and materials suppliers, in-country engineers and constructors, international lending agencies and financial institutions.« less
Applications of the pipeline environment for visual informatics and genomics computations
2011-01-01
Background Contemporary informatics and genomics research require efficient, flexible and robust management of large heterogeneous data, advanced computational tools, powerful visualization, reliable hardware infrastructure, interoperability of computational resources, and detailed data and analysis-protocol provenance. The Pipeline is a client-server distributed computational environment that facilitates the visual graphical construction, execution, monitoring, validation and dissemination of advanced data analysis protocols. Results This paper reports on the applications of the LONI Pipeline environment to address two informatics challenges - graphical management of diverse genomics tools, and the interoperability of informatics software. Specifically, this manuscript presents the concrete details of deploying general informatics suites and individual software tools to new hardware infrastructures, the design, validation and execution of new visual analysis protocols via the Pipeline graphical interface, and integration of diverse informatics tools via the Pipeline eXtensible Markup Language syntax. We demonstrate each of these processes using several established informatics packages (e.g., miBLAST, EMBOSS, mrFAST, GWASS, MAQ, SAMtools, Bowtie) for basic local sequence alignment and search, molecular biology data analysis, and genome-wide association studies. These examples demonstrate the power of the Pipeline graphical workflow environment to enable integration of bioinformatics resources which provide a well-defined syntax for dynamic specification of the input/output parameters and the run-time execution controls. Conclusions The LONI Pipeline environment http://pipeline.loni.ucla.edu provides a flexible graphical infrastructure for efficient biomedical computing and distributed informatics research. The interactive Pipeline resource manager enables the utilization and interoperability of diverse types of informatics resources. The Pipeline client-server model provides computational power to a broad spectrum of informatics investigators - experienced developers and novice users, user with or without access to advanced computational-resources (e.g., Grid, data), as well as basic and translational scientists. The open development, validation and dissemination of computational networks (pipeline workflows) facilitates the sharing of knowledge, tools, protocols and best practices, and enables the unbiased validation and replication of scientific findings by the entire community. PMID:21791102
NASA Technical Reports Server (NTRS)
Brownston, Lee; Jenkins, Jon M.
2015-01-01
The Kepler Mission was launched in 2009 as NASAs first mission capable of finding Earth-size planets in the habitable zone of Sun-like stars. Its telescope consists of a 1.5-m primary mirror and a 0.95-m aperture. The 42 charge-coupled devices in its focal plane are read out every half hour, compressed, and then downlinked monthly. After four years, the second of four reaction wheels failed, ending the original mission. Back on earth, the Science Operations Center developed the Science Pipeline to analyze about 200,000 target stars in Keplers field of view, looking for evidence of periodic dimming suggesting that one or more planets had crossed the face of its host star. The Pipeline comprises several steps, from pixel-level calibration, through noise and artifact removal, to detection of transit-like signals and the construction of a suite of diagnostic tests to guard against false positives. The Kepler Science Pipeline consists of a pipeline infrastructure written in the Java programming language, which marshals data input to and output from MATLAB applications that are executed as external processes. The pipeline modules, which underwent continuous development and refinement even after data started arriving, employ several analytic techniques, many developed for the Kepler Project. Because of the large number of targets, the large amount of data per target and the complexity of the pipeline algorithms, the processing demands are daunting. Some pipeline modules require days to weeks to process all of their targets, even when run on NASA's 128-node Pleiades supercomputer. The software developers are still seeking ways to increase the throughput. To date, the Kepler project has discovered more than 4000 planetary candidates, of which more than 1000 have been independently confirmed or validated to be exoplanets. Funding for this mission is provided by NASAs Science Mission Directorate.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-01
... Plan for the American Burying Beetle for Pipelines and Well Field Development in Oklahoma and Texas..., operation, and repair of oil and gas pipelines, and related well field activities. Individual oil and gas... pipelines and related well field activities, and will include measures necessary to minimize and mitigate...
NASA Astrophysics Data System (ADS)
Artana, K. B.; Pitana, T.; Dinariyana, D. P.; Ariana, M.; Kristianto, D.; Pratiwi, E.
2018-06-01
The aim of this research is to develop an algorithm and application that can perform real-time monitoring of the safety operation of offshore platforms and subsea gas pipelines as well as determine the need for ship inspection using data obtained from automatic identification system (AIS). The research also focuses on the integration of shipping database, AIS data, and others to develop a prototype for designing a real-time monitoring system of offshore platforms and pipelines. A simple concept is used in the development of this prototype, which is achieved by using an overlaying map that outlines the coordinates of the offshore platform and subsea gas pipeline with the ship's coordinates (longitude/latitude) as detected by AIS. Using such information, we can then build an early warning system (EWS) relayed through short message service (SMS), email, or other means when the ship enters the restricted and exclusion zone of platforms and pipelines. The ship inspection system is developed by combining several attributes. Then, decision analysis software is employed to prioritize the vessel's four attributes, including ship age, ship type, classification, and flag state. Results show that the EWS can increase the safety level of offshore platforms and pipelines, as well as the efficient use of patrol boats in monitoring the safety of the facilities. Meanwhile, ship inspection enables the port to prioritize the ship to be inspected in accordance with the priority ranking inspection score.
Numerical Modeling of Mechanical Behavior for Buried Steel Pipelines Crossing Subsidence Strata
Han, C. J.
2015-01-01
This paper addresses the mechanical behavior of buried steel pipeline crossing subsidence strata. The investigation is based on numerical simulation of the nonlinear response of the pipeline-soil system through finite element method, considering large strain and displacement, inelastic material behavior of buried pipeline and the surrounding soil, as well as contact and friction on the pipeline-soil interface. Effects of key parameters on the mechanical behavior of buried pipeline were investigated, such as strata subsidence, diameter-thickness ratio, buried depth, internal pressure, friction coefficient and soil properties. The results show that the maximum strain appears on the outer transition subsidence section of the pipeline, and its cross section is concave shaped. With the increasing of strata subsidence and diameter-thickness ratio, the out of roundness, longitudinal strain and equivalent plastic strain increase gradually. With the buried depth increasing, the deflection, out of roundness and strain of the pipeline decrease. Internal pressure and friction coefficient have little effect on the deflection of buried pipeline. Out of roundness is reduced and the strain is increased gradually with the increasing of internal pressure. The physical properties of soil have a great influence on the mechanical properties of buried pipeline. The results from the present study can be used for the development of optimization design and preventive maintenance for buried steel pipelines. PMID:26103460
U.S. pipeline industry enters new era
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnsen, M.R.
1999-11-01
The largest construction project in North America this year and next--the Alliance Pipeline--marks some advances for the US pipeline industry. With the Alliance Pipeline system (Alliance), mechanized welding and ultrasonic testing are making their debuts in the US as primary mainline construction techniques. Particularly in Canada and Europe, mechanized welding technology has been used for both onshore and offshore pipeline construction for at least 15 years. However, it has never before been used to build a cross-country pipeline in the US, although it has been tested on short segments. This time, however, an accelerated construction schedule, among other reasons, necessitatedmore » the use of mechanized gas metal arc welding (GMAW). The $3-billion pipeline will delivery natural gas from northwestern British Columbia and northeastern Alberta in Canada to a hub near Chicago, Ill., where it will connect to the North American pipeline grid. Once the pipeline is completed and buried, crews will return the topsoil. Corn and other crops will reclaim the land. While the casual passerby probably won't know the Alliance pipeline is there, it may have a far-reaching effect on the way mainline pipelines are built in the US. For even though mechanized welding and ultrasonic testing are being used for the first time in the United States on this project, some US workers had already gained experience with the technology on projects elsewhere. And work on this pipeline has certainly developed a much larger pool of experienced workers for industry to draw from. The Alliance project could well signal the start of a new era in US pipeline construction.« less
Cormier, Nathan; Kolisnik, Tyler; Bieda, Mark
2016-07-05
There has been an enormous expansion of use of chromatin immunoprecipitation followed by sequencing (ChIP-seq) technologies. Analysis of large-scale ChIP-seq datasets involves a complex series of steps and production of several specialized graphical outputs. A number of systems have emphasized custom development of ChIP-seq pipelines. These systems are primarily based on custom programming of a single, complex pipeline or supply libraries of modules and do not produce the full range of outputs commonly produced for ChIP-seq datasets. It is desirable to have more comprehensive pipelines, in particular ones addressing common metadata tasks, such as pathway analysis, and pipelines producing standard complex graphical outputs. It is advantageous if these are highly modular systems, available as both turnkey pipelines and individual modules, that are easily comprehensible, modifiable and extensible to allow rapid alteration in response to new analysis developments in this growing area. Furthermore, it is advantageous if these pipelines allow data provenance tracking. We present a set of 20 ChIP-seq analysis software modules implemented in the Kepler workflow system; most (18/20) were also implemented as standalone, fully functional R scripts. The set consists of four full turnkey pipelines and 16 component modules. The turnkey pipelines in Kepler allow data provenance tracking. Implementation emphasized use of common R packages and widely-used external tools (e.g., MACS for peak finding), along with custom programming. This software presents comprehensive solutions and easily repurposed code blocks for ChIP-seq analysis and pipeline creation. Tasks include mapping raw reads, peakfinding via MACS, summary statistics, peak location statistics, summary plots centered on the transcription start site (TSS), gene ontology, pathway analysis, and de novo motif finding, among others. These pipelines range from those performing a single task to those performing full analyses of ChIP-seq data. The pipelines are supplied as both Kepler workflows, which allow data provenance tracking, and, in the majority of cases, as standalone R scripts. These pipelines are designed for ease of modification and repurposing.
Study of stress-strain state of pipeline under permafrost conditions
NASA Astrophysics Data System (ADS)
Tarasenko, A. A.; Redutinskiy, M. N.; Chepur, P. V.; Gruchenkova, A. A.
2018-05-01
In this paper, the dependences of the stress-strain state and subsidence of pipelines on the dimensions of the subsidence zone are obtained for the sizes of pipes that have become most widespread during the construction of main oil pipelines (530x10, 820x12, 1020x12, 1020x14, 1020x16, 1220x14, 1220x16, 1220x18 mm). True values of stresses in the pipeline wall, as well as the exact location of maximum stresses for the interval of subsidence zones from 5 to 60 meters, are determined. For this purpose, the authors developed a finite element model of the pipeline that takes into account the actual interaction of the pipeline with the subgrade and allows calculating the SSS of the structure for a variable subsidence zone. Based on the obtained dependences for the underground laying of oil pipelines in permafrost areas, it is proposed to artificially limit the zone of possible subsidence by separation supports from the soil with higher building properties and physical-mechanical parameters. This technical solution would significantly reduce costs when constructing new oil pipelines in permafrost areas.
Urban Underground Pipelines Mapping Using Ground Penetrating Radar
NASA Astrophysics Data System (ADS)
Jaw, S. W.; M, Hashim
2014-02-01
Underground spaces are now being given attention to exploit for transportation, utilities, and public usage. The underground has become a spider's web of utility networks. Mapping of underground utility pipelines has become a challenging and difficult task. As such, mapping of underground utility pipelines is a "hit-and-miss" affair, and results in many catastrophic damages, particularly in urban areas. Therefore, this study was conducted to extract locational information of the urban underground utility pipeline using trenchless measuring tool, namely ground penetrating radar (GPR). The focus of this study was to conduct underground utility pipeline mapping for retrieval of geometry properties of the pipelines, using GPR. In doing this, a series of tests were first conducted at the preferred test site and real-life experiment, followed by modeling of field-based model using Finite-Difference Time-Domain (FDTD). Results provide the locational information of underground utility pipelines associated with its mapping accuracy. Eventually, this locational information of the underground utility pipelines is beneficial to civil infrastructure management and maintenance which in the long term is time-saving and critically important for the development of metropolitan areas.
Research on prognostics and health management of underground pipeline
NASA Astrophysics Data System (ADS)
Zhang, Guangdi; Yang, Meng; Yang, Fan; Ni, Na
2018-04-01
With the development of the city, the construction of the underground pipeline is more and more complex, which has relation to the safety and normal operation of the city, known as "the lifeline of the city". First of all, this paper introduces the principle of PHM (Prognostics and Health Management) technology, then proposed for fault diagnosis, prognostics and health management in view of underground pipeline, make a diagnosis and prognostics for the faults appearing in the operation of the underground pipeline, and then make a health assessment of the whole underground pipe network in order to ensure the operation of the pipeline safely. Finally, summarize and prospect the future research direction.
Identification of sewage leaks by active remote-sensing methods
NASA Astrophysics Data System (ADS)
Goldshleger, Naftaly; Basson, Uri
2016-04-01
The increasing length of sewage pipelines, and concomitant risk of leaks due to urban and industrial growth and development is exposing the surrounding land to contamination risk and environmental harm. It is therefore important to locate such leaks in a timely manner, to minimize the damage. Advances in active remote sensing Ground Penetrating Radar (GPR) and Frequency Domain Electromagnetic (FDEM) technologies was used to identify leaking potentially responsible for pollution and to identify minor spills before they cause widespread damage. This study focused on the development of these electromagnetic methods to replace conventional acoustic methods for the identification of leaks along sewage pipes. Electromagnetic methods provide an additional advantage in that they allow mapping of the fluid-transport system in the subsurface. Leak-detection systems using GPR and FDEM are not limited to large amounts of water, but enable detecting leaks of tens of liters per hour, because they can locate increases in environmental moisture content of only a few percentage along the pipes. The importance and uniqueness of this research lies in the development of practical tools to provide a snapshot and monitoring of the spatial changes in soil moisture content up to depths of about 3-4 m, in open and paved areas, at relatively low cost, in real time or close to real time. Spatial measurements performed using GPR and FDEM systems allow monitoring many tens of thousands of measurement points per hectare, thus providing a picture of the spatial situation along pipelines and the surrounding. The main purpose of this study was to develop a method for detecting sewage leaks using the above-proposed geophysical methods, since their contaminants can severely affect public health. We focused on identifying, locating and characterizing such leaks in sewage pipes in residential and industrial areas.
Forecasting and Evaluation of Gas Pipelines Geometric Forms Breach Hazard
NASA Astrophysics Data System (ADS)
Voronin, K. S.
2016-10-01
Main gas pipelines during operation are under the influence of the permanent pressure drops which leads to their lengthening and as a result, to instability of their position in space. In dynamic systems that have feedback, phenomena, preceding emergencies, should be observed. The article discusses the forced vibrations of the gas pipeline cylindrical surface under the influence of dynamic loads caused by pressure surges, and the process of its geometric shape deformation. Frequency of vibrations, arising in the pipeline at the stage preceding its bending, is being determined. Identification of this frequency can be the basis for the development of a method of monitoring the technical condition of the gas pipeline, and forecasting possible emergency situations allows planning and carrying out in due time reconstruction works on sections of gas pipeline with a possible deviation from the design position.
A software framework for pipelined arithmetic algorithms in field programmable gate arrays
NASA Astrophysics Data System (ADS)
Kim, J. B.; Won, E.
2018-03-01
Pipelined algorithms implemented in field programmable gate arrays are extensively used for hardware triggers in the modern experimental high energy physics field and the complexity of such algorithms increases rapidly. For development of such hardware triggers, algorithms are developed in C++, ported to hardware description language for synthesizing firmware, and then ported back to C++ for simulating the firmware response down to the single bit level. We present a C++ software framework which automatically simulates and generates hardware description language code for pipelined arithmetic algorithms.
NASA Astrophysics Data System (ADS)
Ding, Wenhua; Li, Shaopo; Li, Jiading; Li, Qun; Chen, Tieqiang; Zhang, Hai
In recent years, there has been development of several significant pipeline projects for the transmission of oil and gas from deep water environments. The production of gas transmission pipelines for application demands heavy wall, high strength, good lower temperature toughness and good weldability. To overcome the difficulty of producing consistent mechanical property in heavy wall pipe Shougang Steel Research in cooperation with the Shougang Steel Qinhuangdao China (Shouqin) 4.3m heavy wide plate mill research was conducted.
Pipeline enhances Norman Wells potential
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
Approval of an oil pipeline from halfway down Canada's MacKenzie River Valley at Norman Wells to N. Alberta has raised the potential for development of large reserves along with controversy over native claims. The project involves 2 closely related proposals. One, by Esso Resources, the exploration and production unit of Imperial Oil, will increase oil production from the Norman Wells field from 3000 bpd currently to 25,000 bpd. The other proposal, by Interprovincial Pipeline (N.W) Ltd., calls for construction of an underground pipeline to transport the additional production from Norman Wells to Alberta. The 560-mile, 12-in. pipeline will extend frommore » Norman Wells, which is 90 miles south of the Arctic Circle on the north shore of the Mackenzie River, south to the end of an existing line at Zama in N. Alberta. There will be 3 pumping stations en route. This work also discusses recovery, potential, drilling limitations, the processing plant, positive impact, and further development of the Norman Wells project.« less
Using steady-state equations for transient flow calculation in natural gas pipelines
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maddox, R.N.; Zhou, P.
1984-04-02
Maddox and Zhou have extended their technique for calculating the unsteady-state behavior of straight gas pipelines to complex pipeline systems and networks. After developing the steady-state flow rate and pressure profile for each pipe in the network, analysts can perform the transient-state analysis in the real-time step-wise manner described for this technique.
A Critique of the STEM Pipeline: Young People's Identities in Sweden and Science Education Policy
ERIC Educational Resources Information Center
Mendick, Heather; Berge, Maria; Danielsson, Anna
2017-01-01
In this article, we develop critiques of the pipeline model which dominates Western science education policy, using discourse analysis of interviews with two Swedish young women focused on "identity work". We argue that it is important to unpack the ways that the pipeline model fails to engage with intersections of gender, ethnicity,…
The Minimal Preprocessing Pipelines for the Human Connectome Project
Glasser, Matthew F.; Sotiropoulos, Stamatios N; Wilson, J Anthony; Coalson, Timothy S; Fischl, Bruce; Andersson, Jesper L; Xu, Junqian; Jbabdi, Saad; Webster, Matthew; Polimeni, Jonathan R; Van Essen, David C; Jenkinson, Mark
2013-01-01
The Human Connectome Project (HCP) faces the challenging task of bringing multiple magnetic resonance imaging (MRI) modalities together in a common automated preprocessing framework across a large cohort of subjects. The MRI data acquired by the HCP differ in many ways from data acquired on conventional 3 Tesla scanners and often require newly developed preprocessing methods. We describe the minimal preprocessing pipelines for structural, functional, and diffusion MRI that were developed by the HCP to accomplish many low level tasks, including spatial artifact/distortion removal, surface generation, cross-modal registration, and alignment to standard space. These pipelines are specially designed to capitalize on the high quality data offered by the HCP. The final standard space makes use of a recently introduced CIFTI file format and the associated grayordinates spatial coordinate system. This allows for combined cortical surface and subcortical volume analyses while reducing the storage and processing requirements for high spatial and temporal resolution data. Here, we provide the minimum image acquisition requirements for the HCP minimal preprocessing pipelines and additional advice for investigators interested in replicating the HCP’s acquisition protocols or using these pipelines. Finally, we discuss some potential future improvements for the pipelines. PMID:23668970
Open source pipeline for ESPaDOnS reduction and analysis
NASA Astrophysics Data System (ADS)
Martioli, Eder; Teeple, Doug; Manset, Nadine; Devost, Daniel; Withington, Kanoa; Venne, Andre; Tannock, Megan
2012-09-01
OPERA is a Canada-France-Hawaii Telescope (CFHT) open source collaborative software project currently under development for an ESPaDOnS echelle spectro-polarimetric image reduction pipeline. OPERA is designed to be fully automated, performing calibrations and reduction, producing one-dimensional intensity and polarimetric spectra. The calibrations are performed on two-dimensional images. Spectra are extracted using an optimal extraction algorithm. While primarily designed for CFHT ESPaDOnS data, the pipeline is being written to be extensible to other echelle spectrographs. A primary design goal is to make use of fast, modern object-oriented technologies. Processing is controlled by a harness, which manages a set of processing modules, that make use of a collection of native OPERA software libraries and standard external software libraries. The harness and modules are completely parametrized by site configuration and instrument parameters. The software is open- ended, permitting users of OPERA to extend the pipeline capabilities. All these features have been designed to provide a portable infrastructure that facilitates collaborative development, code re-usability and extensibility. OPERA is free software with support for both GNU/Linux and MacOSX platforms. The pipeline is hosted on SourceForge under the name "opera-pipeline".
Ho, Cheng-I; Lin, Min-Der; Lo, Shang-Lien
2010-07-01
A methodology based on the integration of a seismic-based artificial neural network (ANN) model and a geographic information system (GIS) to assess water leakage and to prioritize pipeline replacement is developed in this work. Qualified pipeline break-event data derived from the Taiwan Water Corporation Pipeline Leakage Repair Management System were analyzed. "Pipe diameter," "pipe material," and "the number of magnitude-3( + ) earthquakes" were employed as the input factors of ANN, while "the number of monthly breaks" was used for the prediction output. This study is the first attempt to manipulate earthquake data in the break-event ANN prediction model. Spatial distribution of the pipeline break-event data was analyzed and visualized by GIS. Through this, the users can swiftly figure out the hotspots of the leakage areas. A northeastern township in Taiwan, frequently affected by earthquakes, is chosen as the case study. Compared to the traditional processes for determining the priorities of pipeline replacement, the methodology developed is more effective and efficient. Likewise, the methodology can overcome the difficulty of prioritizing pipeline replacement even in situations where the break-event records are unavailable.
49 CFR 190.239 - Safety orders.
Code of Federal Regulations, 2013 CFR
2013-10-01
..., testing, repair or other appropriate action to remedy the identified risk condition. (b) How is an... the form of the proposed order in accordance with paragraphs (c) through (g) of this section. (4... conditions that pose a pipeline integrity risk to public safety, property, or the environment. (c) How is the...
Risk Analysis and Forecast Service for Geomagnetically Induced Currents in Europe
NASA Astrophysics Data System (ADS)
Wik, Magnus; Pirjola, Risto; Viljanen, Ari; Lundstedt, Henrik
Geomagnetically induced currents (GIC), occurring during magnetic storms, pose a widespread natural disaster risk to the reliable operation of electric power transmission grids, oil and gas pipelines, telecommunication cables and railway systems. The solar magnetic activity is the cause of GIC. Solar coronal holes can cause recurrent inter-vals of raised geomagnetic activity, and coronal mass ejections (CME) at the Sun, sometimes producing very high speed plasma clouds with enhanced magnetic fields and particle densities, can cause the strongest geomagnetic storms. When the solar wind interacts with the geomag-netic field, energy is transferred to the magnetosphere, driving strong currents in the ionosphere. When these currents change in time a geoelectric field is induced at the surface of the Earth and in the ground. Finally, this field drives GIC in the ground and in any technological conductor systems. The worst consequence of a severe magnetic storm within a power grid is a complete blackout, as happened in the province of Québec, Canada, in March 1989, and in the city of Malmü, Sweden, in October 2003. Gas and oil pipelines are not regarded as vulnerable to the immediate impact of GIC, but the corrosion rate of buried steel pipes can increase due to GIC, which may thus shorten the lifetime of a pipe. European Risk from Geomagnetically Induced Currents (EURISGIC) is an EU project, that, if approved, will produce the first European-wide real-time prototype forecast service of GIC in power systems, based on in-situ solar wind observations and comprehensive simulations of the Earth's magnetosphere. This project focuses on high-voltage power transmission networks, which are probably currently the most susceptible to GIC effects. Geomagnetic storms cover large geographical regions, at times the whole globe. Consequently, power networks are rightly described as being European critical infrastructures whose disruption or destruction could have a significant impact. The project includes six research institutes and one SME, within Europe and US. The Federal Emergency Management Agency (FEMA), the Swedish civil contingencies agency (MSB), and representatives from the European Commission are collaborating with the NOAA National Weather Service and other research institutes on various space weather scenarios -geomagnetic storms with widespread blackouts and disruptions in communications. The aim of this new project is to conduct a risk analysis from GIC on critical infrastructure. Large amounts of natural gas are transported from Russia to Central Europe. Those long pipelines are prone to GIC impacts, which should also be evaluated quantitatively. We will use the EURISGIC project to inform the pipeline community of present European capability in GIC modelling, forecasting and in developing mitigation measures.
Code of Federal Regulations, 2010 CFR
2010-10-01
... criteria as an alternative to the pressure testing in § 195.302(b)(1)(i)-(iii) and § 195.302(b)(2)(i) of... in paragraph (b) of this section as follows: (1) Risk Classification A if the location indicator is ranked as low or medium risk, the product and volume indicators are ranked as low risk, and the...
Code of Federal Regulations, 2011 CFR
2011-10-01
... criteria as an alternative to the pressure testing in § 195.302(b)(1)(i)-(iii) and § 195.302(b)(2)(i) of... in paragraph (b) of this section as follows: (1) Risk Classification A if the location indicator is ranked as low or medium risk, the product and volume indicators are ranked as low risk, and the...
ERIC Educational Resources Information Center
Hitt, Dallas Hambrick; Tucker, Pamela D.; Young, Michelle D.
2012-01-01
The professional pipeline represents a developmental perspective for fostering leadership capacity in schools and districts, from identification of potential talent during the recruitment phase to ensuring career-long learning through professional development. An intentional and mindful approach to supporting the development of educational leaders…
Magnetic Flux Leakage and Principal Component Analysis for metal loss approximation in a pipeline
NASA Astrophysics Data System (ADS)
Ruiz, M.; Mujica, L. E.; Quintero, M.; Florez, J.; Quintero, S.
2015-07-01
Safety and reliability of hydrocarbon transportation pipelines represent a critical aspect for the Oil an Gas industry. Pipeline failures caused by corrosion, external agents, among others, can develop leaks or even rupture, which can negatively impact on population, natural environment, infrastructure and economy. It is imperative to have accurate inspection tools traveling through the pipeline to diagnose the integrity. In this way, over the last few years, different techniques under the concept of structural health monitoring (SHM) have continuously been in development. This work is based on a hybrid methodology that combines the Magnetic Flux Leakage (MFL) and Principal Components Analysis (PCA) approaches. The MFL technique induces a magnetic field in the pipeline's walls. The data are recorded by sensors measuring leakage magnetic field in segments with loss of metal, such as cracking, corrosion, among others. The data provide information of a pipeline with 15 years of operation approximately, which transports gas, has a diameter of 20 inches and a total length of 110 km (with several changes in the topography). On the other hand, PCA is a well-known technique that compresses the information and extracts the most relevant information facilitating the detection of damage in several structures. At this point, the goal of this work is to detect and localize critical loss of metal of a pipeline that are currently working.
Pipeline transport and simultaneous saccharification of corn stover.
Kumar, Amit; Cameron, Jay B; Flynn, Peter C
2005-05-01
Pipeline transport of corn stover delivered by truck from the field is evaluated against a range of truck transport costs. Corn stover transported by pipeline at 20% solids concentration (wet basis) or higher could directly enter an ethanol fermentation plant, and hence the investment in the pipeline inlet end processing facilities displaces comparable investment in the plant. At 20% solids, pipeline transport of corn stover costs less than trucking at capacities in excess of 1.4 M drytonnes/yr when compared to a mid range of truck transport cost (excluding any credit for economies of scale achieved in the ethanol fermentation plant from larger scale due to multiple pipelines). Pipelining of corn stover gives the opportunity to conduct simultaneous transport and saccharification (STS). If current enzymes are used, this would require elevated temperature. Heating of the slurry for STS, which in a fermentation plant is achieved from waste heat, is a significant cost element (more than 5 cents/l of ethanol) if done at the pipeline inlet unless waste heat is available, for example from an electric power plant located adjacent to the pipeline inlet. Heat loss in a 1.26 m pipeline carrying 2 M drytonnes/yr is about 5 degrees C at a distance of 400 km in typical prairie clay soils, and would not likely require insulation; smaller pipelines or different soil conditions might require insulation for STS. Saccharification in the pipeline would reduce the need for investment in the fermentation plant, saving about 0.2 cents/l of ethanol. Transport of corn stover in multiple pipelines offers the opportunity to develop a large ethanol fermentation plant, avoiding some of the diseconomies of scale that arise from smaller plants whose capacities are limited by issues of truck congestion.
30 CFR 1206.157 - Determination of transportation allowances.
Code of Federal Regulations, 2014 CFR
2014-07-01
... pipelines are interconnected to a series of outgoing pipelines; (5) Gas Research Institute (GRI) fees. The GRI conducts research, development, and commercialization programs on natural gas related topics for...
30 CFR 1206.157 - Determination of transportation allowances.
Code of Federal Regulations, 2013 CFR
2013-07-01
... pipelines are interconnected to a series of outgoing pipelines; (5) Gas Research Institute (GRI) fees. The GRI conducts research, development, and commercialization programs on natural gas related topics for...
NASA Astrophysics Data System (ADS)
Dudin, S. M.; Novitskiy, D. V.
2018-05-01
The works of researchers at VNIIgaz, Giprovostokneft, Kuibyshev NIINP, Grozny Petroleum Institute, etc., are devoted to modeling heterogeneous medium flows in pipelines under laboratory conditions. In objective consideration, the empirical relationships obtained and the calculation procedures for pipelines transporting multiphase products are a bank of experimental data on the problem of pipeline transportation of multiphase systems. Based on the analysis of the published works, the main design requirements for experimental installations designed to study the flow regimes of gas-liquid flows in pipelines were formulated, which were taken into account by the authors when creating the experimental stand. The article describes the results of experimental studies of the flow regimes of a gas-liquid mixture in a pipeline, and also gives a methodological description of the experimental installation. Also the article describes the software of the experimental scientific and educational stand developed with the participation of the authors.
Feng, Qingshan; Li, Rui; Nie, Baohua; Liu, Shucong; Zhao, Lianyu; Zhang, Hong
2016-01-01
Girth weld cracking is one of the main failure modes in oil and gas pipelines; girth weld cracking inspection has great economic and social significance for the intrinsic safety of pipelines. This paper introduces the typical girth weld defects of oil and gas pipelines and the common nondestructive testing methods, and systematically generalizes the progress in the studies on technical principles, signal analysis, defect sizing method and inspection reliability, etc., of magnetic flux leakage (MFL) inspection, liquid ultrasonic inspection, electromagnetic acoustic transducer (EMAT) inspection and remote field eddy current (RFDC) inspection for oil and gas pipeline girth weld defects. Additionally, it introduces the new technologies for composite ultrasonic, laser ultrasonic, and magnetostriction inspection, and provides reference for development and application of oil and gas pipeline girth weld defect in-line inspection technology. PMID:28036016
Virtual Instrumentation Corrosion Controller for Natural Gas Pipelines
NASA Astrophysics Data System (ADS)
Gopalakrishnan, J.; Agnihotri, G.; Deshpande, D. M.
2012-12-01
Corrosion is an electrochemical process. Corrosion in natural gas (methane) pipelines leads to leakages. Corrosion occurs when anode and cathode are connected through electrolyte. Rate of corrosion in metallic pipeline can be controlled by impressing current to it and thereby making it to act as cathode of corrosion cell. Technologically advanced and energy efficient corrosion controller is required to protect natural gas pipelines. Proposed virtual instrumentation (VI) based corrosion controller precisely controls the external corrosion in underground metallic pipelines, enhances its life and ensures safety. Designing and development of proportional-integral-differential (PID) corrosion controller using VI (LabVIEW) is carried out. When the designed controller is deployed at field, it maintains the pipe to soil potential (PSP) within safe operating limit and not entering into over/under protection zone. Horizontal deployment of this technique can be done to protect all metallic structure, oil pipelines, which need corrosion protection.
Pipeline for Contraceptive Development
Blithe, Diana L.
2016-01-01
The high rates of unplanned pregnancy reflect unmet need for effective contraceptive methods for women, especially for individuals with health risks such as obesity, diabetes, hypertension, and other conditions that may contraindicate use of an estrogen-containing product. Improvements in safety, user convenience, acceptability and availability of products remain important goals of the contraceptive development program. Another important goal is to minimize the impact of the products on the environment. Development of new methods for male contraception has the potential to address many of these issues with regard to safety for women who have contraindications to effective contraceptive methods but want to protect against pregnancy. It also will address a huge unmet need for men who want to control their fertility. Products under development for men would not introduce eco-toxic hormones in the waste water. Investment in contraceptive research to identify new products for women has been limited in the pharmaceutical industry relative to investment in drug development for other indications. Pharmaceutical R&D for male contraception was active in the 1990’s but was abandoned over a decade ago. The Eunice Kennedy Shriver National Institute of Child Health and Human Development (NICHD) has supported a contraceptive development program since 1969. Through a variety of programs including research grants and contracts, NICHD has developed a pipeline of new targets/products for male and female contraception. A number of lead candidates are under evaluation in the NICHD Contraceptive Clinical Trials Network (CCTN) (1–3). PMID:27523300
Next Generation Sequence Analysis and Computational Genomics Using Graphical Pipeline Workflows
Torri, Federica; Dinov, Ivo D.; Zamanyan, Alen; Hobel, Sam; Genco, Alex; Petrosyan, Petros; Clark, Andrew P.; Liu, Zhizhong; Eggert, Paul; Pierce, Jonathan; Knowles, James A.; Ames, Joseph; Kesselman, Carl; Toga, Arthur W.; Potkin, Steven G.; Vawter, Marquis P.; Macciardi, Fabio
2012-01-01
Whole-genome and exome sequencing have already proven to be essential and powerful methods to identify genes responsible for simple Mendelian inherited disorders. These methods can be applied to complex disorders as well, and have been adopted as one of the current mainstream approaches in population genetics. These achievements have been made possible by next generation sequencing (NGS) technologies, which require substantial bioinformatics resources to analyze the dense and complex sequence data. The huge analytical burden of data from genome sequencing might be seen as a bottleneck slowing the publication of NGS papers at this time, especially in psychiatric genetics. We review the existing methods for processing NGS data, to place into context the rationale for the design of a computational resource. We describe our method, the Graphical Pipeline for Computational Genomics (GPCG), to perform the computational steps required to analyze NGS data. The GPCG implements flexible workflows for basic sequence alignment, sequence data quality control, single nucleotide polymorphism analysis, copy number variant identification, annotation, and visualization of results. These workflows cover all the analytical steps required for NGS data, from processing the raw reads to variant calling and annotation. The current version of the pipeline is freely available at http://pipeline.loni.ucla.edu. These applications of NGS analysis may gain clinical utility in the near future (e.g., identifying miRNA signatures in diseases) when the bioinformatics approach is made feasible. Taken together, the annotation tools and strategies that have been developed to retrieve information and test hypotheses about the functional role of variants present in the human genome will help to pinpoint the genetic risk factors for psychiatric disorders. PMID:23139896
2017-03-01
Contribution to Project: Ian primarily focuses on developing tissue imaging pipeline and perform imaging data analysis . Funding Support: Partially...3D ReconsTruction), a multi-faceted image analysis pipeline , permitting quantitative interrogation of functional implications of heterogeneous... analysis pipeline , to observe and quantify phenotypic metastatic landscape heterogeneity in situ with spatial and molecular resolution. Our implementation
Building the Pipeline for Hubble Legacy Archive Grism data
NASA Astrophysics Data System (ADS)
Kümmel, M.; Albrecht, R.; Fosbury, R.; Freudling, W.; Haase, J.; Hook, R. N.; Kuntschner, H.; Lombardi, M.; Micol, A.; Rosa, M.; Stoehr, F.; Walsh, J. R.
2008-10-01
The Pipeline for Hubble Legacy Archive Grism data (PHLAG) is currently being developed as an end-to-end pipeline for the Hubble Legacy Archive (HLA). The inputs to PHLAG are slitless spectroscopic HST data with only the basic calibrations from standard HST pipelines applied; the outputs are fully calibrated, Virtuall Observatory-compatible spectra, which will be made available through a static HLA-archive. We give an overview of the various aspects of PHLAG. The pipeline consists of several subcomponents -- data preparation, data retrieval, image combination, object detection, spectral extraction using the aXe software, quality control -- which is discussed in detail. As a pilot project, PHLAG is currently being applied to NICMOS G141 grism data. Examples of G141 spectra reduced with PHLAG are shown.
Sequanix: a dynamic graphical interface for Snakemake workflows.
Desvillechabrol, Dimitri; Legendre, Rachel; Rioualen, Claire; Bouchier, Christiane; van Helden, Jacques; Kennedy, Sean; Cokelaer, Thomas
2018-06-01
We designed a PyQt graphical user interface-Sequanix-aimed at democratizing the use of Snakemake pipelines in the NGS space and beyond. By default, Sequanix includes Sequana NGS pipelines (Snakemake format) (http://sequana.readthedocs.io), and is also capable of loading any external Snakemake pipeline. New users can easily, visually, edit configuration files of expert-validated pipelines and can interactively execute these production-ready workflows. Sequanix will be useful to both Snakemake developers in exposing their pipelines and to a wide audience of users. Source on http://github.com/sequana/sequana, bio-containers on http://bioconda.github.io and Singularity hub (http://singularity-hub.org). dimitri.desvillechabrol@pasteur.fr or thomas.cokelaer@pasteur.fr. Supplementary data are available at Bioinformatics online.
Cohen Freue, Gabriela V.; Meredith, Anna; Smith, Derek; Bergman, Axel; Sasaki, Mayu; Lam, Karen K. Y.; Hollander, Zsuzsanna; Opushneva, Nina; Takhar, Mandeep; Lin, David; Wilson-McManus, Janet; Balshaw, Robert; Keown, Paul A.; Borchers, Christoph H.; McManus, Bruce; Ng, Raymond T.; McMaster, W. Robert
2013-01-01
Recent technical advances in the field of quantitative proteomics have stimulated a large number of biomarker discovery studies of various diseases, providing avenues for new treatments and diagnostics. However, inherent challenges have limited the successful translation of candidate biomarkers into clinical use, thus highlighting the need for a robust analytical methodology to transition from biomarker discovery to clinical implementation. We have developed an end-to-end computational proteomic pipeline for biomarkers studies. At the discovery stage, the pipeline emphasizes different aspects of experimental design, appropriate statistical methodologies, and quality assessment of results. At the validation stage, the pipeline focuses on the migration of the results to a platform appropriate for external validation, and the development of a classifier score based on corroborated protein biomarkers. At the last stage towards clinical implementation, the main aims are to develop and validate an assay suitable for clinical deployment, and to calibrate the biomarker classifier using the developed assay. The proposed pipeline was applied to a biomarker study in cardiac transplantation aimed at developing a minimally invasive clinical test to monitor acute rejection. Starting with an untargeted screening of the human plasma proteome, five candidate biomarker proteins were identified. Rejection-regulated proteins reflect cellular and humoral immune responses, acute phase inflammatory pathways, and lipid metabolism biological processes. A multiplex multiple reaction monitoring mass-spectrometry (MRM-MS) assay was developed for the five candidate biomarkers and validated by enzyme-linked immune-sorbent (ELISA) and immunonephelometric assays (INA). A classifier score based on corroborated proteins demonstrated that the developed MRM-MS assay provides an appropriate methodology for an external validation, which is still in progress. Plasma proteomic biomarkers of acute cardiac rejection may offer a relevant post-transplant monitoring tool to effectively guide clinical care. The proposed computational pipeline is highly applicable to a wide range of biomarker proteomic studies. PMID:23592955
Thakur, Shalabh; Guttman, David S
2016-06-30
Comparative analysis of whole genome sequence data from closely related prokaryotic species or strains is becoming an increasingly important and accessible approach for addressing both fundamental and applied biological questions. While there are number of excellent tools developed for performing this task, most scale poorly when faced with hundreds of genome sequences, and many require extensive manual curation. We have developed a de-novo genome analysis pipeline (DeNoGAP) for the automated, iterative and high-throughput analysis of data from comparative genomics projects involving hundreds of whole genome sequences. The pipeline is designed to perform reference-assisted and de novo gene prediction, homolog protein family assignment, ortholog prediction, functional annotation, and pan-genome analysis using a range of proven tools and databases. While most existing methods scale quadratically with the number of genomes since they rely on pairwise comparisons among predicted protein sequences, DeNoGAP scales linearly since the homology assignment is based on iteratively refined hidden Markov models. This iterative clustering strategy enables DeNoGAP to handle a very large number of genomes using minimal computational resources. Moreover, the modular structure of the pipeline permits easy updates as new analysis programs become available. DeNoGAP integrates bioinformatics tools and databases for comparative analysis of a large number of genomes. The pipeline offers tools and algorithms for annotation and analysis of completed and draft genome sequences. The pipeline is developed using Perl, BioPerl and SQLite on Ubuntu Linux version 12.04 LTS. Currently, the software package accompanies script for automated installation of necessary external programs on Ubuntu Linux; however, the pipeline should be also compatible with other Linux and Unix systems after necessary external programs are installed. DeNoGAP is freely available at https://sourceforge.net/projects/denogap/ .
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pallesen, T.R.; Braestrup, M.W.; Jorgensen, O.
Development of Danish North Sea hydrocarbon resources includes the 17-km Rolf pipeline installed in 1985. This one consists of an insulated 8-in. two-phase flow product line with a 3-in. piggyback gas lift line. A practical solution to design of this insulated pipeline, including the small diameter, piggyback injection line was corrosion coating of fusion bonded epoxy (FBE) and polyethylene (PE) sleeve pipe. The insulation design prevents hydrate formation under the most conservative flow regime during gas lift production. Also, the required minimum flow rate during the initial natural lift period is well below the value anticipiated at the initiation ofmore » gas lift. The weight coating design ensures stability on the seabed during the summer months only; thus trenching was required during the same installation season. Installation of insulated flowlines serving marginal fields is a significant feature of North Sea hydrocarbon development projects. The Skjold field is connected to Gorm by a 6-in., two-phase-flow line. The 11-km line was installed in 1982 as the first insulated pipeline in the North Sea. The Rolf field, located 17 km west of Gorm, went on stream Jan. 2. The development includes an unmanned wellhead platform and an insulated, two-phase-flow pipeline to the Gorm E riser platform. After separation on the Gorm C process platform, the oil and condensate are transported to shore through the 20-in. oil pipeline, and the natural gas is piped to Tyra for transmission through the 30-in. gas pipeline. Oil production at Rolf is assisted by the injection of lift gas, transported from Gorm through a 3-in. pipeline, installed piggyback on the insulated 8-in. product line. The seabed is smooth and sandy, the water depth varying between 33.7 m (110.5 ft) at Rolf and 39.1 m (128 ft) at Gorm.« less
3D Visualization for Phoenix Mars Lander Science Operations
NASA Technical Reports Server (NTRS)
Edwards, Laurence; Keely, Leslie; Lees, David; Stoker, Carol
2012-01-01
Planetary surface exploration missions present considerable operational challenges in the form of substantial communication delays, limited communication windows, and limited communication bandwidth. A 3D visualization software was developed and delivered to the 2008 Phoenix Mars Lander (PML) mission. The components of the system include an interactive 3D visualization environment called Mercator, terrain reconstruction software called the Ames Stereo Pipeline, and a server providing distributed access to terrain models. The software was successfully utilized during the mission for science analysis, site understanding, and science operations activity planning. A terrain server was implemented that provided distribution of terrain models from a central repository to clients running the Mercator software. The Ames Stereo Pipeline generates accurate, high-resolution, texture-mapped, 3D terrain models from stereo image pairs. These terrain models can then be visualized within the Mercator environment. The central cross-cutting goal for these tools is to provide an easy-to-use, high-quality, full-featured visualization environment that enhances the mission science team s ability to develop low-risk productive science activity plans. In addition, for the Mercator and Viz visualization environments, extensibility and adaptability to different missions and application areas are key design goals.
NASA Astrophysics Data System (ADS)
Marshall, R. A.; Waters, C. L.; Sciffer, M. D.
2010-05-01
Long, steel pipelines used to transport essential resources such as gas and oil are potentially vulnerable to space weather. In order to inhibit corrosion, the pipelines are usually coated in an insulating material and maintained at a negative electric potential with respect to Earth using cathodic protection units. During periods of enhanced geomagnetic activity, potential differences between the pipeline and surrounding soil (referred to as pipe-to-soil potentials (PSPs)) may exhibit large voltage swings which place the pipeline outside the recommended "safe range" and at an increased risk of corrosion. The PSP variations result from the "geoelectric" field at the Earth's surface and associated geomagnetic field variations. Previous research investigating the relationship between the surface geoelectric field and geomagnetic source fields has focused on the high-latitude regions where line currents in the ionosphere E region are often the assumed source of the geomagnetic field variations. For the Australian region Sq currents also contribute to the geomagnetic field variations and provide the major contribution during geomagnetic quiet times. This paper presents the results of a spectral analysis of PSP measurements from four pipeline networks from the Australian region with geomagnetic field variations from nearby magnetometers. The pipeline networks extend from Queensland in the north of Australia to Tasmania in the south and provide PSP variations during both active and quiet geomagnetic conditions. The spectral analyses show both consistent phase and amplitude relationships across all pipelines, even for large separations between magnetometer and PSP sites and for small-amplitude signals. Comparison between the observational relationships and model predictions suggests a method for deriving a geoelectric field proxy suitable for indicating PSP-related space weather conditions.
Corrosivity Sensor for Exposed Pipelines Based on Wireless Energy Transfer.
Lawand, Lydia; Shiryayev, Oleg; Al Handawi, Khalil; Vahdati, Nader; Rostron, Paul
2017-05-30
External corrosion was identified as one of the main causes of pipeline failures worldwide. A solution that addresses the issue of detecting and quantifying corrosivity of environment for application to existing exposed pipelines has been developed. It consists of a sensing array made of an assembly of thin strips of pipeline steel and a circuit that provides a visual sensor reading to the operator. The proposed sensor is passive and does not require a constant power supply. Circuit design was validated through simulations and lab experiments. Accelerated corrosion experiment was conducted to confirm the feasibility of the proposed corrosivity sensor design.
Canada seeks US financing waiver to clear Alaska Gas Pipeline's path
DOE Office of Scientific and Technical Information (OSTI.GOV)
Corrigan, R.
1981-09-26
A Canadian official outlines in an interview his government's hope that the US will proceed with the financing and construction of the Alaska Highway natural gas pipeline. The Canadian portion of the pipeline was begun under good faith because Canada sees her best interests served when US supply needs are met and when both countries have the energy to develop and prosper. Canada asks the Reagan administration to present Congress with a waiver package that will facilitate financing by eliminating a prohibition against pipeline share ownership by the owners of gas in Alaska. (DCK)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wynne, Adam S.
2011-05-05
In many application domains in science and engineering, data produced by sensors, instruments and networks is naturally processed by software applications structured as a pipeline . Pipelines comprise a sequence of software components that progressively process discrete units of data to produce a desired outcome. For example, in a Web crawler that is extracting semantics from text on Web sites, the first stage in the pipeline might be to remove all HTML tags to leave only the raw text of the document. The second step may parse the raw text to break it down into its constituent grammatical parts, suchmore » as nouns, verbs and so on. Subsequent steps may look for names of people or places, interesting events or times so documents can be sequenced on a time line. Each of these steps can be written as a specialized program that works in isolation with other steps in the pipeline. In many applications, simple linear software pipelines are sufficient. However, more complex applications require topologies that contain forks and joins, creating pipelines comprising branches where parallel execution is desirable. It is also increasingly common for pipelines to process very large files or high volume data streams which impose end-to-end performance constraints. Additionally, processes in a pipeline may have specific execution requirements and hence need to be distributed as services across a heterogeneous computing and data management infrastructure. From a software engineering perspective, these more complex pipelines become problematic to implement. While simple linear pipelines can be built using minimal infrastructure such as scripting languages, complex topologies and large, high volume data processing requires suitable abstractions, run-time infrastructures and development tools to construct pipelines with the desired qualities-of-service and flexibility to evolve to handle new requirements. The above summarizes the reasons we created the MeDICi Integration Framework (MIF) that is designed for creating high-performance, scalable and modifiable software pipelines. MIF exploits a low friction, robust, open source middleware platform and extends it with component and service-based programmatic interfaces that make implementing complex pipelines simple. The MIF run-time automatically handles queues between pipeline elements in order to handle request bursts, and automatically executes multiple instances of pipeline elements to increase pipeline throughput. Distributed pipeline elements are supported using a range of configurable communications protocols, and the MIF interfaces provide efficient mechanisms for moving data directly between two distributed pipeline elements.« less
NASA Astrophysics Data System (ADS)
Tomarov, G. V.; Povarov, V. P.; Shipkov, A. A.; Gromov, A. F.; Kiselev, A. N.; Shepelev, S. V.; Galanin, A. V.
2015-02-01
Specific features relating to development of the information-analytical system on the problem of flow-accelerated corrosion of pipeline elements in the secondary coolant circuit of the VVER-440-based power units at the Novovoronezh nuclear power plant are considered. The results from a statistical analysis of data on the quantity, location, and operating conditions of the elements and preinserted segments of pipelines used in the condensate-feedwater and wet steam paths are presented. The principles of preparing and using the information-analytical system for determining the lifetime to reaching inadmissible wall thinning in elements of pipelines used in the secondary coolant circuit of the VVER-440-based power units at the Novovoronezh NPP are considered.
NGSPanPipe: A Pipeline for Pan-genome Identification in Microbial Strains from Experimental Reads.
Kulsum, Umay; Kapil, Arti; Singh, Harpreet; Kaur, Punit
2018-01-01
Recent advancements in sequencing technologies have decreased both time span and cost for sequencing the whole bacterial genome. High-throughput Next-Generation Sequencing (NGS) technology has led to the generation of enormous data concerning microbial populations publically available across various repositories. As a consequence, it has become possible to study and compare the genomes of different bacterial strains within a species or genus in terms of evolution, ecology and diversity. Studying the pan-genome provides insights into deciphering microevolution, global composition and diversity in virulence and pathogenesis of a species. It can also assist in identifying drug targets and proposing vaccine candidates. The effective analysis of these large genome datasets necessitates the development of robust tools. Current methods to develop pan-genome do not support direct input of raw reads from the sequencer machine but require preprocessing of reads as an assembled protein/gene sequence file or the binary matrix of orthologous genes/proteins. We have designed an easy-to-use integrated pipeline, NGSPanPipe, which can directly identify the pan-genome from short reads. The output from the pipeline is compatible with other pan-genome analysis tools. We evaluated our pipeline with other methods for developing pan-genome, i.e. reference-based assembly and de novo assembly using simulated reads of Mycobacterium tuberculosis. The single script pipeline (pipeline.pl) is applicable for all bacterial strains. It integrates multiple in-house Perl scripts and is freely accessible from https://github.com/Biomedinformatics/NGSPanPipe .
Robust, open-source removal of systematics in Kepler data
NASA Astrophysics Data System (ADS)
Aigrain, S.; Parviainen, H.; Roberts, S.; Reece, S.; Evans, T.
2017-10-01
We present ARC2 (Astrophysically Robust Correction 2), an open-source python-based systematics-correction pipeline, to correct for the Kepler prime mission long-cadence light curves. The ARC2 pipeline identifies and corrects any isolated discontinuities in the light curves and then removes trends common to many light curves. These trends are modelled using the publicly available co-trending basis vectors, within an (approximate) Bayesian framework with 'shrinkage' priors to minimize the risk of overfitting and the injection of any additional noise into the corrected light curves, while keeping any astrophysical signals intact. We show that the ARC2 pipeline's performance matches that of the standard Kepler PDC-MAP data products using standard noise metrics, and demonstrate its ability to preserve astrophysical signals using injection tests with simulated stellar rotation and planetary transit signals. Although it is not identical, the ARC2 pipeline can thus be used as an open-source alternative to PDC-MAP, whenever the ability to model the impact of the systematics removal process on other kinds of signal is important.
Design Optimization of Innovative High-Level Waste Pipeline Unplugging Technologies - 13341
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pribanic, T.; Awwad, A.; Varona, J.
2013-07-01
Florida International University (FIU) is currently working on the development and optimization of two innovative pipeline unplugging methods: the asynchronous pulsing system (APS) and the peristaltic crawler system (PCS). Experiments were conducted on the APS to determine how air in the pipeline influences the system's performance as well as determine the effectiveness of air mitigation techniques in a pipeline. The results obtained during the experimental phase of the project, including data from pipeline pressure pulse tests along with air bubble compression tests are presented. Single-cycle pulse amplification caused by a fast-acting cylinder piston pump in 21.8, 30.5, and 43.6 mmore » pipelines were evaluated. Experiments were conducted on fully flooded pipelines as well as pipelines that contained various amounts of air to evaluate the system's performance when air is present in the pipeline. Also presented are details of the improvements implemented to the third generation crawler system (PCS). The improvements include the redesign of the rims of the unit to accommodate a camera system that provides visual feedback of the conditions inside the pipeline. Visual feedback allows the crawler to be used as a pipeline unplugging and inspection tool. Tests conducted previously demonstrated a significant reduction of the crawler speed with increasing length of tether. Current improvements include the positioning of a pneumatic valve manifold system that is located in close proximity to the crawler, rendering tether length independent of crawler speed. Additional improvements to increase the crawler's speed were also investigated and presented. Descriptions of the test beds, which were designed to emulate possible scenarios present on the Department of Energy (DOE) pipelines, are presented. Finally, conclusions and recommendations for the systems are provided. (authors)« less
Health, safety, and environmental risks from energy production: A year-long reality check
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oldenburg, C.M.
2011-04-01
Large-scale carbon dioxide capture and storage (CCS) offers the benefit of reducing CO{sub 2} emissions and thereby mitigating climate change risk, but it will also bring its own health, safety, and environmental risks. Curtis M. Oldenburg, Editor-in-Chief, considers these risks in the context of the broader picture of energy production. Over the last year, there have been major acute health, safety, and environmental (HSE) consequences related to accidents involving energy production from every major primary energy source. These are, in chronological order: (i) the Upper Big Branch (coal) Mine disaster, (ii) the Gulf of Mexico Macondo (oil) well blowout, (iii)more » the San Bruno (natural gas) pipeline leak and explosion, and (iv) the Fukushima (nuclear) reactor radioactivity releases. Briefly, the Upper Big Branch Mine disaster occurred in West Virginia on April 5, 2010, when natural methane in the mine ignited, causing the deaths of 29 miners, the worst coal mine disaster in the USA since 1970. Fifteen days later, the Macondo oil well in the Gulf of Mexico suffered a blowout, with a gas explosion and fire on the floating drilling platform that killed 11 people. The oil and gas continued to flow out of the well at the seafloor until July 15, 2010, spilling a total of approximately 5 million barrels of oil into the sea. On September 9, 2010, a 30-inch (76-cm) buried, steel, natural gas pipeline in San Bruno, California, leaked gas and exploded in a residential neighborhood, killing 8 people in their homes and burning a total of 38 homes. Flames were up to 1000 ft (300 m) high, and the initial explosion itself reportedly measured 1.1 on the Richter scale. Finally, on March 11, 2011, a magnitude 9.0 earthquake off the coast of Japan's main island, Honshu, caused a tsunami that crippled the backup power and associated cooling systems for six reactor cores and their spent fuel storage tanks at the Fukushima nuclear power plant. At time of writing, workers trying to bring the crisis under control have been exposed to dangerous levels of radiation, and radioactive water and particulates have been released to the sea and atmosphere. These four disasters, all of which occurred within the past 12 months, were not unprecedented; similar events differing only in detail have happened around the world before, and such events will occur again. Today, developed nations primarily use fossil fuels to create affordable energy for comforts such as lighting, heating and air-conditioning, refrigeration, transportation, education, and entertainment, as well as for powering manufacturing, which creates jobs and a wealth of material goods. In addition to the risks of the existing energy infrastructure that have become obvious through these recent disasters, there is also the ongoing risk of climate change that comes from the vast emissions of greenhouse gases, primarily CO{sub 2}, from the burning of fossil fuels. The implementation of CO{sub 2} capture and storage (CCS) will help mitigate CO{sub 2} emissions from fossil fuel energy, but it also carries with it HSE risks. In my personal interactions with the public and with students, the main concern voiced is whether CO{sub 2} could leak out of the deep reservoirs into which it is injected and rise up out of the ground, smothering people and animals at the ground surface. Another concern expressed is that CO{sub 2} pipelines could fail and cause similar gaseous plumes of CO{sub 2}. The widespread concerns about CO{sub 2} leaking out over the ground surface may be inspired by events that have happened within natural systems in equatorial Africa, in Indonesia, and in Italy. Researchers have been investigating a wide variety of HSE risks of geologic CO{sub 2} storage for some time and have determined that wells are the main potential pathways for significant leakage from the deep subsurface. I discuss the acute HSE risks of CO{sub 2} leakage through wells and from pipelines, and compare the behavior of failures in CO{sub 2} wells and pipelines with oil and gas analogues from which most of our experience derives.« less
49 CFR 190.239 - Safety orders.
Code of Federal Regulations, 2012 CFR
2012-10-01
... risk condition. (b) How is an operator notified of the proposed issuance of a safety order and what are... the form of the proposed order in accordance with paragraphs (c) through (g) of this section. (4... conditions that pose a pipeline integrity risk to public safety, property, or the environment. (c) How is the...
49 CFR 190.239 - Safety orders.
Code of Federal Regulations, 2014 CFR
2014-10-01
... risk condition. (b) How is an operator notified of the proposed issuance of a safety order and what are... the form of the proposed order in accordance with paragraphs (c) through (g) of this section. (4... conditions that pose a pipeline integrity risk to public safety, property, or the environment. (c) How is the...
2016-03-01
they traverse land [e.g., runway, road, rail line, pipeline, fence, pavement , electrical distribution line] and are reported by a linear unit of...locations. Furthermore, these officials stated that the new risk- based Interagency Security Committee standards provide a more flexible risk-based
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
This is a short paper on the history and development of the Platte Pipe Line which stretches 1156 miles from Byron, Wyoming, to Wood River, Illinois. It discusses the development and significance of one of the most used crude oil pipelines in the United States. It also discusses its role in advanced pipeline control technology and the future of the system.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-20
... oil and gas pipelines. While subsequent efforts by industry to develop infrastructure such as oil and gas pipelines and their associated components are reasonably foreseeable, these elements are not... Highway to Umiat, to increase access to potential oil and gas resources for exploration and development...
Career Development of Women in Academia: Traversing the Leaky Pipeline
ERIC Educational Resources Information Center
Gasser, Courtney E.; Shaffer, Katharine S.
2014-01-01
Women's experiences in academia are laden with a fundamental set of issues pertaining to gender inequalities. A model reflecting women's career development and experiences around their academic pipeline (or career in academia) is presented. This model further conveys a new perspective on the experiences of women academicians before, during and…
Evidence-Based Professional Development Considerations along the School-to-Prison Pipeline
ERIC Educational Resources Information Center
Houchins, David E.; Shippen, Margaret E.; Murphy, Kristin M.
2012-01-01
This article addresses professional development (PD) issues for those who provide services to students in the school-to-prison pipeline (STPP). Emphasis is on implementing evidence-based practices. The authors use a modified version of Desimone's PD framework for the structure of this article involving (a) collective participation and common…
Gaps of Decision Support Models for Pipeline Renewal and Recommendations for Improvement
In terms of the development of software for decision support for pipeline renewal, more attention to date has been paid to the development of asset management models that help an owner decide on which portions of a system to prioritize needed actions. There has been much less w...
GAPS OF DECISION SUPPORT MODELS FOR PIPELINE RENEWAL AND RECOMMENDATIONS FOR IMPROVEMENT (SLIDE)
In terms of the development of software for decision support for pipeline renewal, more attention to date has been paid to the development of asset management models that help an owner decide on which portions of a system to prioritize needed actions. There has been much less wor...
Commercial Mobile Alert Service (CMAS) Alerting Pipeline Taxonomy
2012-03-01
for the consumer at the mo- ment but will soon become a commoditized, basic requirement. For example, as the baby boomers grow older, mobile services...Commercial Mobile Alert Service (CMAS) Alerting Pipeline Taxonomy The WEA Project Team March 2012 SPECIAL REPORT CMU/SEI-2012-TR-019 CERT...report presents a taxonomy developed for the Commercial Mobile Alert Service (CMAS). The CMAS Alerting Pipeline Taxonomy is a hierarchical classification
Makropoulos, Antonios; Robinson, Emma C; Schuh, Andreas; Wright, Robert; Fitzgibbon, Sean; Bozek, Jelena; Counsell, Serena J; Steinweg, Johannes; Vecchiato, Katy; Passerat-Palmbach, Jonathan; Lenz, Gregor; Mortari, Filippo; Tenev, Tencho; Duff, Eugene P; Bastiani, Matteo; Cordero-Grande, Lucilio; Hughes, Emer; Tusor, Nora; Tournier, Jacques-Donald; Hutter, Jana; Price, Anthony N; Teixeira, Rui Pedro A G; Murgasova, Maria; Victor, Suresh; Kelly, Christopher; Rutherford, Mary A; Smith, Stephen M; Edwards, A David; Hajnal, Joseph V; Jenkinson, Mark; Rueckert, Daniel
2018-06-01
The Developing Human Connectome Project (dHCP) seeks to create the first 4-dimensional connectome of early life. Understanding this connectome in detail may provide insights into normal as well as abnormal patterns of brain development. Following established best practices adopted by the WU-MINN Human Connectome Project (HCP), and pioneered by FreeSurfer, the project utilises cortical surface-based processing pipelines. In this paper, we propose a fully automated processing pipeline for the structural Magnetic Resonance Imaging (MRI) of the developing neonatal brain. This proposed pipeline consists of a refined framework for cortical and sub-cortical volume segmentation, cortical surface extraction, and cortical surface inflation, which has been specifically designed to address considerable differences between adult and neonatal brains, as imaged using MRI. Using the proposed pipeline our results demonstrate that images collected from 465 subjects ranging from 28 to 45 weeks post-menstrual age (PMA) can be processed fully automatically; generating cortical surface models that are topologically correct, and correspond well with manual evaluations of tissue boundaries in 85% of cases. Results improve on state-of-the-art neonatal tissue segmentation models and significant errors were found in only 2% of cases, where these corresponded to subjects with high motion. Downstream, these surfaces will enhance comparisons of functional and diffusion MRI datasets, supporting the modelling of emerging patterns of brain connectivity. Copyright © 2018 Elsevier Inc. All rights reserved.
Automated processing pipeline for neonatal diffusion MRI in the developing Human Connectome Project.
Bastiani, Matteo; Andersson, Jesper L R; Cordero-Grande, Lucilio; Murgasova, Maria; Hutter, Jana; Price, Anthony N; Makropoulos, Antonios; Fitzgibbon, Sean P; Hughes, Emer; Rueckert, Daniel; Victor, Suresh; Rutherford, Mary; Edwards, A David; Smith, Stephen M; Tournier, Jacques-Donald; Hajnal, Joseph V; Jbabdi, Saad; Sotiropoulos, Stamatios N
2018-05-28
The developing Human Connectome Project is set to create and make available to the scientific community a 4-dimensional map of functional and structural cerebral connectivity from 20 to 44 weeks post-menstrual age, to allow exploration of the genetic and environmental influences on brain development, and the relation between connectivity and neurocognitive function. A large set of multi-modal MRI data from fetuses and newborn infants is currently being acquired, along with genetic, clinical and developmental information. In this overview, we describe the neonatal diffusion MRI (dMRI) image processing pipeline and the structural connectivity aspect of the project. Neonatal dMRI data poses specific challenges, and standard analysis techniques used for adult data are not directly applicable. We have developed a processing pipeline that deals directly with neonatal-specific issues, such as severe motion and motion-related artefacts, small brain sizes, high brain water content and reduced anisotropy. This pipeline allows automated analysis of in-vivo dMRI data, probes tissue microstructure, reconstructs a number of major white matter tracts, and includes an automated quality control framework that identifies processing issues or inconsistencies. We here describe the pipeline and present an exemplar analysis of data from 140 infants imaged at 38-44 weeks post-menstrual age. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Lan, G.; Jiang, J.; Li, D. D.; Yi, W. S.; Zhao, Z.; Nie, L. N.
2013-12-01
The calculation of water-hammer pressure phenomenon of single-phase liquid is already more mature for a pipeline of uniform characteristics, but less research has addressed the calculation of slurry water hammer pressure in complex pipelines with slurry flows carrying solid particles. In this paper, based on the developments of slurry pipelines at home and abroad, the fundamental principle and method of numerical simulation of transient processes are presented, and several boundary conditions are given. Through the numerical simulation and analysis of transient processes of a practical engineering of long-distance slurry transportation pipeline system, effective protection measures and operating suggestions are presented. A model for calculating the water impact of solid and fluid phases is established based on a practical engineering of long-distance slurry pipeline transportation system. After performing a numerical simulation of the transient process, analyzing and comparing the results, effective protection measures and operating advice are recommended, which has guiding significance to the design and operating management of practical engineering of longdistance slurry pipeline transportation system.
SNPhylo: a pipeline to construct a phylogenetic tree from huge SNP data.
Lee, Tae-Ho; Guo, Hui; Wang, Xiyin; Kim, Changsoo; Paterson, Andrew H
2014-02-26
Phylogenetic trees are widely used for genetic and evolutionary studies in various organisms. Advanced sequencing technology has dramatically enriched data available for constructing phylogenetic trees based on single nucleotide polymorphisms (SNPs). However, massive SNP data makes it difficult to perform reliable analysis, and there has been no ready-to-use pipeline to generate phylogenetic trees from these data. We developed a new pipeline, SNPhylo, to construct phylogenetic trees based on large SNP datasets. The pipeline may enable users to construct a phylogenetic tree from three representative SNP data file formats. In addition, in order to increase reliability of a tree, the pipeline has steps such as removing low quality data and considering linkage disequilibrium. A maximum likelihood method for the inference of phylogeny is also adopted in generation of a tree in our pipeline. Using SNPhylo, users can easily produce a reliable phylogenetic tree from a large SNP data file. Thus, this pipeline can help a researcher focus more on interpretation of the results of analysis of voluminous data sets, rather than manipulations necessary to accomplish the analysis.
Slaying Hydra: A Python-Based Reduction Pipeline for the Hydra Multi-Object Spectrograph
NASA Astrophysics Data System (ADS)
Seifert, Richard; Mann, Andrew
2018-01-01
We present a Python-based data reduction pipeline for the Hydra Multi-Object Spectrograph on the WIYN 3.5 m telescope, an instrument which enables simultaneous spectroscopy of up to 93 targets. The reduction steps carried out include flat-fielding, dynamic fiber tracing, wavelength calibration, optimal fiber extraction, and sky subtraction. The pipeline also supports the use of sky lines to correct for zero-point offsets between fibers. To account for the moving parts on the instrument and telescope, fiber positions and wavelength solutions are derived in real-time for each dataset. The end result is a one-dimensional spectrum for each target fiber. Quick and fully automated, the pipeline enables on-the-fly reduction while observing, and has been known to outperform the IRAF pipeline by more accurately reproducing known RVs. While Hydra has many configurations in both high- and low-resolution, the pipeline was developed and tested with only one high-resolution mode. In the future we plan to expand the pipeline to work in most commonly used modes.
The JCSG high-throughput structural biology pipeline.
Elsliger, Marc André; Deacon, Ashley M; Godzik, Adam; Lesley, Scott A; Wooley, John; Wüthrich, Kurt; Wilson, Ian A
2010-10-01
The Joint Center for Structural Genomics high-throughput structural biology pipeline has delivered more than 1000 structures to the community over the past ten years. The JCSG has made a significant contribution to the overall goal of the NIH Protein Structure Initiative (PSI) of expanding structural coverage of the protein universe, as well as making substantial inroads into structural coverage of an entire organism. Targets are processed through an extensive combination of bioinformatics and biophysical analyses to efficiently characterize and optimize each target prior to selection for structure determination. The pipeline uses parallel processing methods at almost every step in the process and can adapt to a wide range of protein targets from bacterial to human. The construction, expansion and optimization of the JCSG gene-to-structure pipeline over the years have resulted in many technological and methodological advances and developments. The vast number of targets and the enormous amounts of associated data processed through the multiple stages of the experimental pipeline required the development of variety of valuable resources that, wherever feasible, have been converted to free-access web-based tools and applications.
Kepler Science Operations Center Architecture
NASA Technical Reports Server (NTRS)
Middour, Christopher; Klaus, Todd; Jenkins, Jon; Pletcher, David; Cote, Miles; Chandrasekaran, Hema; Wohler, Bill; Girouard, Forrest; Gunter, Jay P.; Uddin, Kamal;
2010-01-01
We give an overview of the operational concepts and architecture of the Kepler Science Data Pipeline. Designed, developed, operated, and maintained by the Science Operations Center (SOC) at NASA Ames Research Center, the Kepler Science Data Pipeline is central element of the Kepler Ground Data System. The SOC charter is to analyze stellar photometric data from the Kepler spacecraft and report results to the Kepler Science Office for further analysis. We describe how this is accomplished via the Kepler Science Data Pipeline, including the hardware infrastructure, scientific algorithms, and operational procedures. The SOC consists of an office at Ames Research Center, software development and operations departments, and a data center that hosts the computers required to perform data analysis. We discuss the high-performance, parallel computing software modules of the Kepler Science Data Pipeline that perform transit photometry, pixel-level calibration, systematic error-correction, attitude determination, stellar target management, and instrument characterization. We explain how data processing environments are divided to support operational processing and test needs. We explain the operational timelines for data processing and the data constructs that flow into the Kepler Science Data Pipeline.
NASA Astrophysics Data System (ADS)
Toropov, V. S.
2018-05-01
The paper suggests a set of measures to select the equipment and its components in order to reduce energy costs in the process of pulling the pipeline into the well in the constructing the trenchless pipeline crossings of various materials using horizontal directional drilling technology. A methodology for reducing energy costs has been developed by regulating the operation modes of equipment during the process of pulling the working pipeline into a drilled and pre-expanded well. Since the power of the drilling rig is the most important criterion in the selection of equipment for the construction of a trenchless crossover, an algorithm is proposed for calculating the required capacity of the rig when operating in different modes in the process of pulling the pipeline into the well.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-19
... the best available knowledge and expertise, and considers stakeholder perspectives. Specifically the... rooms. All public spaces are ADA accessible. Contact the Westin for more information. Refer to the...
ECDA of Cased Pipeline Segments
DOT National Transportation Integrated Search
2010-06-01
On June 28, 2007, PHMSA released a Broad Agency Announcement (BAA), DTPH56-07-BAA-000002, seeking white papers on individual projects and consolidated Research and Development (R&D) programs addressing topics on pipeline safety program. Although, not...
Next-Generation Sequencing in Oncology: Genetic Diagnosis, Risk Prediction and Cancer Classification
Kamps, Rick; Brandão, Rita D.; van den Bosch, Bianca J.; Paulussen, Aimee D. C.; Xanthoulea, Sofia; Blok, Marinus J.; Romano, Andrea
2017-01-01
Next-generation sequencing (NGS) technology has expanded in the last decades with significant improvements in the reliability, sequencing chemistry, pipeline analyses, data interpretation and costs. Such advances make the use of NGS feasible in clinical practice today. This review describes the recent technological developments in NGS applied to the field of oncology. A number of clinical applications are reviewed, i.e., mutation detection in inherited cancer syndromes based on DNA-sequencing, detection of spliceogenic variants based on RNA-sequencing, DNA-sequencing to identify risk modifiers and application for pre-implantation genetic diagnosis, cancer somatic mutation analysis, pharmacogenetics and liquid biopsy. Conclusive remarks, clinical limitations, implications and ethical considerations that relate to the different applications are provided. PMID:28146134
Development of Time-Distance Helioseismology Data Analysis Pipeline for SDO/HMI
NASA Technical Reports Server (NTRS)
DuVall, T. L., Jr.; Zhao, J.; Couvidat, S.; Parchevsky, K. V.; Beck, J.; Kosovichev, A. G.; Scherrer, P. H.
2008-01-01
The Helioseismic and Magnetic Imager of SDO will provide uninterrupted 4k x 4k-pixel Doppler-shift images of the Sun with approximately 40 sec cadence. These data will have a unique potential for advancing local helioseismic diagnostics of the Sun's interior structure and dynamics. They will help to understand the basic mechanisms of solar activity and develop predictive capabilities for NASA's Living with a Star program. Because of the tremendous amount of data the HMI team is developing a data analysis pipeline, which will provide maps of subsurface flows and sound-speed distributions inferred form the Doppler data by the time-distance technique. We discuss the development plan, methods, and algorithms, and present the status of the pipeline, testing results and examples of the data products.
Selection of pipe repair methods.
DOT National Transportation Integrated Search
2013-06-01
The objective of this research is to provide pipeline operators with testing procedures and : results of the performance of composite pipe repair methods and ultimately, improve their : selection and installation, and reduce the risks associated with...
Automated Monitoring of Pipeline Rights-of-Way
NASA Technical Reports Server (NTRS)
Frost, Chard Ritchie
2010-01-01
NASA Ames Research Center and the Pipeline Research Council International, Inc. have partnered in the formation of a research program to identify and develop the key technologies required to enable automated detection of threats to gas and oil transmission and distribution pipelines. This presentation describes the Right-of-way Automated Monitoring (RAM) program and highlights research successes to date, continuing challenges to implementing the RAM objectives, and the program's ongoing work and plans.
Kepler: A Search for Terrestrial Planets - SOC 9.3 DR25 Pipeline Parameter Configuration Reports
NASA Technical Reports Server (NTRS)
Campbell, Jennifer R.
2017-01-01
This document describes the manner in which the pipeline and algorithm parameters for the Kepler Science Operations Center (SOC) science data processing pipeline were managed. This document is intended for scientists and software developers who wish to better understand the software design for the final Kepler codebase (SOC 9.3) and the effect of the software parameters on the Data Release (DR) 25 archival products.
Inverse Transient Analysis for Classification of Wall Thickness Variations in Pipelines
Tuck, Jeffrey; Lee, Pedro
2013-01-01
Analysis of transient fluid pressure signals has been investigated as an alternative method of fault detection in pipeline systems and has shown promise in both laboratory and field trials. The advantage of the method is that it can potentially provide a fast and cost effective means of locating faults such as leaks, blockages and pipeline wall degradation within a pipeline while the system remains fully operational. The only requirement is that high speed pressure sensors are placed in contact with the fluid. Further development of the method requires detailed numerical models and enhanced understanding of transient flow within a pipeline where variations in pipeline condition and geometry occur. One such variation commonly encountered is the degradation or thinning of pipe walls, which can increase the susceptible of a pipeline to leak development. This paper aims to improve transient-based fault detection methods by investigating how changes in pipe wall thickness will affect the transient behaviour of a system; this is done through the analysis of laboratory experiments. The laboratory experiments are carried out on a stainless steel pipeline of constant outside diameter, into which a pipe section of variable wall thickness is inserted. In order to detect the location and severity of these changes in wall conditions within the laboratory system an inverse transient analysis procedure is employed which considers independent variations in wavespeed and diameter. Inverse transient analyses are carried out using a genetic algorithm optimisation routine to match the response from a one-dimensional method of characteristics transient model to the experimental time domain pressure responses. The accuracy of the detection technique is evaluated and benefits associated with various simplifying assumptions and simulation run times are investigated. It is found that for the case investigated, changes in the wavespeed and nominal diameter of the pipeline are both important to the accuracy of the inverse analysis procedure and can be used to differentiate the observed transient behaviour caused by changes in wall thickness from that caused by other known faults such as leaks. Further application of the method to real pipelines is discussed.
Development of Optimized Welding Solutions for X100 Linepipe Steel
DOT National Transportation Integrated Search
2011-09-01
This investigation is part of a major consolidated program of research sponsored by the US Department of Transportation (DOT) Pipeline Hazardous Materials Safety Administration (PHMSA) and the Pipeline Research Council International (PRCI) to advance...
Earthquake Risk Reduction to Istanbul Natural Gas Distribution Network
NASA Astrophysics Data System (ADS)
Zulfikar, Can; Kariptas, Cagatay; Biyikoglu, Hikmet; Ozarpa, Cevat
2017-04-01
Earthquake Risk Reduction to Istanbul Natural Gas Distribution Network Istanbul Natural Gas Distribution Corporation (IGDAS) is one of the end users of the Istanbul Earthquake Early Warning (EEW) signal. IGDAS, the primary natural gas provider in Istanbul, operates an extensive system 9,867km of gas lines with 750 district regulators and 474,000 service boxes. The natural gas comes to Istanbul city borders with 70bar in 30inch diameter steel pipeline. The gas pressure is reduced to 20bar in RMS stations and distributed to district regulators inside the city. 110 of 750 district regulators are instrumented with strong motion accelerometers in order to cut gas flow during an earthquake event in the case of ground motion parameters exceeds the certain threshold levels. Also, state of-the-art protection systems automatically cut natural gas flow when breaks in the gas pipelines are detected. IGDAS uses a sophisticated SCADA (supervisory control and data acquisition) system to monitor the state-of-health of its pipeline network. This system provides real-time information about quantities related to pipeline monitoring, including input-output pressure, drawing information, positions of station and RTU (remote terminal unit) gates, slum shut mechanism status at 750 district regulator sites. IGDAS Real-time Earthquake Risk Reduction algorithm follows 4 stages as below: 1) Real-time ground motion data transmitted from 110 IGDAS and 110 KOERI (Kandilli Observatory and Earthquake Research Institute) acceleration stations to the IGDAS Scada Center and KOERI data center. 2) During an earthquake event EEW information is sent from IGDAS Scada Center to the IGDAS stations. 3) Automatic Shut-Off is applied at IGDAS district regulators, and calculated parameters are sent from stations to the IGDAS Scada Center and KOERI. 4) Integrated building and gas pipeline damage maps are prepared immediately after the earthquake event. The today's technology allows to rapidly estimate the expected level of shaking when an earthquake starts to occur. However, in Istanbul case for a potential Marmara Sea Earthquake, the time is very limited even to estimate the level of shaking. The robust threshold based EEW system is only algorithm for such a near source event to activate automatic shut-off mechanism in the critical infrastructures before the damaging waves arrive. This safety measure even with a few seconds of early warning time will help to mitigate potential damages and secondary hazards.
Detection of underground pipeline based on Golay waveform design
NASA Astrophysics Data System (ADS)
Dai, Jingjing; Xu, Dazhuan
2017-08-01
The detection of underground pipeline is an important problem in the development of the city, but the research about it is not mature at present. In this paper, based on the principle of waveform design in wireless communication, we design an acoustic signal detection system to detect the location of underground pipelines. According to the principle of acoustic localization, we chose DSP-F28335 as the development board, and use DA and AD module as the master control chip. The DA module uses complementary Golay sequence as emission signal. The AD module acquisiting data synchronously, so that the echo signals which containing position information of the target is recovered through the signal processing. The test result shows that the method in this paper can not only calculate the sound velocity of the soil, but also can locate the location of underground pipelines accurately.
The LCOGT Science Archive and Data Pipeline
NASA Astrophysics Data System (ADS)
Lister, Tim; Walker, Z.; Ciardi, D.; Gelino, C. R.; Good, J.; Laity, A.; Swain, M.
2013-01-01
Las Cumbres Observatory Global Telescope (LCOGT) is building and deploying a world-wide network of optical telescopes dedicated to time-domain astronomy. In the past year, we have deployed and commissioned four new 1m telescopes at McDonald Observatory, Texas and at CTIO, Chile, with more to come at SAAO, South Africa and Siding Spring Observatory, Australia. To handle these new data sources coming from the growing LCOGT network, and to serve them to end users, we have constructed a new data pipeline and Science Archive. We describe the new LCOGT pipeline, currently under development and testing, which makes use of the ORAC-DR automated recipe-based data reduction pipeline and illustrate some of the new data products. We also present the new Science Archive, which is being developed in partnership with the Infrared Processing and Analysis Center (IPAC) and show some of the new features the Science Archive provides.
A Model for Oil-Gas Pipelines Cost Prediction Based on a Data Mining Process
NASA Astrophysics Data System (ADS)
Batzias, Fragiskos A.; Spanidis, Phillip-Mark P.
2009-08-01
This paper addresses the problems associated with the cost estimation of oil/gas pipelines during the elaboration of feasibility assessments. Techno-economic parameters, i.e., cost, length and diameter, are critical for such studies at the preliminary design stage. A methodology for the development of a cost prediction model based on Data Mining (DM) process is proposed. The design and implementation of a Knowledge Base (KB), maintaining data collected from various disciplines of the pipeline industry, are presented. The formulation of a cost prediction equation is demonstrated by applying multiple regression analysis using data sets extracted from the KB. Following the methodology proposed, a learning context is inductively developed as background pipeline data are acquired, grouped and stored in the KB, and through a linear regression model provide statistically substantial results, useful for project managers or decision makers.
Corrosivity Sensor for Exposed Pipelines Based on Wireless Energy Transfer
Lawand, Lydia; Shiryayev, Oleg; Al Handawi, Khalil; Vahdati, Nader; Rostron, Paul
2017-01-01
External corrosion was identified as one of the main causes of pipeline failures worldwide. A solution that addresses the issue of detecting and quantifying corrosivity of environment for application to existing exposed pipelines has been developed. It consists of a sensing array made of an assembly of thin strips of pipeline steel and a circuit that provides a visual sensor reading to the operator. The proposed sensor is passive and does not require a constant power supply. Circuit design was validated through simulations and lab experiments. Accelerated corrosion experiment was conducted to confirm the feasibility of the proposed corrosivity sensor design. PMID:28556805
A VLSI pipeline design of a fast prime factor DFT on a finite field
NASA Technical Reports Server (NTRS)
Truong, T. K.; Hsu, I. S.; Shao, H. M.; Reed, I. S.; Shyu, H. C.
1986-01-01
A conventional prime factor discrete Fourier transform (DFT) algorithm is used to realize a discrete Fourier-like transform on the finite field, GF(q sub n). A pipeline structure is used to implement this prime factor DFT over GF(q sub n). This algorithm is developed to compute cyclic convolutions of complex numbers and to decode Reed-Solomon codes. Such a pipeline fast prime factor DFT algorithm over GF(q sub n) is regular, simple, expandable, and naturally suitable for VLSI implementation. An example illustrating the pipeline aspect of a 30-point transform over GF(q sub n) is presented.
Bad Actors Criticality Assessment for Pipeline system
NASA Astrophysics Data System (ADS)
Nasir, Meseret; Chong, Kit wee; Osman, Sabtuni; Siaw Khur, Wee
2015-04-01
Failure of a pipeline system could bring huge economic loss. In order to mitigate such catastrophic loss, it is required to evaluate and rank the impact of each bad actor of the pipeline system. In this study, bad actors are known as the root causes or any potential factor leading to the system downtime. Fault Tree Analysis (FTA) is used to analyze the probability of occurrence for each bad actor. Bimbaum's Importance and criticality measure (BICM) is also employed to rank the impact of each bad actor on the pipeline system failure. The results demonstrate that internal corrosion; external corrosion and construction damage are critical and highly contribute to the pipeline system failure with 48.0%, 12.4% and 6.0% respectively. Thus, a minor improvement in internal corrosion; external corrosion and construction damage would bring significant changes in the pipeline system performance and reliability. These results could also be useful to develop efficient maintenance strategy by identifying the critical bad actors.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hansen, E.A.; Smed, P.F.; Bryndum, M.B.
The paper describes the numerical program, PIPESIN, that simulates the behavior of a pipeline placed on an erodible seabed. PIPEline Seabed INteraction from installation until a stable pipeline seabed configuration has occurred is simulated in the time domain including all important physical processes. The program is the result of the joint research project, ``Free Span Development and Self-lowering of Offshore Pipelines`` sponsored by EU and a group of companies and carried out by the Danish Hydraulic Institute and Delft Hydraulics. The basic modules of PIPESIN are described. The description of the scouring processes has been based on and verified throughmore » physical model tests carried out as part of the research project. The program simulates a section of the pipeline (typically 500 m) in the time domain, the main input being time series of the waves and current. The main results include predictions of the onset of free spans, their length distribution, their variation in time, and the lowering of the pipeline as function of time.« less
Zhang, Jing; Liang, Lichen; Anderson, Jon R; Gatewood, Lael; Rottenberg, David A; Strother, Stephen C
2008-01-01
As functional magnetic resonance imaging (fMRI) becomes widely used, the demands for evaluation of fMRI processing pipelines and validation of fMRI analysis results is increasing rapidly. The current NPAIRS package, an IDL-based fMRI processing pipeline evaluation framework, lacks system interoperability and the ability to evaluate general linear model (GLM)-based pipelines using prediction metrics. Thus, it can not fully evaluate fMRI analytical software modules such as FSL.FEAT and NPAIRS.GLM. In order to overcome these limitations, a Java-based fMRI processing pipeline evaluation system was developed. It integrated YALE (a machine learning environment) into Fiswidgets (a fMRI software environment) to obtain system interoperability and applied an algorithm to measure GLM prediction accuracy. The results demonstrated that the system can evaluate fMRI processing pipelines with univariate GLM and multivariate canonical variates analysis (CVA)-based models on real fMRI data based on prediction accuracy (classification accuracy) and statistical parametric image (SPI) reproducibility. In addition, a preliminary study was performed where four fMRI processing pipelines with GLM and CVA modules such as FSL.FEAT and NPAIRS.CVA were evaluated with the system. The results indicated that (1) the system can compare different fMRI processing pipelines with heterogeneous models (NPAIRS.GLM, NPAIRS.CVA and FSL.FEAT) and rank their performance by automatic performance scoring, and (2) the rank of pipeline performance is highly dependent on the preprocessing operations. These results suggest that the system will be of value for the comparison, validation, standardization and optimization of functional neuroimaging software packages and fMRI processing pipelines.
Development of an Automated Imaging Pipeline for the Analysis of the Zebrafish Larval Kidney
Westhoff, Jens H.; Giselbrecht, Stefan; Schmidts, Miriam; Schindler, Sebastian; Beales, Philip L.; Tönshoff, Burkhard; Liebel, Urban; Gehrig, Jochen
2013-01-01
The analysis of kidney malformation caused by environmental influences during nephrogenesis or by hereditary nephropathies requires animal models allowing the in vivo observation of developmental processes. The zebrafish has emerged as a useful model system for the analysis of vertebrate organ development and function, and it is suitable for the identification of organotoxic or disease-modulating compounds on a larger scale. However, to fully exploit its potential in high content screening applications, dedicated protocols are required allowing the consistent visualization of inner organs such as the embryonic kidney. To this end, we developed a high content screening compatible pipeline for the automated imaging of standardized views of the developing pronephros in zebrafish larvae. Using a custom designed tool, cavities were generated in agarose coated microtiter plates allowing for accurate positioning and orientation of zebrafish larvae. This enabled the subsequent automated acquisition of stable and consistent dorsal views of pronephric kidneys. The established pipeline was applied in a pilot screen for the analysis of the impact of potentially nephrotoxic drugs on zebrafish pronephros development in the Tg(wt1b:EGFP) transgenic line in which the developing pronephros is highlighted by GFP expression. The consistent image data that was acquired allowed for quantification of gross morphological pronephric phenotypes, revealing concentration dependent effects of several compounds on nephrogenesis. In addition, applicability of the imaging pipeline was further confirmed in a morpholino based model for cilia-associated human genetic disorders associated with different intraflagellar transport genes. The developed tools and pipeline can be used to study various aspects in zebrafish kidney research, and can be readily adapted for the analysis of other organ systems. PMID:24324758
Development of an automated imaging pipeline for the analysis of the zebrafish larval kidney.
Westhoff, Jens H; Giselbrecht, Stefan; Schmidts, Miriam; Schindler, Sebastian; Beales, Philip L; Tönshoff, Burkhard; Liebel, Urban; Gehrig, Jochen
2013-01-01
The analysis of kidney malformation caused by environmental influences during nephrogenesis or by hereditary nephropathies requires animal models allowing the in vivo observation of developmental processes. The zebrafish has emerged as a useful model system for the analysis of vertebrate organ development and function, and it is suitable for the identification of organotoxic or disease-modulating compounds on a larger scale. However, to fully exploit its potential in high content screening applications, dedicated protocols are required allowing the consistent visualization of inner organs such as the embryonic kidney. To this end, we developed a high content screening compatible pipeline for the automated imaging of standardized views of the developing pronephros in zebrafish larvae. Using a custom designed tool, cavities were generated in agarose coated microtiter plates allowing for accurate positioning and orientation of zebrafish larvae. This enabled the subsequent automated acquisition of stable and consistent dorsal views of pronephric kidneys. The established pipeline was applied in a pilot screen for the analysis of the impact of potentially nephrotoxic drugs on zebrafish pronephros development in the Tg(wt1b:EGFP) transgenic line in which the developing pronephros is highlighted by GFP expression. The consistent image data that was acquired allowed for quantification of gross morphological pronephric phenotypes, revealing concentration dependent effects of several compounds on nephrogenesis. In addition, applicability of the imaging pipeline was further confirmed in a morpholino based model for cilia-associated human genetic disorders associated with different intraflagellar transport genes. The developed tools and pipeline can be used to study various aspects in zebrafish kidney research, and can be readily adapted for the analysis of other organ systems.
33 CFR 148.217 - How can a State be designated as an adjacent coastal State?
Code of Federal Regulations, 2010 CFR
2010-07-01
..., DEPARTMENT OF HOMELAND SECURITY (CONTINUED) DEEPWATER PORTS DEEPWATER PORTS: GENERAL Processing Applications... as an adjacent coastal State in the notice may request to be designated as one if the environmental risks to it are equal to or greater than the risks posed to a State directly connected by pipeline to...
Nayor, Jennifer; Borges, Lawrence F; Goryachev, Sergey; Gainer, Vivian S; Saltzman, John R
2018-07-01
ADR is a widely used colonoscopy quality indicator. Calculation of ADR is labor-intensive and cumbersome using current electronic medical databases. Natural language processing (NLP) is a method used to extract meaning from unstructured or free text data. (1) To develop and validate an accurate automated process for calculation of adenoma detection rate (ADR) and serrated polyp detection rate (SDR) on data stored in widely used electronic health record systems, specifically Epic electronic health record system, Provation ® endoscopy reporting system, and Sunquest PowerPath pathology reporting system. Screening colonoscopies performed between June 2010 and August 2015 were identified using the Provation ® reporting tool. An NLP pipeline was developed to identify adenomas and sessile serrated polyps (SSPs) on pathology reports corresponding to these colonoscopy reports. The pipeline was validated using a manual search. Precision, recall, and effectiveness of the natural language processing pipeline were calculated. ADR and SDR were then calculated. We identified 8032 screening colonoscopies that were linked to 3821 pathology reports (47.6%). The NLP pipeline had an accuracy of 100% for adenomas and 100% for SSPs. Mean total ADR was 29.3% (range 14.7-53.3%); mean male ADR was 35.7% (range 19.7-62.9%); and mean female ADR was 24.9% (range 9.1-51.0%). Mean total SDR was 4.0% (0-9.6%). We developed and validated an NLP pipeline that accurately and automatically calculates ADRs and SDRs using data stored in Epic, Provation ® and Sunquest PowerPath. This NLP pipeline can be used to evaluate colonoscopy quality parameters at both individual and practice levels.
Kepler Science Operations Center Pipeline Framework
NASA Technical Reports Server (NTRS)
Klaus, Todd C.; McCauliff, Sean; Cote, Miles T.; Girouard, Forrest R.; Wohler, Bill; Allen, Christopher; Middour, Christopher; Caldwell, Douglas A.; Jenkins, Jon M.
2010-01-01
The Kepler mission is designed to continuously monitor up to 170,000 stars at a 30 minute cadence for 3.5 years searching for Earth-size planets. The data are processed at the Science Operations Center (SOC) at NASA Ames Research Center. Because of the large volume of data and the memory and CPU-intensive nature of the analysis, significant computing hardware is required. We have developed generic pipeline framework software that is used to distribute and synchronize the processing across a cluster of CPUs and to manage the resulting products. The framework is written in Java and is therefore platform-independent, and scales from a single, standalone workstation (for development and research on small data sets) to a full cluster of homogeneous or heterogeneous hardware with minimal configuration changes. A plug-in architecture provides customized control of the unit of work without the need to modify the framework itself. Distributed transaction services provide for atomic storage of pipeline products for a unit of work across a relational database and the custom Kepler DB. Generic parameter management and data accountability services are provided to record the parameter values, software versions, and other meta-data used for each pipeline execution. A graphical console allows for the configuration, execution, and monitoring of pipelines. An alert and metrics subsystem is used to monitor the health and performance of the pipeline. The framework was developed for the Kepler project based on Kepler requirements, but the framework itself is generic and could be used for a variety of applications where these features are needed.
East Spar: Alliance approach for offshore gasfield development
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1998-04-01
East spar is a gas/condensate field 25 miles west of Barrow Island, offshore Western Australia. Proved plus probable reserves at the time of development were estimated at 430 Bcf gas and 28 million bbl of condensate. The field was discovered in early 1993 when the Western Australia gas market was deregulated and the concept of a gas pipeline to the gold fields was proposed. This created a window of opportunity for East Spar, but only if plans could be established quickly. A base-case development plan was established to support gas marketing while alternative plans could be developed in parallel. Themore » completed East Spar facilities comprise two subsea wells, a subsea gathering system, and a multiphase (gas/condensate/water) pipeline to new gas-processing facilities. The subsea facilities are controlled through a navigation, communication, and control (NCC) buoy. The control room and gas-processing plant are 39 miles east of the field on Varanus Island. Sales gas is exported through a pre-existing gas-sales pipeline to the Dampier-Bunbury and Goldfields Gas Transmission pipelines. Condensate is stored in and exported by use of pre-existing facilities on Varanus Island. Field development from approval to first production took 22 months. The paper describes its field development.« less
NASA Astrophysics Data System (ADS)
Razak, K. Abdul; Othman, M. I. H.; Mat Yusuf, S.; Fuad, M. F. I. Ahmad; yahaya, Effah
2018-05-01
Oil and gas today being developed at different water depth characterized as shallow, deep and ultra-deep waters. Among the major components involved during the offshore installation is pipelines. Pipelines are a transportation method of material through a pipe. In oil and gas industry, pipeline come from a bunch of line pipe that welded together to become a long pipeline and can be divided into two which is gas pipeline and oil pipeline. In order to perform pipeline installation, we need pipe laying barge or pipe laying vessel. However, pipe laying vessel can be divided into two types: S-lay vessel and J-lay vessel. The function of pipe lay vessel is not only to perform pipeline installation. It also performed installation of umbilical or electrical cables. In the simple words, pipe lay vessel is performing the installation of subsea in all the connecting infrastructures. Besides that, the installation processes of pipelines require special focus to make the installation succeed. For instance, the heavy pipelines may exceed the lay vessel’s tension capacities in certain kind of water depth. Pipeline have their own characteristic and we can group it or differentiate it by certain parameters such as grade of material, type of material, size of diameter, size of wall thickness and the strength. For instances, wall thickness parameter studies indicate that if use the higher steel grade of the pipelines will have a significant contribution in pipeline wall thickness reduction. When running the process of pipe lay, water depth is the most critical thing that we need to monitor and concern about because of course we cannot control the water depth but we can control the characteristic of the pipe like apply line pipe that have wall thickness suitable with current water depth in order to avoid failure during the installation. This research will analyse whether the pipeline parameter meet the requirements limit and minimum yield stress. It will overlook to simulate pipe grade API 5L X60 which size from 8 to 20mm thickness with a water depth of 50 to 300m. Result shown that pipeline installation will fail from the wall thickness of 18mm onwards since it has been passed the critical yield percentage.
Budin, Francois; Hoogstoel, Marion; Reynolds, Patrick; Grauer, Michael; O'Leary-Moore, Shonagh K; Oguz, Ipek
2013-01-01
Magnetic resonance imaging (MRI) of rodent brains enables study of the development and the integrity of the brain under certain conditions (alcohol, drugs etc.). However, these images are difficult to analyze for biomedical researchers with limited image processing experience. In this paper we present an image processing pipeline running on a Midas server, a web-based data storage system. It is composed of the following steps: rigid registration, skull-stripping, average computation, average parcellation, parcellation propagation to individual subjects, and computation of region-based statistics on each image. The pipeline is easy to configure and requires very little image processing knowledge. We present results obtained by processing a data set using this pipeline and demonstrate how this pipeline can be used to find differences between populations.
Digital Imaging of Pipeline Mechanical Damage and Residual Stress
DOT National Transportation Integrated Search
2010-02-19
The purpose of this program was to enhance characterization of mechanical damage in pipelines through application of digital eddy current imaging. Lift-off maps can be used to develop quantitative representations of mechanical damage and magnetic per...
Digital Mapping of Buried Pipelines with a Dual Array System
DOT National Transportation Integrated Search
2003-06-06
The objective of this research is to develop a non-invasive system for detecting, mapping, and inspecting ferrous and plastic pipelines in place using technology that combines and interprets measurements from ground penetrating radar and electromagne...
Identification of failure type in corroded pipelines: a bayesian probabilistic approach.
Breton, T; Sanchez-Gheno, J C; Alamilla, J L; Alvarez-Ramirez, J
2010-07-15
Spillover of hazardous materials from transport pipelines can lead to catastrophic events with serious and dangerous environmental impact, potential fire events and human fatalities. The problem is more serious for large pipelines when the construction material is under environmental corrosion conditions, as in the petroleum and gas industries. In this way, predictive models can provide a suitable framework for risk evaluation, maintenance policies and substitution procedure design that should be oriented to reduce increased hazards. This work proposes a bayesian probabilistic approach to identify and predict the type of failure (leakage or rupture) for steel pipelines under realistic corroding conditions. In the first step of the modeling process, the mechanical performance of the pipe is considered for establishing conditions under which either leakage or rupture failure can occur. In the second step, experimental burst tests are used to introduce a mean probabilistic boundary defining a region where the type of failure is uncertain. In the boundary vicinity, the failure discrimination is carried out with a probabilistic model where the events are considered as random variables. In turn, the model parameters are estimated with available experimental data and contrasted with a real catastrophic event, showing good discrimination capacity. The results are discussed in terms of policies oriented to inspection and maintenance of large-size pipelines in the oil and gas industry. 2010 Elsevier B.V. All rights reserved.
A Concept for the One Degree Imager (ODI) Data Reduction Pipeline and Archiving System
NASA Astrophysics Data System (ADS)
Knezek, Patricia; Stobie, B.; Michael, S.; Valdes, F.; Marru, S.; Henschel, R.; Pierce, M.
2010-05-01
The One Degree Imager (ODI), currently being built by the WIYN Observatory, will provide tremendous possibilities for conducting diverse scientific programs. ODI will be a complex instrument, using non-conventional Orthogonal Transfer Array (OTA) detectors. Due to its large field of view, small pixel size, use of OTA technology, and expected frequent use, ODI will produce vast amounts of astronomical data. If ODI is to achieve its full potential, a data reduction pipeline must be developed. Long-term archiving must also be incorporated into the pipeline system to ensure the continued value of ODI data. This paper presents a concept for an ODI data reduction pipeline and archiving system. To limit costs and development time, our plan leverages existing software and hardware, including existing pipeline software, Science Gateways, Computational Grid & Cloud Technology, Indiana University's Data Capacitor and Massive Data Storage System, and TeraGrid compute resources. Existing pipeline software will be augmented to add functionality required to meet challenges specific to ODI, enhance end-user control, and enable the execution of the pipeline on grid resources including national grid resources such as the TeraGrid and Open Science Grid. The planned system offers consistent standard reductions and end-user flexibility when working with images beyond the initial instrument signature removal. It also gives end-users access to computational and storage resources far beyond what are typically available at most institutions. Overall, the proposed system provides a wide array of software tools and the necessary hardware resources to use them effectively.
Milchenko, Mikhail; Snyder, Abraham Z; LaMontagne, Pamela; Shimony, Joshua S; Benzinger, Tammie L; Fouke, Sarah Jost; Marcus, Daniel S
2016-07-01
Neuroimaging research often relies on clinically acquired magnetic resonance imaging (MRI) datasets that can originate from multiple institutions. Such datasets are characterized by high heterogeneity of modalities and variability of sequence parameters. This heterogeneity complicates the automation of image processing tasks such as spatial co-registration and physiological or functional image analysis. Given this heterogeneity, conventional processing workflows developed for research purposes are not optimal for clinical data. In this work, we describe an approach called Heterogeneous Optimization Framework (HOF) for developing image analysis pipelines that can handle the high degree of clinical data non-uniformity. HOF provides a set of guidelines for configuration, algorithm development, deployment, interpretation of results and quality control for such pipelines. At each step, we illustrate the HOF approach using the implementation of an automated pipeline for Multimodal Glioma Analysis (MGA) as an example. The MGA pipeline computes tissue diffusion characteristics of diffusion tensor imaging (DTI) acquisitions, hemodynamic characteristics using a perfusion model of susceptibility contrast (DSC) MRI, and spatial cross-modal co-registration of available anatomical, physiological and derived patient images. Developing MGA within HOF enabled the processing of neuro-oncology MR imaging studies to be fully automated. MGA has been successfully used to analyze over 160 clinical tumor studies to date within several research projects. Introduction of the MGA pipeline improved image processing throughput and, most importantly, effectively produced co-registered datasets that were suitable for advanced analysis despite high heterogeneity in acquisition protocols.
Wright, J. Fraser
2014-01-01
Adeno-associated virus (AAV)-based vectors expressing therapeutic genes continue to demonstrate great promise for the treatment of a wide variety of diseases and together with other gene transfer vectors represent an emerging new therapeutic paradigm comparable in potential impact on human health to that achieved by recombinant proteins and vaccines. A challenge for the current pipeline of AAV-based investigational products as they advance through clinical development is the identification, characterization and lot-to-lot control of the process- and product-related impurities present in even highly purified preparations. Especially challenging are AAV vector product-related impurities that closely resemble the vector itself and are, in some cases, without clear precedent in established biotherapeutic products. The determination of acceptable levels of these impurities in vectors prepared for human clinical product development, with the goal of new product licensure, requires careful risk and feasibility assessment. This review focuses primarily on the AAV product-related impurities that have been described in vectors prepared for clinical development. PMID:28548061
Continuous Turbidity Monitoring in the Indian Creek Watershed, Tazewell County, Virginia, 2006-08
Moyer, Douglas; Hyer, Kenneth
2009-01-01
Thousands of miles of natural gas pipelines are installed annually in the United States. These pipelines commonly cross streams, rivers, and other water bodies during pipeline construction. A major concern associated with pipelines crossing water bodies is increased sediment loading and the subsequent impact to the ecology of the aquatic system. Several studies have investigated the techniques used to install pipelines across surface-water bodies and their effect on downstream suspended-sediment concentrations. These studies frequently employ the evaluation of suspended-sediment or turbidity data that were collected using discrete sample-collection methods. No studies, however, have evaluated the utility of continuous turbidity monitoring for identifying real-time sediment input and providing a robust dataset for the evaluation of long-term changes in suspended-sediment concentration as it relates to a pipeline crossing. In 2006, the U.S. Geological Survey, in cooperation with East Tennessee Natural Gas and the U.S. Fish and Wildlife Service, began a study to monitor the effects of construction of the Jewell Ridge Lateral natural gas pipeline on turbidity conditions below pipeline crossings of Indian Creek and an unnamed tributary to Indian Creek, in Tazewell County, Virginia. The potential for increased sediment loading to Indian Creek is of major concern for watershed managers because Indian Creek is listed as one of Virginia's Threatened and Endangered Species Waters and contains critical habitat for two freshwater mussel species, purple bean (Villosa perpurpurea) and rough rabbitsfoot (Quadrula cylindrical strigillata). Additionally, Indian Creek contains the last known reproducing population of the tan riffleshell (Epioblasma florentina walkeri). Therefore, the objectives of the U.S. Geological Survey monitoring effort were to (1) develop a continuous turbidity monitoring network that attempted to measure real-time changes in suspended sediment (using turbidity as a surrogate) downstream from the pipeline crossings, and (2) provide continuous turbidity data that enable the development of a real-time turbidity-input warning system and assessment of long-term changes in turbidity conditions. Water-quality conditions were assessed using continuous water-quality monitors deployed upstream and downstream from the pipeline crossings in Indian Creek and the unnamed tributary. These paired upstream and downstream monitors were outfitted with turbidity, pH (for Indian Creek only), specific-conductance, and water-temperature sensors. Water-quality data were collected continuously (every 15 minutes) during three phases of the pipeline construction: pre-construction, during construction, and post-construction. Continuous turbidity data were evaluated at various time steps to determine whether the construction of the pipeline crossings had an effect on downstream suspended-sediment conditions in Indian Creek and the unnamed tributary. These continuous turbidity data were analyzed in real time with the aid of a turbidity-input warning system. A warning occurred when turbidity values downstream from the pipeline were 6 Formazin Nephelometric Units or 15 percent (depending on the observed range) greater than turbidity upstream from the pipeline crossing. Statistical analyses also were performed on monthly and phase-of-construction turbidity data to determine if the pipeline crossing served as a long-term source of sediment. Results of this intensive water-quality monitoring effort indicate that values of turbidity in Indian Creek increased significantly between the upstream and downstream water-quality monitors during the construction of the Jewell Ridge pipeline. The magnitude of the significant turbidity increase, however, was small (less than 2 Formazin Nephelometric Units). Patterns in the continuous turbidity data indicate that the actual pipeline crossing of Indian Creek had little influence of downstream water quality; co
Workflows for microarray data processing in the Kepler environment.
Stropp, Thomas; McPhillips, Timothy; Ludäscher, Bertram; Bieda, Mark
2012-05-17
Microarray data analysis has been the subject of extensive and ongoing pipeline development due to its complexity, the availability of several options at each analysis step, and the development of new analysis demands, including integration with new data sources. Bioinformatics pipelines are usually custom built for different applications, making them typically difficult to modify, extend and repurpose. Scientific workflow systems are intended to address these issues by providing general-purpose frameworks in which to develop and execute such pipelines. The Kepler workflow environment is a well-established system under continual development that is employed in several areas of scientific research. Kepler provides a flexible graphical interface, featuring clear display of parameter values, for design and modification of workflows. It has capabilities for developing novel computational components in the R, Python, and Java programming languages, all of which are widely used for bioinformatics algorithm development, along with capabilities for invoking external applications and using web services. We developed a series of fully functional bioinformatics pipelines addressing common tasks in microarray processing in the Kepler workflow environment. These pipelines consist of a set of tools for GFF file processing of NimbleGen chromatin immunoprecipitation on microarray (ChIP-chip) datasets and more comprehensive workflows for Affymetrix gene expression microarray bioinformatics and basic primer design for PCR experiments, which are often used to validate microarray results. Although functional in themselves, these workflows can be easily customized, extended, or repurposed to match the needs of specific projects and are designed to be a toolkit and starting point for specific applications. These workflows illustrate a workflow programming paradigm focusing on local resources (programs and data) and therefore are close to traditional shell scripting or R/BioConductor scripting approaches to pipeline design. Finally, we suggest that microarray data processing task workflows may provide a basis for future example-based comparison of different workflow systems. We provide a set of tools and complete workflows for microarray data analysis in the Kepler environment, which has the advantages of offering graphical, clear display of conceptual steps and parameters and the ability to easily integrate other resources such as remote data and web services.
NASA Astrophysics Data System (ADS)
Papathanassiou, George
2016-04-01
The last decade several pipeline corridors have been designed in order to transmit to Europe natural gas and oil from Asia. Although the fact that a pipeline is considered as an underground structure, an analysis of earthquake-induced structural failures should be conducted in prone to earthquake countries e.g. Greece, Italy in EU. The aim of these specific analyses is to assess and evaluate the hazard and the relevant risk induced by earthquake-induced slope failures and soil liquefaction. The latter is a phenomenon that is triggered under specific site conditions. In particular the basic ingredients for the occurrence of liquefaction is the surficial water table, the existence of non-plastic or low plasticity soil layer and the generation of strong ground motion. Regarding the liquefaction-induced deformation that should be assessed and evaluated in order to minimize the risk, it is concluded that the pervasive types of ground failures for level to gently sloping sites are the ground settlements and lateral spreads. The goal of this study is to overview the most widely approaches used for the computation of liquefaction-induced settlement and to present a more detailed description, step by step, of the methodology that is recommended to follow for the evaluation of lateral spreading.
Cottingham, Marci D.; Kalbaugh, Corey A.
2014-01-01
In spite of a growing literature on pharmaceuticalization, little is known about the pharmaceutical industry’s investments in research and development (R&D). Information about the drugs being developed can provide important context for existing case studies detailing the expanding – and often problematic – role of pharmaceuticals in society. To access the pharmaceutical industry’s pipeline, we constructed a database of drugs for which pharmaceutical companies reported initiating clinical trials over a five-year period (July 2006-June 2011), capturing 2,477 different drugs in 4,182 clinical trials. Comparing drugs in the pipeline that target diseases in high-income and low-income countries, we found that the number of drugs for diseases prevalent in high-income countries was 3.46 times higher than drugs for diseases prevalent in low-income countries. We also found that the plurality of drugs in the pipeline were being developed to treat cancers (26.2%). Interpreting our findings through the lens of pharmaceuticalization, we illustrate how investigating the entire drug development pipeline provides important information about patterns of pharmaceuticalization that are invisible when only marketed drugs are considered. PMID:25159693
Improved satellite and geospatial tools for pipeline operator decision support systems.
DOT National Transportation Integrated Search
2017-01-06
Under Cooperative Agreement No. OASRTRS-14-H-CAL, California Polytechnic State University San Luis Obispo (Cal Poly), partnered with C-CORE, MDA, PRCI, and Electricore to design and develop improved satellite and geospatial tools for pipeline operato...
Synergistic combinations of antifungals and anti-virulence agents to fight against Candida albicans.
Cui, Jinhui; Ren, Biao; Tong, Yaojun; Dai, Huanqin; Zhang, Lixin
2015-01-01
Candida albicans, one of the pathogenic Candida species, causes high mortality rate in immunocompromised and high-risk surgical patients. In the last decade, only one new class of antifungal drug echinocandin was applied. The increased therapy failures, such as the one caused by multi-drug resistance, demand innovative strategies for new effective antifungal drugs. Synergistic combinations of antifungals and anti-virulence agents highlight the pragmatic strategy to reduce the development of drug resistant and potentially repurpose known antifungals, which bypass the costly and time-consuming pipeline of new drug development. Anti-virulence and synergistic combination provide new options for antifungal drug discovery by counteracting the difficulty or failure of traditional therapy for fungal infections.
Pipeline for contraceptive development.
Blithe, Diana L
2016-11-01
The high rates of unplanned pregnancy reflect an unmet need for effective contraceptive methods for women, especially for individuals with health risks such as obesity, diabetes, hypertension, and other conditions that may contraindicate use of an estrogen-containing product. Improvements in safety, user convenience, acceptability, and availability of products remain important goals of the contraceptive development program. Another important goal is to minimize the impact of the products on the environment. Development of new methods for male contraception has the potential to address many of these issues of safety for women who have contraindications to effective contraceptive methods but want to protect against pregnancy. It would also address a huge unmet need for men who want to control their fertility. Products under development for men would not introduce ecotoxic hormones into the water system. Published by Elsevier Inc.
Pipelines subject to slow landslide movements: Structural modeling vs field measurement
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bruschi, R.; Glavina, S.; Spinazze, M.
1996-12-01
In recent years finite element techniques have been increasingly used to investigate the behavior of buried pipelines subject to soil movements. The use of these tools provides a rational basis for the definition of minimum wall thickness requirements in landslide crossings. Furthermore the design of mitigation measures or monitoring systems which control the development of undesirable strains in the pipe wall over time, requires a detailed structural modeling. The scope of this paper is to discuss the use of dedicated structural modeling with relevant calibration to field measurements. The strain measurements used were regularly gathered from pipe sections, in twomore » different sites over a period of time long enough to record changes of axial strain due to soil movement. Detailed structural modeling of pipeline layout in both sites and for operating conditions, is applied. Numerical simulations show the influence of the distribution of soil movement acting on the pipeline with regards to the state of strain which can be developed in certain locations. The role of soil nature and direction of relative movements in the definition of loads transferred to the pipeline, is also discussed.« less
Song, Jia; Zheng, Sisi; Nguyen, Nhung; Wang, Youjun; Zhou, Yubin; Lin, Kui
2017-10-03
Because phylogenetic inference is an important basis for answering many evolutionary problems, a large number of algorithms have been developed. Some of these algorithms have been improved by integrating gene evolution models with the expectation of accommodating the hierarchy of evolutionary processes. To the best of our knowledge, however, there still is no single unifying model or algorithm that can take all evolutionary processes into account through a stepwise or simultaneous method. On the basis of three existing phylogenetic inference algorithms, we built an integrated pipeline for inferring the evolutionary history of a given gene family; this pipeline can model gene sequence evolution, gene duplication-loss, gene transfer and multispecies coalescent processes. As a case study, we applied this pipeline to the STIMATE (TMEM110) gene family, which has recently been reported to play an important role in store-operated Ca 2+ entry (SOCE) mediated by ORAI and STIM proteins. We inferred their phylogenetic trees in 69 sequenced chordate genomes. By integrating three tree reconstruction algorithms with diverse evolutionary models, a pipeline for inferring the evolutionary history of a gene family was developed, and its application was demonstrated.
Implementation of Cloud based next generation sequencing data analysis in a clinical laboratory.
Onsongo, Getiria; Erdmann, Jesse; Spears, Michael D; Chilton, John; Beckman, Kenneth B; Hauge, Adam; Yohe, Sophia; Schomaker, Matthew; Bower, Matthew; Silverstein, Kevin A T; Thyagarajan, Bharat
2014-05-23
The introduction of next generation sequencing (NGS) has revolutionized molecular diagnostics, though several challenges remain limiting the widespread adoption of NGS testing into clinical practice. One such difficulty includes the development of a robust bioinformatics pipeline that can handle the volume of data generated by high-throughput sequencing in a cost-effective manner. Analysis of sequencing data typically requires a substantial level of computing power that is often cost-prohibitive to most clinical diagnostics laboratories. To address this challenge, our institution has developed a Galaxy-based data analysis pipeline which relies on a web-based, cloud-computing infrastructure to process NGS data and identify genetic variants. It provides additional flexibility, needed to control storage costs, resulting in a pipeline that is cost-effective on a per-sample basis. It does not require the usage of EBS disk to run a sample. We demonstrate the validation and feasibility of implementing this bioinformatics pipeline in a molecular diagnostics laboratory. Four samples were analyzed in duplicate pairs and showed 100% concordance in mutations identified. This pipeline is currently being used in the clinic and all identified pathogenic variants confirmed using Sanger sequencing further validating the software.
GIS characterization of spatially distributed lifeline damage
Toprak, Selcuk; O'Rourke, Thomas; Tutuncu, Ilker
1999-01-01
This paper describes the visualization of spatially distributed water pipeline damage following an earthquake using geographical information systems (GIS). Pipeline damage is expressed as a repair rate (RR). Repair rate contours are developed with GIS by dividing the study area into grid cells (n ?? n), determining the number of particular pipeline repairs in each grid cell, and dividing the number of repairs by the length of that pipeline in each cell area. The resulting contour plot is a two-dimensional visualization of point source damage. High damage zones are defined herein as areas with an RR value greater than the mean RR for the entire study area of interest. A hyperbolic relationship between visual display of high pipeline damage zones and grid size, n, was developed. The relationship is expressed in terms of two dimensionless parameters, threshold area coverage (TAC) and dimensionless grid size (DGS). The relationship is valid over a wide range of different map scales spanning approximately 1,200 km2 for the largest portion of the Los Angeles water distribution system to 1 km2 for the Marina in San Francisco. This relationship can aid GIS users to get sufficiently refined, but easily visualized, maps of damage patterns.
Mary Beth Adams; Pamela J. Edwards; W. Mark Ford; Joshua B. Johnson; Thomas M. Schuler; Melissa Thomas-Van Gundy; Frederica Wood
2011-01-01
Development of a natural gas well and pipeline on the Fernow Experimental Forest, WV, raised concerns about the effects on the natural and scientifi c resources of the Fernow, set aside in 1934 for long-term research. A case study approach was used to evaluate effects of the development. This report includes results of monitoring projects as well as observations...
Theory and Application of Magnetic Flux Leakage Pipeline Detection.
Shi, Yan; Zhang, Chao; Li, Rui; Cai, Maolin; Jia, Guanwei
2015-12-10
Magnetic flux leakage (MFL) detection is one of the most popular methods of pipeline inspection. It is a nondestructive testing technique which uses magnetic sensitive sensors to detect the magnetic leakage field of defects on both the internal and external surfaces of pipelines. This paper introduces the main principles, measurement and processing of MFL data. As the key point of a quantitative analysis of MFL detection, the identification of the leakage magnetic signal is also discussed. In addition, the advantages and disadvantages of different identification methods are analyzed. Then the paper briefly introduces the expert systems used. At the end of this paper, future developments in pipeline MFL detection are predicted.
Theory and Application of Magnetic Flux Leakage Pipeline Detection
Shi, Yan; Zhang, Chao; Li, Rui; Cai, Maolin; Jia, Guanwei
2015-01-01
Magnetic flux leakage (MFL) detection is one of the most popular methods of pipeline inspection. It is a nondestructive testing technique which uses magnetic sensitive sensors to detect the magnetic leakage field of defects on both the internal and external surfaces of pipelines. This paper introduces the main principles, measurement and processing of MFL data. As the key point of a quantitative analysis of MFL detection, the identification of the leakage magnetic signal is also discussed. In addition, the advantages and disadvantages of different identification methods are analyzed. Then the paper briefly introduces the expert systems used. At the end of this paper, future developments in pipeline MFL detection are predicted. PMID:26690435
Simulation of pipeline in the area of the underwater crossing
NASA Astrophysics Data System (ADS)
Burkov, P.; Chernyavskiy, D.; Burkova, S.; Konan, E. C.
2014-08-01
The article studies stress-strain behavior of the main oil-pipeline section Alexandrovskoye-Anzhero-Sudzhensk using software system Ansys. This method of examination and assessment of technical conditions of objects of pipeline transport studies the objects and the processes that affect the technical condition of these facilities, including the research on the basis of computer simulation. Such approach allows to develop the theory, methods of calculations and designing of objects of pipeline transport, units and parts of machines, regardless of their industry and destination with a view to improve the existing constructions and create new structures, machines of high performance, durability and reliability, maintainability, low material capacity and cost, which have competitiveness on the world market.
NASA Astrophysics Data System (ADS)
Li, Jun-Wei; Cao, Jun-Wei
2010-04-01
One challenge in large-scale scientific data analysis is to monitor data in real-time in a distributed environment. For the LIGO (Laser Interferometer Gravitational-wave Observatory) project, a dedicated suit of data monitoring tools (DMT) has been developed, yielding good extensibility to new data type and high flexibility to a distributed environment. Several services are provided, including visualization of data information in various forms and file output of monitoring results. In this work, a DMT monitor, OmegaMon, is developed for tracking statistics of gravitational-wave (OW) burst triggers that are generated from a specific OW burst data analysis pipeline, the Omega Pipeline. Such results can provide diagnostic information as reference of trigger post-processing and interferometer maintenance.
Developing eThread pipeline using SAGA-pilot abstraction for large-scale structural bioinformatics.
Ragothaman, Anjani; Boddu, Sairam Chowdary; Kim, Nayong; Feinstein, Wei; Brylinski, Michal; Jha, Shantenu; Kim, Joohyun
2014-01-01
While most of computational annotation approaches are sequence-based, threading methods are becoming increasingly attractive because of predicted structural information that could uncover the underlying function. However, threading tools are generally compute-intensive and the number of protein sequences from even small genomes such as prokaryotes is large typically containing many thousands, prohibiting their application as a genome-wide structural systems biology tool. To leverage its utility, we have developed a pipeline for eThread--a meta-threading protein structure modeling tool, that can use computational resources efficiently and effectively. We employ a pilot-based approach that supports seamless data and task-level parallelism and manages large variation in workload and computational requirements. Our scalable pipeline is deployed on Amazon EC2 and can efficiently select resources based upon task requirements. We present runtime analysis to characterize computational complexity of eThread and EC2 infrastructure. Based on results, we suggest a pathway to an optimized solution with respect to metrics such as time-to-solution or cost-to-solution. Our eThread pipeline can scale to support a large number of sequences and is expected to be a viable solution for genome-scale structural bioinformatics and structure-based annotation, particularly, amenable for small genomes such as prokaryotes. The developed pipeline is easily extensible to other types of distributed cyberinfrastructure.
Developing eThread Pipeline Using SAGA-Pilot Abstraction for Large-Scale Structural Bioinformatics
Ragothaman, Anjani; Feinstein, Wei; Jha, Shantenu; Kim, Joohyun
2014-01-01
While most of computational annotation approaches are sequence-based, threading methods are becoming increasingly attractive because of predicted structural information that could uncover the underlying function. However, threading tools are generally compute-intensive and the number of protein sequences from even small genomes such as prokaryotes is large typically containing many thousands, prohibiting their application as a genome-wide structural systems biology tool. To leverage its utility, we have developed a pipeline for eThread—a meta-threading protein structure modeling tool, that can use computational resources efficiently and effectively. We employ a pilot-based approach that supports seamless data and task-level parallelism and manages large variation in workload and computational requirements. Our scalable pipeline is deployed on Amazon EC2 and can efficiently select resources based upon task requirements. We present runtime analysis to characterize computational complexity of eThread and EC2 infrastructure. Based on results, we suggest a pathway to an optimized solution with respect to metrics such as time-to-solution or cost-to-solution. Our eThread pipeline can scale to support a large number of sequences and is expected to be a viable solution for genome-scale structural bioinformatics and structure-based annotation, particularly, amenable for small genomes such as prokaryotes. The developed pipeline is easily extensible to other types of distributed cyberinfrastructure. PMID:24995285
Minimum separation distances for natural gas pipeline and boilers in the 300 area, Hanford Site
DOE Office of Scientific and Technical Information (OSTI.GOV)
Daling, P.M.; Graham, T.M.
1997-08-01
The U.S. Department of Energy (DOE) is proposing actions to reduce energy expenditures and improve energy system reliability at the 300 Area of the Hanford Site. These actions include replacing the centralized heating system with heating units for individual buildings or groups of buildings, constructing a new natural gas distribution system to provide a fuel source for many of these units, and constructing a central control building to operate and maintain the system. The individual heating units will include steam boilers that are to be housed in individual annex buildings located at some distance away from nearby 300 Area nuclearmore » facilities. This analysis develops the basis for siting the package boilers and natural gas distribution systems to be used to supply steam to 300 Area nuclear facilities. The effects of four potential fire and explosion scenarios involving the boiler and natural gas pipeline were quantified to determine minimum separation distances that would reduce the risks to nearby nuclear facilities. The resulting minimum separation distances are shown in Table ES.1.« less
Production of bio-synthetic natural gas in Canada.
Hacatoglu, Kevork; McLellan, P James; Layzell, David B
2010-03-15
Large-scale production of renewable synthetic natural gas from biomass (bioSNG) in Canada was assessed for its ability to mitigate energy security and climate change risks. The land area within 100 km of Canada's network of natural gas pipelines was estimated to be capable of producing 67-210 Mt of dry lignocellulosic biomass per year with minimal adverse impacts on food and fiber production. Biomass gasification and subsequent methanation and upgrading were estimated to yield 16,000-61,000 Mm(3) of pipeline-quality gas (equivalent to 16-63% of Canada's current gas use). Life-cycle greenhouse gas emissions of bioSNG-based electricity were calculated to be only 8.2-10% of the emissions from coal-fired power. Although predicted production costs ($17-21 GJ(-1)) were much higher than current energy prices, a value for low-carbon energy would narrow the price differential. A bioSNG sector could infuse Canada's rural economy with $41-130 billion of investments and create 410,000-1,300,000 jobs while developing a nation-wide low-carbon energy system.
The application of Mike Urban model in drainage and waterlogging in Lincheng county, China
NASA Astrophysics Data System (ADS)
Luan, Qinghua; Zhang, Kun; Liu, Jiahong; Wang, Dong; Ma, Jun
2018-06-01
Recently, the water disaster in cities especially in Chinese mountainous cities is more serious, due to the coupling influences of waterlogging and regional floods. It is necessary to study the surface runoff process of mountainous cities and examine the regional drainage pipeline network. In this study, the runoff processes of Lincheng county (located in Hebei province, China) in different scenarios were simulated through Mike Urban model. The results show that all of the runoff process of the old town and the new residential area with larger slope, is significant and full flow of these above zones exists in the part of the drainage pipeline network; and the overflow exists in part of the drainage pipeline network when the return period is ten years or twenty years, which illuminates that the waterlogging risk in this zone of Lincheng is higher. Therefore, remodeling drainage pipeline network in the old town of Lincheng and adding water storage ponds in the new residential areas were suggested. This research provides both technical support and decision-making reference to local storm flood management, also give the experiences for the study on the runoff process of similar cities.
NASA Technical Reports Server (NTRS)
Chaudhary, Aashish; Votava, Petr; Nemani, Ramakrishna R.; Michaelis, Andrew; Kotfila, Chris
2016-01-01
We are developing capabilities for an integrated petabyte-scale Earth science collaborative analysis and visualization environment. The ultimate goal is to deploy this environment within the NASA Earth Exchange (NEX) and OpenNEX in order to enhance existing science data production pipelines in both high-performance computing (HPC) and cloud environments. Bridging of HPC and cloud is a fairly new concept under active research and this system significantly enhances the ability of the scientific community to accelerate analysis and visualization of Earth science data from NASA missions, model outputs and other sources. We have developed a web-based system that seamlessly interfaces with both high-performance computing (HPC) and cloud environments, providing tools that enable science teams to develop and deploy large-scale analysis, visualization and QA pipelines of both the production process and the data products, and enable sharing results with the community. Our project is developed in several stages each addressing separate challenge - workflow integration, parallel execution in either cloud or HPC environments and big-data analytics or visualization. This work benefits a number of existing and upcoming projects supported by NEX, such as the Web Enabled Landsat Data (WELD), where we are developing a new QA pipeline for the 25PB system.
Analytics and Visualization Pipelines for Big Data on the NASA Earth Exchange (NEX) and OpenNEX
NASA Astrophysics Data System (ADS)
Chaudhary, A.; Votava, P.; Nemani, R. R.; Michaelis, A.; Kotfila, C.
2016-12-01
We are developing capabilities for an integrated petabyte-scale Earth science collaborative analysis and visualization environment. The ultimate goal is to deploy this environment within the NASA Earth Exchange (NEX) and OpenNEX in order to enhance existing science data production pipelines in both high-performance computing (HPC) and cloud environments. Bridging of HPC and cloud is a fairly new concept under active research and this system significantly enhances the ability of the scientific community to accelerate analysis and visualization of Earth science data from NASA missions, model outputs and other sources. We have developed a web-based system that seamlessly interfaces with both high-performance computing (HPC) and cloud environments, providing tools that enable science teams to develop and deploy large-scale analysis, visualization and QA pipelines of both the production process and the data products, and enable sharing results with the community. Our project is developed in several stages each addressing separate challenge - workflow integration, parallel execution in either cloud or HPC environments and big-data analytics or visualization. This work benefits a number of existing and upcoming projects supported by NEX, such as the Web Enabled Landsat Data (WELD), where we are developing a new QA pipeline for the 25PB system.
Grizzly bears and calving caribou: What is the relation with river corridors?
Young, Donald D.; McCabe, Thomas R.
1998-01-01
Researchers have debated the effect of the Trans-Alaska Pipeline (TAP) and associated developments to caribou (Rangifer tarandus) of the central Arctic herd (CAH) since the 1970s. Several studies have demonstrated that cows and calves of the CAH avoided the TAP corridor because of disturbance associated with the pipeline, whereas others have indicated that female caribou of the CAH avoided riparian habitats closely associated with the pipeline. This avoidance was explained as a predator-avoidance strategy. We investigated the relation between female caribou and grizzly bear (Ursus arctos) use of river corridors on the yet undisturbed calving grounds of the Porcupine caribou herd (PCH) in northeastern Alaska. On the coastal plain, caribou were closer to river corridors than expected (P = 0.038), but bear use of river corridors did not differ from expected (P = 0.740). In the foothills, caribou use of river corridors did not differ from expected (P = 0.520), but bears were farther from rivers than expected (P = 0.001). Our results did not suggest an avoidance of river corridors by calving caribou or a propensity for bears to be associated with riparian habitats, presumably for stalking or ambush cover. We propose that PCH caribou reduce the risks of predation to neonates by migrating to a common calving grounds, where predator swamping is the operational antipredator strategy. Consequently, we hypothesize that nutritional demands, not predator avoidance strategies, ultimately regulate habitat use patterns (e.g., use of river corridors) of calving PCH caribou.
Park, Byeonghyeok; Baek, Min-Jeong; Min, Byoungnam; Choi, In-Geol
2017-09-01
Genome annotation is a primary step in genomic research. To establish a light and portable prokaryotic genome annotation pipeline for use in individual laboratories, we developed a Shiny app package designated as "P-CAPS" (Prokaryotic Contig Annotation Pipeline Server). The package is composed of R and Python scripts that integrate publicly available annotation programs into a server application. P-CAPS is not only a browser-based interactive application but also a distributable Shiny app package that can be installed on any personal computer. The final annotation is provided in various standard formats and is summarized in an R markdown document. Annotation can be visualized and examined with a public genome browser. A benchmark test showed that the annotation quality and completeness of P-CAPS were reliable and compatible with those of currently available public pipelines.
Fuchs, Helmut; Aguilar-Pimentel, Juan Antonio; Amarie, Oana V; Becker, Lore; Calzada-Wack, Julia; Cho, Yi-Li; Garrett, Lillian; Hölter, Sabine M; Irmler, Martin; Kistler, Martin; Kraiger, Markus; Mayer-Kuckuk, Philipp; Moreth, Kristin; Rathkolb, Birgit; Rozman, Jan; da Silva Buttkus, Patricia; Treise, Irina; Zimprich, Annemarie; Gampe, Kristine; Hutterer, Christine; Stöger, Claudia; Leuchtenberger, Stefanie; Maier, Holger; Miller, Manuel; Scheideler, Angelika; Wu, Moya; Beckers, Johannes; Bekeredjian, Raffi; Brielmeier, Markus; Busch, Dirk H; Klingenspor, Martin; Klopstock, Thomas; Ollert, Markus; Schmidt-Weber, Carsten; Stöger, Tobias; Wolf, Eckhard; Wurst, Wolfgang; Yildirim, Ali Önder; Zimmer, Andreas; Gailus-Durner, Valérie; Hrabě de Angelis, Martin
2017-09-29
Since decades, model organisms have provided an important approach for understanding the mechanistic basis of human diseases. The German Mouse Clinic (GMC) was the first phenotyping facility that established a collaboration-based platform for phenotype characterization of mouse lines. In order to address individual projects by a tailor-made phenotyping strategy, the GMC advanced in developing a series of pipelines with tests for the analysis of specific disease areas. For a general broad analysis, there is a screening pipeline that covers the key parameters for the most relevant disease areas. For hypothesis-driven phenotypic analyses, there are thirteen additional pipelines with focus on neurological and behavioral disorders, metabolic dysfunction, respiratory system malfunctions, immune-system disorders and imaging techniques. In this article, we give an overview of the pipelines and describe the scientific rationale behind the different test combinations. Copyright © 2017 Elsevier B.V. All rights reserved.
SAND: an automated VLBI imaging and analysing pipeline - I. Stripping component trajectories
NASA Astrophysics Data System (ADS)
Zhang, M.; Collioud, A.; Charlot, P.
2018-02-01
We present our implementation of an automated very long baseline interferometry (VLBI) data-reduction pipeline that is dedicated to interferometric data imaging and analysis. The pipeline can handle massive VLBI data efficiently, which makes it an appropriate tool to investigate multi-epoch multiband VLBI data. Compared to traditional manual data reduction, our pipeline provides more objective results as less human interference is involved. The source extraction is carried out in the image plane, while deconvolution and model fitting are performed in both the image plane and the uv plane for parallel comparison. The output from the pipeline includes catalogues of CLEANed images and reconstructed models, polarization maps, proper motion estimates, core light curves and multiband spectra. We have developed a regression STRIP algorithm to automatically detect linear or non-linear patterns in the jet component trajectories. This algorithm offers an objective method to match jet components at different epochs and to determine their proper motions.
Simulation and Experiment Research on Fatigue Life of High Pressure Air Pipeline Joint
NASA Astrophysics Data System (ADS)
Shang, Jin; Xie, Jianghui; Yu, Jian; Zhang, Deman
2017-12-01
High pressure air pipeline joint is important part of high pressure air system, whose reliability is related to the safety and stability of the system. This thesis developed a new type-high pressure air pipeline joint, carried out dynamics research on CB316-1995 and new type-high pressure air pipeline joint with finite element method, deeply analysed the join forms of different design schemes and effect of materials on stress, tightening torque and fatigue life of joint. Research team set up vibration/pulse test bench, carried out joint fatigue life contrast test. The result shows: the maximum stress of the joint is inverted in the inner side of the outer sleeve nut, which is consistent with the failure mode of the crack on the outer sleeve nut in practice. Simulation and experiment of fatigue life and tightening torque of new type-high pressure air pipeline joint are better than CB316-1995 joint.
2016-09-01
natural gas pipelines , water pipelines , and metallic USTs. The full and complete data sets for curve-fit development were not pro- vided to ERDC...Dunmire (OUSD(AT&L)), Bernie Rodriguez (IMPW-E), and Valerie D. Hines (DAIM-ODF). The work was performed by the Materials and Structures Branch...of structures being tested increases, as in the case of pipelines that run many miles or the case of when a structure’s coating quality
ORAC-DR: Pipelining With Other People's Code
NASA Astrophysics Data System (ADS)
Economou, Frossie; Bridger, Alan; Wright, Gillian S.; Jenness, Tim; Currie, Malcolm J.; Adamson, Andy
As part of the UKIRT ORAC project, we have developed a pipeline (orac-dr) for driving on-line data reduction using existing astronomical packages as algorithm engines and display tools. The design is modular and extensible on several levels, allowing it to be easily adapted to a wide variety of instruments. Here we briefly review the design, discuss the robustness and speed of execution issues inherent in such pipelines, and address what constitutes a desirable (in terms of ``buy-in'' effort) engine or tool.
Tang, Jing; Tang, Lin; Zhang, Chang; Zeng, Guangming; Deng, Yaocheng; Dong, Haoran; Wang, Jingjing; Wu, Yanan
2015-10-01
Semi-volatile organic compounds (SVOCs) derived from plastic pipes widely used in water distribution definitely influence our daily drinking water quality. There are still few scientific or integrated studies on the release and degradation of the migrating chemicals in pipelines. This investigation was carried out at field sites along a pipeline in Changsha, China. Two chemicals, 2, 4-tert-buthylphenol and 1, 3-diphenylguanidine, were found to be migrating from high density polyethylene (HDPE) pipe material. New pipes released more of these two compounds than older pipes, and microorganisms living in older pipes tended to degrade them faster, indicating that the aged pipes were safer for water transmission. Microorganism degradation in water plays a dominant role in the control of these substances. To minimize the potential harm to human, a more detailed study incorporating assessment of their risk should be carried out, along with seeking safer drinking pipes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hooker, J.N.
This report describes an investigation of energy consumption and efficiency of oil pipelines in the US in 1978. It is based on a simulation of the actual movement of oil on a very detailed representation of the pipeline network, and it uses engineering equations to calculate the energy that pipeline pumps must have exerted on the oil to move it in this manner. The efficiencies of pumps and drivers are estimated so as to arrive at the amount of energy consumed at pumping stations. The throughput in each pipeline segment is estimated by distributing each pipeline company's reported oil movementsmore » over its segments in proportions predicted by regression equations that show typical throughput and throughput capacity as functions of pipe diameter. The form of the equations is justified by a generalized cost-engineering study of pipelining, and their parameters are estimated using new techniques developed for the purpose. A simplified model of flow scheduling is chosen on the basis of actual energy use data obtained from a few companies. The study yields energy consumption and intensiveness estimates for crude oil trunk lines, crude oil gathering lines and oil products lines, for the nation as well as by state and by pipe diameter. It characterizes the efficiency of typical pipelines of various diameters operating at capacity. Ancillary results include estimates of oil movements by state and by diameter and approximate pipeline capacity utilization nationwide.« less
Göbl, Rüdiger; Navab, Nassir; Hennersperger, Christoph
2018-06-01
Research in ultrasound imaging is limited in reproducibility by two factors: First, many existing ultrasound pipelines are protected by intellectual property, rendering exchange of code difficult. Second, most pipelines are implemented in special hardware, resulting in limited flexibility of implemented processing steps on such platforms. With SUPRA, we propose an open-source pipeline for fully software-defined ultrasound processing for real-time applications to alleviate these problems. Covering all steps from beamforming to output of B-mode images, SUPRA can help improve the reproducibility of results and make modifications to the image acquisition mode accessible to the research community. We evaluate the pipeline qualitatively, quantitatively, and regarding its run time. The pipeline shows image quality comparable to a clinical system and backed by point spread function measurements a comparable resolution. Including all processing stages of a usual ultrasound pipeline, the run-time analysis shows that it can be executed in 2D and 3D on consumer GPUs in real time. Our software ultrasound pipeline opens up the research in image acquisition. Given access to ultrasound data from early stages (raw channel data, radiofrequency data), it simplifies the development in imaging. Furthermore, it tackles the reproducibility of research results, as code can be shared easily and even be executed without dedicated ultrasound hardware.
The future of pre-exposure prophylaxis (PrEP) for human immunodeficiency virus (HIV) infection.
Özdener, Ayşe Elif; Park, Tae Eun; Kalabalik, Julie; Gupta, Rachna
2017-05-01
People at high risk for HIV acquisition should be offered pre-exposure prophylaxis (PrEP). Tenofovir disoproxil fumarate (TDF)/emtricitabine (FTC) is currently the only medication recommended for pre-exposure prophylaxis (PrEP) by the Centers for Disease Control and Prevention (CDC) in people at high risk for HIV acquisition. This article will review medications currently under investigation and the future landscape of PrEP therapy. Areas covered: This article will review clinical trials that have investigated nontraditional regimens of TDF/FTC, antiretroviral agents from different drug classes such as integrase strand transfer inhibitors (INSTI), nucleoside reverse transcriptase inhibitors (NRTI), and non-nucleoside reverse transcriptase inhibitors (NNRTI) as potential PrEP therapies. Expert commentary: Currently, there are several investigational drugs in the pipeline for PrEP against HIV infection. Increased utilization of PrEP therapy depends on provider identification of people at high risk for HIV transmission. Advances in PrEP development will expand options and access for people and reduce the risk of HIV acquisition.
DOT National Transportation Integrated Search
2013-02-15
The technical tasks in this study included activities to characterize the impact of selected : metallurgical processing and fabrication variables on ethanol stress corrosion cracking (ethanol : SCC) of new pipeline steels, develop a better understand...
The global pipeline of new medicines for the control and elimination of malaria
2012-01-01
Over the past decade, there has been a transformation in the portfolio of medicines to combat malaria. New fixed-dose artemisinin combination therapy is available, with four different types having received approval from Stringent Regulatory Authorities or the World Health Organization (WHO). However, there is still scope for improvement. The Malaria Eradication Research agenda identified several gaps in the current portfolio. Simpler regimens, such as a single-dose cure are needed, compared with the current three-day treatment. In addition, new medicines that prevent transmission and also relapse are needed, but with better safety profiles than current medicines. There is also a big opportunity for new medicines to prevent reinfection and to provide chemoprotection. This study reviews the global portfolio of new medicines in development against malaria, as of the summer of 2012. Cell-based phenotypic screening, and ‘fast followers’ of clinically validated classes, mean that there are now many new classes of molecules starting in clinical development, especially for the blood stages of malaria. There remain significant gaps for medicines blocking transmission, preventing relapse, and long-duration molecules for chemoprotection. The nascent pipeline of new medicines is significantly stronger than five years ago. However, there are still risks ahead in clinical development and sustainable funding of clinical studies is vital if this early promise is going to be delivered. PMID:22958514
Development of a design methodology for pipelines in ice scoured seabeds
DOE Office of Scientific and Technical Information (OSTI.GOV)
Clark, J.I.; Paulin, M.J.; Lach, P.R.
1994-12-31
Large areas of the continental shelf of northern oceans are frequently scoured or gouged by moving bodies of ice such as icebergs and sea ice keels associated with pressure ridges. This phenomenon presents a formidable challenge when the route of a submarine pipeline is intersected by the scouring ice. It is generally acknowledged that if a pipeline, laid on the seabed, were hit by an iceberg or a pressure ridge keel, the forces imposed on the pipeline would be much greater than it could practically withstand. The pipeline must therefore be buried to avoid direct contact with ice, but itmore » is very important to determine with some assurance the minimum depth required for safety for both economical and environmental reasons. The safe burial depth of a pipeline, however, cannot be determined directly from the relatively straight forward measurement of maximum scour depth. The major design consideration is the determination of the potential sub-scour deformation of the ice scoured soil. Forces transmitted through the soil and soil displacement around the pipeline could load the pipeline to failure if not taken into account in the design. If the designer can predict the forces transmitted through the soil, the pipeline can be designed to withstand these external forces using conventional design practice. In this paper, the authors outline a design methodology that is based on phenomenological studies of ice scoured terrain, both modern and relict, laboratory tests, centrifuge modeling, and numerical analysis. The implications of these studies, which could assist in the safe and economical design of pipelines in ice scoured terrain, will also be discussed.« less
Leveraging public private partnerships to innovate under challenging budget times.
Portilla, Lili M; Rohrbaugh, Mark L
2014-01-01
The National Institutes of Health (NIH), academic medical centers and industry have a long and productive history in collaborating together. Decreasing R&D budgets in both the private and public sector have made the need for such collaborations paramount to reduce the risk of further declines in the number of innovative drugs reaching the market to address pressing public health needs. Doing more with less has forced both industry and public sector research institutions (PSRIs) to leverage resources and expertise in order to de-risk projects. In addition, it provides an opportunity to envision and implement new approaches to accomplish these goals. We discuss several of these innovative collaborations and partnerships at the NIH that demonstrate how the NIH and industry are working together to strengthen the drug development pipeline.
Leveraging Public Private Partnerships to Innovate Under Challenging Budget Times
Portilla, Lili M.; Rohrbaugh, Mark
2014-01-01
The National Institutes of Health (NIH), academic medical centers and industry have a long and productive history in collaborating together. Decreasing R&D budgets both the private and public sector have made the need for such collaborations paramount [critical?] to reduce the risk of [further?] declines in the number of innovative drugs reaching the market to address pressing public health needs. Doing more with less has forced both industry and public sector research institutions (PSRIs) to leverage resources and expertise in order to de-risk projects. In addition, it provides an opportunity to envision and implement new approaches to accomplish these goals. We discuss several of these innovative collaborations and partnerships at the NIH that demonstrate how the NIH and industry are working together to strenghten the drug development pipeline. PMID:24283971
NASA Technical Reports Server (NTRS)
Zhao, J.; Couvidat, S.; Bogart, R. S.; Parchevsky, K. V.; Birch, A. C.; Duvall, Thomas L., Jr.; Beck, J. G.; Kosovichev, A. G.; Scherrer, P. H.
2011-01-01
The Helioseismic and Magnetic Imager onboard the Solar Dynamics Observatory (SDO/HMI) provides continuous full-disk observations of solar oscillations. We develop a data-analysis pipeline based on the time-distance helioseismology method to measure acoustic travel times using HMI Doppler-shift observations, and infer solar interior properties by inverting these measurements. The pipeline is used for routine production of near-real-time full-disk maps of subsurface wave-speed perturbations and horizontal flow velocities for depths ranging from 0 to 20 Mm, every eight hours. In addition, Carrington synoptic maps for the subsurface properties are made from these full-disk maps. The pipeline can also be used for selected target areas and time periods. We explain details of the pipeline organization and procedures, including processing of the HMI Doppler observations, measurements of the travel times, inversions, and constructions of the full-disk and synoptic maps. Some initial results from the pipeline, including full-disk flow maps, sunspot subsurface flow fields, and the interior rotation and meridional flow speeds, are presented.
NASA Astrophysics Data System (ADS)
Torres, Monica
The use of pipelines for the transmission of gas offers not only efficiency, but a number of economic advantages. Nevertheless, pipelines are subject to aggressive operating conditions and environments which can lead to in-service degradation [1] and thus to failures. These failures can have catastrophic consequences, such as environmental damage and loss of life [2]. One of the most dangerous threats to pipeline integrity is stress corrosion cracking (SCC). Despite the substantial progress that has been achieved in the field, due to the complex nature of this phenomenon there is still not a complete understanding of this form of external corrosion. This makes its detection and prevention a challenge and therefore a risk to pipeline integrity, and most importantly, to the safety of the population. SCC cracks are the result of the interaction between a corrosive environment, applied stresses, and a susceptible microstructure. To date, what defines a susceptible microstructure remains ambiguous, as SCC has been observed in a range of steel grades, microstructures, chemical composition, and grain sizes. Therefore, in order to be able to accurately predict and prevent this hazardous form of corrosion, it is imperative to advance our knowledge on the subject and gain a better understanding on the microstructural features of highly susceptible pipeline materials, especially in the subsurface zone where crack nucleation must take place. Therefore, a microstructural characterization of the region near the surface layer was carried-out utilizing TEM. TEM analysis revealed the dislocation character, ferrite morphology, and apparent carbide precipitation in some grain boundaries. Furthermore, light microscopy, SEM, and hardness testing were performed to expand our knowledge on the microscopical features of highly SCC susceptible service components. This investigation presents a new approach to SCC characterization, which exposed the sub-surface region microscopical characteristics of service components with confirmed SCC.
RGAugury: a pipeline for genome-wide prediction of resistance gene analogs (RGAs) in plants.
Li, Pingchuan; Quan, Xiande; Jia, Gaofeng; Xiao, Jin; Cloutier, Sylvie; You, Frank M
2016-11-02
Resistance gene analogs (RGAs), such as NBS-encoding proteins, receptor-like protein kinases (RLKs) and receptor-like proteins (RLPs), are potential R-genes that contain specific conserved domains and motifs. Thus, RGAs can be predicted based on their conserved structural features using bioinformatics tools. Computer programs have been developed for the identification of individual domains and motifs from the protein sequences of RGAs but none offer a systematic assessment of the different types of RGAs. A user-friendly and efficient pipeline is needed for large-scale genome-wide RGA predictions of the growing number of sequenced plant genomes. An integrative pipeline, named RGAugury, was developed to automate RGA prediction. The pipeline first identifies RGA-related protein domains and motifs, namely nucleotide binding site (NB-ARC), leucine rich repeat (LRR), transmembrane (TM), serine/threonine and tyrosine kinase (STTK), lysin motif (LysM), coiled-coil (CC) and Toll/Interleukin-1 receptor (TIR). RGA candidates are identified and classified into four major families based on the presence of combinations of these RGA domains and motifs: NBS-encoding, TM-CC, and membrane associated RLP and RLK. All time-consuming analyses of the pipeline are paralleled to improve performance. The pipeline was evaluated using the well-annotated Arabidopsis genome. A total of 98.5, 85.2, and 100 % of the reported NBS-encoding genes, membrane associated RLPs and RLKs were validated, respectively. The pipeline was also successfully applied to predict RGAs for 50 sequenced plant genomes. A user-friendly web interface was implemented to ease command line operations, facilitate visualization and simplify result management for multiple datasets. RGAugury is an efficiently integrative bioinformatics tool for large scale genome-wide identification of RGAs. It is freely available at Bitbucket: https://bitbucket.org/yaanlpc/rgaugury .
Kvamme, Bjørn; Kuznetsova, Tatiana; Jensen, Bjørnar; Stensholt, Sigvat; Bauman, Jordan; Sjøblom, Sara; Nes Lervik, Kim
2014-05-14
Deciding on the upper bound of water content permissible in a stream of dense carbon dioxide under pipeline transport conditions without facing the risks of hydrate formation is a complex issue. In this work, we outline and analyze ten primary routes of hydrate formation inside a rusty pipeline, with hydrogen sulfide, methane, argon, and nitrogen as additional impurities. A comprehensive treatment of equilibrium absolute thermodynamics as applied to multiple hydrate phase transitions is provided. We also discuss in detail the implications of the Gibbs phase rule that make it necessary to consider non-equilibrium thermodynamics. The analysis of hydrate formation risk has been revised for the dominant routes, including the one traditionally considered in industrial practice and hydrate calculators. The application of absolute thermodynamics with parameters derived from atomistic simulations leads to several important conclusions regarding the impact of hydrogen sulfide. When present at studied concentrations below 5 mol%, the presence of hydrogen sulfide will only support the carbon-dioxide-dominated hydrate formation on the phase interface between liquid water and hydrate formers entering from the carbon dioxide phase. This is in contrast to a homogeneous hydrate nucleation and growth inside the aqueous solution bulk. Our case studies indicate that hydrogen sulfide at higher than 0.1 mol% concentration in carbon dioxide can lead to growth of multiple hydrate phases immediately adjacent to the adsorbed water layers. We conclude that hydrate formation via water adsorption on rusty pipeline walls will be the dominant contributor to the hydrate formation risk, with initial concentration of hydrogen sulfide being the critical factor.
nanopipe: Calibration and data reduction pipeline for pulsar timing
NASA Astrophysics Data System (ADS)
Demorest, Paul B.
2018-03-01
nanopipe is a data reduction pipeline for calibration, RFI removal, and pulse time-of-arrival measurement from radio pulsar data. It was developed primarily for use by the NANOGrav project. nanopipe is written in Python, and depends on the PSRCHIVE (ascl:1105.014) library.
49 CFR 192.616 - Public awareness.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 49 Transportation 3 2011-10-01 2011-10-01 false Public awareness. 192.616 Section 192.616... BY PIPELINE: MINIMUM FEDERAL SAFETY STANDARDS Operations § 192.616 Public awareness. (a) Except for..., each pipeline operator must develop and implement a written continuing public education program that...
49 CFR 192.616 - Public awareness.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 49 Transportation 3 2010-10-01 2010-10-01 false Public awareness. 192.616 Section 192.616... BY PIPELINE: MINIMUM FEDERAL SAFETY STANDARDS Operations § 192.616 Public awareness. (a) Except for..., each pipeline operator must develop and implement a written continuing public education program that...
Transforming microbial genotyping: a robotic pipeline for genotyping bacterial strains.
O'Farrell, Brian; Haase, Jana K; Velayudhan, Vimalkumar; Murphy, Ronan A; Achtman, Mark
2012-01-01
Microbial genotyping increasingly deals with large numbers of samples, and data are commonly evaluated by unstructured approaches, such as spread-sheets. The efficiency, reliability and throughput of genotyping would benefit from the automation of manual manipulations within the context of sophisticated data storage. We developed a medium- throughput genotyping pipeline for MultiLocus Sequence Typing (MLST) of bacterial pathogens. This pipeline was implemented through a combination of four automated liquid handling systems, a Laboratory Information Management System (LIMS) consisting of a variety of dedicated commercial operating systems and programs, including a Sample Management System, plus numerous Python scripts. All tubes and microwell racks were bar-coded and their locations and status were recorded in the LIMS. We also created a hierarchical set of items that could be used to represent bacterial species, their products and experiments. The LIMS allowed reliable, semi-automated, traceable bacterial genotyping from initial single colony isolation and sub-cultivation through DNA extraction and normalization to PCRs, sequencing and MLST sequence trace evaluation. We also describe robotic sequencing to facilitate cherrypicking of sequence dropouts. This pipeline is user-friendly, with a throughput of 96 strains within 10 working days at a total cost of < €25 per strain. Since developing this pipeline, >200,000 items were processed by two to three people. Our sophisticated automated pipeline can be implemented by a small microbiology group without extensive external support, and provides a general framework for semi-automated bacterial genotyping of large numbers of samples at low cost.
Transforming Microbial Genotyping: A Robotic Pipeline for Genotyping Bacterial Strains
Velayudhan, Vimalkumar; Murphy, Ronan A.; Achtman, Mark
2012-01-01
Microbial genotyping increasingly deals with large numbers of samples, and data are commonly evaluated by unstructured approaches, such as spread-sheets. The efficiency, reliability and throughput of genotyping would benefit from the automation of manual manipulations within the context of sophisticated data storage. We developed a medium- throughput genotyping pipeline for MultiLocus Sequence Typing (MLST) of bacterial pathogens. This pipeline was implemented through a combination of four automated liquid handling systems, a Laboratory Information Management System (LIMS) consisting of a variety of dedicated commercial operating systems and programs, including a Sample Management System, plus numerous Python scripts. All tubes and microwell racks were bar-coded and their locations and status were recorded in the LIMS. We also created a hierarchical set of items that could be used to represent bacterial species, their products and experiments. The LIMS allowed reliable, semi-automated, traceable bacterial genotyping from initial single colony isolation and sub-cultivation through DNA extraction and normalization to PCRs, sequencing and MLST sequence trace evaluation. We also describe robotic sequencing to facilitate cherrypicking of sequence dropouts. This pipeline is user-friendly, with a throughput of 96 strains within 10 working days at a total cost of < €25 per strain. Since developing this pipeline, >200,000 items were processed by two to three people. Our sophisticated automated pipeline can be implemented by a small microbiology group without extensive external support, and provides a general framework for semi-automated bacterial genotyping of large numbers of samples at low cost. PMID:23144721
Assessing fugitive emissions of CH4 from high-pressure gas pipelines in the UK
NASA Astrophysics Data System (ADS)
Clancy, S.; Worrall, F.; Davies, R. J.; Almond, S.; Boothroyd, I.
2016-12-01
Concern over the greenhouse gas impact of the exploitation of unconventional natural gas from shale deposits has caused a spotlight to be shone on to the entire hydrocarbon industry. Numerous studies have developed life-cycle emissions inventories to assess the impact that hydraulic fracturing has upon greenhouse gas emissions. Incorporated within life-cycle assessments are transmission and distribution losses, including infrastructure such as pipelines and compressor stations that pressurise natural gas for transport along pipelines. Estimates of fugitive emissions from transmission, storage and distribution have been criticized for reliance on old data from inappropriate sources (1970s Russian gas pipelines). In this study, we investigate fugitive emissions of CH4 from the UK high pressure national transmission system. The study took two approaches. Firstly, CH4 concentration is detected by driving along roads bisecting high pressure gas pipelines and also along an equivalent distance along a route where no high pressure gas pipeline was nearby. Five pipelines and five equivalent control routes were driven and the test was that CH4 measurements, when adjusted for distance and wind speed, should be greater on any route with a pipe than any route without a pipe. Secondly, 5 km of a high pressure gas pipeline and 5 km of equivalent farmland, were walked and soil gas (above the pipeline where present) was analysed every 7 m using a tunable diode laser. When wind adjusted 92 km of high pressure pipeline and 72 km of control route were drive over a 10 day period. When wind and distance adjusted CH4 fluxes were significantly greater on routes with a pipeline than those without. The smallest leak detectable was 3% above ambient (1.03 relative concentration) with any leaks below 3% above ambient assumed ambient. The number of leaks detected along the pipelines correlate to the estimated length of pipe joints, inferring that there are constant fugitive CH4 emissions from these joints. When scaled up to the UK's National Transmission System pipeline length of 7600 km gives a fugitive CH4 flux of 62.6 kt CH4/yr with a CO2 equivalent of 1570 kt CO2eq/yr - this fugitive emission from high pressure pipelines is 0.14% of the annual gas supply.
Biennial report summary of hazardous materials transportation, 2005-2006
DOT National Transportation Integrated Search
2007-01-01
The Federal hazmat law requires the United States (U.S.) Department of Transportation : (DOT) to protect the public from the risks to life, property, and the environment inherent in : commercial transportation of hazardous materials. The Pipeline and...
External Corrosion Direct Assessment for Unique Threats to Underground Pipelines
DOT National Transportation Integrated Search
2007-11-01
External corrosion direct assessment process (ECDA) implemented in accordance with the NACE Recommended Practice RP0502-02 relies on above ground DA techniques to prioritize locations at risk for corrosion. Two special cases warrant special considera...
Magnetic pipeline for coal and oil
DOE Office of Scientific and Technical Information (OSTI.GOV)
Knolle, E.
1998-07-01
A 1994 analysis of the recorded costs of the Alaska oil pipeline, in a paper entitled Maglev Crude Oil Pipeline, (NASA CP-3247 pp. 671--684) concluded that, had the Knolle Magnetrans pipeline technology been available and used, some $10 million per day in transportation costs could have been saved over the 20 years of the Alaska oil pipeline's existence. This over 800 mile long pipeline requires about 500 horsepower per mile in pumping power, which together with the cost of the pipeline's capital investment consumes about one-third of the energy value of the pumped oil. This does not include the costmore » of getting the oil out of the ground. The reason maglev technology performs superior to conventional pipelines is because by magnetically levitating the oil into contact-free suspense, there is no drag-causing adhesion. In addition, by using permanent magnets in repulsion, suspension is achieved without using energy. Also, the pumped oil's adhesion to the inside of pipes limits its speed. In the case of the Alaska pipeline the speed is limited to about 7 miles per hour, which, with its 48-inch pipe diameter and 1200 psi pressure, pumps about 2 million barrels per day. The maglev system, as developed by Knolle Magnetrans, would transport oil in magnetically suspended sealed containers and, thus free of adhesion, at speeds 10 to 20 times faster. Furthermore, the diameter of the levitated containers can be made smaller with the same capacity, which makes the construction of the maglev system light and inexpensive. There are similar advantages when using maglev technology to transport coal. Also, a maglev system has advantages over railroads in mountainous regions where coal is primarily mined. A maglev pipeline can travel, all-year and all weather, in a straight line to the end-user, whereas railroads have difficult circuitous routes. In contrast, a maglev pipeline can climb over steep hills without much difficulty.« less
A Pipeline Tool for CCD Image Processing
NASA Astrophysics Data System (ADS)
Bell, Jon F.; Young, Peter J.; Roberts, William H.; Sebo, Kim M.
MSSSO is part of a collaboration developing a wide field imaging CCD mosaic (WFI). As part of this project, we have developed a GUI based pipeline tool that is an integrated part of MSSSO's CICADA data acquisition environment and processes CCD FITS images as they are acquired. The tool is also designed to run as a stand alone program to process previously acquired data. IRAF tasks are used as the central engine, including the new NOAO mscred package for processing multi-extension FITS files. The STScI OPUS pipeline environment may be used to manage data and process scheduling. The Motif GUI was developed using SUN Visual Workshop. C++ classes were written to facilitate launching of IRAF and OPUS tasks. While this first version implements calibration processing up to and including flat field corrections, there is scope to extend it to other processing.
DOT National Transportation Integrated Search
2010-06-18
The potential exists for stress corrosion cracking (SCC) of carbon steel pipelines transporting fuel grade ethanol (FGE) and FGE- gasoline blends. The objectives of SCC 4-4 were to: 1. Develop data necessary to make engineering assessments of the fea...
49 CFR 195.440 - Public awareness.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 49 Transportation 3 2011-10-01 2011-10-01 false Public awareness. 195.440 Section 195.440... PIPELINE Operation and Maintenance § 195.440 Public awareness. (a) Each pipeline operator must develop and implement a written continuing public education program that follows the guidance provided in the American...
49 CFR 195.440 - Public awareness.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 49 Transportation 3 2010-10-01 2010-10-01 false Public awareness. 195.440 Section 195.440... PIPELINE Operation and Maintenance § 195.440 Public awareness. (a) Each pipeline operator must develop and implement a written continuing public education program that follows the guidance provided in the American...
Natural Gas in the Rocky Mountains: Developing Infrastructure
2007-01-01
This Supplement to the Energy Information Administration's Short-Term Energy Outlook analyzes current natural gas production, pipeline and storage infrastructure in the Rocky Mountains, as well as prospective pipeline projects in these states. The influence of these factors on regional prices and price volatility is examined.
Putting the environment into the NPV calculation -- Quantifying pipeline environmental costs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dott, D.R.; Wirasinghe, S.C.; Chakma, A.
1996-12-31
Pipeline projects impact the environment through soil and habitat disturbance, noise during construction and compressor operation, river crossing disturbance and the risk of rupture. Assigning monetary value to these negative project consequences enables the environment to be represented in the project cost-benefit analysis. This paper presents the mechanics and implications of two environmental valuation techniques: (1) the contingent valuation method and (2) the stated preference method. The use of environmental value at the project economic-evaluation stage is explained. A summary of research done on relevant environmental attribute valuation is presented and discussed. Recommendations for further research in the field aremore » made.« less
Latest Development and Application of High Strength and Heavy Gauge Pipeline Steel in China
NASA Astrophysics Data System (ADS)
Yongqing, Zhang; Aimin, Guo; Chengjia, Shang; Qingyou, Liu; Gray, J. Malcolm; Barbaro, Frank
Over the past twenty years, significant advances have been made in the field of microalloying and associated application, among which one of the most successful application cases is HTP practice for heavy gauge, high strength pipeline steels. Combined the strengthening effects of TMCP and retardation effects of austenite recrystallization with increasing Nb in austenite region, HTP conception with low carbon and high niobium alloy design has been successfully applied to develop X80 coil with a thickness of 18.4mm used for China's Second West-East pipeline. During this process, big efforts were made to further develop and enrich the application of microalloying technology, and at the same time the strengthening effects of Nb have been completely unfolded and fully utilized with improved metallurgical quality and quantitative analysis of microstructure. In this paper, the existing status and strengthening effect of Nb during reheating, rolling, cooling and welding have been analyzed and characterized based on mass production samples and laboratory analysis. As confirmed, grain refinement remains the most basic strengthening measure to reduce the microstructure gradient along the thickness, which in turn enlarges the processing window to improve upon low temperature toughness, and finally make it possible to develop heavy gauge, high strength pipeline steels with more challenging fracture toughness requirements.
HESP: Instrument control, calibration and pipeline development
NASA Astrophysics Data System (ADS)
Anantha, Ch.; Roy, Jayashree; Mahesh, P. K.; Parihar, P. S.; Sangal, A. K.; Sriram, S.; Anand, M. N.; Anupama, G. C.; Giridhar, S.; Prabhu, T. P.; Sivarani, T.; Sundararajan, M. S.
Hanle Echelle SPectrograph (HESP) is a fibre-fed, high resolution (R = 30,000 and 60,000) spectrograph being developed for the 2m HCT telescope at IAO, Hanle. The major components of the instrument are a) Cassegrain unit b) Spectrometer instrument. An instrument control system interacting with a guiding unit at Cassegrain interface as well as handling spectrograph functions is being developed. An on-axis auto-guiding using the spill-over angular ring around the input pinhole is also being developed. The stellar light from the Cassegrain unit is taken to the spectrograph using an optical fiber which is being characterized for spectral transmission, focal ratio degradation and scrambling properties. The design of the thermal enclosure and thermal control for the spectrograph housing is presented. A data pipeline for the entire Echelle spectral reduction is being developed. We also plan to implement an instrument physical model based calibration into the main data pipeline and in the maintenance and quality control operations.
Sensor Network Architectures for Monitoring Underwater Pipelines
Mohamed, Nader; Jawhar, Imad; Al-Jaroodi, Jameela; Zhang, Liren
2011-01-01
This paper develops and compares different sensor network architecture designs that can be used for monitoring underwater pipeline infrastructures. These architectures are underwater wired sensor networks, underwater acoustic wireless sensor networks, RF (Radio Frequency) wireless sensor networks, integrated wired/acoustic wireless sensor networks, and integrated wired/RF wireless sensor networks. The paper also discusses the reliability challenges and enhancement approaches for these network architectures. The reliability evaluation, characteristics, advantages, and disadvantages among these architectures are discussed and compared. Three reliability factors are used for the discussion and comparison: the network connectivity, the continuity of power supply for the network, and the physical network security. In addition, the paper also develops and evaluates a hierarchical sensor network framework for underwater pipeline monitoring. PMID:22346669
Sensor network architectures for monitoring underwater pipelines.
Mohamed, Nader; Jawhar, Imad; Al-Jaroodi, Jameela; Zhang, Liren
2011-01-01
This paper develops and compares different sensor network architecture designs that can be used for monitoring underwater pipeline infrastructures. These architectures are underwater wired sensor networks, underwater acoustic wireless sensor networks, RF (radio frequency) wireless sensor networks, integrated wired/acoustic wireless sensor networks, and integrated wired/RF wireless sensor networks. The paper also discusses the reliability challenges and enhancement approaches for these network architectures. The reliability evaluation, characteristics, advantages, and disadvantages among these architectures are discussed and compared. Three reliability factors are used for the discussion and comparison: the network connectivity, the continuity of power supply for the network, and the physical network security. In addition, the paper also develops and evaluates a hierarchical sensor network framework for underwater pipeline monitoring.
NASA Astrophysics Data System (ADS)
Feng, Shuo; Liu, Dejun; Cheng, Xing; Fang, Huafeng; Li, Caifang
2017-04-01
Magnetic anomalies produced by underground ferromagnetic pipelines because of the polarization of earth's magnetic field are used to obtain the information on the location, buried depth and other parameters of pipelines. In order to achieve a fast inversion and interpretation of measured data, it is necessary to develop a fast and stable forward method. Magnetic dipole reconstruction (MDR), as a kind of integration numerical method, is well suited for simulating a thin pipeline anomaly. In MDR the pipeline model must be cut into small magnetic dipoles through different segmentation methods. The segmentation method has an impact on the stability and speed of forward calculation. Rapid and accurate simulation of deep-buried pipelines has been achieved by exciting segmentation method. However, in practical measurement, the depth of underground pipe is uncertain. When it comes to the shallow-buried pipeline, the present segmentation may generate significant errors. This paper aims at solving this problem in three stages. First, the cause of inaccuracy is analyzed by simulation experiment. Secondly, new variable interval section segmentation is proposed based on the existing segmentation. It can help MDR method to obtain simulation results in a fast way under the premise of ensuring the accuracy of different depth models. Finally, the measured data is inversed based on new segmentation method. The result proves that the inversion based on the new segmentation can achieve fast and accurate inversion of depth parameters of underground pipes without being limited by pipeline depth.
A homology-based pipeline for global prediction of post-translational modification sites
NASA Astrophysics Data System (ADS)
Chen, Xiang; Shi, Shao-Ping; Xu, Hao-Dong; Suo, Sheng-Bao; Qiu, Jian-Ding
2016-05-01
The pathways of protein post-translational modifications (PTMs) have been shown to play particularly important roles for almost any biological process. Identification of PTM substrates along with information on the exact sites is fundamental for fully understanding or controlling biological processes. Alternative computational strategies would help to annotate PTMs in a high-throughput manner. Traditional algorithms are suited for identifying the common organisms and tissues that have a complete PTM atlas or extensive experimental data. While annotation of rare PTMs in most organisms is a clear challenge. In this work, to this end we have developed a novel homology-based pipeline named PTMProber that allows identification of potential modification sites for most of the proteomes lacking PTMs data. Cross-promotion E-value (CPE) as stringent benchmark has been used in our pipeline to evaluate homology to known modification sites. Independent-validation tests show that PTMProber achieves over 58.8% recall with high precision by CPE benchmark. Comparisons with other machine-learning tools show that PTMProber pipeline performs better on general predictions. In addition, we developed a web-based tool to integrate this pipeline at http://bioinfo.ncu.edu.cn/PTMProber/index.aspx. In addition to pre-constructed prediction models of PTM, the website provides an extensional functionality to allow users to customize models.
Lu, Hao; Wang, Mingyang; Yang, Baohuai; Rong, Xiaoli
2013-01-01
With the development of subway engineering, according to uncertain factors and serious accidents involved in the construction of subways, implementing risk assessment is necessary and may bring a number of benefits for construction safety. The Kent index method extensively used in pipeline construction is improved to make risk assessment much more practical for the risk assessment of disastrous accidents in subway engineering. In the improved method, the indexes are divided into four categories, namely, basic, design, construction, and consequence indexes. In this study, a risk assessment model containing four kinds of indexes is provided. Three kinds of risk occurrence modes are listed. The probability index model which considers the relativity of the indexes is established according to the risk occurrence modes. The model provides the risk assessment process through the fault tree method and has been applied in the risk assessment of Nanjing subway's river-crossing tunnel construction. Based on the assessment results, the builders were informed of what risks should be noticed and what they should do to avoid the risks. The need for further research is discussed. Overall, this method may provide a tool for the builders, and improve the safety of the construction. PMID:23710136
NASA Astrophysics Data System (ADS)
Kang, Jidong; Gianetto, James A.; Tyson, William R.
2018-03-01
Fracture toughness measurement is an integral part of structural integrity assessment of pipelines. Traditionally, a single-edge-notched bend (SE(B)) specimen with a deep crack is recommended in many existing pipeline structural integrity assessment procedures. Such a test provides high constraint and therefore conservative fracture toughness results. However, for girth welds in service, defects are usually subjected to primarily tensile loading where the constraint is usually much lower than in the three-point bend case. Moreover, there is increasing use of strain-based design of pipelines that allows applied strains above yield. Low-constraint toughness tests represent more realistic loading conditions for girth weld defects, and the corresponding increased toughness can minimize unnecessary conservatism in assessments. In this review, we present recent developments in low-constraint fracture toughness testing, specifically using single-edgenotched tension specimens, SENT or SE(T). We focus our review on the test procedure development and automation, round-robin test results and some common concerns such as the effect of crack tip, crack size monitoring techniques, and testing at low temperatures. Examples are also given of the integration of fracture toughness data from SE(T) tests into structural integrity assessment.
Rasmussen, Luke V; Peissig, Peggy L; McCarty, Catherine A; Starren, Justin
2012-06-01
Although the penetration of electronic health records is increasing rapidly, much of the historical medical record is only available in handwritten notes and forms, which require labor-intensive, human chart abstraction for some clinical research. The few previous studies on automated extraction of data from these handwritten notes have focused on monolithic, custom-developed recognition systems or third-party systems that require proprietary forms. We present an optical character recognition processing pipeline, which leverages the capabilities of existing third-party optical character recognition engines, and provides the flexibility offered by a modular custom-developed system. The system was configured and run on a selected set of form fields extracted from a corpus of handwritten ophthalmology forms. The processing pipeline allowed multiple configurations to be run, with the optimal configuration consisting of the Nuance and LEADTOOLS engines running in parallel with a positive predictive value of 94.6% and a sensitivity of 13.5%. While limitations exist, preliminary experience from this project yielded insights on the generalizability and applicability of integrating multiple, inexpensive general-purpose third-party optical character recognition engines in a modular pipeline.
Peissig, Peggy L; McCarty, Catherine A; Starren, Justin
2011-01-01
Background Although the penetration of electronic health records is increasing rapidly, much of the historical medical record is only available in handwritten notes and forms, which require labor-intensive, human chart abstraction for some clinical research. The few previous studies on automated extraction of data from these handwritten notes have focused on monolithic, custom-developed recognition systems or third-party systems that require proprietary forms. Methods We present an optical character recognition processing pipeline, which leverages the capabilities of existing third-party optical character recognition engines, and provides the flexibility offered by a modular custom-developed system. The system was configured and run on a selected set of form fields extracted from a corpus of handwritten ophthalmology forms. Observations The processing pipeline allowed multiple configurations to be run, with the optimal configuration consisting of the Nuance and LEADTOOLS engines running in parallel with a positive predictive value of 94.6% and a sensitivity of 13.5%. Discussion While limitations exist, preliminary experience from this project yielded insights on the generalizability and applicability of integrating multiple, inexpensive general-purpose third-party optical character recognition engines in a modular pipeline. PMID:21890871
A pipeline design of a fast prime factor DFT on a finite field
NASA Technical Reports Server (NTRS)
Truong, T. K.; Hsu, In-Shek; Shao, H. M.; Reed, Irving S.; Shyu, Hsuen-Chyun
1988-01-01
A conventional prime factor discrete Fourier transform (DFT) algorithm is used to realize a discrete Fourier-like transform on the finite field, GF(q sub n). This algorithm is developed to compute cyclic convolutions of complex numbers and to decode Reed-Solomon codes. Such a pipeline fast prime factor DFT algorithm over GF(q sub n) is regular, simple, expandable, and naturally suitable for VLSI implementation. An example illustrating the pipeline aspect of a 30-point transform over GF(q sub n) is presented.
Critical Race Quantitative Intersections: A "testimonio" Analysis
ERIC Educational Resources Information Center
Covarrubias, Alejandro; Nava, Pedro E.; Lara, Argelia; Burciaga, Rebeca; Vélez, Verónica N.; Solorzano, Daniel G.
2018-01-01
The educational pipeline has become a commonly referenced depiction of educational outcomes for racialized groups across the country. While visually impactful, an overreliance on decontextualized quantitative data often leads to majoritarian interpretations. Without sociohistorical contexts, these interpretations run the risk of perpetuating…
18 CFR 157.215 - Underground storage testing and development.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 7 OF THE NATURAL GAS ACT Interstate Pipeline Blanket Certificates and Authorization Under Section 7..., construct and operate natural gas pipeline and compression facilities, including injection, withdrawal, and... the gas bubble. This map need not be filed if there is no material change from the map previously...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-28
... Services Limited Chongqing Petroleum Special Pipeline Factory of CNPC Sichuan Petroleum Goods & Material... Pipeline Materials Company Limited Shanghai Baodi Petroleum Pipe Development Co., Ltd Shanghai Baofu Steel... request. Respondent Selection In the event the Department limits the number of respondents for individual...
ERIC Educational Resources Information Center
Daniel, Rhonda; Caruthers, Devina
2016-01-01
This white paper, "Understanding Underrepresented Populations in the Business School Pipeline," examines the shifting US racial and ethnic demographics and projected growth among US minority populations and the challenges--and incentives--these developments pose for US business schools to increase the opportunities for minority students…
Rucker, Dale Franklin
2010-04-01
A former radioactive waste disposal site is surveyed with two non-intrusive geophysical techniques, including magnetic gradiometry and electromagnetic induction. Data were gathered over the site by towing the geophysical equipment mounted to a non-electrically conductive and non-magnetic fibre-glass cart. Magnetic gradiometry, which detects the location of ferromagnetic material, including iron and steel, was used to map the existence of a previously unknown buried pipeline formerly used in the delivery of liquid waste to a number of surface disposal trenches and concrete vaults. The existence of a possible pipeline is reinforced by historical engineering drawing and photographs. The electromagnetic induction (EMI) technique was used to map areas of high and low electrical conductivity, which coincide with the magnetic gradiometry data. The EMI also provided information on areas of high electrical conductivity unrelated to a pipeline network. Both data sets demonstrate the usefulness of surface geophysical surveillance techniques to minimize the risk of exposure in the event of future remediation efforts.
Statistical analysis on the signals monitoring multiphase flow patterns in pipeline-riser system
NASA Astrophysics Data System (ADS)
Ye, Jing; Guo, Liejin
2013-07-01
The signals monitoring petroleum transmission pipeline in offshore oil industry usually contain abundant information about the multiphase flow on flow assurance which includes the avoidance of most undesirable flow pattern. Therefore, extracting reliable features form these signals to analyze is an alternative way to examine the potential risks to oil platform. This paper is focused on characterizing multiphase flow patterns in pipeline-riser system that is often appeared in offshore oil industry and finding an objective criterion to describe the transition of flow patterns. Statistical analysis on pressure signal at the riser top is proposed, instead of normal prediction method based on inlet and outlet flow conditions which could not be easily determined during most situations. Besides, machine learning method (least square supported vector machine) is also performed to classify automatically the different flow patterns. The experiment results from a small-scale loop show that the proposed method is effective for analyzing the multiphase flow pattern.
Chan, Kuang-Lim; Rosli, Rozana; Tatarinova, Tatiana V; Hogan, Michael; Firdaus-Raih, Mohd; Low, Eng-Ti Leslie
2017-01-27
Gene prediction is one of the most important steps in the genome annotation process. A large number of software tools and pipelines developed by various computing techniques are available for gene prediction. However, these systems have yet to accurately predict all or even most of the protein-coding regions. Furthermore, none of the currently available gene-finders has a universal Hidden Markov Model (HMM) that can perform gene prediction for all organisms equally well in an automatic fashion. We present an automated gene prediction pipeline, Seqping that uses self-training HMM models and transcriptomic data. The pipeline processes the genome and transcriptome sequences of the target species using GlimmerHMM, SNAP, and AUGUSTUS pipelines, followed by MAKER2 program to combine predictions from the three tools in association with the transcriptomic evidence. Seqping generates species-specific HMMs that are able to offer unbiased gene predictions. The pipeline was evaluated using the Oryza sativa and Arabidopsis thaliana genomes. Benchmarking Universal Single-Copy Orthologs (BUSCO) analysis showed that the pipeline was able to identify at least 95% of BUSCO's plantae dataset. Our evaluation shows that Seqping was able to generate better gene predictions compared to three HMM-based programs (MAKER2, GlimmerHMM and AUGUSTUS) using their respective available HMMs. Seqping had the highest accuracy in rice (0.5648 for CDS, 0.4468 for exon, and 0.6695 nucleotide structure) and A. thaliana (0.5808 for CDS, 0.5955 for exon, and 0.8839 nucleotide structure). Seqping provides researchers a seamless pipeline to train species-specific HMMs and predict genes in newly sequenced or less-studied genomes. We conclude that the Seqping pipeline predictions are more accurate than gene predictions using the other three approaches with the default or available HMMs.
A midas plugin to enable construction of reproducible web-based image processing pipelines
Grauer, Michael; Reynolds, Patrick; Hoogstoel, Marion; Budin, Francois; Styner, Martin A.; Oguz, Ipek
2013-01-01
Image processing is an important quantitative technique for neuroscience researchers, but difficult for those who lack experience in the field. In this paper we present a web-based platform that allows an expert to create a brain image processing pipeline, enabling execution of that pipeline even by those biomedical researchers with limited image processing knowledge. These tools are implemented as a plugin for Midas, an open-source toolkit for creating web based scientific data storage and processing platforms. Using this plugin, an image processing expert can construct a pipeline, create a web-based User Interface, manage jobs, and visualize intermediate results. Pipelines are executed on a grid computing platform using BatchMake and HTCondor. This represents a new capability for biomedical researchers and offers an innovative platform for scientific collaboration. Current tools work well, but can be inaccessible for those lacking image processing expertise. Using this plugin, researchers in collaboration with image processing experts can create workflows with reasonable default settings and streamlined user interfaces, and data can be processed easily from a lab environment without the need for a powerful desktop computer. This platform allows simplified troubleshooting, centralized maintenance, and easy data sharing with collaborators. These capabilities enable reproducible science by sharing datasets and processing pipelines between collaborators. In this paper, we present a description of this innovative Midas plugin, along with results obtained from building and executing several ITK based image processing workflows for diffusion weighted MRI (DW MRI) of rodent brain images, as well as recommendations for building automated image processing pipelines. Although the particular image processing pipelines developed were focused on rodent brain MRI, the presented plugin can be used to support any executable or script-based pipeline. PMID:24416016
A midas plugin to enable construction of reproducible web-based image processing pipelines.
Grauer, Michael; Reynolds, Patrick; Hoogstoel, Marion; Budin, Francois; Styner, Martin A; Oguz, Ipek
2013-01-01
Image processing is an important quantitative technique for neuroscience researchers, but difficult for those who lack experience in the field. In this paper we present a web-based platform that allows an expert to create a brain image processing pipeline, enabling execution of that pipeline even by those biomedical researchers with limited image processing knowledge. These tools are implemented as a plugin for Midas, an open-source toolkit for creating web based scientific data storage and processing platforms. Using this plugin, an image processing expert can construct a pipeline, create a web-based User Interface, manage jobs, and visualize intermediate results. Pipelines are executed on a grid computing platform using BatchMake and HTCondor. This represents a new capability for biomedical researchers and offers an innovative platform for scientific collaboration. Current tools work well, but can be inaccessible for those lacking image processing expertise. Using this plugin, researchers in collaboration with image processing experts can create workflows with reasonable default settings and streamlined user interfaces, and data can be processed easily from a lab environment without the need for a powerful desktop computer. This platform allows simplified troubleshooting, centralized maintenance, and easy data sharing with collaborators. These capabilities enable reproducible science by sharing datasets and processing pipelines between collaborators. In this paper, we present a description of this innovative Midas plugin, along with results obtained from building and executing several ITK based image processing workflows for diffusion weighted MRI (DW MRI) of rodent brain images, as well as recommendations for building automated image processing pipelines. Although the particular image processing pipelines developed were focused on rodent brain MRI, the presented plugin can be used to support any executable or script-based pipeline.
Deliverability on the interstate natural gas pipeline system
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1998-05-01
Deliverability on the Interstate Natural Gas Pipeline System examines the capability of the national pipeline grid to transport natural gas to various US markets. The report quantifies the capacity levels and utilization rates of major interstate pipeline companies in 1996 and the changes since 1990, as well as changes in markets and end-use consumption patterns. It also discusses the effects of proposed capacity expansions on capacity levels. The report consists of five chapters, several appendices, and a glossary. Chapter 1 discusses some of the operational and regulatory features of the US interstate pipeline system and how they affect overall systemmore » design, system utilization, and capacity expansions. Chapter 2 looks at how the exploration, development, and production of natural gas within North America is linked to the national pipeline grid. Chapter 3 examines the capability of the interstate natural gas pipeline network to link production areas to market areas, on the basis of capacity and usage levels along 10 corridors. The chapter also examines capacity expansions that have occurred since 1990 along each corridor and the potential impact of proposed new capacity. Chapter 4 discusses the last step in the transportation chain, that is, deliverability to the ultimate end user. Flow patterns into and out of each market region are discussed, as well as the movement of natural gas between States in each region. Chapter 5 examines how shippers reserve interstate pipeline capacity in the current transportation marketplace and how pipeline companies are handling the secondary market for short-term unused capacity. Four appendices provide supporting data and additional detail on the methodology used to estimate capacity. 32 figs., 15 tabs.« less
Synergistic combinations of antifungals and anti-virulence agents to fight against Candida albicans
Cui, Jinhui; Ren, Biao; Tong, Yaojun; Dai, Huanqin; Zhang, Lixin
2015-01-01
Candida albicans, one of the pathogenic Candida species, causes high mortality rate in immunocompromised and high-risk surgical patients. In the last decade, only one new class of antifungal drug echinocandin was applied. The increased therapy failures, such as the one caused by multi-drug resistance, demand innovative strategies for new effective antifungal drugs. Synergistic combinations of antifungals and anti-virulence agents highlight the pragmatic strategy to reduce the development of drug resistant and potentially repurpose known antifungals, which bypass the costly and time-consuming pipeline of new drug development. Anti-virulence and synergistic combination provide new options for antifungal drug discovery by counteracting the difficulty or failure of traditional therapy for fungal infections. PMID:26048362
The HEASARC Swift Gamma-Ray Burst Archive: The Pipeline and the Catalog
NASA Technical Reports Server (NTRS)
Donato, Davide; Angelini, Lorella; Padgett, C.A.; Reichard, T.; Gehrels, Neil; Marshall, Francis E.; Sakamoto, Takanori
2012-01-01
Since its launch in late 2004, the Swift satellite triggered or observed an average of one gamma-ray burst (GRB) every 3 days, for a total of 771 GRBs by 2012 January. Here, we report the development of a pipeline that semi automatically performs the data-reduction and data-analysis processes for the three instruments on board Swift (BAT, XRT, UVOT). The pipeline is written in Perl, and it uses only HEAsoft tools and can be used to perform the analysis of a majority of the point-like objects (e.g., GRBs, active galactic nuclei, pulsars) observed by Swift. We run the pipeline on the GRBs, and we present a database containing the screened data, the output products, and the results of our ongoing analysis. Furthermore, we created a catalog summarizing some GRB information, collected either by running the pipeline or from the literature. The Perl script, the database, and the catalog are available for downloading and querying at the HEASARC Web site.
The HEASARC Swift Gamma-Ray Burst Archive: The Pipeline and the Catalog
NASA Astrophysics Data System (ADS)
Donato, D.; Angelini, L.; Padgett, C. A.; Reichard, T.; Gehrels, N.; Marshall, F. E.; Sakamoto, T.
2012-11-01
Since its launch in late 2004, the Swift satellite triggered or observed an average of one gamma-ray burst (GRB) every 3 days, for a total of 771 GRBs by 2012 January. Here, we report the development of a pipeline that semi-automatically performs the data-reduction and data-analysis processes for the three instruments on board Swift (BAT, XRT, UVOT). The pipeline is written in Perl, and it uses only HEAsoft tools and can be used to perform the analysis of a majority of the point-like objects (e.g., GRBs, active galactic nuclei, pulsars) observed by Swift. We run the pipeline on the GRBs, and we present a database containing the screened data, the output products, and the results of our ongoing analysis. Furthermore, we created a catalog summarizing some GRB information, collected either by running the pipeline or from the literature. The Perl script, the database, and the catalog are available for downloading and querying at the HEASARC Web site.
Integration of a neuroimaging processing pipeline into a pan-canadian computing grid
NASA Astrophysics Data System (ADS)
Lavoie-Courchesne, S.; Rioux, P.; Chouinard-Decorte, F.; Sherif, T.; Rousseau, M.-E.; Das, S.; Adalat, R.; Doyon, J.; Craddock, C.; Margulies, D.; Chu, C.; Lyttelton, O.; Evans, A. C.; Bellec, P.
2012-02-01
The ethos of the neuroimaging field is quickly moving towards the open sharing of resources, including both imaging databases and processing tools. As a neuroimaging database represents a large volume of datasets and as neuroimaging processing pipelines are composed of heterogeneous, computationally intensive tools, such open sharing raises specific computational challenges. This motivates the design of novel dedicated computing infrastructures. This paper describes an interface between PSOM, a code-oriented pipeline development framework, and CBRAIN, a web-oriented platform for grid computing. This interface was used to integrate a PSOM-compliant pipeline for preprocessing of structural and functional magnetic resonance imaging into CBRAIN. We further tested the capacity of our infrastructure to handle a real large-scale project. A neuroimaging database including close to 1000 subjects was preprocessed using our interface and publicly released to help the participants of the ADHD-200 international competition. This successful experiment demonstrated that our integrated grid-computing platform is a powerful solution for high-throughput pipeline analysis in the field of neuroimaging.
High-throughput bioinformatics with the Cyrille2 pipeline system
Fiers, Mark WEJ; van der Burgt, Ate; Datema, Erwin; de Groot, Joost CW; van Ham, Roeland CHJ
2008-01-01
Background Modern omics research involves the application of high-throughput technologies that generate vast volumes of data. These data need to be pre-processed, analyzed and integrated with existing knowledge through the use of diverse sets of software tools, models and databases. The analyses are often interdependent and chained together to form complex workflows or pipelines. Given the volume of the data used and the multitude of computational resources available, specialized pipeline software is required to make high-throughput analysis of large-scale omics datasets feasible. Results We have developed a generic pipeline system called Cyrille2. The system is modular in design and consists of three functionally distinct parts: 1) a web based, graphical user interface (GUI) that enables a pipeline operator to manage the system; 2) the Scheduler, which forms the functional core of the system and which tracks what data enters the system and determines what jobs must be scheduled for execution, and; 3) the Executor, which searches for scheduled jobs and executes these on a compute cluster. Conclusion The Cyrille2 system is an extensible, modular system, implementing the stated requirements. Cyrille2 enables easy creation and execution of high throughput, flexible bioinformatics pipelines. PMID:18269742
Aerial surveillance for gas and liquid hydrocarbon pipelines using a flame ionization detector (FID)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Riquetti, P.V.; Fletcher, J.I.; Minty, C.D.
1996-12-31
A novel application for the detection of airborne hydrocarbons has been successfully developed by means of a highly sensitive, fast responding Flame Ionization Detector (FID). The traditional way to monitor pipeline leaks has been by ground crews using specific sensors or by airborne crews highly trained to observe anomalies associated with leaks during periodic surveys of the pipeline right-of-way. The goal has been to detect leaks in a fast and cost effective way before the associated spill becomes a costly and hazardous problem. This paper describes a leak detection system combined with a global positioning system (GPS) and a computerizedmore » data output designed to pinpoint the presence of hydrocarbons in the air space of the pipeline`s right of way. Fixed wing aircraft as well as helicopters have been successfully used as airborne platforms. Natural gas, crude oil and finished products pipelines in Canada and the US have been surveyed using this technology with excellent correlation between the aircraft detection and in situ ground detection. The information obtained is processed with a proprietary software and reduced to simple coordinates. Results are transferred to ground crews to effect the necessary repairs.« less
Guo, Li; Allen, Kelly S; Deiulio, Greg; Zhang, Yong; Madeiras, Angela M; Wick, Robert L; Ma, Li-Jun
2016-01-01
Current and emerging plant diseases caused by obligate parasitic microbes such as rusts, downy mildews, and powdery mildews threaten worldwide crop production and food safety. These obligate parasites are typically unculturable in the laboratory, posing technical challenges to characterize them at the genetic and genomic level. Here we have developed a data analysis pipeline integrating several bioinformatic software programs. This pipeline facilitates rapid gene discovery and expression analysis of a plant host and its obligate parasite simultaneously by next generation sequencing of mixed host and pathogen RNA (i.e., metatranscriptomics). We applied this pipeline to metatranscriptomic sequencing data of sweet basil (Ocimum basilicum) and its obligate downy mildew parasite Peronospora belbahrii, both lacking a sequenced genome. Even with a single data point, we were able to identify both candidate host defense genes and pathogen virulence genes that are highly expressed during infection. This demonstrates the power of this pipeline for identifying genes important in host-pathogen interactions without prior genomic information for either the plant host or the obligate biotrophic pathogen. The simplicity of this pipeline makes it accessible to researchers with limited computational skills and applicable to metatranscriptomic data analysis in a wide range of plant-obligate-parasite systems.
Visual to Parametric Interaction (V2PI)
Maiti, Dipayan; Endert, Alex; North, Chris
2013-01-01
Typical data visualizations result from linear pipelines that start by characterizing data using a model or algorithm to reduce the dimension and summarize structure, and end by displaying the data in a reduced dimensional form. Sensemaking may take place at the end of the pipeline when users have an opportunity to observe, digest, and internalize any information displayed. However, some visualizations mask meaningful data structures when model or algorithm constraints (e.g., parameter specifications) contradict information in the data. Yet, due to the linearity of the pipeline, users do not have a natural means to adjust the displays. In this paper, we present a framework for creating dynamic data displays that rely on both mechanistic data summaries and expert judgement. The key is that we develop both the theory and methods of a new human-data interaction to which we refer as “ Visual to Parametric Interaction” (V2PI). With V2PI, the pipeline becomes bi-directional in that users are embedded in the pipeline; users learn from visualizations and the visualizations adjust to expert judgement. We demonstrate the utility of V2PI and a bi-directional pipeline with two examples. PMID:23555552
DOE Office of Scientific and Technical Information (OSTI.GOV)
SADE is a software package for rapidly assembling analytic pipelines to manipulate data. The packages consists of the engine that manages the data and coordinates the movement of data between the tasks performing a function? a set of core libraries consisting of plugins that perform common tasks? and a framework to extend the system supporting the development of new plugins. Currently through configuration files, a pipeline can be defined that maps the routing of data through a series of plugins. Pipelines can be run in a batch mode or can process streaming data? they can be executed from the commandmore » line or run through a Windows background service. There currently exists over a hundred plugins, over fifty pipeline configurations? and the software is now being used by about a half-dozen projects.« less
Extending the Fermi-LAT data processing pipeline to the grid
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zimmer, S.; Arrabito, L.; Glanzman, T.
2015-05-12
The Data Handling Pipeline ("Pipeline") has been developed for the Fermi Gamma-Ray Space Telescope (Fermi) Large Area Telescope (LAT) which launched in June 2008. Since then it has been in use to completely automate the production of data quality monitoring quantities, reconstruction and routine analysis of all data received from the satellite and to deliver science products to the collaboration and the Fermi Science Support Center. Aside from the reconstruction of raw data from the satellite (Level 1), data reprocessing and various event-level analyses are also reasonably heavy loads on the pipeline and computing resources. These other loads, unlike Levelmore » 1, can run continuously for weeks or months at a time. Additionally, it receives heavy use in performing production Monte Carlo tasks.« less
Modelling of non-equilibrium flow in the branched pipeline systems
NASA Astrophysics Data System (ADS)
Sumskoi, S. I.; Sverchkov, A. M.; Lisanov, M. V.; Egorov, A. F.
2016-09-01
This article presents a mathematical model and a numerical method for solving the task of water hammer in the branched pipeline system. The task is considered in the onedimensional non-stationary formulation taking into account the realities such as the change in the diameter of the pipeline and its branches. By comparison with the existing analytic solution it has been shown that the proposed method possesses good accuracy. With the help of the developed model and numerical method the task has been solved concerning the transmission of the compression waves complex in the branching pipeline system when several shut down valves operate. It should be noted that the offered model and method may be easily introduced to a number of other tasks, for example, to describe the flow of blood in the vessels.
DIVERSITY IN THE BIOMEDICAL RESEARCH WORKFORCE: DEVELOPING TALENT
McGee, Richard; Saran, Suman; Krulwich, Terry A.
2012-01-01
Much has been written about the need for and barriers to achievement of greater diversity in the biomedical workforce from the perspectives of gender, race and ethnicity; this is not a new topic. These discussions often center around a ‘pipeline metaphor’ which imagines students flowing through a series of experiences to eventually arrive at a science career. Here we argue that diversity will only be achieved if the primary focus is on: what is happening within the pipeline, not just counting individuals entering and leaving it; de-emphasizing achieving academic milestones by ‘typical’ ages; and adopting approaches that most effectively develop talent. Students may develop skills at different rates based on factors such as earlier access to educational resources, exposure to science (especially research experiences), and competing demands for time and attention during high school and college. Therefore, there is wide variety among students at any point along the pipeline. Taking this view requires letting go of imagining the pipeline as a sequence of age-dependent steps in favor of milestones of skill and talent development decoupled from age or educational stage. Emphasizing talent development opens up many new approaches for science training outside of traditional degree programs. This article provides examples of such approaches, including interventions at the post-baccalaureate and PhD levels, as well as a novel coaching model that incorporates well-established social science theories and complements traditional mentoring. These approaches could significantly impact diversity by developing scientific talent, especially among currently underrepresented minorities. PMID:22678863
Zhang, Chengxin; Mortuza, S M; He, Baoji; Wang, Yanting; Zhang, Yang
2018-03-01
We develop two complementary pipelines, "Zhang-Server" and "QUARK", based on I-TASSER and QUARK pipelines for template-based modeling (TBM) and free modeling (FM), and test them in the CASP12 experiment. The combination of I-TASSER and QUARK successfully folds three medium-size FM targets that have more than 150 residues, even though the interplay between the two pipelines still awaits further optimization. Newly developed sequence-based contact prediction by NeBcon plays a critical role to enhance the quality of models, particularly for FM targets, by the new pipelines. The inclusion of NeBcon predicted contacts as restraints in the QUARK simulations results in an average TM-score of 0.41 for the best in top five predicted models, which is 37% higher than that by the QUARK simulations without contacts. In particular, there are seven targets that are converted from non-foldable to foldable (TM-score >0.5) due to the use of contact restraints in the simulations. Another additional feature in the current pipelines is the local structure quality prediction by ResQ, which provides a robust residue-level modeling error estimation. Despite the success, significant challenges still remain in ab initio modeling of multi-domain proteins and folding of β-proteins with complicated topologies bound by long-range strand-strand interactions. Improvements on domain boundary and long-range contact prediction, as well as optimal use of the predicted contacts and multiple threading alignments, are critical to address these issues seen in the CASP12 experiment. © 2017 Wiley Periodicals, Inc.
A real-time coherent dedispersion pipeline for the giant metrewave radio telescope
NASA Astrophysics Data System (ADS)
De, Kishalay; Gupta, Yashwant
2016-02-01
A fully real-time coherent dedispersion system has been developed for the pulsar back-end at the Giant Metrewave Radio Telescope (GMRT). The dedispersion pipeline uses the single phased array voltage beam produced by the existing GMRT software back-end (GSB) to produce coherently dedispersed intensity output in real time, for the currently operational bandwidths of 16 MHz and 32 MHz. Provision has also been made to coherently dedisperse voltage beam data from observations recorded on disk. We discuss the design and implementation of the real-time coherent dedispersion system, describing the steps carried out to optimise the performance of the pipeline. Presently functioning on an Intel Xeon X5550 CPU equipped with a NVIDIA Tesla C2075 GPU, the pipeline allows dispersion free, high time resolution data to be obtained in real-time. We illustrate the significant improvements over the existing incoherent dedispersion system at the GMRT, and present some preliminary results obtained from studies of pulsars using this system, demonstrating its potential as a useful tool for low frequency pulsar observations. We describe the salient features of our implementation, comparing it with other recently developed real-time coherent dedispersion systems. This implementation of a real-time coherent dedispersion pipeline for a large, low frequency array instrument like the GMRT, will enable long-term observing programs using coherent dedispersion to be carried out routinely at the observatory. We also outline the possible improvements for such a pipeline, including prospects for the upgraded GMRT which will have bandwidths about ten times larger than at present.
Anti-arrhythmic strategies for atrial fibrillation
Grandi, Eleonora; Maleckar, Mary M.
2016-01-01
Atrial fibrillation (AF), the most common cardiac arrhythmia, is associated with increased risk of cerebrovascular stroke, and with several other pathologies, including heart failure. Current therapies for AF are targeted at reducing risk of stroke (anticoagulation) and tachycardia-induced cardiomyopathy (rate or rhythm control). Rate control, typically achieved by atrioventricular nodal blocking drugs, is often insufficient to alleviate symptoms. Rhythm control approaches include antiarrhythmic drugs, electrical cardioversion, and ablation strategies. Here, we offer several examples of how computational modeling can provide a quantitative framework for integrating multi scale data to: (a) gain insight into multi-scale mechanisms of AF; (b) identify and test pharmacological and electrical therapy and interventions; and (c) support clinical decisions. We review how modeling approaches have evolved and contributed to the research pipeline and preclinical development and discuss future directions and challenges in the field. PMID:27612549
Strain-Based Design Methodology of Large Diameter Grade X80 Linepipe
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lower, Mark D.
2014-04-01
Continuous growth in energy demand is driving oil and natural gas production to areas that are often located far from major markets where the terrain is prone to earthquakes, landslides, and other types of ground motion. Transmission pipelines that cross this type of terrain can experience large longitudinal strains and plastic circumferential elongation as the pipeline experiences alignment changes resulting from differential ground movement. Such displacements can potentially impact pipeline safety by adversely affecting structural capacity and leak tight integrity of the linepipe steel. Planning for new long-distance transmission pipelines usually involves consideration of higher strength linepipe steels because theirmore » use allows pipeline operators to reduce the overall cost of pipeline construction and increase pipeline throughput by increasing the operating pressure. The design trend for new pipelines in areas prone to ground movement has evolved over the last 10 years from a stress-based design approach to a strain-based design (SBD) approach to further realize the cost benefits from using higher strength linepipe steels. This report presents an overview of SBD for pipelines subjected to large longitudinal strain and high internal pressure with emphasis on the tensile strain capacity of high-strength microalloyed linepipe steel. The technical basis for this report involved engineering analysis and examination of the mechanical behavior of Grade X80 linepipe steel in both the longitudinal and circumferential directions. Testing was conducted to assess effects on material processing including as-rolled, expanded, and heat treatment processing intended to simulate coating application. Elastic-plastic and low-cycle fatigue analyses were also performed with varying internal pressures. Proposed SBD models discussed in this report are based on classical plasticity theory and account for material anisotropy, triaxial strain, and microstructural damage effects developed from test data. The results are intended to enhance SBD and analysis methods for producing safe and cost effective pipelines capable of accommodating large plastic strains in seismically active arctic areas.« less
Neuroimaging Study Designs, Computational Analyses and Data Provenance Using the LONI Pipeline
Dinov, Ivo; Lozev, Kamen; Petrosyan, Petros; Liu, Zhizhong; Eggert, Paul; Pierce, Jonathan; Zamanyan, Alen; Chakrapani, Shruthi; Van Horn, John; Parker, D. Stott; Magsipoc, Rico; Leung, Kelvin; Gutman, Boris; Woods, Roger; Toga, Arthur
2010-01-01
Modern computational neuroscience employs diverse software tools and multidisciplinary expertise to analyze heterogeneous brain data. The classical problems of gathering meaningful data, fitting specific models, and discovering appropriate analysis and visualization tools give way to a new class of computational challenges—management of large and incongruous data, integration and interoperability of computational resources, and data provenance. We designed, implemented and validated a new paradigm for addressing these challenges in the neuroimaging field. Our solution is based on the LONI Pipeline environment [3], [4], a graphical workflow environment for constructing and executing complex data processing protocols. We developed study-design, database and visual language programming functionalities within the LONI Pipeline that enable the construction of complete, elaborate and robust graphical workflows for analyzing neuroimaging and other data. These workflows facilitate open sharing and communication of data and metadata, concrete processing protocols, result validation, and study replication among different investigators and research groups. The LONI Pipeline features include distributed grid-enabled infrastructure, virtualized execution environment, efficient integration, data provenance, validation and distribution of new computational tools, automated data format conversion, and an intuitive graphical user interface. We demonstrate the new LONI Pipeline features using large scale neuroimaging studies based on data from the International Consortium for Brain Mapping [5] and the Alzheimer's Disease Neuroimaging Initiative [6]. User guides, forums, instructions and downloads of the LONI Pipeline environment are available at http://pipeline.loni.ucla.edu. PMID:20927408
NASA Astrophysics Data System (ADS)
de Bruijn, Renée; Dabekaussen, Willem; Hijma, Marc; Wiersma, Ane; Abspoel-Bukman, Linda; Boeije, Remco; Courage, Wim; van der Geest, Johan; Hamburg, Marc; Harmsma, Edwin; Helmholt, Kristian; van den Heuvel, Frank; Kruse, Henk; Langius, Erik; Lazovik, Elena
2017-04-01
Due to heterogeneity of the subsurface in the delta environment of the Netherlands, differential subsidence over short distances results in tension and subsequent wear of subsurface infrastructure, such as water and gas pipelines. Due to uncertainties in the build-up of the subsurface, however, it is unknown where this problem is the most prominent. This is a problem for asset managers deciding when a pipeline needs replacement: damaged pipelines endanger security of supply and pose a significant threat to safety, yet premature replacement raises needless expenses. In both cases, costs - financial or other - are high. Therefore, an interdisciplinary research team of geotechnicians, geologists and Big Data engineers from research institutes TNO, Deltares and SkyGeo developed a stochastic model to predict differential subsidence and the probability of consequent pipeline failure on a (sub-)street level. In this project pipeline data from company databases is combined with a stochastic geological model and information on (historical) groundwater levels and overburden material. Probability of pipeline failure is modelled by a coupling with a subsidence model and two separate models on pipeline behaviour under stress, using a probabilistic approach. The total length of pipelines (approx. 200.000 km operational in the Netherlands) and the complexity of the model chain that is needed to calculate a chance of failure, results in large computational challenges, as it requires massive evaluation of possible scenarios to reach the required level of confidence. To cope with this, a scalable computational infrastructure has been developed, composing a model workflow in which components have a heterogeneous technological basis. Three pilot areas covering an urban, a rural and a mixed environment, characterised by different groundwater-management strategies and different overburden histories, are used to evaluate the differences in subsidence and uncertainties that come with different types of land use. Furthermore, the model provides results with a measure of reliability, and determines what is the limiting input factor causing most uncertainty. The model results can be validated and further improved using inSAR data for these pilot areas, by iteratively revising model parameters. The design of the model is such, that it can be applied to the whole of the Netherlands. By assessing differential subsidence and its effect on pipelines over time, the model helps to establish when and where maintenance is due, by indicating what areas are particularly vulnerable, thereby increasing safety and lowering maintenance costs.
Prescott, J S; Andrews, P A; Baker, R W; Bogdanffy, M S; Fields, F O; Keller, D A; Lapadula, D M; Mahoney, N M; Paul, D E; Platz, S J; Reese, D M; Stoch, S A; DeGeorge, J J
2017-08-01
Severely-debilitating or life-threatening (SDLT) diseases include conditions in which life expectancy is short or quality of life is greatly diminished despite available therapies. As such, the medical context for SDLT diseases is comparable to advanced cancer and the benefit vs. risk assessment and development of SDLT disease therapeutics should be similar to that of advanced cancer therapeutics. A streamlined development approach would allow patients with SDLT conditions earlier access to therapeutics and increase the speed of progression through development. In addition, this will likely increase the SDLT disease therapeutic pipeline, directly benefiting patients and reducing the economic and societal burden of SDLT conditions. Using advanced-stage heart failure (HF) as an example that illustrates the concepts applicable to other SDLT indications, this article proposes a streamlined development paradigm for SDLT disease therapeutics and recommends development of aligned global regulatory guidance. © 2017 American Society for Clinical Pharmacology and Therapeutics.
A Novel Application of Synthetic Biology and Directed Evolution to Engineer Phage-based Antibiotics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wu, Meiye
The emergence of multiple drug resistant bacteria poses threats to human health, agriculture and food safety. Annually over 100,000 deaths and up to $20 billion loss to the U.S. economy are attributed to multiple drug resistant bacteria. With only four new chemical antibiotics in the drug development pipeline, we are in dire need of new solutions to address the emerging threat of multiple drug resistance. We propose a paradigm-changing approach to address the multi-drug resistant bacteria problem by utilizing Synthetic Biology (SynBio) methodologies to create and evolve “designer” bacteriophages or phages – viruses that specifically infect bacteria – to infectmore » and kill newly emerging pathogenic bacterial strains WITHOUT the need for chemical antibiotics. A major advantage of using phage to combat pathogenic bacteria is that phages can co-evolve with their bacterial host, and Sandia can be the first in the world to establish an industrial scale Synthetic Biology pipeline for phage directed evolution for safe, targeted, customizable solution to bacterial drug resistance. Since there is no existing phage directed evolution effort within or outside of Sandia, this proposal is suitable as a high-risk LDRD effort to create the first pipeline for such an endeavor. The high potential reward nature of this proposal will be the immediate impact in decontamination and restoration of surfaces and infrastructure, with longer term impact in human or animal therapeutics. The synthetic biology and screening approaches will lead to fundamental knowledge of phage/bacteria co-evolution, making Sandia a world leader in directed evolution of bacteriophages.« less
Bifrost: a Modular Python/C++ Framework for Development of High-Throughput Data Analysis Pipelines
NASA Astrophysics Data System (ADS)
Cranmer, Miles; Barsdell, Benjamin R.; Price, Danny C.; Garsden, Hugh; Taylor, Gregory B.; Dowell, Jayce; Schinzel, Frank; Costa, Timothy; Greenhill, Lincoln J.
2017-01-01
Large radio interferometers have data rates that render long-term storage of raw correlator data infeasible, thus motivating development of real-time processing software. For high-throughput applications, processing pipelines are challenging to design and implement. Motivated by science efforts with the Long Wavelength Array, we have developed Bifrost, a novel Python/C++ framework that eases the development of high-throughput data analysis software by packaging algorithms as black box processes in a directed graph. This strategy to modularize code allows astronomers to create parallelism without code adjustment. Bifrost uses CPU/GPU ’circular memory’ data buffers that enable ready introduction of arbitrary functions into the processing path for ’streams’ of data, and allow pipelines to automatically reconfigure in response to astrophysical transient detection or input of new observing settings. We have deployed and tested Bifrost at the latest Long Wavelength Array station, in Sevilleta National Wildlife Refuge, NM, where it handles throughput exceeding 10 Gbps per CPU core.
SimVascular: An Open Source Pipeline for Cardiovascular Simulation.
Updegrove, Adam; Wilson, Nathan M; Merkow, Jameson; Lan, Hongzhi; Marsden, Alison L; Shadden, Shawn C
2017-03-01
Patient-specific cardiovascular simulation has become a paradigm in cardiovascular research and is emerging as a powerful tool in basic, translational and clinical research. In this paper we discuss the recent development of a fully open-source SimVascular software package, which provides a complete pipeline from medical image data segmentation to patient-specific blood flow simulation and analysis. This package serves as a research tool for cardiovascular modeling and simulation, and has contributed to numerous advances in personalized medicine, surgical planning and medical device design. The SimVascular software has recently been refactored and expanded to enhance functionality, usability, efficiency and accuracy of image-based patient-specific modeling tools. Moreover, SimVascular previously required several licensed components that hindered new user adoption and code management and our recent developments have replaced these commercial components to create a fully open source pipeline. These developments foster advances in cardiovascular modeling research, increased collaboration, standardization of methods, and a growing developer community.
ERIC Educational Resources Information Center
Martin, Jennifer L.; Beese, Jane A.
2017-01-01
Teaching writing to students of high need in an urban school is simultaneously pedagogical, curricular, and political. Students labeled "at-risk" for school failure often have lowered expectations placed upon them from without that impact how they feel within. Compounding this problem of perception is the real issue of heightened…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, Wellington K.; Morris, Tyler; Chu, Andrew
The ThunderBird Cup v3.0 (TBC3) program falls under the Minority Serving Institution Pipeline Program (MSIPP) that aims to establish a world-class workforce development, education and research program that combines the strengths of Historically Black Colleges and Universities (HBCUs) and national laboratories to create a K-20 pipeline of students to participate in cybersecurity and related fields.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, Wellington K.; Morris, Tyler Jake; Chu, Andrew Chun-An
The ThunderBird Cup v2.0 (TBC2) program falls under the Minority Serving Institution Pipeline Program (MSIPP) that aims to establish a world-class workforce development, education and research program that combines the strengths of Historically Black Colleges and Universities (HBCUs) and national laboratories to create a K-20 pipeline of students to participate in cybersecurity and related fields.
Latest Development and Application of Nb-Bearing High Strength Pipeline Steels
NASA Astrophysics Data System (ADS)
Zhang, Yongqing; Shang, Chengjia; Guo, Aimin; Zheng, Lei; Niu, Tao; Han, Xiulin
In order to solve the pollution problem emerging in China recently, China's central government is making great efforts to raise the percentage of natural gas consumption in the China's primary energy mix, which needs to construct big pipelines to transport natural gas from the nation's resource-rich western regions to the energy-starved east, as well as import from the Central Asia and Russia. With this mainstream trend, high strength, high toughness, heavy gauge, and large diameter pipeline steels are needed to improve the transportation efficiency. This paper describes the latest progresses in Nb-bearing high strength pipeline steels with regard to metallurgical design, development and application, including X80 coil with a thickness up to 22.0mm, X80 plate with a diameter as much as 1422mm, X80 plate with low-temperature requirements and low-Mn sour service X65 for harsh sour service environments. Moreover, based on widely accepted TMCP and HTP practices with low carbon and Nb micro-alloying design, this paper also investigated some new metallurgical phenomena based on powerful rolling mills and heavy ACC equipment.
Lin, Ching-Heng; Wu, Nai-Yuan; Lai, Wei-Shao; Liou, Der-Ming
2015-01-01
Electronic medical records with encoded entries should enhance the semantic interoperability of document exchange. However, it remains a challenge to encode the narrative concept and to transform the coded concepts into a standard entry-level document. This study aimed to use a novel approach for the generation of entry-level interoperable clinical documents. Using HL7 clinical document architecture (CDA) as the example, we developed three pipelines to generate entry-level CDA documents. The first approach was a semi-automatic annotation pipeline (SAAP), the second was a natural language processing (NLP) pipeline, and the third merged the above two pipelines. We randomly selected 50 test documents from the i2b2 corpora to evaluate the performance of the three pipelines. The 50 randomly selected test documents contained 9365 words, including 588 Observation terms and 123 Procedure terms. For the Observation terms, the merged pipeline had a significantly higher F-measure than the NLP pipeline (0.89 vs 0.80, p<0.0001), but a similar F-measure to that of the SAAP (0.89 vs 0.87). For the Procedure terms, the F-measure was not significantly different among the three pipelines. The combination of a semi-automatic annotation approach and the NLP application seems to be a solution for generating entry-level interoperable clinical documents. © The Author 2014. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.comFor numbered affiliation see end of article.
Statistical method to compare massive parallel sequencing pipelines.
Elsensohn, M H; Leblay, N; Dimassi, S; Campan-Fournier, A; Labalme, A; Roucher-Boulez, F; Sanlaville, D; Lesca, G; Bardel, C; Roy, P
2017-03-01
Today, sequencing is frequently carried out by Massive Parallel Sequencing (MPS) that cuts drastically sequencing time and expenses. Nevertheless, Sanger sequencing remains the main validation method to confirm the presence of variants. The analysis of MPS data involves the development of several bioinformatic tools, academic or commercial. We present here a statistical method to compare MPS pipelines and test it in a comparison between an academic (BWA-GATK) and a commercial pipeline (TMAP-NextGENe®), with and without reference to a gold standard (here, Sanger sequencing), on a panel of 41 genes in 43 epileptic patients. This method used the number of variants to fit log-linear models for pairwise agreements between pipelines. To assess the heterogeneity of the margins and the odds ratios of agreement, four log-linear models were used: a full model, a homogeneous-margin model, a model with single odds ratio for all patients, and a model with single intercept. Then a log-linear mixed model was fitted considering the biological variability as a random effect. Among the 390,339 base-pairs sequenced, TMAP-NextGENe® and BWA-GATK found, on average, 2253.49 and 1857.14 variants (single nucleotide variants and indels), respectively. Against the gold standard, the pipelines had similar sensitivities (63.47% vs. 63.42%) and close but significantly different specificities (99.57% vs. 99.65%; p < 0.001). Same-trend results were obtained when only single nucleotide variants were considered (99.98% specificity and 76.81% sensitivity for both pipelines). The method allows thus pipeline comparison and selection. It is generalizable to all types of MPS data and all pipelines.
Closing the race and gender gaps in computer science education
NASA Astrophysics Data System (ADS)
Robinson, John Henry
Life in a technological society brings new paradigms and pressures to bear on education. These pressures are magnified for underrepresented students and must be addressed if they are to play a vital part in society. Educational pipelines need to be established to provide at risk students with the means and opportunity to succeed in science, technology, engineering, and mathematics (STEM) majors. STEM educational pipelines are programs consisting of components that seek to facilitate students' completion of a college degree by providing access to higher education, intervention, mentoring, support infrastructure, and programs that encourage academic success. Successes in the STEM professions mean that more educators, scientist, engineers, and researchers will be available to add diversity to the professions and to provide role models for future generations. The issues that the educational pipelines must address are improving at risk groups' perceptions and awareness of the math, science, and engineering professions. Additionally, the educational pipelines must provide intervention in math preparation, overcome gender and race socialization, and provide mentors and counseling to help students achieve better self perceptions and provide positive role models. This study was designed to explorer the underrepresentation of minorities and women in the computer science major at Rowan University through a multilayered action research methodology. The purpose of this research study was to define and understand the needs of underrepresented students in computer science, to examine current policies and enrollment data for Rowan University, to develop a historical profile of the Computer Science program from the standpoint of ethnicity and gender enrollment to ascertain trends in students' choice of computer science as a major, and an attempt to determine if raising awareness about computer science for incoming freshmen, and providing an alternate route into the computer science major will entice more women and minorities to pursue a degree in computer science at Rowan University. Finally, this study examined my espoused leadership theories and my leadership theories in use through reflective practices as I progressed through the cycles of this project. The outcomes of this study indicated a large downward trend in women enrollment in computer science and a relatively flat trend in minority enrollment. The enrollment data at Rowan University was found to follow a nationwide trend for underrepresented students' enrollment in STEM majors. The study also indicated that students' mental models are based upon their race and gender socialization and their understanding of the world and society. The mental models were shown to play a large role in the students' choice of major. Finally, a computer science pipeline was designed and piloted as part of this study in an attempt to entice more students into the major and facilitate their success. Additionally, the mental models of the participants were challenged through interactions to make them aware of what possibilities are available with a degree in computer science. The entire study was wrapped in my leadership, which was practiced and studied over the course of this work.
Jayashree, B; Hanspal, Manindra S; Srinivasan, Rajgopal; Vigneshwaran, R; Varshney, Rajeev K; Spurthi, N; Eshwar, K; Ramesh, N; Chandra, S; Hoisington, David A
2007-01-01
The large amounts of EST sequence data available from a single species of an organism as well as for several species within a genus provide an easy source of identification of intra- and interspecies single nucleotide polymorphisms (SNPs). In the case of model organisms, the data available are numerous, given the degree of redundancy in the deposited EST data. There are several available bioinformatics tools that can be used to mine this data; however, using them requires a certain level of expertise: the tools have to be used sequentially with accompanying format conversion and steps like clustering and assembly of sequences become time-intensive jobs even for moderately sized datasets. We report here a pipeline of open source software extended to run on multiple CPU architectures that can be used to mine large EST datasets for SNPs and identify restriction sites for assaying the SNPs so that cost-effective CAPS assays can be developed for SNP genotyping in genetics and breeding applications. At the International Crops Research Institute for the Semi-Arid Tropics (ICRISAT), the pipeline has been implemented to run on a Paracel high-performance system consisting of four dual AMD Opteron processors running Linux with MPICH. The pipeline can be accessed through user-friendly web interfaces at http://hpc.icrisat.cgiar.org/PBSWeb and is available on request for academic use. We have validated the developed pipeline by mining chickpea ESTs for interspecies SNPs, development of CAPS assays for SNP genotyping, and confirmation of restriction digestion pattern at the sequence level.
A-Track: A new approach for detection of moving objects in FITS images
NASA Astrophysics Data System (ADS)
Atay, T.; Kaplan, M.; Kilic, Y.; Karapinar, N.
2016-10-01
We have developed a fast, open-source, cross-platform pipeline, called A-Track, for detecting the moving objects (asteroids and comets) in sequential telescope images in FITS format. The pipeline is coded in Python 3. The moving objects are detected using a modified line detection algorithm, called MILD. We tested the pipeline on astronomical data acquired by an SI-1100 CCD with a 1-meter telescope. We found that A-Track performs very well in terms of detection efficiency, stability, and processing time. The code is hosted on GitHub under the GNU GPL v3 license.
Nine Years of XMM-Newton Pipeline: Experience and Feedback
NASA Astrophysics Data System (ADS)
Michel, Laurent; Motch, Christian
2009-05-01
The Strasbourg Astronomical Observatory is member of the Survey Science Centre (SSC) of the XMM-Newton satellite. Among other responsibilities, we provide a database access to the 2XMMi catalogue and run the part of the data processing pipeline performing the cross-correlation of EPIC sources with archival catalogs. These tasks were all developed in Strasbourg. Pipeline processing is flawlessly in operation since 1999. We describe here the work load and infrastructure setup in Strasbourg to support SSC activities. Our nine year long SSC experience could be used in the framework of the Simbol-X ground segment.
Welding and NDT development in support of Oman-India gas pipeline
DOE Office of Scientific and Technical Information (OSTI.GOV)
Even, T.M.; Laing, B.; Hirsch, D.
1995-12-01
The Oman to India gas pipeline is designed for a maximum water depth of 3,500 m. For such a pipeline, resistance to hydrostatic collapse is a critical factor and dictates that very heavy wall pipe be used, preliminarily 24 inch ID x 1.625 inch wall. Because of the water depth, much of the installation will be by J-Lay which requires that the Joint be welded and inspected in a single station. This paper describes the results of welding and NDT test programs conducted to determine the minimum time to perform these operations in heavy wall pipe.
Analysis pipelines and packages for Infinium HumanMethylation450 BeadChip (450k) data
Morris, Tiffany J.; Beck, Stephan
2015-01-01
The Illumina HumanMethylation450 BeadChip has become a popular platform for interrogating DNA methylation in epigenome-wide association studies (EWAS) and related projects as well as resource efforts such as the International Cancer Genome Consortium (ICGC) and the International Human Epigenome Consortium (IHEC). This has resulted in an exponential increase of 450k data in recent years and triggered the development of numerous integrated analysis pipelines and stand-alone packages. This review will introduce and discuss the currently most popular pipelines and packages and is particularly aimed at new 450k users. PMID:25233806
Corrosion monitoring along infrastructures using distributed fiber optic sensing
NASA Astrophysics Data System (ADS)
Alhandawi, Khalil B.; Vahdati, Nader; Shiryayev, Oleg; Lawand, Lydia
2016-04-01
Pipeline Inspection Gauges (PIGs) are used for internal corrosion inspection of oil pipelines every 3-5 years. However, between inspection intervals, rapid corrosion may occur, potentially resulting in major accidents. The motivation behind this research project was to develop a safe distributed corrosion sensor placed inside oil pipelines continuously monitoring corrosion. The intrinsically safe nature of light provided motivation for researching fiber optic sensors as a solution. The sensing fiber's cladding features polymer plastic that is chemically sensitive to hydrocarbons within crude oil mixtures. A layer of metal, used in the oil pipeline's construction, is deposited on the polymer cladding, which upon corrosion, exposes the cladding to surrounding hydrocarbons. The hydrocarbon's interaction with the cladding locally increases the cladding's refractive index in the radial direction. Light intensity of a traveling pulse is reduced due to local reduction in the modal capacity which is interrogated by Optical Time Domain Reflectometery. Backscattered light is captured in real-time while using time delay to resolve location, allowing real-time spatial monitoring of environmental internal corrosion within pipelines spanning large distances. Step index theoretical solutions were used to calculate the power loss due changes in the intensity profile. The power loss is translated into an attenuation coefficient characterizing the expected OTDR trace which was verified against similar experimental results from the literature. A laboratory scale experiment is being developed to assess the validity of the model and the practicality of the solution.
Toward modeling locomotion using electromyography-informed 3D models: application to cerebral palsy.
Sartori, M; Fernandez, J W; Modenese, L; Carty, C P; Barber, L A; Oberhofer, K; Zhang, J; Handsfield, G G; Stott, N S; Besier, T F; Farina, D; Lloyd, D G
2017-03-01
This position paper proposes a modeling pipeline to develop clinically relevant neuromusculoskeletal models to understand and treat complex neurological disorders. Although applicable to a variety of neurological conditions, we provide direct pipeline applicative examples in the context of cerebral palsy (CP). This paper highlights technologies in: (1) patient-specific segmental rigid body models developed from magnetic resonance imaging for use in inverse kinematics and inverse dynamics pipelines; (2) efficient population-based approaches to derive skeletal models and muscle origins/insertions that are useful for population statistics and consistent creation of continuum models; (3) continuum muscle descriptions to account for complex muscle architecture including spatially varying material properties with muscle wrapping; (4) muscle and tendon properties specific to CP; and (5) neural-based electromyography-informed methods for muscle force prediction. This represents a novel modeling pipeline that couples for the first time electromyography extracted features of disrupted neuromuscular behavior with advanced numerical methods for modeling CP-specific musculoskeletal morphology and function. The translation of such pipeline to the clinical level will provide a new class of biomarkers that objectively describe the neuromusculoskeletal determinants of pathological locomotion and complement current clinical assessment techniques, which often rely on subjective judgment. WIREs Syst Biol Med 2017, 9:e1368. doi: 10.1002/wsbm.1368 For further resources related to this article, please visit the WIREs website. © 2016 Wiley Periodicals, Inc.
Mandating responsible flagging practices as a strategy for reducing the risk of coastal oil spills.
Miller, Dana D; Hotte, Ngaio; Sumaila, U Rashid
2014-04-15
As human civilization is becoming more aware of the negative impact our actions can inflict upon the natural world, the intensification of fossil fuel extraction and industrial development is being met with increasing opposition. In Western Canada, proposals that would increase the volume of petroleum transported by pipelines and by tankers through the coastal waters of British Columbia have engaged the province in debate. To ease public concern on the risk of a coastal oil spill, there are additional commitments that involved parties could make. There is evidence to show that the practice of registering vessels under foreign flags of states that have exhibited failure in compliance with international obligations is more common amongst petroleum tankers that have been involved in large-scale oil spills. To prove that they are committed to reducing the risk of oil spills, businesses need to stop registering their vessels under flags of foreign, non-compliant states. Copyright © 2014 Elsevier Ltd. All rights reserved.
Onuki, Ritsuko; Yamaguchi, Rui; Shibuya, Tetsuo; Kanehisa, Minoru; Goto, Susumu
2017-01-01
Genome-wide scans for positive selection have become important for genomic medicine, and many studies aim to find genomic regions affected by positive selection that are associated with risk allele variations among populations. Most such studies are designed to detect recent positive selection. However, we hypothesize that ancient positive selection is also important for adaptation to pathogens, and has affected current immune-mediated common diseases. Based on this hypothesis, we developed a novel linkage disequilibrium-based pipeline, which aims to detect regions associated with ancient positive selection across populations from single nucleotide polymorphism (SNP) data. By applying this pipeline to the genotypes in the International HapMap project database, we show that genes in the detected regions are enriched in pathways related to the immune system and infectious diseases. The detected regions also contain SNPs reported to be associated with cancers and metabolic diseases, obesity-related traits, type 2 diabetes, and allergic sensitization. These SNPs were further mapped to biological pathways to determine the associations between phenotypes and molecular functions. Assessments of candidate regions to identify functions associated with variations in incidence rates of these diseases are needed in the future. PMID:28445522
Data processing pipeline for serial femtosecond crystallography at SACLA.
Nakane, Takanori; Joti, Yasumasa; Tono, Kensuke; Yabashi, Makina; Nango, Eriko; Iwata, So; Ishitani, Ryuichiro; Nureki, Osamu
2016-06-01
A data processing pipeline for serial femtosecond crystallography at SACLA was developed, based on Cheetah [Barty et al. (2014). J. Appl. Cryst. 47 , 1118-1131] and CrystFEL [White et al. (2016). J. Appl. Cryst. 49 , 680-689]. The original programs were adapted for data acquisition through the SACLA API, thread and inter-node parallelization, and efficient image handling. The pipeline consists of two stages: The first, online stage can analyse all images in real time, with a latency of less than a few seconds, to provide feedback on hit rate and detector saturation. The second, offline stage converts hit images into HDF5 files and runs CrystFEL for indexing and integration. The size of the filtered compressed output is comparable to that of a synchrotron data set. The pipeline enables real-time feedback and rapid structure solution during beamtime.
Generic Data Pipelining Using ORAC-DR
NASA Astrophysics Data System (ADS)
Allan, Alasdair; Jenness, Tim; Economou, Frossie; Currie, Malcolm J.; Bly, Martin J.
A generic data reduction pipeline is, perhaps, the holy grail for data reduction software. We present work which sets us firmly on the path towards this goal. ORAC-DR is an online data reduction pipeline written by the Joint Astronomy Center (JAC) and the UK Astronomy Technology Center (ATC) and distributed as part of the Starlink Software collection (SSC). It is intended to run with a minimum of observer interaction, and is able to handle data from many different instruments, including SCUBA, CGS4, UFTI, IRCAM and Michelle, with support for IRIS2 and UIST under development. Recent work by Starlink in collaboration with the JAC has resulted in an increase in the pipeline's flexibility, opening up the possibility that it could be used for truly generic data reduction for data from any imaging, and eventually spectroscopic, detector.
Batista, Fernanda Aparecida Heleno
2018-01-01
Peroxisome proliferator-activated receptor beta/delta (PPARß/δ) is considered a therapeutic target for metabolic disorders, cancer, and cardiovascular diseases. Here, we developed one pipeline for the screening of PPARß/δ agonists, which reduces the cost, time, and false-positive hits. The first step is an optimized 3-day long cellular transactivation assay based on reporter-gene technology, which is supported by automated liquid-handlers. This primary screening is followed by a confirmatory transactivation assay and by two biophysical validation methods (thermal shift assay (TSA) and (ANS) fluorescence quenching), which allow the calculation of the affinity constant, giving more information about the selected hits. All of the assays were validated using well-known commercial agonists providing trustworthy data. Furthermore, to validate and test this pipeline, we screened a natural extract library (560 extracts), and we found one plant extract that might be interesting for PPARß/δ modulation. In conclusion, our results suggested that we developed a cheaper and more robust pipeline that goes beyond the single activation screening, as it also evaluates PPARß/δ tertiary structure stabilization and the ligand affinity constant, selecting only molecules that directly bind to the receptor. Moreover, this approach might improve the effectiveness of the screening for agonists that target PPARß/δ for drug development.
High-Precision Phenotyping of Grape Bunch Architecture Using Fast 3D Sensor and Automation.
Rist, Florian; Herzog, Katja; Mack, Jenny; Richter, Robert; Steinhage, Volker; Töpfer, Reinhard
2018-03-02
Wine growers prefer cultivars with looser bunch architecture because of the decreased risk for bunch rot. As a consequence, grapevine breeders have to select seedlings and new cultivars with regard to appropriate bunch traits. Bunch architecture is a mosaic of different single traits which makes phenotyping labor-intensive and time-consuming. In the present study, a fast and high-precision phenotyping pipeline was developed. The optical sensor Artec Spider 3D scanner (Artec 3D, L-1466, Luxembourg) was used to generate dense 3D point clouds of grapevine bunches under lab conditions and an automated analysis software called 3D-Bunch-Tool was developed to extract different single 3D bunch traits, i.e., the number of berries, berry diameter, single berry volume, total volume of berries, convex hull volume of grapes, bunch width and bunch length. The method was validated on whole bunches of different grapevine cultivars and phenotypic variable breeding material. Reliable phenotypic data were obtained which show high significant correlations (up to r² = 0.95 for berry number) compared to ground truth data. Moreover, it was shown that the Artec Spider can be used directly in the field where achieved data show comparable precision with regard to the lab application. This non-invasive and non-contact field application facilitates the first high-precision phenotyping pipeline based on 3D bunch traits in large plant sets.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-26
.... PHMSA-2009-0203] Pipeline Safety: Meeting of the Gas Pipeline Advisory Committee and the Liquid Pipeline..., and safety policies for natural gas pipelines and for hazardous liquid pipelines. Both committees were...: Notice of advisory committee meeting. SUMMARY: This notice announces a public meeting of the Gas Pipeline...
How Do They Get Here?: Paths into Physics Education Research
ERIC Educational Resources Information Center
Barthelemy, Ramon S.; Henderson, Charles; Grunert, Megan L.
2013-01-01
Physics education research (PER) is a relatively new and rapidly growing area of Ph.D. specialization. To sustain the field of PER, a steady pipeline of talented scholars needs to be developed and supported. One aspect of building this pipeline is understanding how students come to graduate and postdoctoral work in PER and what their career goals…
Hydrostatic collapse research in support of the Oman India gas pipeline
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stark, P.R.; McKeehan, D.S.
1995-12-01
This paper provides a summary of the collapse test program conducted as part of the technical development for the Ultra Deep Oman to India Pipeline. The paper describes the motivation for conducting the collapse test program, outlines the test objectives and procedures, presents the results obtained, and draws conclusions on the factors affecting collapse resistance.
Melicher, Dacotah; Torson, Alex S; Dworkin, Ian; Bowsher, Julia H
2014-03-12
The Sepsidae family of flies is a model for investigating how sexual selection shapes courtship and sexual dimorphism in a comparative framework. However, like many non-model systems, there are few molecular resources available. Large-scale sequencing and assembly have not been performed in any sepsid, and the lack of a closely related genome makes investigation of gene expression challenging. Our goal was to develop an automated pipeline for de novo transcriptome assembly, and to use that pipeline to assemble and analyze the transcriptome of the sepsid Themira biloba. Our bioinformatics pipeline uses cloud computing services to assemble and analyze the transcriptome with off-site data management, processing, and backup. It uses a multiple k-mer length approach combined with a second meta-assembly to extend transcripts and recover more bases of transcript sequences than standard single k-mer assembly. We used 454 sequencing to generate 1.48 million reads from cDNA generated from embryo, larva, and pupae of T. biloba and assembled a transcriptome consisting of 24,495 contigs. Annotation identified 16,705 transcripts, including those involved in embryogenesis and limb patterning. We assembled transcriptomes from an additional three non-model organisms to demonstrate that our pipeline assembled a higher-quality transcriptome than single k-mer approaches across multiple species. The pipeline we have developed for assembly and analysis increases contig length, recovers unique transcripts, and assembles more base pairs than other methods through the use of a meta-assembly. The T. biloba transcriptome is a critical resource for performing large-scale RNA-Seq investigations of gene expression patterns, and is the first transcriptome sequenced in this Dipteran family.
Bicycle: a bioinformatics pipeline to analyze bisulfite sequencing data.
Graña, Osvaldo; López-Fernández, Hugo; Fdez-Riverola, Florentino; González Pisano, David; Glez-Peña, Daniel
2018-04-15
High-throughput sequencing of bisulfite-converted DNA is a technique used to measure DNA methylation levels. Although a considerable number of computational pipelines have been developed to analyze such data, none of them tackles all the peculiarities of the analysis together, revealing limitations that can force the user to manually perform additional steps needed for a complete processing of the data. This article presents bicycle, an integrated, flexible analysis pipeline for bisulfite sequencing data. Bicycle analyzes whole genome bisulfite sequencing data, targeted bisulfite sequencing data and hydroxymethylation data. To show how bicycle overtakes other available pipelines, we compared them on a defined number of features that are summarized in a table. We also tested bicycle with both simulated and real datasets, to show its level of performance, and compared it to different state-of-the-art methylation analysis pipelines. Bicycle is publicly available under GNU LGPL v3.0 license at http://www.sing-group.org/bicycle. Users can also download a customized Ubuntu LiveCD including bicycle and other bisulfite sequencing data pipelines compared here. In addition, a docker image with bicycle and its dependencies, which allows a straightforward use of bicycle in any platform (e.g. Linux, OS X or Windows), is also available. ograna@cnio.es or dgpena@uvigo.es. Supplementary data are available at Bioinformatics online.
Status of the TESS Science Processing Operations Center
NASA Technical Reports Server (NTRS)
Jenkins, Jon M.; Twicken, Joseph D.; Campbell, Jennifer; Tenebaum, Peter; Sanderfer, Dwight; Davies, Misty D.; Smith, Jeffrey C.; Morris, Rob; Mansouri-Samani, Masoud; Girouardi, Forrest;
2017-01-01
The Transiting Exoplanet Survey Satellite (TESS) science pipeline is being developed by the Science Processing Operations Center (SPOC) at NASA Ames Research Center based on the highly successful Kepler Mission science pipeline. Like the Kepler pipeline, the TESS science pipeline will provide calibrated pixels, simple and systematic error-corrected aperture photometry, and centroid locations for all 200,000+ target stars, observed over the 2-year mission, along with associated uncertainties. The pixel and light curve products are modeled on the Kepler archive products and will be archived to the Mikulski Archive for Space Telescopes (MAST). In addition to the nominal science data, the 30-minute Full Frame Images (FFIs) simultaneously collected by TESS will also be calibrated by the SPOC and archived at MAST. The TESS pipeline will search through all light curves for evidence of transits that occur when a planet crosses the disk of its host star. The Data Validation pipeline will generate a suite of diagnostic metrics for each transit-like signature discovered, and extract planetary parameters by fitting a limb-darkened transit model to each potential planetary signature. The results of the transit search will be modeled on the Kepler transit search products (tabulated numerical results, time series products, and pdf reports) all of which will be archived to MAST.
Guo, Li; Allen, Kelly S.; Deiulio, Greg; Zhang, Yong; Madeiras, Angela M.; Wick, Robert L.; Ma, Li-Jun
2016-01-01
Current and emerging plant diseases caused by obligate parasitic microbes such as rusts, downy mildews, and powdery mildews threaten worldwide crop production and food safety. These obligate parasites are typically unculturable in the laboratory, posing technical challenges to characterize them at the genetic and genomic level. Here we have developed a data analysis pipeline integrating several bioinformatic software programs. This pipeline facilitates rapid gene discovery and expression analysis of a plant host and its obligate parasite simultaneously by next generation sequencing of mixed host and pathogen RNA (i.e., metatranscriptomics). We applied this pipeline to metatranscriptomic sequencing data of sweet basil (Ocimum basilicum) and its obligate downy mildew parasite Peronospora belbahrii, both lacking a sequenced genome. Even with a single data point, we were able to identify both candidate host defense genes and pathogen virulence genes that are highly expressed during infection. This demonstrates the power of this pipeline for identifying genes important in host–pathogen interactions without prior genomic information for either the plant host or the obligate biotrophic pathogen. The simplicity of this pipeline makes it accessible to researchers with limited computational skills and applicable to metatranscriptomic data analysis in a wide range of plant-obligate-parasite systems. PMID:27462318
Caous, Cristofer André; Machado, Birajara; Hors, Cora; Zeh, Andrea Kaufmann; Dias, Cleber Gustavo; Amaro Junior, Edson
2012-01-01
To propose a measure (index) of expected risks to evaluate and follow up the performance analysis of research projects involving financial and adequate structure parameters for its development. A ranking of acceptable results regarding research projects with complex variables was used as an index to gauge a project performance. In order to implement this method the ulcer index as the basic model to accommodate the following variables was applied: costs, high impact publication, fund raising, and patent registry. The proposed structured analysis, named here as RoSI (Return on Scientific Investment) comprises a pipeline of analysis to characterize the risk based on a modeling tool that comprises multiple variables interacting in semi-quantitatively environments. This method was tested with data from three different projects in our Institution (projects A, B and C). Different curves reflected the ulcer indexes identifying the project that may have a minor risk (project C) related to development and expected results according to initial or full investment. The results showed that this model contributes significantly to the analysis of risk and planning as well as to the definition of necessary investments that consider contingency actions with benefits to the different stakeholders: the investor or donor, the project manager and the researchers.
Diversity in the biomedical research workforce: developing talent.
McGee, Richard; Saran, Suman; Krulwich, Terry A
2012-01-01
Much has been written about the need for and barriers to achievement of greater diversity in the biomedical workforce from the perspectives of gender, race, and ethnicity; this is not a new topic. These discussions often center around a "pipeline" metaphor that imagines students flowing through a series of experiences to eventually arrive at a science career. Here we argue that diversity will only be achieved if the primary focus is on (1) what is happening within the pipeline, not just counting individuals entering and leaving it; (2) de-emphasizing the achievement of academic milestones by typical ages; and (3) adopting approaches that most effectively develop talent. Students may develop skills at different rates based on factors such as earlier access to educational resources, exposure to science (especially research experiences), and competing demands for time and attention during high school and college. Therefore, there is wide variety among students at any point along the pipeline. Taking this view requires letting go of imagining the pipeline as a sequence of age-dependent steps in favor of milestones of skill and talent development decoupled from age or educational stage. Emphasizing talent development opens up many new approaches for science training outside of traditional degree programs. This article provides examples of such approaches, including interventions at the postbaccalaureate and PhD levels, as well as a novel coaching model that incorporates well-established social science theories and complements traditional mentoring. These approaches could significantly impact diversity by developing scientific talent, especially among currently underrepresented minorities. © 2012 Mount Sinai School of Medicine.
The Tuberculosis Drug Discovery and Development Pipeline and Emerging Drug Targets
Mdluli, Khisimuzi; Kaneko, Takushi; Upton, Anna
2015-01-01
The recent accelerated approval for use in extensively drug-resistant and multidrug-resistant-tuberculosis (MDR-TB) of two first-in-class TB drugs, bedaquiline and delamanid, has reinvigorated the TB drug discovery and development field. However, although several promising clinical development programs are ongoing to evaluate new TB drugs and regimens, the number of novel series represented is few. The global early-development pipeline is also woefully thin. To have a chance of achieving the goal of better, shorter, safer TB drug regimens with utility against drug-sensitive and drug-resistant disease, a robust and diverse global TB drug discovery pipeline is key, including innovative approaches that make use of recently acquired knowledge on the biology of TB. Fortunately, drug discovery for TB has resurged in recent years, generating compounds with varying potential for progression into developable leads. In parallel, advances have been made in understanding TB pathogenesis. It is now possible to apply the lessons learned from recent TB hit generation efforts and newly validated TB drug targets to generate the next wave of TB drug leads. Use of currently underexploited sources of chemical matter and lead-optimization strategies may also improve the efficiency of future TB drug discovery. Novel TB drug regimens with shorter treatment durations must target all subpopulations of Mycobacterium tuberculosis existing in an infection, including those responsible for the protracted TB treatment duration. This review summarizes the current TB drug development pipeline and proposes strategies for generating improved hits and leads in the discovery phase that could help achieve this goal. PMID:25635061
Hollow-core fiber sensing technique for pipeline leak detection
NASA Astrophysics Data System (ADS)
Challener, W. A.; Kasten, Matthias A.; Karp, Jason; Choudhury, Niloy
2018-02-01
Recently there has been increased interest on the part of federal and state regulators to detect and quantify emissions of methane, an important greenhouse gas, from various parts of the oil and gas infrastructure including well pads and pipelines. Pressure and/or flow anomalies are typically used to detect leaks along natural gas pipelines, but are generally very insensitive and subject to false alarms. We have developed a system to detect and localize methane leaks along gas pipelines that is an order of magnitude more sensitive by combining tunable diode laser spectroscopy (TDLAS) with conventional sensor tube technology. This technique can potentially localize leaks along pipelines up to 100 km lengths with an accuracy of +/-50 m or less. A sensor tube buried along the pipeline with a gas-permeable membrane collects leaking gas during a soak period. The leak plume within the tube is then carried to the nearest sensor node along the tube in a purge cycle. The time-to-detection is used to determine leak location. Multiple sensor nodes are situated along the pipeline to minimize the time to detection, and each node is composed of a short segment of hollow core fiber (HCF) into which leaking gas is transported quickly through a small pressure differential. The HCF sensing node is spliced to standard telecom solid core fiber which transports the laser light for spectroscopy to a remote interrogator. The interrogator is multiplexed across the sensor nodes to minimize equipment cost and complexity.
Simplified Technique for Predicting Offshore Pipeline Expansion
NASA Astrophysics Data System (ADS)
Seo, J. H.; Kim, D. K.; Choi, H. S.; Yu, S. Y.; Park, K. S.
2018-06-01
In this study, we propose a method for estimating the amount of expansion that occurs in subsea pipelines, which could be applied in the design of robust structures that transport oil and gas from offshore wells. We begin with a literature review and general discussion of existing estimation methods and terminologies with respect to subsea pipelines. Due to the effects of high pressure and high temperature, the production of fluid from offshore wells is typically caused by physical deformation of subsea structures, e.g., expansion and contraction during the transportation process. In severe cases, vertical and lateral buckling occurs, which causes a significant negative impact on structural safety, and which is related to on-bottom stability, free-span, structural collapse, and many other factors. In addition, these factors may affect the production rate with respect to flow assurance, wax, and hydration, to name a few. In this study, we developed a simple and efficient method for generating a reliable pipe expansion design in the early stage, which can lead to savings in both cost and computation time. As such, in this paper, we propose an applicable diagram, which we call the standard dimensionless ratio (SDR) versus virtual anchor length (L A ) diagram, that utilizes an efficient procedure for estimating subsea pipeline expansion based on applied reliable scenarios. With this user guideline, offshore pipeline structural designers can reliably determine the amount of subsea pipeline expansion and the obtained results will also be useful for the installation, design, and maintenance of the subsea pipeline.
An evolutionary approach to the architecture of effective healthcare delivery systems.
Towill, D R; Christopher, M
2005-01-01
Aims to show that material flow concepts developed and successfully applied to commercial products and services can form equally well the architectural infrastructure of effective healthcare delivery systems. The methodology is based on the "power of analogy" which demonstrates that healthcare pipelines may be classified via the Time-Space Matrix. A small number (circa 4) of substantially different healthcare delivery pipelines will cover the vast majority of patient needs and simultaneously create adequate added value from their perspective. The emphasis is firmly placed on total process mapping and analysis via established identification techniques. Healthcare delivery pipelines must be properly engineered and matched to life cycle phase if the service is to be effective. This small family of healthcare delivery pipelines needs to be designed via adherence to very specific-to-purpose principles. These vary from "lean production" through to "agile delivery". The proposition for a strategic approach to healthcare delivery pipeline design is novel and positions much currently isolated research into a comprehensive organisational framework. It therefore provides a synthesis of the needs of global healthcare.
A low-latency pipeline for GRB light curve and spectrum using Fermi/GBM near real-time data
NASA Astrophysics Data System (ADS)
Zhao, Yi; Zhang, Bin-Bin; Xiong, Shao-Lin; Long, Xi; Zhang, Qiang; Song, Li-Ming; Sun, Jian-Chao; Wang, Yuan-Hao; Li, Han-Cheng; Bu, Qing-Cui; Feng, Min-Zi; Li, Zheng-Heng; Wen, Xing; Wu, Bo-Bing; Zhang, Lai-Yu; Zhang, Yong-Jie; Zhang, Shuang-Nan; Shao, Jian-Xiong
2018-05-01
Rapid response and short time latency are very important for Time Domain Astronomy, such as the observations of Gamma-ray Bursts (GRBs) and electromagnetic (EM) counterparts of gravitational waves (GWs). Based on near real-time Fermi/GBM data, we developed a low-latency pipeline to automatically calculate the temporal and spectral properties of GRBs. With this pipeline, some important parameters can be obtained, such as T 90 and fluence, within ∼ 20 min after the GRB trigger. For ∼ 90% of GRBs, T 90 and fluence are consistent with the GBM catalog results within 2σ errors. This pipeline has been used by the Gamma-ray Bursts Polarimeter (POLAR) and the Insight Hard X-ray Modulation Telescope (Insight-HXMT) to follow up the bursts of interest. For GRB 170817A, the first EM counterpart of GW events detected by Fermi/GBM and INTEGRAL/SPI-ACS, the pipeline gave T 90 and spectral information 21 min after the GBM trigger, providing important information for POLAR and Insight-HXMT observations.
Reliability evaluation of oil pipelines operating in aggressive environment
NASA Astrophysics Data System (ADS)
Magomedov, R. M.; Paizulaev, M. M.; Gebel, E. S.
2017-08-01
In connection with modern increased requirements for ecology and safety, the development of diagnostic services complex is obligatory and necessary enabling to ensure the reliable operation of the gas transportation infrastructure. Estimation of oil pipelines technical condition should be carried out not only to establish the current values of the equipment technological parameters in operation, but also to predict the dynamics of changes in the physical and mechanical characteristics of the material, the appearance of defects, etc. to ensure reliable and safe operation. In the paper, existing Russian and foreign methods for evaluation of the oil pipelines reliability are considered, taking into account one of the main factors leading to the appearance of crevice in the pipeline material, i.e. change the shape of its cross-section, - corrosion. Without compromising the generality of the reasoning, the assumption of uniform corrosion wear for the initial rectangular cross section has been made. As a result a formula for calculation the probability of failure-free operation was formulated. The proposed mathematical model makes it possible to predict emergency situations, as well as to determine optimal operating conditions for oil pipelines.
CFHT data processing and calibration ESPaDOnS pipeline: Upena and OPERA (optical spectropolarimetry)
NASA Astrophysics Data System (ADS)
Martioli, Eder; Teeple, D.; Manset, Nadine
2011-03-01
CFHT is ESPaDOnS responsible for processing raw images, removing instrument related artifacts, and delivering science-ready data to the PIs. Here we describe the Upena pipeline, which is the software used to reduce the echelle spectro-polarimetric data obtained with the ESPaDOnS instrument. Upena is an automated pipeline that performs calibration and reduction of raw images. Upena has the capability of both performing real-time image-by-image basis reduction and a post observing night complete reduction. Upena produces polarization and intensity spectra in FITS format. The pipeline is designed to perform parallel computing for improved speed, which assures that the final products are delivered to the PIs before noon HST after each night of observations. We also present the OPERA project, which is an open-source pipeline to reduce ESPaDOnS data that will be developed as a collaborative work between CFHT and the scientific community. OPERA will match the core capabilities of Upena and in addition will be open-source, flexible and extensible.
Mittra, James; Tait, Joyce; Wield, David
2011-03-01
The pharmaceutical and agro-biotechnology industries have been confronted by dwindling product pipelines and rapid developments in life sciences, thus demanding a strategic rethink of conventional research and development. Despite offering both industries a solution to the pipeline problem, the life sciences have also brought complex regulatory challenges for firms. In this paper, we comment on the response of these industries to the life science trajectory, in the context of maturing conventional small-molecule product pipelines and routes to market. The challenges of managing transition from maturity to new high-value-added innovation models are addressed. Furthermore, we argue that regulation plays a crucial role in shaping the innovation systems of both industries, and as such, we suggest potentially useful changes to the current regulatory system. Copyright © 2010 Elsevier Ltd. All rights reserved.
Update on the SDSS-III MARVELS data pipeline development
NASA Astrophysics Data System (ADS)
Li, Rui; Ge, J.; Thomas, N. B.; Petersen, E.; Wang, J.; Ma, B.; Sithajan, S.; Shi, J.; Ouyang, Y.; Chen, Y.
2014-01-01
MARVELS (Multi-object APO Radial Velocity Exoplanet Large-area Survey), as one of the four surveys in the SDSS-III program, has monitored over 3,300 stars during 2008-2012, with each being visited an average of 26 times over a 2-year window. Although the early data pipeline was able to detect over 20 brown dwarf candidates and several hundreds of binaries, no giant planet candidates have been reliably identified due to its large systematic errors. Learning from past data pipeline lessons, we re-designed the entire pipeline to handle various types of systematic effects caused by the instrument (such as trace, slant, distortion, drifts and dispersion) and observation condition changes (such as illumination profile and continuum). We also introduced several advanced methods to precisely extract the RV signals. To date, we have achieved a long term RMS RV measurement error of 14 m/s for HIP-14810 (one of our reference stars) after removal of the known planet signal based on previous HIRES RV measurement. This new 1-D data pipeline has been used to robustly identify four giant planet candidates within the small fraction of the survey data that has been processed (Thomas et al. this meeting). The team is currently working hard to optimize the pipeline, especially the 2-D interference-fringe RV extraction, where early results show a 1.5 times improvement over the 1-D data pipeline. We are quickly approaching the survey baseline performance requirement of 10-35 m/s RMS for 8-12 solar type stars. With this fine-tuned pipeline and the soon to be processed plates of data, we expect to discover many more giant planet candidates and make a large statistical impact to the exoplanet study.
Amar, David; Frades, Itziar; Danek, Agnieszka; Goldberg, Tatyana; Sharma, Sanjeev K; Hedley, Pete E; Proux-Wera, Estelle; Andreasson, Erik; Shamir, Ron; Tzfadia, Oren; Alexandersson, Erik
2014-12-05
For most organisms, even if their genome sequence is available, little functional information about individual genes or proteins exists. Several annotation pipelines have been developed for functional analysis based on sequence, 'omics', and literature data. However, researchers encounter little guidance on how well they perform. Here, we used the recently sequenced potato genome as a case study. The potato genome was selected since its genome is newly sequenced and it is a non-model plant even if there is relatively ample information on individual potato genes, and multiple gene expression profiles are available. We show that the automatic gene annotations of potato have low accuracy when compared to a "gold standard" based on experimentally validated potato genes. Furthermore, we evaluate six state-of-the-art annotation pipelines and show that their predictions are markedly dissimilar (Jaccard similarity coefficient of 0.27 between pipelines on average). To overcome this discrepancy, we introduce a simple GO structure-based algorithm that reconciles the predictions of the different pipelines. We show that the integrated annotation covers more genes, increases by over 50% the number of highly co-expressed GO processes, and obtains much higher agreement with the gold standard. We find that different annotation pipelines produce different results, and show how to integrate them into a unified annotation that is of higher quality than each single pipeline. We offer an improved functional annotation of both PGSC and ITAG potato gene models, as well as tools that can be applied to additional pipelines and improve annotation in other organisms. This will greatly aid future functional analysis of '-omics' datasets from potato and other organisms with newly sequenced genomes. The new potato annotations are available with this paper.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-26
... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket No. PHMSA-2009-0203] Pipeline Safety: Meeting of the Gas Pipeline Advisory Committee and the Liquid Pipeline Advisory Committee AGENCY: Pipeline and Hazardous Materials Safety Administration (PHMSA), DOT. [[Page...
DOE Office of Scientific and Technical Information (OSTI.GOV)
El-Jundi, I.M.
Qatar NGL/2 plant, commissioned in December, 1979, was designed to process the associated gas from the offshore crude oil fields of Qatar. The dehydrated sour lean gas and wet sour liquids are transported via two separate lines to Umm Said NGL Complex about 120 kms. from the central offshore station. The liquids line 300 mm diameter (12 inch) has suffered general and severe pitting corrosion. The lean gas line 600 mm diameter (24 inch) has suffered corrosion and extensively hydrogen induced cracking (HIC), also known as HIPC. Both lines never performed to their design parameters and many problems in themore » downstream facilities have been experienced. All efforts to clean the liquids lines from the solids (debris) have failed. This inturn interfered with the planned corrosion control programe, thus allowing corrosion to continue. Investigation work has been done by various specialists in an attempt to find the origin of the solids and to recommend necessary remedial actions. Should lines fall from pitting corrosion, the effect of liquids leak at a pressure of about 11000 kpa will be very dangerous especially if it occurs onshore. In order to protect the NGL-2 operations against possible risks, both interms of safety as well as losses in revenue, critically sections of the pipelines have been replaced, whilst the whole gas liquids pipelines would be replaced shortly. Supplementary documents to the API standards were prepared by QPC for the replaced pipelines.« less
Method for Stereo Mapping Based on Objectarx and Pipeline Technology
NASA Astrophysics Data System (ADS)
Liu, F.; Chen, T.; Lin, Z.; Yang, Y.
2012-07-01
Stereo mapping is an important way to acquire 4D production. Based on the development of the stereo mapping and the characteristics of ObjectARX and pipeline technology, a new stereo mapping scheme which can realize the interaction between the AutoCAD and digital photogrammetry system is offered by ObjectARX and pipeline technology. An experiment is made in order to make sure the feasibility with the example of the software MAP-AT (Modern Aerial Photogrammetry Automatic Triangulation), the experimental results show that this scheme is feasible and it has very important meaning for the realization of the acquisition and edit integration.
Practical Approach for Hyperspectral Image Processing in Python
NASA Astrophysics Data System (ADS)
Annala, L.; Eskelinen, M. A.; Hämäläinen, J.; Riihinen, A.; Pölönen, I.
2018-04-01
Python is a very popular programming language among data scientists around the world. Python can also be used in hyperspectral data analysis. There are some toolboxes designed for spectral imaging, such as Spectral Python and HyperSpy, but there is a need for analysis pipeline, which is easy to use and agile for different solutions. We propose a Python pipeline which is built on packages xarray, Holoviews and scikit-learn. We have developed some of own tools, MaskAccessor, VisualisorAccessor and a spectral index library. They also fulfill our goal of easy and agile data processing. In this paper we will present our processing pipeline and demonstrate it in practice.
A single chip VLSI Reed-Solomon decoder
NASA Technical Reports Server (NTRS)
Shao, H. M.; Truong, T. K.; Hsu, I. S.; Deutsch, L. J.; Reed, I. S.
1986-01-01
A new VLSI design of a pipeline Reed-Solomon decoder is presented. The transform decoding technique used in a previous design is replaced by a time domain algorithm. A new architecture that implements such an algorithm permits efficient pipeline processing with minimum circuitry. A systolic array is also developed to perform erasure corrections in the new design. A modified form of Euclid's algorithm is implemented by a new architecture that maintains the throughput rate with less circuitry. Such improvements result in both enhanced capability and a significant reduction in silicon area, therefore making it possible to build a pipeline (31,15)RS decoder on a single VLSI chip.
The Eastring gas pipeline in the context of the Central and Eastern European gas supply challenge
NASA Astrophysics Data System (ADS)
Mišík, Matúš; Nosko, Andrej
2017-11-01
Ever since the 2009 natural gas crisis, energy security has been a crucial priority for countries of Central and Eastern Europe. Escalating in 2014, the conflict between Ukraine and Russia further fuelled negative expectations about the future development of energy relations for the region predominantly supplied by Russia. As a response to the planned cessation of gas transit through the Brotherhood pipeline, which brings Russian gas to Europe via Ukraine and Slovakia, the Slovak transmission system operator Eustream proposed the Eastring pipeline. This Perspective analyses this proposal and argues that neither the perceived decrease in Slovak energy security nor the loss of economic rent from the international gas transit should be the main policy driver behind such a major infrastructure project. Although marketed as an answer to current Central and Eastern European gas supply security challenges, the Eastring pipeline is actually mainly focused on issues connected to the Slovak gas transit.
Solvepol: A Reduction Pipeline for Imaging Polarimetry Data
NASA Astrophysics Data System (ADS)
Ramírez, Edgar A.; Magalhães, Antônio M.; Davidson, James W., Jr.; Pereyra, Antonio; Rubinho, Marcelo
2017-05-01
We present a newly, fully automated, data pipeline, Solvepol, designed to reduce and analyze polarimetric data. It has been optimized for imaging data from the Instituto de Astronomía, Geofísica e Ciências Atmosféricas (IAG) of the University of São Paulo (USP), calcite Savart prism plate-based IAGPOL polarimeter. Solvepol is also the basis of a reduction pipeline for the wide-field optical polarimeter that will execute SOUTH POL, a survey of the polarized southern sky. Solvepol was written using the Interactive data language (IDL) and is based on the Image Reduction and Analysis Facility (IRAF) task PCCDPACK, developed by our polarimetry group. We present and discuss reduced data from standard stars and other fields and compare these results with those obtained in the IRAF environment. Our analysis shows that Solvepol, in addition to being a fully automated pipeline, produces results consistent with those reduced by PCCDPACK and reported in the literature.
Development Of A Centrifugal Hydrogen Pipeline Gas Compressor
DOE Office of Scientific and Technical Information (OSTI.GOV)
Di Bella, Francis A.
2015-04-16
Concepts NREC (CN) has completed a Department of Energy (DOE) sponsored project to analyze, design, and fabricate a pipeline capacity hydrogen compressor. The pipeline compressor is a critical component in the DOE strategy to provide sufficient quantities of hydrogen to support the expected shift in transportation fuels from liquid and natural gas to hydrogen. The hydrogen would be generated by renewable energy (solar, wind, and perhaps even tidal or ocean), and would be electrolyzed from water. The hydrogen would then be transported to the population centers in the U.S., where fuel-cell vehicles are expected to become popular and necessary tomore » relieve dependency on fossil fuels. The specifications for the required pipeline hydrogen compressor indicates a need for a small package that is efficient, less costly, and more reliable than what is available in the form of a multi-cylinder, reciprocating (positive displacement) compressor for compressing hydrogen in the gas industry.« less
Photometer Performance Assessment in TESS SPOC Pipeline
NASA Astrophysics Data System (ADS)
Li, Jie; Caldwell, Douglas A.; Jenkins, Jon Michael; Twicken, Joseph D.; Wohler, Bill; Chen, Xiaolan; Rose, Mark; TESS Science Processing Operations Center
2018-06-01
This poster describes the Photometer Performance Assessment (PPA) software component in the Transiting Exoplanet Survey Satellite (TESS) Science Processing Operations Center (SPOC) pipeline, which is developed based on the Kepler science pipeline. The PPA component performs two tasks: the first task is to assess the health and performance of the instrument based on the science data sets collected during each observation sector, identifying out of bounds conditions and generating alerts. The second is to combine the astrometric data collected for each CCD readout channel to construct a high fidelity record of the pointing history for each of the 4 cameras and an attitude solution for the TESS spacecraft for each 2-min data collection interval. PPA is implemented with multiple pipeline modules: PPA Metrics Determination (PMD), PMD Aggregator (PAG), and PPA Attitude Determination (PAD). The TESS Mission is funded by NASA's Science Mission Directorate. The SPOC is managed and operated by NASA Ames Research Center.
Rapid, Vehicle-Based Identification of Location and Magnitude of Urban Natural Gas Pipeline Leaks.
von Fischer, Joseph C; Cooley, Daniel; Chamberlain, Sam; Gaylord, Adam; Griebenow, Claire J; Hamburg, Steven P; Salo, Jessica; Schumacher, Russ; Theobald, David; Ham, Jay
2017-04-04
Information about the location and magnitudes of natural gas (NG) leaks from urban distribution pipelines is important for minimizing greenhouse gas emissions and optimizing investment in pipeline management. To enable rapid collection of such data, we developed a relatively simple method using high-precision methane analyzers in Google Street View cars. Our data indicate that this automated leak survey system can document patterns in leak location and magnitude within and among cities, even without wind data. We found that urban areas with prevalent corrosion-prone distribution lines (Boston, MA, Staten Island, NY, and Syracuse, NY), leaked approximately 25-fold more methane than cities with more modern pipeline materials (Burlington, VT, and Indianapolis, IN). Although this mobile monitoring method produces conservative estimates of leak rates and leak counts, it can still help prioritize both leak repairs and replacement of leak-prone sections of distribution lines, thus minimizing methane emissions over short and long terms.
The Kepler Science Operations Center Pipeline Framework Extensions
NASA Technical Reports Server (NTRS)
Klaus, Todd C.; Cote, Miles T.; McCauliff, Sean; Girouard, Forrest R.; Wohler, Bill; Allen, Christopher; Chandrasekaran, Hema; Bryson, Stephen T.; Middour, Christopher; Caldwell, Douglas A.;
2010-01-01
The Kepler Science Operations Center (SOC) is responsible for several aspects of the Kepler Mission, including managing targets, generating on-board data compression tables, monitoring photometer health and status, processing the science data, and exporting the pipeline products to the mission archive. We describe how the generic pipeline framework software developed for Kepler is extended to achieve these goals, including pipeline configurations for processing science data and other support roles, and custom unit of work generators that control how the Kepler data are partitioned and distributed across the computing cluster. We describe the interface between the Java software that manages the retrieval and storage of the data for a given unit of work and the MATLAB algorithms that process these data. The data for each unit of work are packaged into a single file that contains everything needed by the science algorithms, allowing these files to be used to debug and evolve the algorithms offline.
GPU-Powered Coherent Beamforming
NASA Astrophysics Data System (ADS)
Magro, A.; Adami, K. Zarb; Hickish, J.
2015-03-01
Graphics processing units (GPU)-based beamforming is a relatively unexplored area in radio astronomy, possibly due to the assumption that any such system will be severely limited by the PCIe bandwidth required to transfer data to the GPU. We have developed a CUDA-based GPU implementation of a coherent beamformer, specifically designed and optimized for deployment at the BEST-2 array which can generate an arbitrary number of synthesized beams for a wide range of parameters. It achieves ˜1.3 TFLOPs on an NVIDIA Tesla K20, approximately 10x faster than an optimized, multithreaded CPU implementation. This kernel has been integrated into two real-time, GPU-based time-domain software pipelines deployed at the BEST-2 array in Medicina: a standalone beamforming pipeline and a transient detection pipeline. We present performance benchmarks for the beamforming kernel as well as the transient detection pipeline with beamforming capabilities as well as results of test observation.
NASA Astrophysics Data System (ADS)
Barnsley, R. M.; Steele, Iain A.; Smith, R. J.; Mawson, Neil R.
2014-07-01
The Small Telescopes Installed at the Liverpool Telescope (STILT) project has been in operation since March 2009, collecting data with three wide field unfiltered cameras: SkycamA, SkycamT and SkycamZ. To process the data, a pipeline was developed to automate source extraction, catalogue cross-matching, photometric calibration and database storage. In this paper, modifications and further developments to this pipeline will be discussed, including a complete refactor of the pipeline's codebase into Python, migration of the back-end database technology from MySQL to PostgreSQL, and changing the catalogue used for source cross-matching from USNO-B1 to APASS. In addition to this, details will be given relating to the development of a preliminary front-end to the source extracted database which will allow a user to perform common queries such as cone searches and light curve comparisons of catalogue and non-catalogue matched objects. Some next steps and future ideas for the project will also be presented.
Use of geographic information systems for applications on gas pipeline rights-of-way
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sydelko, P.J.; Wilkey, P.L.
1992-12-01
Geographic information system (GIS) applications for the siting and monitoring of gas pipeline rights-of-way (ROWS) were developed for areas near Rio Vista, California. The data layers developed for this project represent geographic features, such as landcover, elevation, aspect, slope, soils, hydrography, transportation, endangered species, wetlands, and public line surveys. A GIS was used to develop and store spatial data from several sources; to manipulate spatial data to evaluate environmental and engineering issues associated with the siting, permitting, construction, maintenance, and monitoring of gas pipeline ROWS; and to graphically display analysis results. Examples of these applications include (1) determination of environmentallymore » sensitive areas, such as endangered species habitat, wetlands, and areas of highly erosive soils; (2) evaluation of engineering constraints, including shallow depth to bedrock, major hydrographic features, and shallow water table; (3) classification of satellite imagery for landuse/landcover that will affect ROWS; and (4) identification of alternative ROW corridors that avoid environmentally sensitive areas or areas with severe engineering constraints.« less
GIS least-cost analysis approach for siting gas pipeline ROWs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sydelko, P.J.; Wilkey, P.L.
1994-09-01
Geographic-information-system applications for the siting and monitoring of gas pipeline rights-of-way (ROWS) were developed for areas near Rio Vista, California. The data layers developed for this project represent geographic features, such as landcover, elevation, aspect, slope, soils, hydrography, transportation corridors, endangered species habitats, wetlands, and public line surveys. A geographic information system was used to develop and store spatial data from several sources; to manipulate spatial data to evaluate environmental and engineering issues associated with the siting, permitting, construction, maintenance, and monitoring of gas-pipeline ROWS; and to graphically display analysis results. Examples of these applications include (1) determination of environmentallymore » sensitive areas, such as endangered species habitat, wetlands, and areas of highly erosive soils; (2) evaluation of engineering constraints, including shallow depth to bedrock, major hydrographic features, and shallow water table; (3) classification of satellite imagery for landuse/landcover that will affect ROWS; and (4) identification of alternative ROW corridors that avoid environmentally sensitive areas or areas with severe engineering constraints.« less
Use of geographic information systems for applications on gas pipeline rights-of-way
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sydelko, P.J.
1993-10-01
Geographic information system (GIS) applications for the siting and monitoring of gas pipeline rights-of-way (ROWS) were developed for areas near Rio Vista, California. The data layers developed for this project represent geographic features, such as landcover, elevation, aspect, slope, soils, hydrography, transportation, endangered species, wetlands, and public line surveys. A GIS was used to develop and store spatial data from several sources; to manipulate spatial data to evaluate environmental and engineering issues associated with the siting, permitting, construction, maintenance, and monitoring of gas pipeline ROWS; and to graphically display analysis results. Examples of these applications include (1) determination of environmentallymore » sensitive areas, such as endangered species habitat, wetlands, and areas of highly erosive soils; (2) evaluation of engineering constraints, including shallow depth to bedrock, major hydrographic features, and shallow water table; (3) classification of satellite imagery for land use/landcover that will affect ROWS; and (4) identification of alternative ROW corridors that avoid environmentally sensitive areas or areas with severe engineering constraints.« less
Use of geographic information systems for applications on gas pipeline rights-of-way
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sydelko, P.J.; Wilkey, P.L.
1992-01-01
Geographic information system (GIS) applications for the siting and monitoring of gas pipeline rights-of-way (ROWS) were developed for areas near Rio Vista, California. The data layers developed for this project represent geographic features, such as landcover, elevation, aspect, slope, soils, hydrography, transportation, endangered species, wetlands, and public line surveys. A GIS was used to develop and store spatial data from several sources; to manipulate spatial data to evaluate environmental and engineering issues associated with the siting, permitting, construction, maintenance, and monitoring of gas pipeline ROWS; and to graphically display analysis results. Examples of these applications include (1) determination of environmentallymore » sensitive areas, such as endangered species habitat, wetlands, and areas of highly erosive soils; (2) evaluation of engineering constraints, including shallow depth to bedrock, major hydrographic features, and shallow water table; (3) classification of satellite imagery for landuse/landcover that will affect ROWS; and (4) identification of alternative ROW corridors that avoid environmentally sensitive areas or areas with severe engineering constraints.« less
Pydpiper: a flexible toolkit for constructing novel registration pipelines.
Friedel, Miriam; van Eede, Matthijs C; Pipitone, Jon; Chakravarty, M Mallar; Lerch, Jason P
2014-01-01
Using neuroimaging technologies to elucidate the relationship between genotype and phenotype and brain and behavior will be a key contribution to biomedical research in the twenty-first century. Among the many methods for analyzing neuroimaging data, image registration deserves particular attention due to its wide range of applications. Finding strategies to register together many images and analyze the differences between them can be a challenge, particularly given that different experimental designs require different registration strategies. Moreover, writing software that can handle different types of image registration pipelines in a flexible, reusable and extensible way can be challenging. In response to this challenge, we have created Pydpiper, a neuroimaging registration toolkit written in Python. Pydpiper is an open-source, freely available software package that provides multiple modules for various image registration applications. Pydpiper offers five key innovations. Specifically: (1) a robust file handling class that allows access to outputs from all stages of registration at any point in the pipeline; (2) the ability of the framework to eliminate duplicate stages; (3) reusable, easy to subclass modules; (4) a development toolkit written for non-developers; (5) four complete applications that run complex image registration pipelines "out-of-the-box." In this paper, we will discuss both the general Pydpiper framework and the various ways in which component modules can be pieced together to easily create new registration pipelines. This will include a discussion of the core principles motivating code development and a comparison of Pydpiper with other available toolkits. We also provide a comprehensive, line-by-line example to orient users with limited programming knowledge and highlight some of the most useful features of Pydpiper. In addition, we will present the four current applications of the code.
Pydpiper: a flexible toolkit for constructing novel registration pipelines
Friedel, Miriam; van Eede, Matthijs C.; Pipitone, Jon; Chakravarty, M. Mallar; Lerch, Jason P.
2014-01-01
Using neuroimaging technologies to elucidate the relationship between genotype and phenotype and brain and behavior will be a key contribution to biomedical research in the twenty-first century. Among the many methods for analyzing neuroimaging data, image registration deserves particular attention due to its wide range of applications. Finding strategies to register together many images and analyze the differences between them can be a challenge, particularly given that different experimental designs require different registration strategies. Moreover, writing software that can handle different types of image registration pipelines in a flexible, reusable and extensible way can be challenging. In response to this challenge, we have created Pydpiper, a neuroimaging registration toolkit written in Python. Pydpiper is an open-source, freely available software package that provides multiple modules for various image registration applications. Pydpiper offers five key innovations. Specifically: (1) a robust file handling class that allows access to outputs from all stages of registration at any point in the pipeline; (2) the ability of the framework to eliminate duplicate stages; (3) reusable, easy to subclass modules; (4) a development toolkit written for non-developers; (5) four complete applications that run complex image registration pipelines “out-of-the-box.” In this paper, we will discuss both the general Pydpiper framework and the various ways in which component modules can be pieced together to easily create new registration pipelines. This will include a discussion of the core principles motivating code development and a comparison of Pydpiper with other available toolkits. We also provide a comprehensive, line-by-line example to orient users with limited programming knowledge and highlight some of the most useful features of Pydpiper. In addition, we will present the four current applications of the code. PMID:25126069
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-05
... Texas. Pipelines include El Paso Natural Gas, Transwestern Pipeline, Natural Gas Pipeline Co. of America, Northern Natural Gas, Delhi Pipeline, Oasis Pipeline, EPGT Texas and Lone Star Pipeline. The Platt's [[Page... pipelines. These pipelines bring in natural gas from fields in the Gulf Coast region and ship it to major...
Jóri, Balazs; Kamps, Rick; Xanthoulea, Sofia; Delvoux, Bert; Blok, Marinus J; Van de Vijver, Koen K; de Koning, Bart; Oei, Felicia Trups; Tops, Carli M; Speel, Ernst Jm; Kruitwagen, Roy F; Gomez-Garcia, Encarna B; Romano, Andrea
2015-12-01
The risk to develop colorectal and endometrial cancers among subjects testing positive for a pathogenic Lynch syndrome mutation varies, making the risk prediction difficult. Genetic risk modifiers alter the risk conferred by inherited Lynch syndrome mutations, and their identification can improve genetic counseling. We aimed at identifying rare genetic modifiers of the risk of Lynch syndrome endometrial cancer. A family based approach was used to assess the presence of genetic risk modifiers among 35 Lynch syndrome mutation carriers having either a poor clinical phenotype (early age of endometrial cancer diagnosis or multiple cancers) or a neutral clinical phenotype. Putative genetic risk modifiers were identified by Next Generation Sequencing among a panel of 154 genes involved in endometrial physiology and carcinogenesis. A simple pipeline, based on an allele frequency lower than 0.001 and on predicted non-conservative amino-acid substitutions returned 54 variants that were considered putative risk modifiers. The presence of two or more risk modifying variants in women carrying a pathogenic Lynch syndrome mutation was associated with a poor clinical phenotype. A gene-panel is proposed that comprehends genes that can carry variants with putative modifying effects on the risk of Lynch syndrome endometrial cancer. Validation in further studies is warranted before considering the possible use of this tool in genetic counseling.
Building Principal Pipelines: A Job That Urban Districts Can Do. Perspective
ERIC Educational Resources Information Center
Mendels, Pamela
2016-01-01
School district officials have faced the urgent task in recent years of ensuring that all schools, not just a lucky few, benefit from sure-footed leadership by professionals who know how to focus on instruction and improve it. The question boils down to this: How can districts develop a pipeline of great school principals? Research about a Wallace…
Our Deliberate Success: Recognizing What Works for Latina/o Students across the Educational Pipeline
ERIC Educational Resources Information Center
Rodríguez, Louie F.; Oseguera, Leticia
2015-01-01
The purpose of this article is to identify the best practices across the K-20 pipeline that work for Latina/o students for the purposes of developing a framework for Latina/o student success. The authors suggest that the field needs to be explicit when it comes to recognizing "what works" and encourage researchers, practitioners, and…
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-07
... can file your comments electronically using the eFiling feature located on the Commission's Web site ( www.ferc.gov ) under the Documents & Filings link. With eFiling, you can provide comments in a variety of formats by attaching them as a file with your submission. New eFiling users must first create an...
ERIC Educational Resources Information Center
Cooper, Catherine R.; Chavira, Gabriela; Mena, Dolores D.
2005-01-01
This article maps recent progress on 5 key questions about "the academic pipeline problem" of different rates of persistence through school among ethnically diverse students across the nation. The article shows the complementary development of the Overlapping Spheres of Influence Theory and Sociocultural Theory and aligns concepts and measures…
ERIC Educational Resources Information Center
Rodriguez, Louie F.
2016-01-01
The educational system continues to inadequately serve Latina/o students across the educational pipeline. A key shortcoming is the system's inability to develop, support, and grow educational leaders that can respond. In this article, the author poses a series of pedagogical approaches using a Community Cultural Wealth (Yosso, 2005) lens. In the…
Building Principal Pipelines: A Job That Urban Districts Can Do. Perspective. Updated Edition
ERIC Educational Resources Information Center
Mendels, Pamela
2017-01-01
School district officials have faced the urgent task in recent years of ensuring that all schools, not just a lucky few, benefit from sure-footed leadership by professionals who know how to focus on instruction and improve it. The question boils down to this: How can districts develop a pipeline of great school principals? Research about a Wallace…
Ready for Work. Advocates Series. Action Brief No.2
ERIC Educational Resources Information Center
Forum for Youth Investment, 2006
2006-01-01
In the past, much attention has been paid to the leaks in the "education pipeline", but now employers, youth and communities are focusing on repairing the "work pipeline" to ensure that young people are ready for work by age 21. This issue brief is the second in a series developed by the Forum for Youth Investment, Connect for Kids, Voices for…
Pipeline of Known Chemical Classes of Antibiotics
d’Urso de Souza Mendes, Cristina; de Souza Antunes, Adelaide Maria
2013-01-01
Many approaches are used to discover new antibiotic compounds, one of the most widespread being the chemical modification of known antibiotics. This type of discovery has been so important in the development of new antibiotics that most antibiotics used today belong to the same chemical classes as antibiotics discovered in the 1950s and 1960s. Even though the discovery of new classes of antibiotics is urgently needed, the chemical modification of antibiotics in known classes is still widely used to discover new antibiotics, resulting in a great number of compounds in the discovery and clinical pipeline that belong to existing classes. In this scenario, the present article presents an overview of the R&D pipeline of new antibiotics in known classes of antibiotics, from discovery to clinical trial, in order to map out the technological trends in this type of antibiotic R&D, aiming to identify the chemical classes attracting most interest, their spectrum of activity, and the new subclasses under development. The result of the study shows that the new antibiotics in the pipeline belong to the following chemical classes: quinolones, aminoglycosides, macrolides, oxazolidinones, tetracyclines, pleuromutilins, beta-lactams, lipoglycopeptides, polymyxins and cyclic lipopeptides. PMID:27029317
Saeed, Isaam; Wong, Stephen Q.; Mar, Victoria; Goode, David L.; Caramia, Franco; Doig, Ken; Ryland, Georgina L.; Thompson, Ella R.; Hunter, Sally M.; Halgamuge, Saman K.; Ellul, Jason; Dobrovic, Alexander; Campbell, Ian G.; Papenfuss, Anthony T.; McArthur, Grant A.; Tothill, Richard W.
2014-01-01
Targeted resequencing by massively parallel sequencing has become an effective and affordable way to survey small to large portions of the genome for genetic variation. Despite the rapid development in open source software for analysis of such data, the practical implementation of these tools through construction of sequencing analysis pipelines still remains a challenging and laborious activity, and a major hurdle for many small research and clinical laboratories. We developed TREVA (Targeted REsequencing Virtual Appliance), making pre-built pipelines immediately available as a virtual appliance. Based on virtual machine technologies, TREVA is a solution for rapid and efficient deployment of complex bioinformatics pipelines to laboratories of all sizes, enabling reproducible results. The analyses that are supported in TREVA include: somatic and germline single-nucleotide and insertion/deletion variant calling, copy number analysis, and cohort-based analyses such as pathway and significantly mutated genes analyses. TREVA is flexible and easy to use, and can be customised by Linux-based extensions if required. TREVA can also be deployed on the cloud (cloud computing), enabling instant access without investment overheads for additional hardware. TREVA is available at http://bioinformatics.petermac.org/treva/. PMID:24752294
The LCOGT Observation Portal, Data Pipeline and Science Archive
NASA Astrophysics Data System (ADS)
Lister, Tim; LCOGT Science Archive Team
2014-01-01
Las Cumbres Observatory Global Telescope (LCOGT) is building and deploying a world-wide network of optical telescopes dedicated to time-domain astronomy. During 2012-2013, we successfully deployed and commissioned nine new 1m telescopes at McDonald Observatory (Texas), CTIO (Chile), SAAO (South Africa) and Siding Spring Observatory (Australia). New, improved cameras and additional telescopes will be deployed during 2014. To enable the diverse LCOGT user community of scientific and educational users to request observations on the LCOGT Network and to see their progress and get access to their data, we have developed an Observation Portal system. This Observation Portal integrates proposal submission and observation requests with seamless access to the data products from the data pipelines in near-realtime and long-term products from the Science Archive. We describe the LCOGT Observation Portal and the data pipeline, currently in operation, which makes use of the ORAC-DR automated recipe-based data reduction pipeline and illustrate some of the new data products. We also present the LCOGT Science Archive, which is being developed in partnership with the Infrared Processing and Analysis Center (IPAC) and show some of the new features the Science Archive provides.
Luepke, Katherine H; Suda, Katie J; Boucher, Helen; Russo, Rene L; Bonney, Michael W; Hunt, Timothy D; Mohr, John F
2017-01-01
Growing antimicrobial resistance and a dwindling antibiotic pipeline have resulted in an emerging postantibiotic era, as patients are now dying from bacterial infections that were once treatable. The fast-paced "Golden Age" of antibiotic development that started in the 1940s has lost momentum; from the 1980s to the early 2000s, there was a 90% decline in the approval of new antibiotics as well as the discovery of few new novel classes. Many companies have shifted away from development due to scientific, regulatory, and economic hurdles that proved antibiotic development to be less attractive compared with more lucrative therapeutic areas. National and global efforts are focusing attention toward potential solutions for reinvigorating the antibiotic pipeline and include "push" incentives such as public-private partnerships and "pull" incentives such as reimbursement reform and market exclusivity. Hybrid models of incentives, global coordination among stakeholders, and the appropriate balance of antibiotic pricing, volume of drug used, and proper antimicrobial stewardship are key to maximizing efforts toward drug development to ensure access to patients in need of these therapies. © 2016 Pharmacotherapy Publications, Inc.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-20
... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket No... Technical Hazardous Liquid Pipeline Safety Standards Committee AGENCY: Pipeline and Hazardous Materials... for natural gas pipelines and for hazardous liquid pipelines. Both committees were established under...
77 FR 34123 - Pipeline Safety: Public Meeting on Integrity Management of Gas Distribution Pipelines
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-08
... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket No. PHMSA-2012-0100] Pipeline Safety: Public Meeting on Integrity Management of Gas Distribution Pipelines AGENCY: Office of Pipeline Safety, Pipeline and Hazardous Materials Safety Administration, DOT. ACTION...
Fast interactive exploration of 4D MRI flow data
NASA Astrophysics Data System (ADS)
Hennemuth, A.; Friman, O.; Schumann, C.; Bock, J.; Drexl, J.; Huellebrand, M.; Markl, M.; Peitgen, H.-O.
2011-03-01
1- or 2-directional MRI blood flow mapping sequences are an integral part of standard MR protocols for diagnosis and therapy control in heart diseases. Recent progress in rapid MRI has made it possible to acquire volumetric, 3-directional cine images in reasonable scan time. In addition to flow and velocity measurements relative to arbitrarily oriented image planes, the analysis of 3-dimensional trajectories enables the visualization of flow patterns, local features of flow trajectories or possible paths into specific regions. The anatomical and functional information allows for advanced hemodynamic analysis in different application areas like stroke risk assessment, congenital and acquired heart disease, aneurysms or abdominal collaterals and cranial blood flow. The complexity of the 4D MRI flow datasets and the flow related image analysis tasks makes the development of fast comprehensive data exploration software for advanced flow analysis a challenging task. Most existing tools address only individual aspects of the analysis pipeline such as pre-processing, quantification or visualization, or are difficult to use for clinicians. The goal of the presented work is to provide a software solution that supports the whole image analysis pipeline and enables data exploration with fast intuitive interaction and visualization methods. The implemented methods facilitate the segmentation and inspection of different vascular systems. Arbitrary 2- or 3-dimensional regions for quantitative analysis and particle tracing can be defined interactively. Synchronized views of animated 3D path lines, 2D velocity or flow overlays and flow curves offer a detailed insight into local hemodynamics. The application of the analysis pipeline is shown for 6 cases from clinical practice, illustrating the usefulness for different clinical questions. Initial user tests show that the software is intuitive to learn and even inexperienced users achieve good results within reasonable processing times.
Aman, Zachary M; Sloan, E Dendy; Sum, Amadeu K; Koh, Carolyn A
2014-12-07
Interfacial interactions between liquid-solid and solid-solid phases/surfaces are of fundamental importance to the formation of hydrate deposits in oil and gas pipelines. This work establishes the effect of five categories of physical and chemical modification to steel on clathrate hydrate adhesive force: oleamide, graphite, citric acid ester, nonanedithiol, and Rain-X anti-wetting agent. Hydrate adhesive forces were measured using a micromechanical force apparatus, under both dry and water-wet surface conditions. The results show that the graphite coating reduced hydrate-steel adhesion force by 79%, due to an increase in the water wetting angle from 42 ± 8° to 154 ± 7°. Two chemical surface coatings (nonanedithiol and the citric acid ester) induced rapid hydrate growth in the hydrate particles; nonanedithiol increased hydrate adhesive force by 49% from the baseline, while the citric acid ester coating reduced hydrate adhesion force by 98%. This result suggests that crystal growth may enable a strong adhesive pathway between hydrate and other crystalline structures, however this effect may be negated in cases where water-hydrocarbon interfacial tension is minimised. When a liquid water droplet was placed on the modified steel surfaces, the graphite and citric acid ester became less effective at reducing adhesive force. In pipelines containing a free water phase wetting the steel surface, chemical or physical surface modifications alone may be insufficient to eliminate hydrate deposition risk. In further tests, the citric acid ester reduced hydrate cohesive forces by 50%, suggesting mild activity as a hybrid anti-agglomerant suppressing both hydrate deposition and particle agglomeration. These results demonstrate a new capability to develop polyfunctional surfactants, which simultaneously limit the capability for hydrate particles to aggregate and deposit on the pipeline wall.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-21
... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket No. PHMSA-2011-0127] Pipeline Safety: Meetings of the Technical Pipeline Safety Standards Committee and the Technical Hazardous Liquid Pipeline Safety Standards Committee AGENCY: Pipeline and Hazardous Materials...
Cornwell, MacIntosh; Vangala, Mahesh; Taing, Len; Herbert, Zachary; Köster, Johannes; Li, Bo; Sun, Hanfei; Li, Taiwen; Zhang, Jian; Qiu, Xintao; Pun, Matthew; Jeselsohn, Rinath; Brown, Myles; Liu, X Shirley; Long, Henry W
2018-04-12
RNA sequencing has become a ubiquitous technology used throughout life sciences as an effective method of measuring RNA abundance quantitatively in tissues and cells. The increase in use of RNA-seq technology has led to the continuous development of new tools for every step of analysis from alignment to downstream pathway analysis. However, effectively using these analysis tools in a scalable and reproducible way can be challenging, especially for non-experts. Using the workflow management system Snakemake we have developed a user friendly, fast, efficient, and comprehensive pipeline for RNA-seq analysis. VIPER (Visualization Pipeline for RNA-seq analysis) is an analysis workflow that combines some of the most popular tools to take RNA-seq analysis from raw sequencing data, through alignment and quality control, into downstream differential expression and pathway analysis. VIPER has been created in a modular fashion to allow for the rapid incorporation of new tools to expand the capabilities. This capacity has already been exploited to include very recently developed tools that explore immune infiltrate and T-cell CDR (Complementarity-Determining Regions) reconstruction abilities. The pipeline has been conveniently packaged such that minimal computational skills are required to download and install the dozens of software packages that VIPER uses. VIPER is a comprehensive solution that performs most standard RNA-seq analyses quickly and effectively with a built-in capacity for customization and expansion.
Is there Place for Perfectionism in the NIR Spectral Data Reduction?
NASA Astrophysics Data System (ADS)
Chilingarian, Igor
2017-09-01
"Despite the crucial importance of the near-infrared spectral domain for understanding the star formation and galaxy evolution, NIR observations and data reduction represent a significant challenge. The known complexity of NIR detectors is aggravated by the airglow emission in the upper atmosphere and the water absorption in the troposphere so that up until now, the astronomical community is divided on the issue whether ground based NIR spectroscopy has a future or should it move completely to space (JWST, Euclid, WFIRST). I will share my experience of pipeline development for low- and intermediate-resolution spectrographs operated at Magellan and MMT. The MMIRS data reduction pipeline became the first example of the sky subtraction quality approaching the limit set by the Poisson photon noise and demonstrated the feasibility of low-resolution (R=1200-3000) NIR spectroscopy from the ground even for very faint (J=24.5) continuum sources. On the other hand, the FIRE Bright Source Pipeline developed specifically for high signal-to-noise intermediate resolution stellar spectra proves that systematics in the flux calibration and telluric absorption correction can be pushed down to the (sub-)percent level. My conclusion is that even though substantial effort and time investment is needed to design and develop NIR spectroscopic pipelines for ground based instruments, it will pay off, if done properly, and open new windows of opportunity in the ELT era."
Microcap pharmaceutical firms: linking drug pipelines to market value.
Beach, Robert
2012-01-01
This article examines predictors of the future market value of microcap pharmaceutical companies. This is problematic since the large majority of these firms seldom report positive net income. Their value comes from the potential of a liquidity event such as occurs when a key drug is approved by the FDA. The typical scenario is one in which the company is either acquired by a larger pharmaceutical firm or enters into a joint venture with another pharmaceutical firm. Binary logistic regression is used to determine the impact of the firm's drug treatment pipeline and its investment in research and development on the firm's market cap. Using annual financial data from 2007 through 2010, this study finds that the status of the firm's drug treatment pipeline and its research and development expenses are significant predictors of the firm's future stock value relative to other microcap pharmaceutical firms.
Chen, I-Hsuan; Aguilar, Hillary Andaluz; Paez Paez, J Sebastian; Wu, Xiaofeng; Pan, Li; Wendt, Michael K; Iliuk, Anton B; Zhang, Ying; Tao, W Andy
2018-05-15
Glycoproteins comprise more than half of current FDA-approved protein cancer markers, but the development of new glycoproteins as disease biomarkers has been stagnant. Here we present a pipeline to develop glycoproteins from extracellular vesicles (EVs) through integrating quantitative glycoproteomics with a novel reverse phase glycoprotein array and then apply it to identify novel biomarkers for breast cancer. EV glycoproteomics show promise in circumventing the problems plaguing current serum/plasma glycoproteomics and allowed us to identify hundreds of glycoproteins that have not been identified in blood. We identified 1,453 unique glycopeptides representing 556 glycoproteins in EVs, among which 20 were verified significantly higher in individual breast cancer patients. We further applied a novel glyco-specific reverse phase protein array to quantify a subset of the candidates. Together, this study demonstrates the great potential of this integrated pipeline for biomarker discovery.
Reliability-based management of buried pipelines considering external corrosion defects
NASA Astrophysics Data System (ADS)
Miran, Seyedeh Azadeh
Corrosion is one of the main deteriorating mechanisms that degrade the energy pipeline integrity, due to transferring corrosive fluid or gas and interacting with corrosive environment. Corrosion defects are usually detected by periodical inspections using in-line inspection (ILI) methods. In order to ensure pipeline safety, this study develops a cost-effective maintenance strategy that consists of three aspects: corrosion growth model development using ILI data, time-dependent performance evaluation, and optimal inspection interval determination. In particular, the proposed study is applied to a cathodic protected buried steel pipeline located in Mexico. First, time-dependent power-law formulation is adopted to probabilistically characterize growth of the maximum depth and length of the external corrosion defects. Dependency between defect depth and length are considered in the model development and generation of the corrosion defects over time is characterized by the homogenous Poisson process. The growth models unknown parameters are evaluated based on the ILI data through the Bayesian updating method with Markov Chain Monte Carlo (MCMC) simulation technique. The proposed corrosion growth models can be used when either matched or non-matched defects are available, and have ability to consider newly generated defects since last inspection. Results of this part of study show that both depth and length growth models can predict damage quantities reasonably well and a strong correlation between defect depth and length is found. Next, time-dependent system failure probabilities are evaluated using developed corrosion growth models considering prevailing uncertainties where three failure modes, namely small leak, large leak and rupture are considered. Performance of the pipeline is evaluated through failure probability per km (or called a sub-system) where each subsystem is considered as a series system of detected and newly generated defects within that sub-system. Sensitivity analysis is also performed to determine to which incorporated parameter(s) in the growth models reliability of the studied pipeline is most sensitive. The reliability analysis results suggest that newly generated defects should be considered in calculating failure probability, especially for prediction of long-term performance of the pipeline and also, impact of the statistical uncertainty in the model parameters is significant that should be considered in the reliability analysis. Finally, with the evaluated time-dependent failure probabilities, a life cycle-cost analysis is conducted to determine optimal inspection interval of studied pipeline. The expected total life-cycle costs consists construction cost and expected costs of inspections, repair, and failure. The repair is conducted when failure probability from any described failure mode exceeds pre-defined probability threshold after each inspection. Moreover, this study also investigates impact of repair threshold values and unit costs of inspection and failure on the expected total life-cycle cost and optimal inspection interval through a parametric study. The analysis suggests that a smaller inspection interval leads to higher inspection costs, but can lower failure cost and also repair cost is less significant compared to inspection and failure costs.
Generation of ethylene tracer by noncatalytic pyrolysis of natural gas at elevated pressure
Lu, Y.; Chen, S.; Rostam-Abadi, M.; Ruch, R.; Coleman, D.; Benson, L.J.
2005-01-01
There is a critical need within the pipeline gas industry for an inexpensive and reliable technology to generate an identification tag or tracer that can be added to pipeline gas to identify gas that may escape and improve the deliverability and management of gas in underground storage fields. Ethylene is an ideal tracer, because it does not exist naturally in the pipeline gas, and because its physical properties are similar to the pipeline gas components. A pyrolysis process, known as the Tragen process, has been developed to continuously convert the ???2%-4% ethane component present in pipeline gas into ethylene at common pipeline pressures of 800 psi. In our studies of the Tragen process, pyrolysis without steam addition achieved a maximum ethylene yield of 28%-35% at a temperature range of 700-775 ??C, corresponding to an ethylene concentration of 4600-5800 ppm in the product gas. Coke deposition was determined to occur at a significant rate in the pyrolysis reactor without steam addition. The ?? 13C isotopic analysis of gas components showed a ?? 13C value of ethylene similar to ethane in the pipeline gas, indicating that most of the ethylene was generated from decomposition of the ethane in the raw gas. However, ?? 13C isotopic analysis of the deposited coke showed that coke was primarily produced from methane, rather than from ethane or other heavier hydrocarbons. No coke deposition was observed with the addition of steam at concentrations of > 20% by volume. The dilution with steam also improved the ethylene yield. ?? 2005 American Chemical Society.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-21
... Registry of Pipeline and Liquefied Natural Gas Operators AGENCY: Pipeline and Hazardous Materials Safety... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration 49 CFR Parts... Register (75 FR 72878) titled: ``Pipeline Safety: Updates to Pipeline and Liquefied Natural Gas Reporting...
Aerial image databases for pipeline rights-of-way management
NASA Astrophysics Data System (ADS)
Jadkowski, Mark A.
1996-03-01
Pipeline companies that own and manage extensive rights-of-way corridors are faced with ever-increasing regulatory pressures, operating issues, and the need to remain competitive in today's marketplace. Automation has long been an answer to the problem of having to do more work with less people, and Automated Mapping/Facilities Management/Geographic Information Systems (AM/FM/GIS) solutions have been implemented at several pipeline companies. Until recently, the ability to cost-effectively acquire and incorporate up-to-date aerial imagery into these computerized systems has been out of the reach of most users. NASA's Earth Observations Commercial Applications Program (EOCAP) is providing a means by which pipeline companies can bridge this gap. The EOCAP project described in this paper includes a unique partnership with NASA and James W. Sewall Company to develop an aircraft-mounted digital camera system and a ground-based computer system to geometrically correct and efficiently store and handle the digital aerial images in an AM/FM/GIS environment. This paper provides a synopsis of the project, including details on (1) the need for aerial imagery, (2) NASA's interest and role in the project, (3) the design of a Digital Aerial Rights-of-Way Monitoring System, (4) image georeferencing strategies for pipeline applications, and (5) commercialization of the EOCAP technology through a prototype project at Algonquin Gas Transmission Company which operates major gas pipelines in New England, New York, and New Jersey.
Force of resistance to pipeline pulling in plane and volumetrically curved wells
NASA Astrophysics Data System (ADS)
Toropov, V. S.; Toropov, S. Yu; Toropov, E. S.
2018-05-01
A method has been developed for calculating the component of the pulling force of a pipeline, arising from the well curvature in one or several planes, with the assumption that the pipeline is ballasted by filling with water or otherwise until zero buoyancy in the drilling mud is reached. This paper shows that when calculating this force, one can neglect the effect of sections with zero curvature. In the other case, if buoyancy of the pipeline is other than zero, the resistance force in the curvilinear sections should be calculated taking into account the difference between the normal components of the buoyancy force and weight. In the paper, it is proved that without taking into account resistance forces from the viscosity of the drilling mud, if buoyancy of the pipeline is zero, the total resistance force is independent of the length of the pipe and is determined by the angle equal to the sum of the entry angle and the exit angle of the pipeline to the day surface. For the case of the well curvature in several planes, it is proposed to perform the calculation of such volumetrically curved well by the central angle of the well profile. Analytical dependences are obtained that allow calculating the pulling force for well profiles with a variable curvature radius, i.e. at different angles of deviation between the drill pipes along the well profile.
Deficiencies in drinking water distribution systems in developing countries.
Lee, Ellen J; Schwab, Kellogg J
2005-06-01
Rapidly growing populations and migration to urban areas in developing countries has resulted in a vital need for the establishment of centralized water systems to disseminate potable water to residents. Protected source water and modern, well-maintained drinking water treatment plants can provide water adequate for human consumption. However, ageing, stressed or poorly maintained distribution systems can cause the quality of piped drinking water to deteriorate below acceptable levels and pose serious health risks. This review will outline distribution system deficiencies in developing countries caused by: the failure to disinfect water or maintain a proper disinfection residual; low pipeline water pressure; intermittent service; excessive network leakages; corrosion of parts; inadequate sewage disposal; and inequitable pricing and usage of water. Through improved research, monitoring and surveillance, increased understanding of distribution system deficiencies may focus limited resources on key areas in an effort to improve public health and decrease global disease burden.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-22
... interconnect pipelines to four existing offshore pipelines (Dauphin Natural Gas Pipeline, Williams Natural Gas Pipeline, Destin Natural Gas Pipeline, and Viosca Knoll Gathering System [VKGS] Gas Pipeline) that connect to the onshore natural gas transmission pipeline system. Natural gas would be delivered to customers...
Changes in the Arctic: Background and Issues for Congress
2016-05-12
discovery of new oil and gas deposits far from existing storage, pipelines , and shipping facilities cannot be developed until infrastructure is built...markets. Other questions in need of answers include the status of port, pipeline , and liquid natural gas infrastructure; whether methane hydrates...Changes to the Arctic brought about by warming temperatures will likely allow more exploration for oil, gas , and minerals. Warming that causes
A Systolic VLSI Design of a Pipeline Reed-solomon Decoder
NASA Technical Reports Server (NTRS)
Shao, H. M.; Truong, T. K.; Deutsch, L. J.; Yuen, J. H.; Reed, I. S.
1984-01-01
A pipeline structure of a transform decoder similar to a systolic array was developed to decode Reed-Solomon (RS) codes. An important ingredient of this design is a modified Euclidean algorithm for computing the error locator polynomial. The computation of inverse field elements is completely avoided in this modification of Euclid's algorithm. The new decoder is regular and simple, and naturally suitable for VLSI implementation.
A VLSI design of a pipeline Reed-Solomon decoder
NASA Technical Reports Server (NTRS)
Shao, H. M.; Truong, T. K.; Deutsch, L. J.; Yuen, J. H.; Reed, I. S.
1985-01-01
A pipeline structure of a transform decoder similar to a systolic array was developed to decode Reed-Solomon (RS) codes. An important ingredient of this design is a modified Euclidean algorithm for computing the error locator polynomial. The computation of inverse field elements is completely avoided in this modification of Euclid's algorithm. The new decoder is regular and simple, and naturally suitable for VLSI implementation.
An Automated, Adaptive Framework for Optimizing Preprocessing Pipelines in Task-Based Functional MRI
Churchill, Nathan W.; Spring, Robyn; Afshin-Pour, Babak; Dong, Fan; Strother, Stephen C.
2015-01-01
BOLD fMRI is sensitive to blood-oxygenation changes correlated with brain function; however, it is limited by relatively weak signal and significant noise confounds. Many preprocessing algorithms have been developed to control noise and improve signal detection in fMRI. Although the chosen set of preprocessing and analysis steps (the “pipeline”) significantly affects signal detection, pipelines are rarely quantitatively validated in the neuroimaging literature, due to complex preprocessing interactions. This paper outlines and validates an adaptive resampling framework for evaluating and optimizing preprocessing choices by optimizing data-driven metrics of task prediction and spatial reproducibility. Compared to standard “fixed” preprocessing pipelines, this optimization approach significantly improves independent validation measures of within-subject test-retest, and between-subject activation overlap, and behavioural prediction accuracy. We demonstrate that preprocessing choices function as implicit model regularizers, and that improvements due to pipeline optimization generalize across a range of simple to complex experimental tasks and analysis models. Results are shown for brief scanning sessions (<3 minutes each), demonstrating that with pipeline optimization, it is possible to obtain reliable results and brain-behaviour correlations in relatively small datasets. PMID:26161667
Characteristics of vibrational wave propagation and attenuation in submarine fluid-filled pipelines
NASA Astrophysics Data System (ADS)
Yan, Jin; Zhang, Juan
2015-04-01
As an important part of lifeline engineering in the development and utilization of marine resources, the submarine fluid-filled pipeline is a complex coupling system which is subjected to both internal and external flow fields. By utilizing Kennard's shell equations and combining with Helmholtz equations of flow field, the coupling equations of submarine fluid-filled pipeline for n=0 axisymmetrical wave motion are set up. Analytical expressions of wave speed are obtained for both s=1 and s=2 waves, which correspond to a fluid-dominated wave and an axial shell wave, respectively. The numerical results for wave speed and wave attenuation are obtained and discussed subsequently. It shows that the frequency depends on phase velocity, and the attenuation of this mode depends strongly on material parameters of the pipe and the internal and the external fluid fields. The characteristics of PVC pipe are studied for a comparison. The effects of shell thickness/radius ratio and density of the contained fluid on the model are also discussed. The study provides a theoretical basis and helps to accurately predict the situation of submarine pipelines, which also has practical application prospect in the field of pipeline leakage detection.
Improving the result of forcasting using reservoir and surface network simulation
NASA Astrophysics Data System (ADS)
Hendri, R. S.; Winarta, J.
2018-01-01
This study was aimed to get more representative results in production forcasting using integrated simulation in pipeline gathering system of X field. There are 5 main scenarios which consist of the production forecast of the existing condition, work over, and infill drilling. Then, it’s determined the best development scenario. The methods of this study is Integrated Reservoir Simulator and Pipeline Simulator so-calle as Integrated Reservoir and Surface Network Simulation. After well data result from reservoir simulator was then integrated with pipeline networking simulator’s to construct a new schedule, which was input for all simulation procedure. The well design result was done by well modeling simulator then exported into pipeline simulator. Reservoir prediction depends on the minimum value of Tubing Head Pressure (THP) for each well, where the pressure drop on the Gathering Network is not necessary calculated. The same scenario was done also for the single-reservoir simulation. Integration Simulation produces results approaching the actual condition of the reservoir and was confirmed by the THP profile, which difference between those two methods. The difference between integrated simulation compared to single-modeling simulation is 6-9%. The aimed of solving back-pressure problem in pipeline gathering system of X field is achieved.
The visual and radiological inspection of a pipeline using a teleoperated pipe crawler
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fogle, R.F.; Kuelske, K.; Kellner, R.
1995-01-01
In the 1950s, the Savannah River Site built an open, unlined retention basin to temporarily store potentially radionuclide contaminated cooling water from a chemical separations process and storm water drainage from a nearby waste management facility that stored large quantities of nuclear fission byproducts in carbon steel tanks. The retention basin was retired from service in 1972 when a new, lined basin was completed. In 1978, the old retention basin was excavated, backfilled with uncontaminated dirt, and covered with grass. At the same time, much of the underground process pipeline leading to the basin was abandoned. Since the closure ofmore » the retention basin, new environmental regulations require that the basin undergo further assessment to determine whether additional remediation is required. A visual and radiological inspection of the pipeline was necessary to aid in the remediation decision making process for the retention basin system. A teleoperated pipe crawler inspection system was developed to survey the abandoned sections of underground pipelines leading to the retired retention basin. This paper will describe the background to this project, the scope of the investigation, the equipment requirements, and the results of the pipeline inspection.« less
The inspection of a radiologically contaminated pipeline using a teleoperated pipe crawler
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fogle, R.F.; Kuelske, K.; Kellner, R.A.
1995-08-01
In the 1950s, the Savannah River Site built an open, unlined retention basin to temporarily store potentially radionuclide contaminated cooling water from a chemical separations process and storm water drainage from a nearby waste management facility that stored large quantities of nuclear fission byproducts in carbon steel tanks. The retention basin was retired from service in 1972 when a new, lined basin was completed. In 1978, the old retention basin was excavated, backfilled with uncontaminated dirt, and covered with grass. At the same time, much of the underground process pipeline leading to the basin was abandoned. Since the closure ofmore » the retention basin, new environmental regulations require that the basin undergo further assessment to determine whether additional remediation is required. A visual and radiological inspection of the pipeline was necessary to aid in the remediation decision making process for the retention basin system. A teleoperated pipe crawler inspection system was developed to survey the abandoned sections of underground pipelines leading to the retired retention basin. This paper will describe the background to this project, the scope of the investigation, the equipment requirements, and the results of the pipeline inspection.« less
Rapid Processing of Radio Interferometer Data for Transient Surveys
NASA Astrophysics Data System (ADS)
Bourke, S.; Mooley, K.; Hallinan, G.
2014-05-01
We report on a software infrastructure and pipeline developed to process large radio interferometer datasets. The pipeline is implemented using a radical redesign of the AIPS processing model. An infrastructure we have named AIPSlite is used to spawn, at runtime, minimal AIPS environments across a cluster. The pipeline then distributes and processes its data in parallel. The system is entirely free of the traditional AIPS distribution and is self configuring at runtime. This software has so far been used to process a EVLA Stripe 82 transient survey, the data for the JVLA-COSMOS project, and has been used to process most of the EVLA L-Band data archive imaging each integration to search for short duration transients.
Pipeline inspection using an autonomous underwater vehicle
DOE Office of Scientific and Technical Information (OSTI.GOV)
Egeskov, P.; Bech, M.; Bowley, R.
1995-12-31
Pipeline inspection can be carried out by means of small Autonomous Underwater Vehicles (AUVs), operating either with a control link to a surface vessel, or totally independently. The AUV offers an attractive alternative to conventional inspection methods where Remotely Operated Vehicles (ROVs) or paravanes are used. A flatfish type AUV ``MARTIN`` (Marine Tool for Inspection) has been developed for this purpose. The paper describes the proposed types of inspection jobs to be carried out by ``MARTIN``. The design and construction of the vessel, its hydrodynamic properties, its propulsion and control systems are discussed. The pipeline tracking and survey systems, asmore » well as the launch and recovery systems are described.« less
Employing Machine-Learning Methods to Study Young Stellar Objects
NASA Astrophysics Data System (ADS)
Moore, Nicholas
2018-01-01
Vast amounts of data exist in the astronomical data archives, and yet a large number of sources remain unclassified. We developed a multi-wavelength pipeline to classify infrared sources. The pipeline uses supervised machine learning methods to classify objects into the appropriate categories. The program is fed data that is already classified to train it, and is then applied to unknown catalogues. The primary use for such a pipeline is the rapid classification and cataloging of data that would take a much longer time to classify otherwise. While our primary goal is to study young stellar objects (YSOs), the applications extend beyond the scope of this project. We present preliminary results from our analysis and discuss future applications.
Analysis pipelines and packages for Infinium HumanMethylation450 BeadChip (450k) data.
Morris, Tiffany J; Beck, Stephan
2015-01-15
The Illumina HumanMethylation450 BeadChip has become a popular platform for interrogating DNA methylation in epigenome-wide association studies (EWAS) and related projects as well as resource efforts such as the International Cancer Genome Consortium (ICGC) and the International Human Epigenome Consortium (IHEC). This has resulted in an exponential increase of 450k data in recent years and triggered the development of numerous integrated analysis pipelines and stand-alone packages. This review will introduce and discuss the currently most popular pipelines and packages and is particularly aimed at new 450k users. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-22
... natural gas pipelines, the Midwestern Gas Transmission line (3 miles distant) and/or the ANR Pipeline (4.5... Pipeline, Boardwalk/Texas Gas Pipeline, Shell/Capline Oil Pipeline, Panhandle/Trunkline Gas Pipeline, and... Rockport, IN, and CO 2 Pipeline; Conduct Additional Public Scoping Meetings; and Issue a Notice of...
Capsule injection system for a hydraulic capsule pipelining system
Liu, Henry
1982-01-01
An injection system for injecting capsules into a hydraulic capsule pipelining system, the pipelining system comprising a pipeline adapted for flow of a carrier liquid therethrough, and capsules adapted to be transported through the pipeline by the carrier liquid flowing through the pipeline. The injection system comprises a reservoir of carrier liquid, the pipeline extending within the reservoir and extending downstream out of the reservoir, and a magazine in the reservoir for holding capsules in a series, one above another, for injection into the pipeline in the reservoir. The magazine has a lower end in communication with the pipeline in the reservoir for delivery of capsules from the magazine into the pipeline.
Implementation of quality by design toward processing of food products.
Rathore, Anurag S; Kapoor, Gautam
2017-05-28
Quality by design (QbD) is a systematic approach that begins with predefined objectives and emphasizes product and process understanding and process control. It is an approach based on principles of sound science and quality risk management. As the food processing industry continues to embrace the idea of in-line, online, and/or at-line sensors and real-time characterization for process monitoring and control, the existing gaps with regard to our ability to monitor multiple parameters/variables associated with the manufacturing process will be alleviated over time. Investments made for development of tools and approaches that facilitate high-throughput analytical and process development, process analytical technology, design of experiments, risk analysis, knowledge management, and enhancement of process/product understanding would pave way for operational and economic benefits later in the commercialization process and across other product pipelines. This article aims to achieve two major objectives. First, to review the progress that has been made in the recent years on the topic of QbD implementation in processing of food products and second, present a case study that illustrates benefits of such QbD implementation.
Lopez-Doriga, Adriana; Feliubadaló, Lídia; Menéndez, Mireia; Lopez-Doriga, Sergio; Morón-Duran, Francisco D; del Valle, Jesús; Tornero, Eva; Montes, Eva; Cuesta, Raquel; Campos, Olga; Gómez, Carolina; Pineda, Marta; González, Sara; Moreno, Victor; Capellá, Gabriel; Lázaro, Conxi
2014-03-01
Next-generation sequencing (NGS) has revolutionized genomic research and is set to have a major impact on genetic diagnostics thanks to the advent of benchtop sequencers and flexible kits for targeted libraries. Among the main hurdles in NGS are the difficulty of performing bioinformatic analysis of the huge volume of data generated and the high number of false positive calls that could be obtained, depending on the NGS technology and the analysis pipeline. Here, we present the development of a free and user-friendly Web data analysis tool that detects and filters sequence variants, provides coverage information, and allows the user to customize some basic parameters. The tool has been developed to provide accurate genetic analysis of targeted sequencing of common high-risk hereditary cancer genes using amplicon libraries run in a GS Junior System. The Web resource is linked to our own mutation database, to assist in the clinical classification of identified variants. We believe that this tool will greatly facilitate the use of the NGS approach in routine laboratories.
Program for At-Risk Students Helps College, Too
ERIC Educational Resources Information Center
Carlson, Scott
2012-01-01
The author introduces a new program that brings city kids who really need college to a private rural campus that really needs kids. Under the program, called Pipelines Into Partnership, a handful of urban high schools and community organizations--the groups that know their kids beyond the black and white of their transcripts--determine which…
Antimicrobial Resistance in the Environment.
Waseem, Hassan; Williams, Maggie R; Stedtfeld, Robert D; Hashsham, Syed A
2017-10-01
This review summarizes selected publications of 2016 with emphasis on occurrence and treatment of antibiotic resistance genes and bacteria in the aquatic environment and wastewater and drinking water treatment plants. The review is conducted with emphasis on fate, modeling, risk assessment and data analysis methodologies for characterizing abundance. After providing a brief introduction, the review is divided into the following four sections: i) Occurrence of AMR in the Environment, ii) Treatment Technologies for AMR, iii) Modeling of Fate, Risk, and Environmental Impact of AMR, and iv) ARG Databases and Pipelines.
76 FR 73570 - Pipeline Safety: Miscellaneous Changes to Pipeline Safety Regulations
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-29
... pipeline facilities to facilitate the removal of liquids and other materials from the gas stream. These... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration 49 CFR Parts... Changes to Pipeline Safety Regulations AGENCY: Pipeline and Hazardous Materials Safety Administration...
49 CFR 195.210 - Pipeline location.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 49 Transportation 3 2010-10-01 2010-10-01 false Pipeline location. 195.210 Section 195.210 Transportation Other Regulations Relating to Transportation (Continued) PIPELINE AND HAZARDOUS MATERIALS SAFETY... PIPELINE Construction § 195.210 Pipeline location. (a) Pipeline right-of-way must be selected to avoid, as...
77 FR 61825 - Pipeline Safety: Notice of Public Meeting on Pipeline Data
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-11
... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket ID... program performance measures for gas distribution, gas transmission, and hazardous liquids pipelines. The... distribution pipelines (49 CFR 192.1007(e)), gas transmission pipelines (49 CFR 192.945) and hazardous liquids...
78 FR 41991 - Pipeline Safety: Potential for Damage to Pipeline Facilities Caused by Flooding
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-12
... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket No...: Pipeline and Hazardous Materials Safety Administration (PHMSA); DOT. ACTION: Notice; Issuance of Advisory... Gas and Hazardous Liquid Pipeline Systems. Subject: Potential for Damage to Pipeline Facilities Caused...
78 FR 41496 - Pipeline Safety: Meetings of the Gas and Liquid Pipeline Advisory Committees
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-10
... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket No. PHMSA-2013-0156] Pipeline Safety: Meetings of the Gas and Liquid Pipeline Advisory Committees AGENCY: Pipeline and Hazardous Materials Safety Administration (PHMSA), DOT. ACTION: Notice of advisory committee...
76 FR 70953 - Pipeline Safety: Safety of Gas Transmission Pipelines
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-16
... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration 49 CFR Part 192 [Docket ID PHMSA-2011-0023] RIN 2137-AE72 Pipeline Safety: Safety of Gas Transmission Pipelines AGENCY: Pipeline and Hazardous Materials Safety Administration (PHMSA); DOT. ACTION: Advance notice of...
Brown, David K; Penkler, David L; Musyoka, Thommas M; Bishop, Özlem Tastan
2015-01-01
Complex computational pipelines are becoming a staple of modern scientific research. Often these pipelines are resource intensive and require days of computing time. In such cases, it makes sense to run them over high performance computing (HPC) clusters where they can take advantage of the aggregated resources of many powerful computers. In addition to this, researchers often want to integrate their workflows into their own web servers. In these cases, software is needed to manage the submission of jobs from the web interface to the cluster and then return the results once the job has finished executing. We have developed the Job Management System (JMS), a workflow management system and web interface for high performance computing (HPC). JMS provides users with a user-friendly web interface for creating complex workflows with multiple stages. It integrates this workflow functionality with the resource manager, a tool that is used to control and manage batch jobs on HPC clusters. As such, JMS combines workflow management functionality with cluster administration functionality. In addition, JMS provides developer tools including a code editor and the ability to version tools and scripts. JMS can be used by researchers from any field to build and run complex computational pipelines and provides functionality to include these pipelines in external interfaces. JMS is currently being used to house a number of bioinformatics pipelines at the Research Unit in Bioinformatics (RUBi) at Rhodes University. JMS is an open-source project and is freely available at https://github.com/RUBi-ZA/JMS.
Brown, David K.; Penkler, David L.; Musyoka, Thommas M.; Bishop, Özlem Tastan
2015-01-01
Complex computational pipelines are becoming a staple of modern scientific research. Often these pipelines are resource intensive and require days of computing time. In such cases, it makes sense to run them over high performance computing (HPC) clusters where they can take advantage of the aggregated resources of many powerful computers. In addition to this, researchers often want to integrate their workflows into their own web servers. In these cases, software is needed to manage the submission of jobs from the web interface to the cluster and then return the results once the job has finished executing. We have developed the Job Management System (JMS), a workflow management system and web interface for high performance computing (HPC). JMS provides users with a user-friendly web interface for creating complex workflows with multiple stages. It integrates this workflow functionality with the resource manager, a tool that is used to control and manage batch jobs on HPC clusters. As such, JMS combines workflow management functionality with cluster administration functionality. In addition, JMS provides developer tools including a code editor and the ability to version tools and scripts. JMS can be used by researchers from any field to build and run complex computational pipelines and provides functionality to include these pipelines in external interfaces. JMS is currently being used to house a number of bioinformatics pipelines at the Research Unit in Bioinformatics (RUBi) at Rhodes University. JMS is an open-source project and is freely available at https://github.com/RUBi-ZA/JMS. PMID:26280450
Submarine pipeline on-bottom stability. Volume 2: Software and manuals
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1998-12-01
The state-of-the-art in pipeline stability design has been changing very rapidly recent. The physics governing on-bottom stability are much better understood now than they were eight years. This is due largely because of research and large scale model tests sponsored by PRCI. Analysis tools utilizing this new knowledge have been developed. These tools provide the design engineer with a rational approach have been developed. These tools provide the design engineer with a rational approach for weight coating design, which he can use with confidence because the tools have been developed based on full scale and near full scale model tests.more » These tools represent the state-of-the-art in stability design and model the complex behavior of pipes subjected to both wave and current loads. These include: hydrodynamic forces which account for the effect of the wake (generated by flow over the pipe) washing back and forth over the pipe in oscillatory flow; and the embedment (digging) which occurs as a pipe resting on the seabed is exposed to oscillatory loadings and small oscillatory deflections. This report has been developed as a reference handbook for use in on-bottom pipeline stability analysis It consists of two volumes. Volume one is devoted descriptions of the various aspects of the problem: the pipeline design process; ocean physics, wave mechanics, hydrodynamic forces, and meteorological data determination; geotechnical data collection and soil mechanics; and stability design procedures. Volume two describes, lists, and illustrates the analysis software. Diskettes containing the software and examples of the software are also included in Volume two.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
El-Jundi, I.M.
The Qatar NGL-2 plant, commissioned in December 1979, was designed to process the associated gas from the offshore crude oil fields of Qatar. The dehydrated, sour, lean gas and wet, sour liquids are transported by two separate lines to the Umm Said NGL complex about 120 km (75 miles) from the central offshore station. The 300-mm (12-in.) -diameter liquids line has suffered general pitting corrosion, and the 600-mm (24-in.) -diameter lean gas line has suffered corrosion and extensive hydrogen-induced cracking (HIC or HIPC). Neither line performed to its design parameters, and many problems in the downstream facilities have been experienced.more » All efforts to clean the solids (debris) from the liquids lines have failed. This in turn interfered with the planned corrosion control program, thus allowing corrosion to continue. Various specialists have investigated the lines in an attempt to find the origin of the solids and to recommend necessary remedial actions. Should the lines fail from pitting corrosion, the effect of a leak at a pressure of about 11 000 kPa (1,595 psi) will be very dangerous, especially if it occurs onshore. To protect the NGL-2 operations against possible risks - both in terms of safety and of losses in revenue - critical sections of the pipelines have been replaced, and all gas liquids pipelines will be replaced soon. Supplementary documents to the API standards were prepared for the replaced pipelines.« less
NASA Astrophysics Data System (ADS)
Mukhopadhyay, Sayak; Saha, Rohini; Palanisamy, Anbarasi; Ghosh, Madhurima; Biswas, Anupriya; Roy, Saheli; Pal, Arijit; Sarkar, Kathakali; Bagh, Sangram
2016-05-01
Microgravity is a prominent health hazard for astronauts, yet we understand little about its effect at the molecular systems level. In this study, we have integrated a set of systems-biology tools and databases and have analysed more than 8000 molecular pathways on published global gene expression datasets of human cells in microgravity. Hundreds of new pathways have been identified with statistical confidence for each dataset and despite the difference in cell types and experiments, around 100 of the new pathways are appeared common across the datasets. They are related to reduced inflammation, autoimmunity, diabetes and asthma. We have identified downregulation of NfκB pathway via Notch1 signalling as new pathway for reduced immunity in microgravity. Induction of few cancer types including liver cancer and leukaemia and increased drug response to cancer in microgravity are also found. Increase in olfactory signal transduction is also identified. Genes, based on their expression pattern, are clustered and mathematically stable clusters are identified. The network mapping of genes within a cluster indicates the plausible functional connections in microgravity. This pipeline gives a new systems level picture of human cells under microgravity, generates testable hypothesis and may help estimating risk and developing medicine for space missions.
Schuhmacher, Alexander; Gassmann, Oliver; McCracken, Nigel; Hinder, Markus
2018-05-08
Historically, research and development (R&D) in the pharmaceutical sector has predominantly been an in-house activity. To enable investments for game changing late-stage assets and to enable better and less costly go/no-go decisions, most companies have employed a fail early paradigm through the implementation of clinical proof-of-concept organizations. To fuel their pipelines, some pioneers started to complement their internal R&D efforts through collaborations as early as the 1990s. In recent years, multiple extrinsic and intrinsic factors induced an opening for external sources of innovation and resulted in new models for open innovation, such as open sourcing, crowdsourcing, public-private partnerships, innovations centres, and the virtualization of R&D. Three factors seem to determine the breadth and depth regarding how companies approach external innovation: (1) the company's legacy, (2) the company's willingness and ability to take risks and (3) the company's need to control IP and competitors. In addition, these factors often constitute the major hurdles to effectively leveraging external opportunities and assets. Conscious and differential choices of the R&D and business models for different companies and different divisions in the same company seem to best allow a company to fully exploit the potential of both internal and external innovations.
Mukhopadhyay, Sayak; Saha, Rohini; Palanisamy, Anbarasi; Ghosh, Madhurima; Biswas, Anupriya; Roy, Saheli; Pal, Arijit; Sarkar, Kathakali; Bagh, Sangram
2016-05-17
Microgravity is a prominent health hazard for astronauts, yet we understand little about its effect at the molecular systems level. In this study, we have integrated a set of systems-biology tools and databases and have analysed more than 8000 molecular pathways on published global gene expression datasets of human cells in microgravity. Hundreds of new pathways have been identified with statistical confidence for each dataset and despite the difference in cell types and experiments, around 100 of the new pathways are appeared common across the datasets. They are related to reduced inflammation, autoimmunity, diabetes and asthma. We have identified downregulation of NfκB pathway via Notch1 signalling as new pathway for reduced immunity in microgravity. Induction of few cancer types including liver cancer and leukaemia and increased drug response to cancer in microgravity are also found. Increase in olfactory signal transduction is also identified. Genes, based on their expression pattern, are clustered and mathematically stable clusters are identified. The network mapping of genes within a cluster indicates the plausible functional connections in microgravity. This pipeline gives a new systems level picture of human cells under microgravity, generates testable hypothesis and may help estimating risk and developing medicine for space missions.
iGAS: A framework for using electronic intraoperative medical records for genomic discovery.
Levin, Matthew A; Joseph, Thomas T; Jeff, Janina M; Nadukuru, Rajiv; Ellis, Stephen B; Bottinger, Erwin P; Kenny, Eimear E
2017-03-01
Design and implement a HIPAA and Integrating the Healthcare Enterprise (IHE) profile compliant automated pipeline, the integrated Genomics Anesthesia System (iGAS), linking genomic data from the Mount Sinai Health System (MSHS) BioMe biobank to electronic anesthesia records, including physiological data collected during the perioperative period. The resulting repository of multi-dimensional data can be used for precision medicine analysis of physiological readouts, acute medical conditions, and adverse events that can occur during surgery. A structured pipeline was developed atop our existing anesthesia data warehouse using open-source tools. The pipeline is automated using scheduled tasks. The pipeline runs weekly, and finds and identifies all new and existing anesthetic records for BioMe participants. The pipeline went live in June 2015 with 49.2% (n=15,673) of BioMe participants linked to 40,947 anesthetics. The pipeline runs weekly in minimal time. After eighteen months, an additional 3671 participants were enrolled in BioMe and the number of matched anesthetic records grew 21% to 49,545. Overall percentage of BioMe patients with anesthetics remained similar at 51.1% (n=18,128). Seven patients opted out during this time. The median number of anesthetics per participant was 2 (range 1-144). Collectively, there were over 35 million physiologic data points and 480,000 medication administrations linked to genomic data. To date, two projects are using the pipeline at MSHS. Automated integration of biobank and anesthetic data sources is feasible and practical. This integration enables large-scale genomic analyses that might inform variable physiological response to anesthetic and surgical stress, and examine genetic factors underlying adverse outcomes during and after surgery. Copyright © 2017 Elsevier Inc. All rights reserved.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-01-13
... Natural Gas Operators AGENCY: Pipeline and Hazardous Materials Safety Administration (PHMSA), DOT. ACTION... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket No...: ``Pipeline Safety: Updates to Pipeline and Liquefied Natural Gas Reporting Requirements.'' The final rule...
76 FR 303 - Pipeline Safety: Safety of On-Shore Hazardous Liquid Pipelines
Federal Register 2010, 2011, 2012, 2013, 2014
2011-01-04
... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration 49 CFR Part 195 [Docket ID PHMSA-2010-0229] RIN 2137-AE66 Pipeline Safety: Safety of On-Shore Hazardous Liquid Pipelines AGENCY: Pipeline and Hazardous Materials Safety Administration (PHMSA), DOT. ACTION: Notice of...
PyEmir: Data Reduction Pipeline for EMIR, the GTC Near-IR Multi-Object Spectrograph
NASA Astrophysics Data System (ADS)
Pascual, S.; Gallego, J.; Cardiel, N.; Eliche-Moral, M. C.
2010-12-01
EMIR is the near-infrared wide-field camera and multi-slit spectrograph being built for Gran Telescopio Canarias. We present here the work being done on its data processing pipeline. PyEmir is based on Python and it will process automatically data taken in both imaging and spectroscopy mode. PyEmir is begin developed by the UCM Group of Extragalactic Astrophysics and Astronomical Instrumentation.
Reanalysis of 24 Nearby Open Clusters using Gaia data
NASA Astrophysics Data System (ADS)
Yen, Steffi X.; Reffert, Sabine; Röser, Siegfried; Schilbach, Elena; Kharchenko, Nina V.; Piskunov, Anatoly E.
2018-04-01
We have developed a fully automated cluster characterization pipeline, which simultaneously determines cluster membership and fits the fundamental cluster parameters: distance, reddening, and age. We present results for 24 established clusters and compare them to literature values. Given the large amount of stellar data for clusters available from Gaia DR2 in 2018, this pipeline will be beneficial to analyzing the parameters of open clusters in our Galaxy.
A Pipeline Software Architecture for NMR Spectrum Data Translation
Ellis, Heidi J.C.; Weatherby, Gerard; Nowling, Ronald J.; Vyas, Jay; Fenwick, Matthew; Gryk, Michael R.
2012-01-01
The problem of formatting data so that it conforms to the required input for scientific data processing tools pervades scientific computing. The CONNecticut Joint University Research Group (CONNJUR) has developed a data translation tool based on a pipeline architecture that partially solves this problem. The CONNJUR Spectrum Translator supports data format translation for experiments that use Nuclear Magnetic Resonance to determine the structure of large protein molecules. PMID:24634607
MetaDB a Data Processing Workflow in Untargeted MS-Based Metabolomics Experiments.
Franceschi, Pietro; Mylonas, Roman; Shahaf, Nir; Scholz, Matthias; Arapitsas, Panagiotis; Masuero, Domenico; Weingart, Georg; Carlin, Silvia; Vrhovsek, Urska; Mattivi, Fulvio; Wehrens, Ron
2014-01-01
Due to their sensitivity and speed, mass-spectrometry based analytical technologies are widely used to in metabolomics to characterize biological phenomena. To address issues like metadata organization, quality assessment, data processing, data storage, and, finally, submission to public repositories, bioinformatic pipelines of a non-interactive nature are often employed, complementing the interactive software used for initial inspection and visualization of the data. These pipelines often are created as open-source software allowing the complete and exhaustive documentation of each step, ensuring the reproducibility of the analysis of extensive and often expensive experiments. In this paper, we will review the major steps which constitute such a data processing pipeline, discussing them in the context of an open-source software for untargeted MS-based metabolomics experiments recently developed at our institute. The software has been developed by integrating our metaMS R package with a user-friendly web-based application written in Grails. MetaMS takes care of data pre-processing and annotation, while the interface deals with the creation of the sample lists, the organization of the data storage, and the generation of survey plots for quality assessment. Experimental and biological metadata are stored in the ISA-Tab format making the proposed pipeline fully integrated with the Metabolights framework.
NASA Astrophysics Data System (ADS)
Jones, Christopher F.
2009-12-01
Coal canals, oil pipelines, and electricity transmission wires transformed the built environment of the American mid-Atlantic region between 1820 and 1930. By transporting coal, oil, and electrons cheaply, reliably, and in great quantities, these technologies reshaped the energy choices available to mid-Atlantic residents. In particular, canals, pipelines, and wires created new energy landscapes: systems of transport infrastructure that enabled the ever-increasing consumption of fossil fuels. Energy Landscapes integrates history of technology, environmental history, and business history to provide new perspectives on how Americans began to use fossil fuels and the social implications of these practices. First, I argue that the development of transport infrastructure played critical, and underappreciated, roles in shaping social energy choices. Rather than simply responding passively to the needs of producers and consumers, canals, pipelines, and wires structured how, when, where, and in what quantities energy was used. Second, I analyze the ways fossil fuel consumption transformed the society, economy, and environment of the mid-Atlantic. I link the consumption of coal, oil, and electricity to the development of an urban and industrialized region, the transition from an organic to a mineral economy, and the creation of a society dependent on fossil fuel energy.
Machine-learning-based Brokers for Real-time Classification of the LSST Alert Stream
NASA Astrophysics Data System (ADS)
Narayan, Gautham; Zaidi, Tayeb; Soraisam, Monika D.; Wang, Zhe; Lochner, Michelle; Matheson, Thomas; Saha, Abhijit; Yang, Shuo; Zhao, Zhenge; Kececioglu, John; Scheidegger, Carlos; Snodgrass, Richard T.; Axelrod, Tim; Jenness, Tim; Maier, Robert S.; Ridgway, Stephen T.; Seaman, Robert L.; Evans, Eric Michael; Singh, Navdeep; Taylor, Clark; Toeniskoetter, Jackson; Welch, Eric; Zhu, Songzhe; The ANTARES Collaboration
2018-05-01
The unprecedented volume and rate of transient events that will be discovered by the Large Synoptic Survey Telescope (LSST) demand that the astronomical community update its follow-up paradigm. Alert-brokers—automated software system to sift through, characterize, annotate, and prioritize events for follow-up—will be critical tools for managing alert streams in the LSST era. The Arizona-NOAO Temporal Analysis and Response to Events System (ANTARES) is one such broker. In this work, we develop a machine learning pipeline to characterize and classify variable and transient sources only using the available multiband optical photometry. We describe three illustrative stages of the pipeline, serving the three goals of early, intermediate, and retrospective classification of alerts. The first takes the form of variable versus transient categorization, the second a multiclass typing of the combined variable and transient data set, and the third a purity-driven subtyping of a transient class. Although several similar algorithms have proven themselves in simulations, we validate their performance on real observations for the first time. We quantitatively evaluate our pipeline on sparse, unevenly sampled, heteroskedastic data from various existing observational campaigns, and demonstrate very competitive classification performance. We describe our progress toward adapting the pipeline developed in this work into a real-time broker working on live alert streams from time-domain surveys.
NASA Astrophysics Data System (ADS)
Kuckein, C.; Denker, C.; Verma, M.; Balthasar, H.; González Manrique, S. J.; Louis, R. E.; Diercke, A.
2017-10-01
A huge amount of data has been acquired with the GREGOR Fabry-Pérot Interferometer (GFPI), large-format facility cameras, and since 2016 with the High-resolution Fast Imager (HiFI). These data are processed in standardized procedures with the aim of providing science-ready data for the solar physics community. For this purpose, we have developed a user-friendly data reduction pipeline called ``sTools'' based on the Interactive Data Language (IDL) and licensed under creative commons license. The pipeline delivers reduced and image-reconstructed data with a minimum of user interaction. Furthermore, quick-look data are generated as well as a webpage with an overview of the observations and their statistics. All the processed data are stored online at the GREGOR GFPI and HiFI data archive of the Leibniz Institute for Astrophysics Potsdam (AIP). The principles of the pipeline are presented together with selected high-resolution spectral scans and images processed with sTools.
Collapse of Corroded Pipelines under Combined Tension and External Pressure
Ye, Hao; Yan, Sunting; Jin, Zhijiang
2016-01-01
In this work, collapse of corroded pipeline under combined external pressure and tension is investigated through numerical method. Axially uniform corrosion with symmetric imperfections is firstly considered. After verifying with existing experimental results, the finite element model is used to study the effect of tension on collapse pressure. An extensive parametric study is carried out using Python script and FORTRAN subroutine to investigate the influence of geometric parameters on the collapse behavior under combined loads. The results are used to develop an empirical equation for estimating the collapse pressure under tension. In addition, the effects of loading path, initial imperfection length, yielding anisotropy and corrosion defect length on the collapse behavior are also investigated. It is found that tension has a significant influence on collapse pressure of corroded pipelines. Loading path and anisotropic yielding are also important factors affecting the collapse behavior. For pipelines with relatively long corrosion defect, axially uniform corrosion models could be used to estimate the collapse pressure. PMID:27111544
Aerodynamics of electrically driven freight pipeline system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lundgren, T.S.; Zhao, Y.
2000-06-01
This paper examines the aerodynamic characteristics of a freight pipeline system in which freight capsules are individually propelled by electrical motors. The fundamental difference between this system and the more extensively studied pneumatic capsule pipeline is the different role played by aerodynamic forces. In a driven system the propelled capsules are resisted by aerodynamic forces and, in reaction, pump air through the tube. In contrast, in a pneumatically propelled system external blowers pump air through the tubes, and this provides the thrust for the capsules. An incompressible transient analysis is developed to study the aerodynamics of multiple capsules in amore » cross-linked two-bore pipeline. An aerodynamic friction coefficient is used as a cost parameter to compare the effects of capsule blockage and headway and to assess the merits of adits and vents. The authors conclude that optimum efficiency for off-design operation is obtained with long platoons of capsules in vented or adit connected tubes.« less