Creation and Implementation of a Workforce Development Pipeline Program at MSFC
NASA Technical Reports Server (NTRS)
Hix, Billy
2003-01-01
Within the context of NASA's Education Programs, this Workforce Development Pipeline guide describes the goals and objectives of MSFC's Workforce Development Pipeline Program as well as the principles and strategies for guiding implementation. It is designed to support the initiatives described in the NASA Implementation Plan for Education, 1999-2003 (EP-1998-12-383-HQ) and represents the vision of the members of the Education Programs office at MSFC. This document: 1) Outlines NASA s Contribution to National Priorities; 2) Sets the context for the Workforce Development Pipeline Program; 3) Describes Workforce Development Pipeline Program Strategies; 4) Articulates the Workforce Development Pipeline Program Goals and Aims; 5) List the actions to build a unified approach; 6) Outlines the Workforce Development Pipeline Programs guiding Principles; and 7) The results of implementation.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-02-02
... Management Program for Gas Distribution Pipelines; Correction AGENCY: Pipeline and Hazardous Materials Safety... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration 49 CFR Part... Regulations to require operators of gas distribution pipelines to develop and implement integrity management...
77 FR 51848 - Pipeline Safety: Information Collection Activities
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-27
... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket No... Program for Gas Distribution Pipelines. DATES: Interested persons are invited to submit comments on or.... These regulations require operators of hazardous liquid pipelines and gas pipelines to develop and...
BigDataScript: a scripting language for data pipelines.
Cingolani, Pablo; Sladek, Rob; Blanchette, Mathieu
2015-01-01
The analysis of large biological datasets often requires complex processing pipelines that run for a long time on large computational infrastructures. We designed and implemented a simple script-like programming language with a clean and minimalist syntax to develop and manage pipeline execution and provide robustness to various types of software and hardware failures as well as portability. We introduce the BigDataScript (BDS) programming language for data processing pipelines, which improves abstraction from hardware resources and assists with robustness. Hardware abstraction allows BDS pipelines to run without modification on a wide range of computer architectures, from a small laptop to multi-core servers, server farms, clusters and clouds. BDS achieves robustness by incorporating the concepts of absolute serialization and lazy processing, thus allowing pipelines to recover from errors. By abstracting pipeline concepts at programming language level, BDS simplifies implementation, execution and management of complex bioinformatics pipelines, resulting in reduced development and debugging cycles as well as cleaner code. BigDataScript is available under open-source license at http://pcingola.github.io/BigDataScript. © The Author 2014. Published by Oxford University Press.
BigDataScript: a scripting language for data pipelines
Cingolani, Pablo; Sladek, Rob; Blanchette, Mathieu
2015-01-01
Motivation: The analysis of large biological datasets often requires complex processing pipelines that run for a long time on large computational infrastructures. We designed and implemented a simple script-like programming language with a clean and minimalist syntax to develop and manage pipeline execution and provide robustness to various types of software and hardware failures as well as portability. Results: We introduce the BigDataScript (BDS) programming language for data processing pipelines, which improves abstraction from hardware resources and assists with robustness. Hardware abstraction allows BDS pipelines to run without modification on a wide range of computer architectures, from a small laptop to multi-core servers, server farms, clusters and clouds. BDS achieves robustness by incorporating the concepts of absolute serialization and lazy processing, thus allowing pipelines to recover from errors. By abstracting pipeline concepts at programming language level, BDS simplifies implementation, execution and management of complex bioinformatics pipelines, resulting in reduced development and debugging cycles as well as cleaner code. Availability and implementation: BigDataScript is available under open-source license at http://pcingola.github.io/BigDataScript. Contact: pablo.e.cingolani@gmail.com PMID:25189778
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-25
... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket ID... Management Programs AGENCY: Pipeline and Hazardous Materials Safety Administration (PHMSA), DOT. ACTION... Nation's gas distribution pipeline systems through development of inspection methods and guidance for the...
Automated Monitoring of Pipeline Rights-of-Way
NASA Technical Reports Server (NTRS)
Frost, Chard Ritchie
2010-01-01
NASA Ames Research Center and the Pipeline Research Council International, Inc. have partnered in the formation of a research program to identify and develop the key technologies required to enable automated detection of threats to gas and oil transmission and distribution pipelines. This presentation describes the Right-of-way Automated Monitoring (RAM) program and highlights research successes to date, continuing challenges to implementing the RAM objectives, and the program's ongoing work and plans.
ECDA of Cased Pipeline Segments
DOT National Transportation Integrated Search
2010-06-01
On June 28, 2007, PHMSA released a Broad Agency Announcement (BAA), DTPH56-07-BAA-000002, seeking white papers on individual projects and consolidated Research and Development (R&D) programs addressing topics on pipeline safety program. Although, not...
49 CFR 195.452 - Pipeline integrity management in high consequence areas.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 49 Transportation 3 2013-10-01 2013-10-01 false Pipeline integrity management in high consequence... Management § 195.452 Pipeline integrity management in high consequence areas. (a) Which pipelines are covered... by this section must: (1) Develop a written integrity management program that addresses the risks on...
49 CFR 195.452 - Pipeline integrity management in high consequence areas.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 49 Transportation 3 2012-10-01 2012-10-01 false Pipeline integrity management in high consequence... Management § 195.452 Pipeline integrity management in high consequence areas. (a) Which pipelines are covered... by this section must: (1) Develop a written integrity management program that addresses the risks on...
49 CFR 195.452 - Pipeline integrity management in high consequence areas.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 49 Transportation 3 2014-10-01 2014-10-01 false Pipeline integrity management in high consequence... Management § 195.452 Pipeline integrity management in high consequence areas. (a) Which pipelines are covered... by this section must: (1) Develop a written integrity management program that addresses the risks on...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, Wellington K.; Morris, Tyler; Chu, Andrew
The ThunderBird Cup v3.0 (TBC3) program falls under the Minority Serving Institution Pipeline Program (MSIPP) that aims to establish a world-class workforce development, education and research program that combines the strengths of Historically Black Colleges and Universities (HBCUs) and national laboratories to create a K-20 pipeline of students to participate in cybersecurity and related fields.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, Wellington K.; Morris, Tyler Jake; Chu, Andrew Chun-An
The ThunderBird Cup v2.0 (TBC2) program falls under the Minority Serving Institution Pipeline Program (MSIPP) that aims to establish a world-class workforce development, education and research program that combines the strengths of Historically Black Colleges and Universities (HBCUs) and national laboratories to create a K-20 pipeline of students to participate in cybersecurity and related fields.
The Hyper Suprime-Cam software pipeline
NASA Astrophysics Data System (ADS)
Bosch, James; Armstrong, Robert; Bickerton, Steven; Furusawa, Hisanori; Ikeda, Hiroyuki; Koike, Michitaro; Lupton, Robert; Mineo, Sogo; Price, Paul; Takata, Tadafumi; Tanaka, Masayuki; Yasuda, Naoki; AlSayyad, Yusra; Becker, Andrew C.; Coulton, William; Coupon, Jean; Garmilla, Jose; Huang, Song; Krughoff, K. Simon; Lang, Dustin; Leauthaud, Alexie; Lim, Kian-Tat; Lust, Nate B.; MacArthur, Lauren A.; Mandelbaum, Rachel; Miyatake, Hironao; Miyazaki, Satoshi; Murata, Ryoma; More, Surhud; Okura, Yuki; Owen, Russell; Swinbank, John D.; Strauss, Michael A.; Yamada, Yoshihiko; Yamanoi, Hitomi
2018-01-01
In this paper, we describe the optical imaging data processing pipeline developed for the Subaru Telescope's Hyper Suprime-Cam (HSC) instrument. The HSC Pipeline builds on the prototype pipeline being developed by the Large Synoptic Survey Telescope's Data Management system, adding customizations for HSC, large-scale processing capabilities, and novel algorithms that have since been reincorporated into the LSST codebase. While designed primarily to reduce HSC Subaru Strategic Program (SSP) data, it is also the recommended pipeline for reducing general-observer HSC data. The HSC pipeline includes high-level processing steps that generate coadded images and science-ready catalogs as well as low-level detrending and image characterizations.
Development of Protective Coatings for Co-Sequestration Processes and Pipelines
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bierwagen, Gordon; Huang, Yaping
2011-11-30
The program, entitled Development of Protective Coatings for Co-Sequestration Processes and Pipelines, examined the sensitivity of existing coating systems to supercritical carbon dioxide (SCCO2) exposure and developed new coating system to protect pipelines from their corrosion under SCCO2 exposure. A literature review was also conducted regarding pipeline corrosion sensors to monitor pipes used in handling co-sequestration fluids. Research was to ensure safety and reliability for a pipeline involving transport of SCCO2 from the power plant to the sequestration site to mitigate the greenhouse gas effect. Results showed that one commercial coating and one designed formulation can both be supplied asmore » potential candidates for internal pipeline coating to transport SCCO2.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hansen, E.A.; Smed, P.F.; Bryndum, M.B.
The paper describes the numerical program, PIPESIN, that simulates the behavior of a pipeline placed on an erodible seabed. PIPEline Seabed INteraction from installation until a stable pipeline seabed configuration has occurred is simulated in the time domain including all important physical processes. The program is the result of the joint research project, ``Free Span Development and Self-lowering of Offshore Pipelines`` sponsored by EU and a group of companies and carried out by the Danish Hydraulic Institute and Delft Hydraulics. The basic modules of PIPESIN are described. The description of the scouring processes has been based on and verified throughmore » physical model tests carried out as part of the research project. The program simulates a section of the pipeline (typically 500 m) in the time domain, the main input being time series of the waves and current. The main results include predictions of the onset of free spans, their length distribution, their variation in time, and the lowering of the pipeline as function of time.« less
The Hyper Suprime-Cam software pipeline
Bosch, James; Armstrong, Robert; Bickerton, Steven; ...
2017-10-12
Here in this article, we describe the optical imaging data processing pipeline developed for the Subaru Telescope’s Hyper Suprime-Cam (HSC) instrument. The HSC Pipeline builds on the prototype pipeline being developed by the Large Synoptic Survey Telescope’s Data Management system, adding customizations for HSC, large-scale processing capabilities, and novel algorithms that have since been reincorporated into the LSST codebase. While designed primarily to reduce HSC Subaru Strategic Program (SSP) data, it is also the recommended pipeline for reducing general-observer HSC data. The HSC pipeline includes high-level processing steps that generate coadded images and science-ready catalogs as well as low-level detrendingmore » and image characterizations.« less
The Hyper Suprime-Cam software pipeline
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bosch, James; Armstrong, Robert; Bickerton, Steven
Here in this article, we describe the optical imaging data processing pipeline developed for the Subaru Telescope’s Hyper Suprime-Cam (HSC) instrument. The HSC Pipeline builds on the prototype pipeline being developed by the Large Synoptic Survey Telescope’s Data Management system, adding customizations for HSC, large-scale processing capabilities, and novel algorithms that have since been reincorporated into the LSST codebase. While designed primarily to reduce HSC Subaru Strategic Program (SSP) data, it is also the recommended pipeline for reducing general-observer HSC data. The HSC pipeline includes high-level processing steps that generate coadded images and science-ready catalogs as well as low-level detrendingmore » and image characterizations.« less
A visual programming environment for the Navier-Stokes computer
NASA Technical Reports Server (NTRS)
Tomboulian, Sherryl; Crockett, Thomas W.; Middleton, David
1988-01-01
The Navier-Stokes computer is a high-performance, reconfigurable, pipelined machine designed to solve large computational fluid dynamics problems. Due to the complexity of the architecture, development of effective, high-level language compilers for the system appears to be a very difficult task. Consequently, a visual programming methodology has been developed which allows users to program the system at an architectural level by constructing diagrams of the pipeline configuration. These schematic program representations can then be checked for validity and automatically translated into machine code. The visual environment is illustrated by using a prototype graphical editor to program an example problem.
30 CFR 1206.157 - Determination of transportation allowances.
Code of Federal Regulations, 2014 CFR
2014-07-01
... pipelines are interconnected to a series of outgoing pipelines; (5) Gas Research Institute (GRI) fees. The GRI conducts research, development, and commercialization programs on natural gas related topics for...
30 CFR 1206.157 - Determination of transportation allowances.
Code of Federal Regulations, 2013 CFR
2013-07-01
... pipelines are interconnected to a series of outgoing pipelines; (5) Gas Research Institute (GRI) fees. The GRI conducts research, development, and commercialization programs on natural gas related topics for...
Hydrostatic collapse research in support of the Oman India gas pipeline
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stark, P.R.; McKeehan, D.S.
1995-12-01
This paper provides a summary of the collapse test program conducted as part of the technical development for the Ultra Deep Oman to India Pipeline. The paper describes the motivation for conducting the collapse test program, outlines the test objectives and procedures, presents the results obtained, and draws conclusions on the factors affecting collapse resistance.
Thakur, Shalabh; Guttman, David S
2016-06-30
Comparative analysis of whole genome sequence data from closely related prokaryotic species or strains is becoming an increasingly important and accessible approach for addressing both fundamental and applied biological questions. While there are number of excellent tools developed for performing this task, most scale poorly when faced with hundreds of genome sequences, and many require extensive manual curation. We have developed a de-novo genome analysis pipeline (DeNoGAP) for the automated, iterative and high-throughput analysis of data from comparative genomics projects involving hundreds of whole genome sequences. The pipeline is designed to perform reference-assisted and de novo gene prediction, homolog protein family assignment, ortholog prediction, functional annotation, and pan-genome analysis using a range of proven tools and databases. While most existing methods scale quadratically with the number of genomes since they rely on pairwise comparisons among predicted protein sequences, DeNoGAP scales linearly since the homology assignment is based on iteratively refined hidden Markov models. This iterative clustering strategy enables DeNoGAP to handle a very large number of genomes using minimal computational resources. Moreover, the modular structure of the pipeline permits easy updates as new analysis programs become available. DeNoGAP integrates bioinformatics tools and databases for comparative analysis of a large number of genomes. The pipeline offers tools and algorithms for annotation and analysis of completed and draft genome sequences. The pipeline is developed using Perl, BioPerl and SQLite on Ubuntu Linux version 12.04 LTS. Currently, the software package accompanies script for automated installation of necessary external programs on Ubuntu Linux; however, the pipeline should be also compatible with other Linux and Unix systems after necessary external programs are installed. DeNoGAP is freely available at https://sourceforge.net/projects/denogap/ .
NASA Technical Reports Server (NTRS)
Charity, Pamela C.; Klein, Paul B.; Wadhwa, Bhushan
1995-01-01
The Cleveland State University Minority Engineering Program Pipeline consist of programs which foster engineering career awareness, academic enrichment, and professional development for historically underrepresented minority studies. The programs involved are the Access to Careers in Engineering (ACE) Program for high school pre-engineering students: the LINK Program for undergraduate students pursuing degree which include engineering; and the PEP (Pre-calculus Enrichment Program) and EPIC (Enrichment Program in Calculus) mathematics programs for undergraduate academic enrichment. The pipeline is such that high school graduates from the ACE Program who enroll at Cleveland State University in pursuit of engineering degrees are admitted to the LINK Program for undergraduate level support. LINK Program students are among the minority participants who receive mathematics enrichment through the PEP and EPIC Programs for successful completion of their engineering required math courses. THese programs are interdependent and share the goal of preparing minority students for engineering careers by enabling them to achieve academically and obtain college degree and career related experience.
Development of Optimized Welding Solutions for X100 Linepipe Steel
DOT National Transportation Integrated Search
2011-09-01
This investigation is part of a major consolidated program of research sponsored by the US Department of Transportation (DOT) Pipeline Hazardous Materials Safety Administration (PHMSA) and the Pipeline Research Council International (PRCI) to advance...
Digital Imaging of Pipeline Mechanical Damage and Residual Stress
DOT National Transportation Integrated Search
2010-02-19
The purpose of this program was to enhance characterization of mechanical damage in pipelines through application of digital eddy current imaging. Lift-off maps can be used to develop quantitative representations of mechanical damage and magnetic per...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robin Gordon; Bill Bruce; Nancy Porter
2003-05-01
The two broad categories of deposited weld metal repair and fiber-reinforced composite repair technologies were reviewed for potential application for internal repair of gas transmission pipelines. Both are used to some extent for other applications and could be further developed for internal, local, structural repair of gas transmission pipelines. Preliminary test programs were developed for both deposited weld metal repairs and for fiber-reinforced composite repair. To date, all of the experimental work pertaining to the evaluation of potential repair methods has focused on fiber-reinforced composite repairs. Hydrostatic testing was also conducted on four pipeline sections with simulated corrosion damage: twomore » with composite liners and two without.« less
77 FR 15453 - Pipeline Safety: Information Collection Activities
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-15
... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket No... information collection titled, ``Gas Pipeline Safety Program Certification and Hazardous Liquid Pipeline... collection request that PHMSA will be submitting to OMB for renewal titled, ``Gas Pipeline Safety Program...
49 CFR 192.616 - Public awareness.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 49 Transportation 3 2011-10-01 2011-10-01 false Public awareness. 192.616 Section 192.616... BY PIPELINE: MINIMUM FEDERAL SAFETY STANDARDS Operations § 192.616 Public awareness. (a) Except for..., each pipeline operator must develop and implement a written continuing public education program that...
49 CFR 192.616 - Public awareness.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 49 Transportation 3 2010-10-01 2010-10-01 false Public awareness. 192.616 Section 192.616... BY PIPELINE: MINIMUM FEDERAL SAFETY STANDARDS Operations § 192.616 Public awareness. (a) Except for..., each pipeline operator must develop and implement a written continuing public education program that...
49 CFR 195.440 - Public awareness.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 49 Transportation 3 2011-10-01 2011-10-01 false Public awareness. 195.440 Section 195.440... PIPELINE Operation and Maintenance § 195.440 Public awareness. (a) Each pipeline operator must develop and implement a written continuing public education program that follows the guidance provided in the American...
49 CFR 195.440 - Public awareness.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 49 Transportation 3 2010-10-01 2010-10-01 false Public awareness. 195.440 Section 195.440... PIPELINE Operation and Maintenance § 195.440 Public awareness. (a) Each pipeline operator must develop and implement a written continuing public education program that follows the guidance provided in the American...
Workflows for microarray data processing in the Kepler environment.
Stropp, Thomas; McPhillips, Timothy; Ludäscher, Bertram; Bieda, Mark
2012-05-17
Microarray data analysis has been the subject of extensive and ongoing pipeline development due to its complexity, the availability of several options at each analysis step, and the development of new analysis demands, including integration with new data sources. Bioinformatics pipelines are usually custom built for different applications, making them typically difficult to modify, extend and repurpose. Scientific workflow systems are intended to address these issues by providing general-purpose frameworks in which to develop and execute such pipelines. The Kepler workflow environment is a well-established system under continual development that is employed in several areas of scientific research. Kepler provides a flexible graphical interface, featuring clear display of parameter values, for design and modification of workflows. It has capabilities for developing novel computational components in the R, Python, and Java programming languages, all of which are widely used for bioinformatics algorithm development, along with capabilities for invoking external applications and using web services. We developed a series of fully functional bioinformatics pipelines addressing common tasks in microarray processing in the Kepler workflow environment. These pipelines consist of a set of tools for GFF file processing of NimbleGen chromatin immunoprecipitation on microarray (ChIP-chip) datasets and more comprehensive workflows for Affymetrix gene expression microarray bioinformatics and basic primer design for PCR experiments, which are often used to validate microarray results. Although functional in themselves, these workflows can be easily customized, extended, or repurposed to match the needs of specific projects and are designed to be a toolkit and starting point for specific applications. These workflows illustrate a workflow programming paradigm focusing on local resources (programs and data) and therefore are close to traditional shell scripting or R/BioConductor scripting approaches to pipeline design. Finally, we suggest that microarray data processing task workflows may provide a basis for future example-based comparison of different workflow systems. We provide a set of tools and complete workflows for microarray data analysis in the Kepler environment, which has the advantages of offering graphical, clear display of conceptual steps and parameters and the ability to easily integrate other resources such as remote data and web services.
Pipelining in a changing competitive environment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jones, E.G.; Wishart, D.M.
1996-12-31
The changing competitive environment for the pipeline industry presents a broad spectrum of new challenges and opportunities: international cooperation; globalization of opportunities, organizations and competition; and integrated systems approach to system configuration, financing, contracting strategy, materials sourcing, and operations; cutting edge and emerging technologies; adherence to high standards of environmental protection; an emphasis on safety; innovative approaches to project financing; and advances in technology and programs to maintain the long term, cost effective integrity of operating pipeline systems. These challenges and opportunities are partially a result of the increasingly competitive nature of pipeline development and the public`s intolerance to incidentsmore » of pipeline failure. A creative systems approach to these challenges is often the key to the project moving ahead. This usually encompasses collaboration among users of the pipeline, pipeline owners and operators, international engineering and construction companies, equipment and materials suppliers, in-country engineers and constructors, international lending agencies and financial institutions.« less
Cormier, Nathan; Kolisnik, Tyler; Bieda, Mark
2016-07-05
There has been an enormous expansion of use of chromatin immunoprecipitation followed by sequencing (ChIP-seq) technologies. Analysis of large-scale ChIP-seq datasets involves a complex series of steps and production of several specialized graphical outputs. A number of systems have emphasized custom development of ChIP-seq pipelines. These systems are primarily based on custom programming of a single, complex pipeline or supply libraries of modules and do not produce the full range of outputs commonly produced for ChIP-seq datasets. It is desirable to have more comprehensive pipelines, in particular ones addressing common metadata tasks, such as pathway analysis, and pipelines producing standard complex graphical outputs. It is advantageous if these are highly modular systems, available as both turnkey pipelines and individual modules, that are easily comprehensible, modifiable and extensible to allow rapid alteration in response to new analysis developments in this growing area. Furthermore, it is advantageous if these pipelines allow data provenance tracking. We present a set of 20 ChIP-seq analysis software modules implemented in the Kepler workflow system; most (18/20) were also implemented as standalone, fully functional R scripts. The set consists of four full turnkey pipelines and 16 component modules. The turnkey pipelines in Kepler allow data provenance tracking. Implementation emphasized use of common R packages and widely-used external tools (e.g., MACS for peak finding), along with custom programming. This software presents comprehensive solutions and easily repurposed code blocks for ChIP-seq analysis and pipeline creation. Tasks include mapping raw reads, peakfinding via MACS, summary statistics, peak location statistics, summary plots centered on the transcription start site (TSS), gene ontology, pathway analysis, and de novo motif finding, among others. These pipelines range from those performing a single task to those performing full analyses of ChIP-seq data. The pipelines are supplied as both Kepler workflows, which allow data provenance tracking, and, in the majority of cases, as standalone R scripts. These pipelines are designed for ease of modification and repurposing.
49 CFR 192.911 - What are the elements of an integrity management program?
Code of Federal Regulations, 2010 CFR
2010-10-01
...) PIPELINE AND HAZARDOUS MATERIALS SAFETY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED) PIPELINE SAFETY TRANSPORTATION OF NATURAL AND OTHER GAS BY PIPELINE: MINIMUM FEDERAL SAFETY STANDARDS Gas Transmission Pipeline Integrity Management § 192.911 What are the elements of an integrity management program...
49 CFR 192.913 - When may an operator deviate its program from certain requirements of this subpart?
Code of Federal Regulations, 2010 CFR
2010-10-01
... Transportation (Continued) PIPELINE AND HAZARDOUS MATERIALS SAFETY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED) PIPELINE SAFETY TRANSPORTATION OF NATURAL AND OTHER GAS BY PIPELINE: MINIMUM FEDERAL SAFETY STANDARDS Gas Transmission Pipeline Integrity Management § 192.913 When may an operator deviate its program...
49 CFR 192.945 - What methods must an operator use to measure program effectiveness?
Code of Federal Regulations, 2010 CFR
2010-10-01
... (Continued) PIPELINE AND HAZARDOUS MATERIALS SAFETY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED) PIPELINE SAFETY TRANSPORTATION OF NATURAL AND OTHER GAS BY PIPELINE: MINIMUM FEDERAL SAFETY STANDARDS Gas Transmission Pipeline Integrity Management § 192.945 What methods must an operator use to measure program...
Tappis, Hannah; Doocy, Shannon; Amoako, Stephen
2013-01-01
ABSTRACT Despite decades of support for international food assistance programs by the U.S. Agency for International Development (USAID) Office of Food for Peace, relatively little is known about the commodity pipeline and management issues these programs face in post-conflict and politically volatile settings. Based on an audit of the program's commodity tracking system and interviews with 13 key program staff, this case study documents the experiences of organizations implementing the first USAID-funded non-emergency (development) food assistance program approved for Sudan and South Sudan. Key challenges and lessons learned in this experience about food commodity procurement, transport, and management may help improve the design and implementation of future development food assistance programs in a variety of complex, food-insecure settings around the world. Specifically, expanding shipping routes in complex political situations may facilitate reliable and timely commodity delivery. In addition, greater flexibility to procure commodities locally, rather than shipping U.S.-procured commodities, may avoid unnecessary shipping delays and reduce costs. PMID:25276532
The Stanford Medical Youth Science Program: Educational and Science-Related Outcomes
ERIC Educational Resources Information Center
Crump, Casey; Ned, Judith; Winkleby, Marilyn A.
2015-01-01
Biomedical preparatory programs (pipeline programs) have been developed at colleges and universities to better prepare youth for entering science- and health-related careers, but outcomes of such programs have seldom been rigorously evaluated. We conducted a matched cohort study to evaluate the Stanford Medical Youth Science Program's Summer…
Pharmaceutical new product development: the increasing role of in-licensing.
Edwards, Nancy V
2008-12-01
Many pharmaceutical companies are facing a pipeline gap because of the increasing economic burden and uncertainty associated with internal research and development programs designed to develop new pharmaceutical products. To fill this pipeline gap, pharmaceutical companies are increasingly relying on in-licensing opportunities. New business development identifies new pharmaceuticals that satisfy unmet needs and are a good strategic fit for the company, completes valuation models and forecasts, evaluates the ability of the company to develop and launch products, and pursues in-licensing agreements for pharmaceuticals that cannot be developed internally on a timely basis. These agreements involve the transfer of access rights for patents, trademarks, or similar intellectual property from an outside company in exchange for payments. Despite the risks, in-licensing is increasingly becoming the preferred method for pharmaceutical companies with pipeline gaps to bring new pharmaceuticals to the clinician.
Development of Time-Distance Helioseismology Data Analysis Pipeline for SDO/HMI
NASA Technical Reports Server (NTRS)
DuVall, T. L., Jr.; Zhao, J.; Couvidat, S.; Parchevsky, K. V.; Beck, J.; Kosovichev, A. G.; Scherrer, P. H.
2008-01-01
The Helioseismic and Magnetic Imager of SDO will provide uninterrupted 4k x 4k-pixel Doppler-shift images of the Sun with approximately 40 sec cadence. These data will have a unique potential for advancing local helioseismic diagnostics of the Sun's interior structure and dynamics. They will help to understand the basic mechanisms of solar activity and develop predictive capabilities for NASA's Living with a Star program. Because of the tremendous amount of data the HMI team is developing a data analysis pipeline, which will provide maps of subsurface flows and sound-speed distributions inferred form the Doppler data by the time-distance technique. We discuss the development plan, methods, and algorithms, and present the status of the pipeline, testing results and examples of the data products.
Bio-Docklets: virtualization containers for single-step execution of NGS pipelines.
Kim, Baekdoo; Ali, Thahmina; Lijeron, Carlos; Afgan, Enis; Krampis, Konstantinos
2017-08-01
Processing of next-generation sequencing (NGS) data requires significant technical skills, involving installation, configuration, and execution of bioinformatics data pipelines, in addition to specialized postanalysis visualization and data mining software. In order to address some of these challenges, developers have leveraged virtualization containers toward seamless deployment of preconfigured bioinformatics software and pipelines on any computational platform. We present an approach for abstracting the complex data operations of multistep, bioinformatics pipelines for NGS data analysis. As examples, we have deployed 2 pipelines for RNA sequencing and chromatin immunoprecipitation sequencing, preconfigured within Docker virtualization containers we call Bio-Docklets. Each Bio-Docklet exposes a single data input and output endpoint and from a user perspective, running the pipelines as simply as running a single bioinformatics tool. This is achieved using a "meta-script" that automatically starts the Bio-Docklets and controls the pipeline execution through the BioBlend software library and the Galaxy Application Programming Interface. The pipeline output is postprocessed by integration with the Visual Omics Explorer framework, providing interactive data visualizations that users can access through a web browser. Our goal is to enable easy access to NGS data analysis pipelines for nonbioinformatics experts on any computing environment, whether a laboratory workstation, university computer cluster, or a cloud service provider. Beyond end users, the Bio-Docklets also enables developers to programmatically deploy and run a large number of pipeline instances for concurrent analysis of multiple datasets. © The Authors 2017. Published by Oxford University Press.
Bio-Docklets: virtualization containers for single-step execution of NGS pipelines
Kim, Baekdoo; Ali, Thahmina; Lijeron, Carlos; Afgan, Enis
2017-01-01
Abstract Processing of next-generation sequencing (NGS) data requires significant technical skills, involving installation, configuration, and execution of bioinformatics data pipelines, in addition to specialized postanalysis visualization and data mining software. In order to address some of these challenges, developers have leveraged virtualization containers toward seamless deployment of preconfigured bioinformatics software and pipelines on any computational platform. We present an approach for abstracting the complex data operations of multistep, bioinformatics pipelines for NGS data analysis. As examples, we have deployed 2 pipelines for RNA sequencing and chromatin immunoprecipitation sequencing, preconfigured within Docker virtualization containers we call Bio-Docklets. Each Bio-Docklet exposes a single data input and output endpoint and from a user perspective, running the pipelines as simply as running a single bioinformatics tool. This is achieved using a “meta-script” that automatically starts the Bio-Docklets and controls the pipeline execution through the BioBlend software library and the Galaxy Application Programming Interface. The pipeline output is postprocessed by integration with the Visual Omics Explorer framework, providing interactive data visualizations that users can access through a web browser. Our goal is to enable easy access to NGS data analysis pipelines for nonbioinformatics experts on any computing environment, whether a laboratory workstation, university computer cluster, or a cloud service provider. Beyond end users, the Bio-Docklets also enables developers to programmatically deploy and run a large number of pipeline instances for concurrent analysis of multiple datasets. PMID:28854616
1997 annual report : environmental monitoring program Louisiana offshore oil port pipeline.
DOT National Transportation Integrated Search
1998-06-01
The Louisiana Offshore Oil Port (LOOP) Environmental Monitoring Program includes an onshore pipeline vegetation and wildlife survey as a continuing study designed to measure the immediate and long-term impacts of LOOP-related pipeline construction an...
77 FR 61825 - Pipeline Safety: Notice of Public Meeting on Pipeline Data
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-11
... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket ID... program performance measures for gas distribution, gas transmission, and hazardous liquids pipelines. The... distribution pipelines (49 CFR 192.1007(e)), gas transmission pipelines (49 CFR 192.945) and hazardous liquids...
Acosta, David; Olsen, Polly
2006-10-01
Minority populations in the United States are growing rapidly, but physician workforce diversity has not kept pace with the needs of underserved communities. Minorities comprised 26.4% of the population in 1995; by 2050, these groups will comprise nearly half. Medical schools must enlist greater numbers of minority physicians and train all physicians to provide culturally responsive care. The University of Washington School of Medicine (UWSOM) is the nation's only medical school that serves a five-state region (Washington, Wyoming, Alaska, Montana, and Idaho). Its mission addresses the need to serve the region, rectify primary care shortages, and meet increasing regional demands for underserved populations. The UWSOM Native American Center of Excellence (NACOE) was established as one important way to respond to this charge. The authors describe pipeline and minority recruitment programs at UWSOM, focusing on the NACOE and other activities to recruit American Indian/Alaskan Native (AI/AN) applicants to medical schools. These programs have increased the numbers of AI/AN medical students; developed the Indian Health Pathway; worked to prepare students to provide culturally responsive care for AI/AN communities; researched health disparities specific to AI/AN populations; provided retention programs and services to ensure successful completion of medical training; developed mentorship networks; and provided faculty-development programs to increase entry of AI/AN physicians into academia. Challenges lie ahead. Barriers to the pipeline will continue to plague students, and inadequate federal funding will have a significant and negative impact on achieving needed physician-workforce diversity. Medical schools must play a larger role in resolving these, and continue to provide pipeline programs, retention programs, and minority faculty development that can make a difference.
2016-06-10
Democratic Society White House Leadership Development Program (WHLD) Harvard Kennedy School (HKS)–Senior Executive Fellows Program George......Nurse Leaders: An Exploration of Current Nurse Leadership Development in the Veterans Health Administration 5a. CONTRACT NUMBER 5b. GRANT NUMBER
Developing Healthcare Data Analytics APPs with Open Data Science Tools.
Hao, Bibo; Sun, Wen; Yu, Yiqin; Xie, Guotong
2017-01-01
Recent advances in big data analytics provide more flexible, efficient, and open tools for researchers to gain insight from healthcare data. Whilst many tools require researchers to develop programs with programming languages like Python, R and so on, which is not a skill set grasped by many researchers in the healthcare data analytics area. To make data science more approachable, we explored existing tools and developed a practice that can help data scientists convert existing analytics pipelines to user-friendly analytics APPs with rich interactions and features of real-time analysis. With this practice, data scientists can develop customized analytics pipelines as APPs in Jupyter Notebook and disseminate them to other researchers easily, and researchers can benefit from the shared notebook to perform analysis tasks or reproduce research results much more easily.
Kravatsky, Yuri; Chechetkin, Vladimir; Fedoseeva, Daria; Gorbacheva, Maria; Kravatskaya, Galina; Kretova, Olga; Tchurikov, Nickolai
2017-11-23
The efficient development of antiviral drugs, including efficient antiviral small interfering RNAs (siRNAs), requires continuous monitoring of the strict correspondence between a drug and the related highly variable viral DNA/RNA target(s). Deep sequencing is able to provide an assessment of both the general target conservation and the frequency of particular mutations in the different target sites. The aim of this study was to develop a reliable bioinformatic pipeline for the analysis of millions of short, deep sequencing reads corresponding to selected highly variable viral sequences that are drug target(s). The suggested bioinformatic pipeline combines the available programs and the ad hoc scripts based on an original algorithm of the search for the conserved targets in the deep sequencing data. We also present the statistical criteria for the threshold of reliable mutation detection and for the assessment of variations between corresponding data sets. These criteria are robust against the possible sequencing errors in the reads. As an example, the bioinformatic pipeline is applied to the study of the conservation of RNA interference (RNAi) targets in human immunodeficiency virus 1 (HIV-1) subtype A. The developed pipeline is freely available to download at the website http://virmut.eimb.ru/. Brief comments and comparisons between VirMut and other pipelines are also presented.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-26
... From OMB of One Current Public Collection of Information: Pipeline Corporate Security Review Program... current security practices in the pipeline industry by way of TSA's Pipeline Corporate Security Review... Collection Requirement The TSA Pipeline Security Branch is responsible for conducting Pipeline Corporate...
Programming the Navier-Stokes computer: An abstract machine model and a visual editor
NASA Technical Reports Server (NTRS)
Middleton, David; Crockett, Tom; Tomboulian, Sherry
1988-01-01
The Navier-Stokes computer is a parallel computer designed to solve Computational Fluid Dynamics problems. Each processor contains several floating point units which can be configured under program control to implement a vector pipeline with several inputs and outputs. Since the development of an effective compiler for this computer appears to be very difficult, machine level programming seems necessary and support tools for this process have been studied. These support tools are organized into a graphical program editor. A programming process is described by which appropriate computations may be efficiently implemented on the Navier-Stokes computer. The graphical editor would support this programming process, verifying various programmer choices for correctness and deducing values such as pipeline delays and network configurations. Step by step details are provided and demonstrated with two example programs.
Chan, Kuang-Lim; Rosli, Rozana; Tatarinova, Tatiana V; Hogan, Michael; Firdaus-Raih, Mohd; Low, Eng-Ti Leslie
2017-01-27
Gene prediction is one of the most important steps in the genome annotation process. A large number of software tools and pipelines developed by various computing techniques are available for gene prediction. However, these systems have yet to accurately predict all or even most of the protein-coding regions. Furthermore, none of the currently available gene-finders has a universal Hidden Markov Model (HMM) that can perform gene prediction for all organisms equally well in an automatic fashion. We present an automated gene prediction pipeline, Seqping that uses self-training HMM models and transcriptomic data. The pipeline processes the genome and transcriptome sequences of the target species using GlimmerHMM, SNAP, and AUGUSTUS pipelines, followed by MAKER2 program to combine predictions from the three tools in association with the transcriptomic evidence. Seqping generates species-specific HMMs that are able to offer unbiased gene predictions. The pipeline was evaluated using the Oryza sativa and Arabidopsis thaliana genomes. Benchmarking Universal Single-Copy Orthologs (BUSCO) analysis showed that the pipeline was able to identify at least 95% of BUSCO's plantae dataset. Our evaluation shows that Seqping was able to generate better gene predictions compared to three HMM-based programs (MAKER2, GlimmerHMM and AUGUSTUS) using their respective available HMMs. Seqping had the highest accuracy in rice (0.5648 for CDS, 0.4468 for exon, and 0.6695 nucleotide structure) and A. thaliana (0.5808 for CDS, 0.5955 for exon, and 0.8839 nucleotide structure). Seqping provides researchers a seamless pipeline to train species-specific HMMs and predict genes in newly sequenced or less-studied genomes. We conclude that the Seqping pipeline predictions are more accurate than gene predictions using the other three approaches with the default or available HMMs.
77 FR 19799 - Pipeline Safety: Pipeline Damage Prevention Programs
Federal Register 2010, 2011, 2012, 2013, 2014
2012-04-02
...,602 to $3,445,975. Evaluating just the lower range of benefits over ten years results in a total... consequences resulting from excavation damage to pipelines. A comprehensive damage prevention program requires..., including that resulting from excavation, digging, and other impacts, is also precipitated by operators...
NASA Astrophysics Data System (ADS)
Borden, Paula D.
This dissertation study concerned the lack of underrepresented minority students matriculating through the health professions pipeline. The term pipeline is "the educational avenue by which one must travel to successfully enter a profession" (Sullivan Alliance, 2004). There are a significant number of health professional pipeline programs based across the United States and, for the purposes of this study, a focus was placed on the Science Enrichment Preparation (S.E.P.) Program which is based at The University of North Carolina at Chapel Hill. The S.E.P. Program, is an eight-week residential summer experience, designed to support underrepresented minority pre-health students develop the competitive edge for successful admission into health professional school programs. The bedrock of this dissertation study concerned itself with the relationships between cognitive variables and non-cognitive variables and academic performance of students in the S.E.P. Program from 2005-2013. The study was undertaken to provide a clearer understanding for the NC Health Careers Access Program's (NC-HCAP) leadership with regard to variables associated with the students' academic performance in the S.E.P. Program. The data outcomes were informative for NC-HCAP in identifying cognitive and non-cognitive variables associated with student academic performance. Additionally, these findings provided direction as to what infrastructures may be put into place to more effectively support the S.E.P. participants. It is the researcher's hope this study may serve as an educational model and resource to pipeline programs and others with similar educational missions. The consequences and implications of a non-diverse healthcare workforce are high and far reaching. Without parity representation in the healthcare workforce, health disparities between racial and economic groups will likely continue to grow.
Gazda, Nicholas P; Griffin, Emily; Hamrick, Kasey; Baskett, Jordan; Mellon, Meghan M; Eckel, Stephen F; Granko, Robert P
2018-04-01
Purpose: The purpose of this article is to share experiences after the development of a health-system pharmacy administration residency with a MS degree and express the need for additional programs in nonacademic medical center health-system settings. Summary: Experiences with the development and implementation of a health-system pharmacy administration residency at a large community teaching hospital are described. Resident candidates benefit from collaborations with other health-systems through master's degree programs and visibility to leaders at your health-system. Programs benefit from building a pipeline of future pharmacy administrators and by leveraging the skills of residents to contribute to projects and department-wide initiatives. Tools to assist in the implementation of a new pharmacy administration program are also described and include rotation and preceptor development, marketing and recruiting, financial evaluation, and steps to prepare for accreditation. Conclusion: Health-system pharmacy administration residents provide the opportunity to build a pipeline of high-quality leaders, provide high-level project involvement, and produce a positive return on investment (ROI) for health-systems. These programs should be explored in academic and nonacademic-based health-systems.
Welding and NDT development in support of Oman-India gas pipeline
DOE Office of Scientific and Technical Information (OSTI.GOV)
Even, T.M.; Laing, B.; Hirsch, D.
1995-12-01
The Oman to India gas pipeline is designed for a maximum water depth of 3,500 m. For such a pipeline, resistance to hydrostatic collapse is a critical factor and dictates that very heavy wall pipe be used, preliminarily 24 inch ID x 1.625 inch wall. Because of the water depth, much of the installation will be by J-Lay which requires that the Joint be welded and inspected in a single station. This paper describes the results of welding and NDT test programs conducted to determine the minimum time to perform these operations in heavy wall pipe.
A graph-based approach for designing extensible pipelines
2012-01-01
Background In bioinformatics, it is important to build extensible and low-maintenance systems that are able to deal with the new tools and data formats that are constantly being developed. The traditional and simplest implementation of pipelines involves hardcoding the execution steps into programs or scripts. This approach can lead to problems when a pipeline is expanding because the incorporation of new tools is often error prone and time consuming. Current approaches to pipeline development such as workflow management systems focus on analysis tasks that are systematically repeated without significant changes in their course of execution, such as genome annotation. However, more dynamism on the pipeline composition is necessary when each execution requires a different combination of steps. Results We propose a graph-based approach to implement extensible and low-maintenance pipelines that is suitable for pipeline applications with multiple functionalities that require different combinations of steps in each execution. Here pipelines are composed automatically by compiling a specialised set of tools on demand, depending on the functionality required, instead of specifying every sequence of tools in advance. We represent the connectivity of pipeline components with a directed graph in which components are the graph edges, their inputs and outputs are the graph nodes, and the paths through the graph are pipelines. To that end, we developed special data structures and a pipeline system algorithm. We demonstrate the applicability of our approach by implementing a format conversion pipeline for the fields of population genetics and genetic epidemiology, but our approach is also helpful in other fields where the use of multiple software is necessary to perform comprehensive analyses, such as gene expression and proteomics analyses. The project code, documentation and the Java executables are available under an open source license at http://code.google.com/p/dynamic-pipeline. The system has been tested on Linux and Windows platforms. Conclusions Our graph-based approach enables the automatic creation of pipelines by compiling a specialised set of tools on demand, depending on the functionality required. It also allows the implementation of extensible and low-maintenance pipelines and contributes towards consolidating openness and collaboration in bioinformatics systems. It is targeted at pipeline developers and is suited for implementing applications with sequential execution steps and combined functionalities. In the format conversion application, the automatic combination of conversion tools increased both the number of possible conversions available to the user and the extensibility of the system to allow for future updates with new file formats. PMID:22788675
Adaptation of a program for nonlinear finite element analysis to the CDC STAR 100 computer
NASA Technical Reports Server (NTRS)
Pifko, A. B.; Ogilvie, P. L.
1978-01-01
The conversion of a nonlinear finite element program to the CDC STAR 100 pipeline computer is discussed. The program called DYCAST was developed for the crash simulation of structures. Initial results with the STAR 100 computer indicated that significant gains in computation time are possible for operations on gloval arrays. However, for element level computations that do not lend themselves easily to long vector processing, the STAR 100 was slower than comparable scalar computers. On this basis it is concluded that in order for pipeline computers to impact the economic feasibility of large nonlinear analyses it is absolutely essential that algorithms be devised to improve the efficiency of element level computations.
Deep ocean corrosion research in support of Oman India gas pipeline
DOE Office of Scientific and Technical Information (OSTI.GOV)
Graham, F.W.; McKeehan, D.S.
1995-12-01
The increasing interest in deepwater exploration and production has motivated the development of technologies required to accomplish tasks heretofore possible only onshore and in shallow water. The tremendous expense of technology development and the cost of specialized equipment has created concerns that the design life of these facilities may be compromised by corrosion. The requirements to develop and prove design parameters to meet these demands will require an ongoing environmental testing and materials evaluation and development program. This paper describes a two-fold corrosion testing program involving: (1) the installation of two corrosion test devices installed in-situ, and (2) a laboratorymore » test conducted in simulated site-specific seawater. These tests are expected to qualify key parameters necessary to design a cathodic protection system to protect the Oman-to-India pipeline.« less
Park, Byeonghyeok; Baek, Min-Jeong; Min, Byoungnam; Choi, In-Geol
2017-09-01
Genome annotation is a primary step in genomic research. To establish a light and portable prokaryotic genome annotation pipeline for use in individual laboratories, we developed a Shiny app package designated as "P-CAPS" (Prokaryotic Contig Annotation Pipeline Server). The package is composed of R and Python scripts that integrate publicly available annotation programs into a server application. P-CAPS is not only a browser-based interactive application but also a distributable Shiny app package that can be installed on any personal computer. The final annotation is provided in various standard formats and is summarized in an R markdown document. Annotation can be visualized and examined with a public genome browser. A benchmark test showed that the annotation quality and completeness of P-CAPS were reliable and compatible with those of currently available public pipelines.
Partitioning problems in parallel, pipelined and distributed computing
NASA Technical Reports Server (NTRS)
Bokhari, S.
1985-01-01
The problem of optimally assigning the modules of a parallel program over the processors of a multiple computer system is addressed. A Sum-Bottleneck path algorithm is developed that permits the efficient solution of many variants of this problem under some constraints on the structure of the partitions. In particular, the following problems are solved optimally for a single-host, multiple satellite system: partitioning multiple chain structured parallel programs, multiple arbitrarily structured serial programs and single tree structured parallel programs. In addition, the problems of partitioning chain structured parallel programs across chain connected systems and across shared memory (or shared bus) systems are also solved under certain constraints. All solutions for parallel programs are equally applicable to pipelined programs. These results extend prior research in this area by explicitly taking concurrency into account and permit the efficient utilization of multiple computer architectures for a wide range of problems of practical interest.
A Pipeline Tool for CCD Image Processing
NASA Astrophysics Data System (ADS)
Bell, Jon F.; Young, Peter J.; Roberts, William H.; Sebo, Kim M.
MSSSO is part of a collaboration developing a wide field imaging CCD mosaic (WFI). As part of this project, we have developed a GUI based pipeline tool that is an integrated part of MSSSO's CICADA data acquisition environment and processes CCD FITS images as they are acquired. The tool is also designed to run as a stand alone program to process previously acquired data. IRAF tasks are used as the central engine, including the new NOAO mscred package for processing multi-extension FITS files. The STScI OPUS pipeline environment may be used to manage data and process scheduling. The Motif GUI was developed using SUN Visual Workshop. C++ classes were written to facilitate launching of IRAF and OPUS tasks. While this first version implements calibration processing up to and including flat field corrections, there is scope to extend it to other processing.
75 FR 32836 - Pipeline Safety: Workshop on Public Awareness Programs
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-09
... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket ID... American Public Gas Association Association of Oil Pipelines American Petroleum Institute Interstate... the pipeline industry). Hazardous Liquid Gas Transmission/Gathering Natural Gas Distribution (10...
Long-Term Monitoring of Cased Pipelines Using Longrange Guided-Wave Technique
DOT National Transportation Integrated Search
2009-05-19
Integrity management programs for gas transmission pipelines are required by The Office of Pipeline Safety (OPS)/DOT. Direct Assessment (DA) and 'Other Technologies' have become the focus of assessment options for pipeline integrity on cased crossing...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grafe, J.L.
During the past decade many changes have taken place in the natural gas industry, not the least of which is the way information (data) is acquired, moved, compiled, integrated and disseminated within organizations. At El Paso Natural Gas Company (EPNG) the Operations Control Department has been at the center of these changes. The Systems Section within Operations Control has been instrumental in developing the computer programs that acquire and store real-time operational data, and then make it available to not only the Gas Control function, but also to anyone else within the company who might require it and, to amore » limited degree, any supplier or purchaser of gas utilizing the El Paso pipeline. These computer programs which make up the VISA system are, in effect, the tools that help move the data that flows in the pipeline of information within the company. Their integration into this pipeline process is the topic of this paper.« less
Increasing Diversity and Gender Parity by working with Professional Organizations and HBCUs
NASA Astrophysics Data System (ADS)
Wims, T. R.
2017-12-01
Context/Purpose: This abstract proposes tactics for recruiting diverse applicants and addressing gender parity in the geoscience workforce. Methods: The geoscience community should continue to develop and expand a pipeline of qualified potential employees and managers at all levels. Recruitment from professional organizations, which are minority based, such as the National Society of Black Engineers (NSBE), and the Society of Hispanic Professional Engineers (SHPE) provides senior and midlevel scientists, engineers, program managers, and corporate managers/administrators with proven track records of success. Geoscience organizations should consider increasing hiring from the 100+ Historically Black Colleges and Universities (HBCU) which have a proven track records of producing high quality graduates with math, science, computer science, and engineering backgrounds. HBCU alumni have been working in all levels of government and corporate organizations for more than 50 years. Results: Professional organizations, like NSBE, have members with one to 40 years of applicable work experience, who are prime candidates for employment in the geoscience community at all levels. NSBE, also operates pipeline programs to graduate 10,000 bachelor degree minority candidates per year by 2025, up from the current 3,620/year. HBCUs have established educational programs and several pipelines for attracting undergraduate students into the engineering and science fields. Since many HBCUs enroll more women than men, they are also addressing gender parity. Both professional organizations and HBCU's have pipeline programs that reach children in high school. Interpretation: Qualified and capable minority and women candidates are available in the United States. Pipelines for employing senior, mid-level, and junior skill sets are in place, but underutilized by some geoscience companies and organizations.
Practical Approach for Hyperspectral Image Processing in Python
NASA Astrophysics Data System (ADS)
Annala, L.; Eskelinen, M. A.; Hämäläinen, J.; Riihinen, A.; Pölönen, I.
2018-04-01
Python is a very popular programming language among data scientists around the world. Python can also be used in hyperspectral data analysis. There are some toolboxes designed for spectral imaging, such as Spectral Python and HyperSpy, but there is a need for analysis pipeline, which is easy to use and agile for different solutions. We propose a Python pipeline which is built on packages xarray, Holoviews and scikit-learn. We have developed some of own tools, MaskAccessor, VisualisorAccessor and a spectral index library. They also fulfill our goal of easy and agile data processing. In this paper we will present our processing pipeline and demonstrate it in practice.
NASA Technical Reports Server (NTRS)
Brownston, Lee; Jenkins, Jon M.
2015-01-01
The Kepler Mission was launched in 2009 as NASAs first mission capable of finding Earth-size planets in the habitable zone of Sun-like stars. Its telescope consists of a 1.5-m primary mirror and a 0.95-m aperture. The 42 charge-coupled devices in its focal plane are read out every half hour, compressed, and then downlinked monthly. After four years, the second of four reaction wheels failed, ending the original mission. Back on earth, the Science Operations Center developed the Science Pipeline to analyze about 200,000 target stars in Keplers field of view, looking for evidence of periodic dimming suggesting that one or more planets had crossed the face of its host star. The Pipeline comprises several steps, from pixel-level calibration, through noise and artifact removal, to detection of transit-like signals and the construction of a suite of diagnostic tests to guard against false positives. The Kepler Science Pipeline consists of a pipeline infrastructure written in the Java programming language, which marshals data input to and output from MATLAB applications that are executed as external processes. The pipeline modules, which underwent continuous development and refinement even after data started arriving, employ several analytic techniques, many developed for the Kepler Project. Because of the large number of targets, the large amount of data per target and the complexity of the pipeline algorithms, the processing demands are daunting. Some pipeline modules require days to weeks to process all of their targets, even when run on NASA's 128-node Pleiades supercomputer. The software developers are still seeking ways to increase the throughput. To date, the Kepler project has discovered more than 4000 planetary candidates, of which more than 1000 have been independently confirmed or validated to be exoplanets. Funding for this mission is provided by NASAs Science Mission Directorate.
NASA Astrophysics Data System (ADS)
Tomarov, G. V.; Shipkov, A. A.; Lovchev, V. N.; Gutsev, D. F.
2016-10-01
Problems of metal flow-accelerated corrosion (FAC) in the pipelines and equipment of the condensate- feeding and wet-steam paths of NPP power-generating units (PGU) are examined. Goals, objectives, and main principles of the methodology for the implementation of an integrated program of AO Concern Rosenergoatom for the prevention of unacceptable FAC thinning and for increasing operational flow-accelerated corrosion resistance of NPP EaP are worded (further the Program). A role is determined and potentialities are shown for the use of Russian software packages in the evaluation and prediction of FAC rate upon solving practical problems for the timely detection of unacceptable FAC thinning in the elements of pipelines and equipment (EaP) of the secondary circuit of NPP PGU. Information is given concerning the structure, properties, and functions of the software systems for plant personnel support in the monitoring and planning of the inservice inspection of FAC thinning elements of pipelines and equipment of the secondary circuit of NPP PGUs, which are created and implemented at some Russian NPPs equipped with VVER-1000, VVER-440, and BN-600 reactors. It is noted that one of the most important practical results of software packages for supporting NPP personnel concerning the issue of flow-accelerated corrosion consists in revealing elements under a hazard of intense local FAC thinning. Examples are given for successful practice at some Russian NPP concerning the use of software systems for supporting the personnel in early detection of secondary-circuit pipeline elements with FAC thinning close to an unacceptable level. Intermediate results of working on the Program are presented and new tasks set in 2012 as a part of the updated program are denoted. The prospects of the developed methods and tools in the scope of the Program measures at the stages of design and construction of NPP PGU are discussed. The main directions of the work on solving the problems of flow-accelerated corrosion of pipelines and equipment in Russian NPP PGU are defined.
Completion of development of robotics systems for inspecting unpiggable transmission pipelines.
DOT National Transportation Integrated Search
2013-02-01
This document presents the final report for a program focusing on the completion of the : research, development and demonstration effort, which was initiated in 2001, for the : development of two robotic systems for the in-line, live inspection of un...
Carthon, J. Margo Brooks; Nguyen, Thai-Huy; Chittams, Jesse; Park, Elizabeth; Guevara, James
2015-01-01
Objectives The purpose of this study was to identify common components of diversity pipeline programs across a national sample of nursing institutions and determine what effect these programs have on increasing underrepresented minority enrollment and graduation. Design Linked data from an electronic survey conducted November 2012 to March 2013 and American Association of Colleges of Nursing baccalaureate graduation and enrollment data (2008 and 2012). Participants Academic and administrative staff of 164 nursing schools in 26 states, including Puerto Rico in the United States. Methods Chi-square statistics were used to (1) describe organizational features of nursing diversity pipeline programs and (2) determine significant trends in underrepresented minorities’ graduation and enrollment between nursing schools with and without diversity pipeline programs Results Twenty percent (n = 33) of surveyed nursing schools reported a structured diversity pipeline program. The most frequent program measures associated with pipeline programs included mentorship, academic, and psychosocial support. Asian, Hispanic, and Native Hawaiian/Pacific Islander nursing student enrollment increased between 2008 and 2012. Hispanic/Latino graduation rates increased (7.9%–10.4%, p = .001), but they decreased among Black (6.8%–5.0%, p = .004) and Native American/Pacific Islander students (2.1 %–0.3%, p ≥ .001). Conclusions Nursing diversity pipeline programs are associated with increases in nursing school enrollment and graduation for some, although not all, minority students. Future initiatives should build on current trends while creating targeted strategies to reverse downward graduation trends among Black, Native American, and Pacific Island nursing students. PMID:24880900
Brooks Carthon, J Margo; Nguyen, Thai-Huy; Chittams, Jesse; Park, Elizabeth; Guevara, James
2014-01-01
The purpose of this study was to identify common components of diversity pipeline programs across a national sample of nursing institutions and determine what effect these programs have on increasing underrepresented minority enrollment and graduation. Linked data from an electronic survey conducted November 2012 to March 2013 and American Association of Colleges of Nursing baccalaureate graduation and enrollment data (2008 and 2012). Academic and administrative staff of 164 nursing schools in 26 states, including Puerto Rico in the United States. Chi-square statistics were used to (1) describe organizational features of nursing diversity pipeline programs and (2) determine significant trends in underrepresented minorities' graduation and enrollment between nursing schools with and without diversity pipeline programs Twenty percent (n = 33) of surveyed nursing schools reported a structured diversity pipeline program. The most frequent program measures associated with pipeline programs included mentorship, academic, and psychosocial support. Asian, Hispanic, and Native Hawaiian/Pacific Islander nursing student enrollment increased between 2008 and 2012. Hispanic/Latino graduation rates increased (7.9%-10.4%, p = .001), but they decreased among Black (6.8%-5.0%, p = .004) and Native American/Pacific Islander students (2.1 %-0.3%, p ≥ .001). Nursing diversity pipeline programs are associated with increases in nursing school enrollment and graduation for some, although not all, minority students. Future initiatives should build on current trends while creating targeted strategies to reverse downward graduation trends among Black, Native American, and Pacific Island nursing students. Copyright © 2014 Elsevier Inc. All rights reserved.
Germaine Reyes-French; Timothy J. Cohen
1991-01-01
This paper outlines a mitigation program for pipeline construction impacts to oak tree habitat by describing the requirements for the Offsite Oak Mitigation Program for the All American Pipeline (AAPL) in Santa Barbara County, California. After describing the initial environmental analysis, the County regulatory structure is described under which the plan was required...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-05
... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket No... Evaluations AGENCY: Pipeline and Hazardous Materials Safety Administration (PHMSA), DOT. ACTION: Notice... improve performance. For gas transmission pipelines, Sec. Sec. 192.911(i) and 192.945 define the...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-03
... Information Collection Activity Under OMB Review: Pipeline Corporate Security Review AGENCY: Transportation.... Information Collection Requirement Title: Pipeline Corporate Security Review (PCSR). Type of Request... current industry security practices through its Pipeline Corporate Security Review (PCSR) program. The...
Data processing pipeline for serial femtosecond crystallography at SACLA.
Nakane, Takanori; Joti, Yasumasa; Tono, Kensuke; Yabashi, Makina; Nango, Eriko; Iwata, So; Ishitani, Ryuichiro; Nureki, Osamu
2016-06-01
A data processing pipeline for serial femtosecond crystallography at SACLA was developed, based on Cheetah [Barty et al. (2014). J. Appl. Cryst. 47 , 1118-1131] and CrystFEL [White et al. (2016). J. Appl. Cryst. 49 , 680-689]. The original programs were adapted for data acquisition through the SACLA API, thread and inter-node parallelization, and efficient image handling. The pipeline consists of two stages: The first, online stage can analyse all images in real time, with a latency of less than a few seconds, to provide feedback on hit rate and detector saturation. The second, offline stage converts hit images into HDF5 files and runs CrystFEL for indexing and integration. The size of the filtered compressed output is comparable to that of a synchrotron data set. The pipeline enables real-time feedback and rapid structure solution during beamtime.
49 CFR 192.909 - How can an operator change its integrity management program?
Code of Federal Regulations, 2010 CFR
2010-10-01
... (Continued) PIPELINE AND HAZARDOUS MATERIALS SAFETY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED) PIPELINE SAFETY TRANSPORTATION OF NATURAL AND OTHER GAS BY PIPELINE: MINIMUM FEDERAL SAFETY STANDARDS Gas Transmission Pipeline Integrity Management § 192.909 How can an operator change its integrity management...
SEALS: an Innovative Pipeline Program Targeting Obstacles to Diversity in the Physician Workforce.
Fritz, Cassandra D L; Press, Valerie G; Nabers, Darrell; Levinson, Dana; Humphrey, Holly; Vela, Monica B
2016-06-01
Medical schools may find implementing pipeline programs for minority pre-medical students prohibitive due to a number of factors including the lack of well-described programs in the literature, the limited evidence for program development, and institutional financial barriers. Our goals were to (1) design a pipeline program based on educational theory; (2) deliver the program in a low cost, sustainable manner; and (3) evaluate intermediate outcomes of the program. SEALS is a 6-week program based on an asset bundles model designed to promote: (1) socialization and professionalism, (2) education in science learning tools, (3) acquisition of finance literacy, (4) the leveraging of mentorship and networks, and (5) social expectations and resilience, among minority pre-medical students. This is a prospective mixed methods study. Students completed survey instruments pre-program, post-program, and 6 months post-program, establishing intermediate outcome measures. Thirteen students matriculated to SEALS. The SEALS cohort rated themselves as improved or significantly improved when asked to rate their familiarity with MCAT components (p < 0.01), ability to ask for a letter of recommendation (p = 0.04), and importance of interview skills (p = 0.04) compared with before the program. Over 90 % of students referenced the health disparities lecture series as an inspiration to advocate for minority health. Six-month surveys suggested that SEALS students acquired and applied four of the five assets at their college campuses. This low-cost, high-quality, program can be undertaken by medical schools interested in promoting a diverse workforce that may ultimately begin to address and reduce health care disparities.
The Role of Mentoring in Leadership Development.
Crisp, Gloria; Alvarado-Young, Kelly
2018-06-01
This chapter discusses the role of mentoring in facilitating leadership development for students throughout the educational pipeline. Related literature is summarized and practical guidance is provided for designing, implementing, and evaluating programs with a focus toward developing students as leaders. © 2018 Wiley Periodicals, Inc.
A Controlled Evaluation of a High School Biomedical Pipeline Program: Design and Methods
ERIC Educational Resources Information Center
Winkleby, Marilyn A.; Ned, Judith; Ahn, David; Koehler, Alana; Fagliano, Kathleen; Crump, Casey
2014-01-01
Given limited funding for school-based science education, non-school-based programs have been developed at colleges and universities to increase the number of students entering science- and health-related careers and address critical workforce needs. However, few evaluations of such programs have been conducted. We report the design and methods of…
ERIC Educational Resources Information Center
Pinckney, Charlyene Carol
2014-01-01
The current study was undertaken to examine the effectiveness of the Rowan University-School of Osteopathic Medicine - Summer Pre-Medical Research and Education Program (Summer PREP), a postsecondary medical sciences enrichment pipeline program for under-represented and disadvantaged students. Thirty-four former program participants were surveyed…
Code of Federal Regulations, 2010 CFR
2010-10-01
... Relating to Transportation (Continued) PIPELINE AND HAZARDOUS MATERIALS SAFETY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED) PIPELINE SAFETY TRANSPORTATION OF NATURAL AND OTHER GAS BY PIPELINE: MINIMUM FEDERAL SAFETY STANDARDS Gas Transmission Pipeline Integrity Management § 192.915 What knowledge...
Employing Machine-Learning Methods to Study Young Stellar Objects
NASA Astrophysics Data System (ADS)
Moore, Nicholas
2018-01-01
Vast amounts of data exist in the astronomical data archives, and yet a large number of sources remain unclassified. We developed a multi-wavelength pipeline to classify infrared sources. The pipeline uses supervised machine learning methods to classify objects into the appropriate categories. The program is fed data that is already classified to train it, and is then applied to unknown catalogues. The primary use for such a pipeline is the rapid classification and cataloging of data that would take a much longer time to classify otherwise. While our primary goal is to study young stellar objects (YSOs), the applications extend beyond the scope of this project. We present preliminary results from our analysis and discuss future applications.
Bartels, Stephen J; Lebowitz, Barry D; Reynolds, Charles F; Bruce, Martha L; Halpain, Maureen; Faison, Warachal E; Kirwin, Paul D
2010-01-01
This report summarizes the findings and recommendations of an expert consensus workgroup that addressed the endangered pipeline of geriatric mental health (GMH) researchers. The workgroup was convened at the Summit on Challenges in Recruitment, Retention, and Career Development in Geriatric Mental Health Research in late 2007. Major identified challenges included attracting and developing early-career investigators into the field of GMH research; a shortfall of geriatric clinical providers and researchers; a disproportionate lack of minority researchers; inadequate mentoring and career development resources; and the loss of promising researchers during the vulnerable period of transition from research training to independent research funding. The field of GMH research has been at the forefront of developing successful programs that address these issues while spanning the spectrum of research career development. These programs serve as a model for other fields and disciplines. Core elements of these multicomponent programs include summer internships to foster early interest in GMH research (Summer Training on Aging Research Topics-Mental Health Program), research sponsorships aimed at recruitment into the field of geriatric psychiatry (Stepping Stones), research training institutes for early career development (Summer Research Institute in Geriatric Psychiatry), mentored intensive programs on developing and obtaining a first research grant (Advanced Research Institute in Geriatric Psychiatry), targeted development of minority researchers (Institute for Research Minority Training on Mental Health and Aging), and a Web-based clearinghouse of mentoring seminars and resources (MedEdMentoring.org). This report discusses implications of and principles for disseminating these programs, including examples of replications in fields besides GMH research.
DOT National Transportation Integrated Search
2010-06-01
On June 28, 2007, PHMSA released a Broad Agency Announcement (BAA), DTPH56- 07-BAA-000002, seeking white papers on individual projects and consolidated Research and Development (R&D) programs addressing topics on their pipeline safety program. Althou...
Optimal Energy Consumption Analysis of Natural Gas Pipeline
Liu, Enbin; Li, Changjun; Yang, Yi
2014-01-01
There are many compressor stations along long-distance natural gas pipelines. Natural gas can be transported using different boot programs and import pressures, combined with temperature control parameters. Moreover, different transport methods have correspondingly different energy consumptions. At present, the operating parameters of many pipelines are determined empirically by dispatchers, resulting in high energy consumption. This practice does not abide by energy reduction policies. Therefore, based on a full understanding of the actual needs of pipeline companies, we introduce production unit consumption indicators to establish an objective function for achieving the goal of lowering energy consumption. By using a dynamic programming method for solving the model and preparing calculation software, we can ensure that the solution process is quick and efficient. Using established optimization methods, we analyzed the energy savings for the XQ gas pipeline. By optimizing the boot program, the import station pressure, and the temperature parameters, we achieved the optimal energy consumption. By comparison with the measured energy consumption, the pipeline now has the potential to reduce energy consumption by 11 to 16 percent. PMID:24955410
77 FR 31827 - Pipeline Safety: Pipeline Damage Prevention Programs
Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-30
...://www.regulations.gov . FOR FURTHER INFORMATION CONTACT: For further information contact Sam Hall, Program Manager, PHMSA by email at sam[email protected] or by telephone at (804) 556-4678 or Larry White...
30 CFR 1206.157 - Determination of transportation allowances.
Code of Federal Regulations, 2012 CFR
2012-07-01
... interconnected to a series of outgoing pipelines; (5) Gas Research Institute (GRI) fees. The GRI conducts research, development, and commercialization programs on natural gas related topics for the benefit of the...
30 CFR 1206.157 - Determination of transportation allowances.
Code of Federal Regulations, 2011 CFR
2011-07-01
... interconnected to a series of outgoing pipelines; (5) Gas Research Institute (GRI) fees. The GRI conducts research, development, and commercialization programs on natural gas related topics for the benefit of the...
A Study Skills Curriculum for Pipeline Programs.
ERIC Educational Resources Information Center
Saks, Norma Susswein, Ed.; Killeya, Ley A., Ed.; Rushton, Joan, Ed.
This study skills curriculum is part of a "pipeline" program designed to recruit, matriculate, and graduate educationally disadvantaged students at the University of Medicine and Dentistry of New Jersey-Robert Wood Johnson Medical School (UMDNJ-RWJMS). It is an integral part of the Biomedical Careers Program (BCP) and the Science…
ERIC Educational Resources Information Center
Masterson, Kathryn
2008-01-01
The University of Michigan at Ann Arbor is offering a development internship program that is designed to give students real-world experience working in development jobs and the chance to meet major donors and network with alumni. Its goals are lofty: to create a pipeline of young people for the development profession; diversify the fund-raising…
Edlow, Brian L.; Hamilton, Karen; Hamilton, Roy H.
2007-01-01
This article provides an overview of the University of Pennsylvania School of Medicine’s Pipeline Neuroscience Program, a multi-tiered mentorship and education program for Philadelphia high school students in which University of Pennsylvania undergraduates are integrally involved. The Pipeline Neuroscience Program provides mentorship and education for students at all levels. High school students are taught by undergraduates, who learn from medical students who, in turn, are guided by neurology residents and fellows. Throughout a semester-long course, undergraduates receive instruction in neuroanatomy, neuroscience, and clinical neurology as part of the Pipeline’s case-based curriculum. During weekly classes, undergraduates make the transition from students to community educators by integrating their new knowledge into lesson plans that they teach to small groups of medically and academically underrepresented Philadelphia high school students. The Pipeline program thus achieves the dual goals of educating undergraduates about neuroscience and providing them with an opportunity to perform community service. PMID:23493190
78 FR 57455 - Pipeline Safety: Information Collection Activities
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-18
... ``. . . system-specific information, including pipe diameter, operating pressure, product transported, and...) must provide contact information and geospatial data on their pipeline system. This information should... Mapping System (NPMS) to support various regulatory programs, pipeline inspections, and authorized...
Guo, Li; Allen, Kelly S; Deiulio, Greg; Zhang, Yong; Madeiras, Angela M; Wick, Robert L; Ma, Li-Jun
2016-01-01
Current and emerging plant diseases caused by obligate parasitic microbes such as rusts, downy mildews, and powdery mildews threaten worldwide crop production and food safety. These obligate parasites are typically unculturable in the laboratory, posing technical challenges to characterize them at the genetic and genomic level. Here we have developed a data analysis pipeline integrating several bioinformatic software programs. This pipeline facilitates rapid gene discovery and expression analysis of a plant host and its obligate parasite simultaneously by next generation sequencing of mixed host and pathogen RNA (i.e., metatranscriptomics). We applied this pipeline to metatranscriptomic sequencing data of sweet basil (Ocimum basilicum) and its obligate downy mildew parasite Peronospora belbahrii, both lacking a sequenced genome. Even with a single data point, we were able to identify both candidate host defense genes and pathogen virulence genes that are highly expressed during infection. This demonstrates the power of this pipeline for identifying genes important in host-pathogen interactions without prior genomic information for either the plant host or the obligate biotrophic pathogen. The simplicity of this pipeline makes it accessible to researchers with limited computational skills and applicable to metatranscriptomic data analysis in a wide range of plant-obligate-parasite systems.
Building the Minority Faculty Development Pipeline.
ERIC Educational Resources Information Center
Gates, Paul E.; Ganey, James H.; Brown, Marc D.
2003-01-01
Describes efforts toward minority faculty development in dentistry, including those of Harlem Hospital-Columbia University School of Dentistry and Oral Surgery, the National Dental Association Foundation, and Bronx Lebanon Hospital Center. Explains that critical elements in the success of these programs are environment, selection criteria,…
OpenCluster: A Flexible Distributed Computing Framework for Astronomical Data Processing
NASA Astrophysics Data System (ADS)
Wei, Shoulin; Wang, Feng; Deng, Hui; Liu, Cuiyin; Dai, Wei; Liang, Bo; Mei, Ying; Shi, Congming; Liu, Yingbo; Wu, Jingping
2017-02-01
The volume of data generated by modern astronomical telescopes is extremely large and rapidly growing. However, current high-performance data processing architectures/frameworks are not well suited for astronomers because of their limitations and programming difficulties. In this paper, we therefore present OpenCluster, an open-source distributed computing framework to support rapidly developing high-performance processing pipelines of astronomical big data. We first detail the OpenCluster design principles and implementations and present the APIs facilitated by the framework. We then demonstrate a case in which OpenCluster is used to resolve complex data processing problems for developing a pipeline for the Mingantu Ultrawide Spectral Radioheliograph. Finally, we present our OpenCluster performance evaluation. Overall, OpenCluster provides not only high fault tolerance and simple programming interfaces, but also a flexible means of scaling up the number of interacting entities. OpenCluster thereby provides an easily integrated distributed computing framework for quickly developing a high-performance data processing system of astronomical telescopes and for significantly reducing software development expenses.
A Concept for the One Degree Imager (ODI) Data Reduction Pipeline and Archiving System
NASA Astrophysics Data System (ADS)
Knezek, Patricia; Stobie, B.; Michael, S.; Valdes, F.; Marru, S.; Henschel, R.; Pierce, M.
2010-05-01
The One Degree Imager (ODI), currently being built by the WIYN Observatory, will provide tremendous possibilities for conducting diverse scientific programs. ODI will be a complex instrument, using non-conventional Orthogonal Transfer Array (OTA) detectors. Due to its large field of view, small pixel size, use of OTA technology, and expected frequent use, ODI will produce vast amounts of astronomical data. If ODI is to achieve its full potential, a data reduction pipeline must be developed. Long-term archiving must also be incorporated into the pipeline system to ensure the continued value of ODI data. This paper presents a concept for an ODI data reduction pipeline and archiving system. To limit costs and development time, our plan leverages existing software and hardware, including existing pipeline software, Science Gateways, Computational Grid & Cloud Technology, Indiana University's Data Capacitor and Massive Data Storage System, and TeraGrid compute resources. Existing pipeline software will be augmented to add functionality required to meet challenges specific to ODI, enhance end-user control, and enable the execution of the pipeline on grid resources including national grid resources such as the TeraGrid and Open Science Grid. The planned system offers consistent standard reductions and end-user flexibility when working with images beyond the initial instrument signature removal. It also gives end-users access to computational and storage resources far beyond what are typically available at most institutions. Overall, the proposed system provides a wide array of software tools and the necessary hardware resources to use them effectively.
49 CFR 190.239 - Safety orders.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 49 Transportation 3 2010-10-01 2010-10-01 false Safety orders. 190.239 Section 190.239 Transportation Other Regulations Relating to Transportation (Continued) PIPELINE AND HAZARDOUS MATERIALS SAFETY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED) PIPELINE SAFETY PIPELINE SAFETY PROGRAMS AND RULEMAKING...
ML-o-Scope: A Diagnostic Visualization System for Deep Machine Learning Pipelines
2014-05-16
ML-o-scope: a diagnostic visualization system for deep machine learning pipelines Daniel Bruckner Electrical Engineering and Computer Sciences... machine learning pipelines 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK NUMBER 5f...the system as a support for tuning large scale object-classification pipelines. 1 Introduction A new generation of pipelined machine learning models
Implementing a Workforce Development Pipeline
NASA Technical Reports Server (NTRS)
Hix, Billy
2002-01-01
Research shows that the number of highly trained scientists and engineers has continued a steady decline during the 1990's. Furthermore, at the high school level, almost 40% of the total high school graduates are seeking technical skills in preparation of entering the workforce directly. The decrease of students in technology and science programs, along with the lack of viable vocational programs, haunts educators and businesses alike. However, MSFC (Marshall Space Flight Center) has the opportunity to become a leading edge model of workforce development by offering a unified program of apprenticeships, workshops, and educational initiatives. These programs will be designed to encourage young people of all backgrounds to pursue the fields of technology and science, to assist research opportunities, and to support teachers in the systemic changes that they are facing. The emphasis of our program based on grade levels will be: Elementary Level: Exposure to the workforce. Middle School: Examine the workforce. High School and beyond: Instruct the workforce. It is proposed that MSFC create a well-integrated Workforce Development Pipeline Program. The program will act to integrate the many and varied programs offered across MSFC directorates and offices. It will offer a clear path of programs for students throughout middle school, high school, technical training, and college and universities. The end result would consist of technicians, bachelors degrees, masters degrees, and PhDs in science and engineering fields entering the nation's workforce, with a focus on NASA's future personnel needs.
Transforming microbial genotyping: a robotic pipeline for genotyping bacterial strains.
O'Farrell, Brian; Haase, Jana K; Velayudhan, Vimalkumar; Murphy, Ronan A; Achtman, Mark
2012-01-01
Microbial genotyping increasingly deals with large numbers of samples, and data are commonly evaluated by unstructured approaches, such as spread-sheets. The efficiency, reliability and throughput of genotyping would benefit from the automation of manual manipulations within the context of sophisticated data storage. We developed a medium- throughput genotyping pipeline for MultiLocus Sequence Typing (MLST) of bacterial pathogens. This pipeline was implemented through a combination of four automated liquid handling systems, a Laboratory Information Management System (LIMS) consisting of a variety of dedicated commercial operating systems and programs, including a Sample Management System, plus numerous Python scripts. All tubes and microwell racks were bar-coded and their locations and status were recorded in the LIMS. We also created a hierarchical set of items that could be used to represent bacterial species, their products and experiments. The LIMS allowed reliable, semi-automated, traceable bacterial genotyping from initial single colony isolation and sub-cultivation through DNA extraction and normalization to PCRs, sequencing and MLST sequence trace evaluation. We also describe robotic sequencing to facilitate cherrypicking of sequence dropouts. This pipeline is user-friendly, with a throughput of 96 strains within 10 working days at a total cost of < €25 per strain. Since developing this pipeline, >200,000 items were processed by two to three people. Our sophisticated automated pipeline can be implemented by a small microbiology group without extensive external support, and provides a general framework for semi-automated bacterial genotyping of large numbers of samples at low cost.
Transforming Microbial Genotyping: A Robotic Pipeline for Genotyping Bacterial Strains
Velayudhan, Vimalkumar; Murphy, Ronan A.; Achtman, Mark
2012-01-01
Microbial genotyping increasingly deals with large numbers of samples, and data are commonly evaluated by unstructured approaches, such as spread-sheets. The efficiency, reliability and throughput of genotyping would benefit from the automation of manual manipulations within the context of sophisticated data storage. We developed a medium- throughput genotyping pipeline for MultiLocus Sequence Typing (MLST) of bacterial pathogens. This pipeline was implemented through a combination of four automated liquid handling systems, a Laboratory Information Management System (LIMS) consisting of a variety of dedicated commercial operating systems and programs, including a Sample Management System, plus numerous Python scripts. All tubes and microwell racks were bar-coded and their locations and status were recorded in the LIMS. We also created a hierarchical set of items that could be used to represent bacterial species, their products and experiments. The LIMS allowed reliable, semi-automated, traceable bacterial genotyping from initial single colony isolation and sub-cultivation through DNA extraction and normalization to PCRs, sequencing and MLST sequence trace evaluation. We also describe robotic sequencing to facilitate cherrypicking of sequence dropouts. This pipeline is user-friendly, with a throughput of 96 strains within 10 working days at a total cost of < €25 per strain. Since developing this pipeline, >200,000 items were processed by two to three people. Our sophisticated automated pipeline can be implemented by a small microbiology group without extensive external support, and provides a general framework for semi-automated bacterial genotyping of large numbers of samples at low cost. PMID:23144721
Rep. Young, Don [R-AK-At Large
2011-02-08
House - 02/09/2011 Referred to the Subcommittee on Railroads, Pipelines, and Hazardous Materials. (All Actions) Tracker: This bill has the status IntroducedHere are the steps for Status of Legislation:
78 FR 59906 - Pipeline Safety: Class Location Requirements
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-30
... 192 [Docket No. PHMSA-2013-0161] Pipeline Safety: Class Location Requirements AGENCY: Pipeline and... Location Requirements,'' seeking comments on whether integrity management program (IMP) requirements, or... for class location requirements. PHMSA has received two requests to extend the comment period to allow...
Pipeline safety and security : improved workforce planning and communication needed
DOT National Transportation Integrated Search
2002-08-01
Pipelines transport about 65 percent of the crude oil and refined oil products and nearly all of the natural gas in the United States. The Office of Pipeline Safety (OPS), within the Department of Transportation's (DOT) Research and Special Programs ...
49 CFR 192.1003 - What do the regulations in this subpart cover?
Code of Federal Regulations, 2010 CFR
2010-10-01
... AND HAZARDOUS MATERIALS SAFETY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED) PIPELINE SAFETY TRANSPORTATION OF NATURAL AND OTHER GAS BY PIPELINE: MINIMUM FEDERAL SAFETY STANDARDS Gas...? General. This subpart prescribes minimum requirements for an IM program for any gas distribution pipeline...
The Rural Girls in Science Project: from Pipelines to Affirming Science Education
NASA Astrophysics Data System (ADS)
Ginorio, Angela B.; Huston, Michelle; Frevert, Katie; Seibel, Jane Bierman
The Rural Girls in Science (RGS) program was developed to foster the interest in science, engineering, and mathematics among rural high school girls in the state of Washington. Girls served include American Indians, Latinas, and Whites. This article provides an overview of the program and its outcomes not only for the participants (girls, teachers, counselors, and schools) but the researchers. Lessons learned from and about the participants are presented, and lessons learned from the process are discussed to illustrate how RGS moved from a focus on individuals to a focus on the school. The initial guiding concepts (self-esteem and scientific pipeline) were replaced by “possible selves” and our proposed complementary concepts: science-affirming and affirming science education.
Development and Applications of Pipeline Steel in Long-Distance Gas Pipeline of China
NASA Astrophysics Data System (ADS)
Chunyong, Huo; Yang, Li; Lingkang, Ji
In past decades, with widely utilizing of Microalloying and Thermal Mechanical Control Processing (TMCP) technology, the good matching of strength, toughness, plasticity and weldability on pipeline steel has been reached so that oil and gas pipeline has been greatly developed in China to meet the demand of strong domestic consumption of energy. In this paper, development history of pipeline steel and gas pipeline in china is briefly reviewed. The microstructure characteristic and mechanical performance of pipeline steel used in some representative gas pipelines of china built in different stage are summarized. Through the analysis on the evolution of pipeline service environment, some prospective development trend of application of pipeline steel in China is also presented.
The American Science Pipeline: Sustaining Innovation in a Time of Economic Crisis
ERIC Educational Resources Information Center
Hue, Gillian; Sales, Jessica; Comeau, Dawn; Lynn, David G.; Eisen, Arri
2010-01-01
Significant limitations have emerged in America's science training pipeline, including inaccessibility, inflexibility, financial limitations, and lack of diversity. We present three effective programs that collectively address these challenges. The programs are grounded in rigorous science and integrate through diverse disciplines across…
Comeau, Donald C.; Liu, Haibin; Islamaj Doğan, Rezarta; Wilbur, W. John
2014-01-01
BioC is a new format and associated code libraries for sharing text and annotations. We have implemented BioC natural language preprocessing pipelines in two popular programming languages: C++ and Java. The current implementations interface with the well-known MedPost and Stanford natural language processing tool sets. The pipeline functionality includes sentence segmentation, tokenization, part-of-speech tagging, lemmatization and sentence parsing. These pipelines can be easily integrated along with other BioC programs into any BioC compliant text mining systems. As an application, we converted the NCBI disease corpus to BioC format, and the pipelines have successfully run on this corpus to demonstrate their functionality. Code and data can be downloaded from http://bioc.sourceforge.net. Database URL: http://bioc.sourceforge.net PMID:24935050
Developing and Managing Talent in the SEA. Benchmark. No. 4
ERIC Educational Resources Information Center
Gross, B.; Jochim A.
2013-01-01
State education agencies (SEAs) are reframing their work to be more coordinated and strategic but talent in most SEAs continues to be in large part defined by federal programs and oriented toward the routines of compliance. Existing talent pipelines in SEAs are rooted in the historic functions of administering federal programs and doing little…
DOT National Transportation Integrated Search
2009-01-01
The Army maintains the capability to employ temporary petroleum pipelines. With the fiscal year (FY) 0813 program objective memorandum (POM) force, the Army proposes to retain two Active and twelve Reserve Petroleum Pipeline and Terminal Operating...
Neuroimaging Study Designs, Computational Analyses and Data Provenance Using the LONI Pipeline
Dinov, Ivo; Lozev, Kamen; Petrosyan, Petros; Liu, Zhizhong; Eggert, Paul; Pierce, Jonathan; Zamanyan, Alen; Chakrapani, Shruthi; Van Horn, John; Parker, D. Stott; Magsipoc, Rico; Leung, Kelvin; Gutman, Boris; Woods, Roger; Toga, Arthur
2010-01-01
Modern computational neuroscience employs diverse software tools and multidisciplinary expertise to analyze heterogeneous brain data. The classical problems of gathering meaningful data, fitting specific models, and discovering appropriate analysis and visualization tools give way to a new class of computational challenges—management of large and incongruous data, integration and interoperability of computational resources, and data provenance. We designed, implemented and validated a new paradigm for addressing these challenges in the neuroimaging field. Our solution is based on the LONI Pipeline environment [3], [4], a graphical workflow environment for constructing and executing complex data processing protocols. We developed study-design, database and visual language programming functionalities within the LONI Pipeline that enable the construction of complete, elaborate and robust graphical workflows for analyzing neuroimaging and other data. These workflows facilitate open sharing and communication of data and metadata, concrete processing protocols, result validation, and study replication among different investigators and research groups. The LONI Pipeline features include distributed grid-enabled infrastructure, virtualized execution environment, efficient integration, data provenance, validation and distribution of new computational tools, automated data format conversion, and an intuitive graphical user interface. We demonstrate the new LONI Pipeline features using large scale neuroimaging studies based on data from the International Consortium for Brain Mapping [5] and the Alzheimer's Disease Neuroimaging Initiative [6]. User guides, forums, instructions and downloads of the LONI Pipeline environment are available at http://pipeline.loni.ucla.edu. PMID:20927408
ERIC Educational Resources Information Center
Knox, Ronny D.
2013-01-01
This research project used the Narrative Non-fiction method to examine the school-to-prison pipeline phenomenon through the experiences of four previously incarcerated adult males who had been placed in Discipline Alternative Educational Programs (DAEPs) during their public school education. In 1981, DAEPs were instituted as a pilot program to…
Natural Gas Pipeline Replacement Programs Reduce Methane Leaks and Improve Consumer Safety
NASA Astrophysics Data System (ADS)
Jackson, R. B.
2015-12-01
From production through distribution, oil and natural gas infrastructure provide the largest source of anthropogenic methane in the U.S. and the second largest globally. To examine the prevalence of natural gas leaks downstream in distribution systems, we mapped methane leaks across 595, 750, and 247 road miles of three U.S. cities—Durham, NC, Cincinnati, OH, and Manhattan, NY, respectively—at different stages of pipeline replacement of cast iron and other older materials. We compare results with those for two cities we mapped previously, Boston and Washington, D.C. Overall, cities with pipeline replacement programs have considerably fewer leaks per mile than cities without such programs. Similar programs around the world should provide additional environmental, economic, and consumer safety benefits.
RGAugury: a pipeline for genome-wide prediction of resistance gene analogs (RGAs) in plants.
Li, Pingchuan; Quan, Xiande; Jia, Gaofeng; Xiao, Jin; Cloutier, Sylvie; You, Frank M
2016-11-02
Resistance gene analogs (RGAs), such as NBS-encoding proteins, receptor-like protein kinases (RLKs) and receptor-like proteins (RLPs), are potential R-genes that contain specific conserved domains and motifs. Thus, RGAs can be predicted based on their conserved structural features using bioinformatics tools. Computer programs have been developed for the identification of individual domains and motifs from the protein sequences of RGAs but none offer a systematic assessment of the different types of RGAs. A user-friendly and efficient pipeline is needed for large-scale genome-wide RGA predictions of the growing number of sequenced plant genomes. An integrative pipeline, named RGAugury, was developed to automate RGA prediction. The pipeline first identifies RGA-related protein domains and motifs, namely nucleotide binding site (NB-ARC), leucine rich repeat (LRR), transmembrane (TM), serine/threonine and tyrosine kinase (STTK), lysin motif (LysM), coiled-coil (CC) and Toll/Interleukin-1 receptor (TIR). RGA candidates are identified and classified into four major families based on the presence of combinations of these RGA domains and motifs: NBS-encoding, TM-CC, and membrane associated RLP and RLK. All time-consuming analyses of the pipeline are paralleled to improve performance. The pipeline was evaluated using the well-annotated Arabidopsis genome. A total of 98.5, 85.2, and 100 % of the reported NBS-encoding genes, membrane associated RLPs and RLKs were validated, respectively. The pipeline was also successfully applied to predict RGAs for 50 sequenced plant genomes. A user-friendly web interface was implemented to ease command line operations, facilitate visualization and simplify result management for multiple datasets. RGAugury is an efficiently integrative bioinformatics tool for large scale genome-wide identification of RGAs. It is freely available at Bitbucket: https://bitbucket.org/yaanlpc/rgaugury .
A Controlled Evaluation of a High School Biomedical Pipeline Program: Design and Methods
NASA Astrophysics Data System (ADS)
Winkleby, Marilyn A.; Ned, Judith; Ahn, David; Koehler, Alana; Fagliano, Kathleen; Crump, Casey
2014-02-01
Given limited funding for school-based science education, non-school-based programs have been developed at colleges and universities to increase the number of students entering science- and health-related careers and address critical workforce needs. However, few evaluations of such programs have been conducted. We report the design and methods of a controlled trial to evaluate the Stanford Medical Youth Science Program's Summer Residential Program (SRP), a 25-year-old university-based biomedical pipeline program. This 5-year matched cohort study uses an annual survey to assess educational and career outcomes among four cohorts of students who participate in the SRP and a matched comparison group of applicants who were not chosen to participate in the SRP. Matching on sociodemographic and academic background allows control for potential confounding. This design enables the testing of whether the SRP has an independent effect on educational- and career-related outcomes above and beyond the effects of other factors such as gender, ethnicity, socioeconomic background, and pre-intervention academic preparation. The results will help determine which curriculum components contribute most to successful outcomes and which students benefit most. After 4 years of follow-up, the results demonstrate high response rates from SRP participants and the comparison group with completion rates near 90 %, similar response rates by gender and ethnicity, and little attrition with each additional year of follow-up. This design and methods can potentially be replicated to evaluate and improve other biomedical pipeline programs, which are increasingly important for equipping more students for science- and health-related careers.
A CONTROLLED EVALUATION OF A HIGH SCHOOL BIOMEDICAL PIPELINE PROGRAM: DESIGN AND METHODS.
Winkleby, Marilyn A; Ned, Judith; Ahn, David; Koehler, Alana; Fagliano, Kathleen; Crump, Casey
2014-02-01
Given limited funding for school-based science education, non-school-based programs have been developed at colleges and universities to increase the number of students entering science- and health-related careers and address critical workforce needs. However, few evaluations of such programs have been conducted. We report the design and methods of a controlled trial to evaluate the Stanford Medical Youth Science Program's Summer Residential Program (SRP), a 25-year-old university-based biomedical pipeline program. This 5-year matched cohort study uses an annual survey to assess educational and career outcomes among four cohorts of students who participate in the SRP and a matched comparison group of applicants who were not chosen to participate in the SRP. Matching on sociodemographic and academic background allows control for potential confounding. This design enables the testing of whether the SRP has an independent effect on educational- and career-related outcomes above and beyond the effects of other factors such as gender, ethnicity, socioeconomic background, and pre-intervention academic preparation. The results will help determine which curriculum components contribute most to successful outcomes and which students benefit most. After 4 years of follow-up, the results demonstrate high response rates from SRP participants and the comparison group with completion rates near 90%, similar response rates by gender and ethnicity, and little attrition with each additional year of follow-up. This design and methods can potentially be replicated to evaluate and improve other biomedical pipeline programs, which are increasingly important for equipping more students for science- and health-related careers.
The Stanford Medical Youth Science Program: educational and science-related outcomes.
Crump, Casey; Ned, Judith; Winkleby, Marilyn A
2015-05-01
Biomedical preparatory programs (pipeline programs) have been developed at colleges and universities to better prepare youth for entering science- and health-related careers, but outcomes of such programs have seldom been rigorously evaluated. We conducted a matched cohort study to evaluate the Stanford Medical Youth Science Program's Summer Residential Program (SRP), a 25-year-old university-based biomedical pipeline program that reaches out to low-income and underrepresented ethnic minority high school students. Five annual surveys were used to assess educational outcomes and science-related experience among 96 SRP participants and a comparison group of 192 youth who applied but were not selected to participate in the SRP, using ~2:1 matching on sociodemographic and academic background to control for potential confounders. SRP participants were more likely than the comparison group to enter college (100.0 vs. 84.4 %, p = 0.002), and both of these matriculation rates were more than double the statewide average (40.8 %). In most areas of science-related experience, SRP participants reported significantly more experience (>twofold odds) than the comparison group at 1 year of follow-up, but these differences did not persist after 2-4 years. The comparison group reported substantially more participation in science or college preparatory programs, more academic role models, and less personal adversity than SRP participants, which likely influenced these findings toward the null hypothesis. SRP applicants, irrespective of whether selected for participation, had significantly better educational outcomes than population averages. Short-term science-related experience was better among SRP participants, although longer-term outcomes were similar, most likely due to college and science-related opportunities among the comparison group. We discuss implications for future evaluations of other biomedical pipeline programs.
Exposure of Seventh and Eighth Grade Urban Youth to Dentistry and Oral Health Careers.
Mayberry, Melanie E; Young, Deirdre D; Sawilowsky, Shlomo; Hoelscher, Diane
2018-01-01
While pipeline programs for students from underrepresented minority groups have been established at the high school and college levels, fewer programs have been developed for middle school students. In an effort to reach this cohort, the University of Detroit Mercy School of Dentistry embarked on a grassroots collaborative pipeline program with two distinct segments: Urban Impressions and Dental Imprint. Their purpose is to expose Detroit-area seventh and eighth grade students to careers in dentistry, provide oral health education, and introduce role models. The aim of this pilot study was to determine outcomes for the middle school participants in Urban Impressions (n=86) and Dental Imprint (n=68). Both segments featured hands-on dental activities at the dental school. Outcomes were assessed by pretest-posttest surveys. Across the three cohorts, a total of 86 students participated in one or more sessions, with 57 completing the pre- and post-program surveys, for a 66% response rate. The results showed that the Dental Imprint respondents' knowledge of oral health, dental admissions, and specialties increased by an average 26% over three years. The gain in knowledge for each cohort was statistically significant (p<0.001). Overall, 91% of Urban Impressions and 95% of Dental Imprint respondents were positive about the value of the program. Thirty-one of 57 Urban Impressions respondents indicated interest in dentistry as a career following the program. These results suggest that the two segments of this program are meeting their goals of increasing middle grade students' awareness of oral health professions including dentistry and providing access to role models. Institutions may benefit from the description of strategies used by this program to address challenges related to establishing early pipeline programs.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wynne, Adam S.
2011-05-05
In many application domains in science and engineering, data produced by sensors, instruments and networks is naturally processed by software applications structured as a pipeline . Pipelines comprise a sequence of software components that progressively process discrete units of data to produce a desired outcome. For example, in a Web crawler that is extracting semantics from text on Web sites, the first stage in the pipeline might be to remove all HTML tags to leave only the raw text of the document. The second step may parse the raw text to break it down into its constituent grammatical parts, suchmore » as nouns, verbs and so on. Subsequent steps may look for names of people or places, interesting events or times so documents can be sequenced on a time line. Each of these steps can be written as a specialized program that works in isolation with other steps in the pipeline. In many applications, simple linear software pipelines are sufficient. However, more complex applications require topologies that contain forks and joins, creating pipelines comprising branches where parallel execution is desirable. It is also increasingly common for pipelines to process very large files or high volume data streams which impose end-to-end performance constraints. Additionally, processes in a pipeline may have specific execution requirements and hence need to be distributed as services across a heterogeneous computing and data management infrastructure. From a software engineering perspective, these more complex pipelines become problematic to implement. While simple linear pipelines can be built using minimal infrastructure such as scripting languages, complex topologies and large, high volume data processing requires suitable abstractions, run-time infrastructures and development tools to construct pipelines with the desired qualities-of-service and flexibility to evolve to handle new requirements. The above summarizes the reasons we created the MeDICi Integration Framework (MIF) that is designed for creating high-performance, scalable and modifiable software pipelines. MIF exploits a low friction, robust, open source middleware platform and extends it with component and service-based programmatic interfaces that make implementing complex pipelines simple. The MIF run-time automatically handles queues between pipeline elements in order to handle request bursts, and automatically executes multiple instances of pipeline elements to increase pipeline throughput. Distributed pipeline elements are supported using a range of configurable communications protocols, and the MIF interfaces provide efficient mechanisms for moving data directly between two distributed pipeline elements.« less
DIVERSITY IN THE BIOMEDICAL RESEARCH WORKFORCE: DEVELOPING TALENT
McGee, Richard; Saran, Suman; Krulwich, Terry A.
2012-01-01
Much has been written about the need for and barriers to achievement of greater diversity in the biomedical workforce from the perspectives of gender, race and ethnicity; this is not a new topic. These discussions often center around a ‘pipeline metaphor’ which imagines students flowing through a series of experiences to eventually arrive at a science career. Here we argue that diversity will only be achieved if the primary focus is on: what is happening within the pipeline, not just counting individuals entering and leaving it; de-emphasizing achieving academic milestones by ‘typical’ ages; and adopting approaches that most effectively develop talent. Students may develop skills at different rates based on factors such as earlier access to educational resources, exposure to science (especially research experiences), and competing demands for time and attention during high school and college. Therefore, there is wide variety among students at any point along the pipeline. Taking this view requires letting go of imagining the pipeline as a sequence of age-dependent steps in favor of milestones of skill and talent development decoupled from age or educational stage. Emphasizing talent development opens up many new approaches for science training outside of traditional degree programs. This article provides examples of such approaches, including interventions at the post-baccalaureate and PhD levels, as well as a novel coaching model that incorporates well-established social science theories and complements traditional mentoring. These approaches could significantly impact diversity by developing scientific talent, especially among currently underrepresented minorities. PMID:22678863
Guo, Li; Allen, Kelly S.; Deiulio, Greg; Zhang, Yong; Madeiras, Angela M.; Wick, Robert L.; Ma, Li-Jun
2016-01-01
Current and emerging plant diseases caused by obligate parasitic microbes such as rusts, downy mildews, and powdery mildews threaten worldwide crop production and food safety. These obligate parasites are typically unculturable in the laboratory, posing technical challenges to characterize them at the genetic and genomic level. Here we have developed a data analysis pipeline integrating several bioinformatic software programs. This pipeline facilitates rapid gene discovery and expression analysis of a plant host and its obligate parasite simultaneously by next generation sequencing of mixed host and pathogen RNA (i.e., metatranscriptomics). We applied this pipeline to metatranscriptomic sequencing data of sweet basil (Ocimum basilicum) and its obligate downy mildew parasite Peronospora belbahrii, both lacking a sequenced genome. Even with a single data point, we were able to identify both candidate host defense genes and pathogen virulence genes that are highly expressed during infection. This demonstrates the power of this pipeline for identifying genes important in host–pathogen interactions without prior genomic information for either the plant host or the obligate biotrophic pathogen. The simplicity of this pipeline makes it accessible to researchers with limited computational skills and applicable to metatranscriptomic data analysis in a wide range of plant-obligate-parasite systems. PMID:27462318
World Bank oil-pipeline project designed to prevent HIV transmission.
Kigotho, A W
1997-11-29
A World Bank-funded oil pipeline project, in Chad and Cameroon, is the first large-scale construction project in sub-Saharan Africa to incorporate an HIV/AIDS prevention component. The project entails the development of oil fields in southern Chad and construction of 1100 km of pipeline to port facilities on Cameroon's Atlantic coast. 3000 construction workers from the two countries will be employed between 1998 and 2001, including about 600 truck drivers. In some areas along the pipeline route, 50% of the prostitutes (who are frequented by truck drivers) are HIV-infected. The HIV/AIDS intervention aims to prevent HIV and sexually transmitted diseases (STDs) among project workers through social marketing of condoms, treatment of STDs in prostitutes along the route, and health education to modify high-risk behaviors. The program is considered a test case for African governments and donors interested in whether the integration of a health component in major construction projects can avoid AIDS epidemics in affected countries.
Comeau, Donald C; Liu, Haibin; Islamaj Doğan, Rezarta; Wilbur, W John
2014-01-01
BioC is a new format and associated code libraries for sharing text and annotations. We have implemented BioC natural language preprocessing pipelines in two popular programming languages: C++ and Java. The current implementations interface with the well-known MedPost and Stanford natural language processing tool sets. The pipeline functionality includes sentence segmentation, tokenization, part-of-speech tagging, lemmatization and sentence parsing. These pipelines can be easily integrated along with other BioC programs into any BioC compliant text mining systems. As an application, we converted the NCBI disease corpus to BioC format, and the pipelines have successfully run on this corpus to demonstrate their functionality. Code and data can be downloaded from http://bioc.sourceforge.net. Database URL: http://bioc.sourceforge.net. © The Author(s) 2014. Published by Oxford University Press.
49 CFR Appendix C to Part 195 - Guidance for Implementation of an Integrity Management Program
Code of Federal Regulations, 2010 CFR
2010-10-01
... (Continued) PIPELINE AND HAZARDOUS MATERIALS SAFETY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED... understanding and analysis of the failure mechanisms or threats to integrity of each pipeline segment. (2) An... pipeline, information and data used for the information analysis; (13) results of the information analyses...
A real-time coherent dedispersion pipeline for the giant metrewave radio telescope
NASA Astrophysics Data System (ADS)
De, Kishalay; Gupta, Yashwant
2016-02-01
A fully real-time coherent dedispersion system has been developed for the pulsar back-end at the Giant Metrewave Radio Telescope (GMRT). The dedispersion pipeline uses the single phased array voltage beam produced by the existing GMRT software back-end (GSB) to produce coherently dedispersed intensity output in real time, for the currently operational bandwidths of 16 MHz and 32 MHz. Provision has also been made to coherently dedisperse voltage beam data from observations recorded on disk. We discuss the design and implementation of the real-time coherent dedispersion system, describing the steps carried out to optimise the performance of the pipeline. Presently functioning on an Intel Xeon X5550 CPU equipped with a NVIDIA Tesla C2075 GPU, the pipeline allows dispersion free, high time resolution data to be obtained in real-time. We illustrate the significant improvements over the existing incoherent dedispersion system at the GMRT, and present some preliminary results obtained from studies of pulsars using this system, demonstrating its potential as a useful tool for low frequency pulsar observations. We describe the salient features of our implementation, comparing it with other recently developed real-time coherent dedispersion systems. This implementation of a real-time coherent dedispersion pipeline for a large, low frequency array instrument like the GMRT, will enable long-term observing programs using coherent dedispersion to be carried out routinely at the observatory. We also outline the possible improvements for such a pipeline, including prospects for the upgraded GMRT which will have bandwidths about ten times larger than at present.
Measuring the Success of a Pipeline Program to Increase Nursing Workforce Diversity.
Katz, Janet R; Barbosa-Leiker, Celestina; Benavides-Vaello, Sandra
2016-01-01
The purpose of this study was to understand changes in knowledge and opinions of underserved American Indian and Hispanic high school students after attending a 2-week summer pipeline program using and testing a pre/postsurvey. The research aims were to (a) psychometrically analyze the survey to determine if scale items could be summed to create a total scale score or subscale scores; (b) assess change in scores pre/postprogram; and (c) examine the survey to make suggestions for modifications and further testing to develop a valid tool to measure changes in student perceptions about going to college and nursing as a result of pipeline programs. Psychometric analysis indicated poor model fit for a 1-factor model for the total scale and majority of subscales. Nonparametric tests indicated statistically significant increases in 13 items and decreases in 2 items. Therefore, while total scores or subscale scores cannot be used to assess changes in perceptions from pre- to postprogram, the survey can be used to examine changes over time in each item. Student did not have an accurate view of nursing and college and underestimated support needed to attend college. However students realized that nursing was a profession with autonomy, respect, and honor. Copyright © 2016 Elsevier Inc. All rights reserved.
A CONTROLLED EVALUATION OF A HIGH SCHOOL BIOMEDICAL PIPELINE PROGRAM: DESIGN AND METHODS
Winkleby, Marilyn A.; Ned, Judith; Ahn, David; Koehler, Alana; Fagliano, Kathleen; Crump, Casey
2013-01-01
Given limited funding for school-based science education, non-school-based programs have been developed at colleges and universities to increase the number of students entering science- and health-related careers and address critical workforce needs. However, few evaluations of such programs have been conducted. We report the design and methods of a controlled trial to evaluate the Stanford Medical Youth Science Program’s Summer Residential Program (SRP), a 25-year-old university-based biomedical pipeline program. This 5-year matched cohort study uses an annual survey to assess educational and career outcomes among four cohorts of students who participate in the SRP and a matched comparison group of applicants who were not chosen to participate in the SRP. Matching on sociodemographic and academic background allows control for potential confounding. This design enables the testing of whether the SRP has an independent effect on educational- and career-related outcomes above and beyond the effects of other factors such as gender, ethnicity, socioeconomic background, and pre-intervention academic preparation. The results will help determine which curriculum components contribute most to successful outcomes and which students benefit most. After 4 years of follow-up, the results demonstrate high response rates from SRP participants and the comparison group with completion rates near 90%, similar response rates by gender and ethnicity, and little attrition with each additional year of follow-up. This design and methods can potentially be replicated to evaluate and improve other biomedical pipeline programs, which are increasingly important for equipping more students for science- and health-related careers. PMID:24563603
Chemical calculations on Cray computers
NASA Technical Reports Server (NTRS)
Taylor, Peter R.; Bauschlicher, Charles W., Jr.; Schwenke, David W.
1989-01-01
The influence of recent developments in supercomputing on computational chemistry is discussed with particular reference to Cray computers and their pipelined vector/limited parallel architectures. After reviewing Cray hardware and software the performance of different elementary program structures are examined, and effective methods for improving program performance are outlined. The computational strategies appropriate for obtaining optimum performance in applications to quantum chemistry and dynamics are discussed. Finally, some discussion is given of new developments and future hardware and software improvements.
DEVELOPMENT OF AN ENVIRONMENTALLY BENIGN MICROBIAL INHIBITOR TO CONTROL INTERNAL PIPELINE CORROSION
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kristine L. Lowe; Bill W. Bogan; Wendy R. Sullivan
2004-07-30
The overall program objective is to develop and evaluate environmentally benign agents or products that are effective in the prevention, inhibition, and mitigation of microbially influenced corrosion (MIC) in the internal surfaces of metallic natural gas pipelines. The goal is to develop one or more environmentally benign (a.k.a. ''green'') products that can be applied to maintain the structure and dependability of the natural gas infrastructure. Previous testing indicated that the growth, and the metal corrosion caused by pure cultures of sulfate reducing bacteria were inhibited by hexane extracts of some pepper plants. This quarter tests were performed with mixed bacterialmore » cultures obtained from natural gas pipelines. Treatment with the pepper extracts affected the growth and metabolic activity of the microbial consortia. Specifically, the growth and metabolism of sulfate reducing bacteria was inhibited. The demonstration that pepper extracts can inhibit the growth and metabolism of sulfate reducing bacteria in mixed cultures is a significant observation validating a key hypothesis of the project. Future tests to determine the effects of pepper extracts on mature/established biofilms will be performed next.« less
Rashied-Henry, Kweli; Fraser-White, Marilyn; Roberts, Calpurnyia B; Wilson, Tracey E; Morgan, Rochelle; Brown, Humberto; Shaw, Raphael; Jean-Louis, Girardin; Graham, Yvonne J; Brown, Clinton; Browne, Ruth
2012-01-01
The purpose of this paper was to describe the development and implementation of a health disparities summer internship program for minority high school students that was created to increase their knowledge of health disparities, provide hands-on training in community-engaged research, support their efforts to advocate for policy change, and further encourage youth to pursue careers in the health professions. Fifty-one high school students who were enrolled in a well-established, science-enrichment after-school program in Brooklyn, New York, participated in a 4-week summer internship program. Students conducted a literature review, focus groups/interviews, geographic mapping or survey development that focused on reducing health disparities at 1 of 15 partnering CBOs. Overall, student interns gained an increase in knowledge of racial/ethnic health disparities. There was a 36.2% increase in students expressing an interest in pursuing careers in minority health post program. The majority of the participating CBOs were able to utilize the results of the student-led research projects for their programs. In addition, research conclusions and policy recommendations based on the students' projects were given to local elected officials. As demonstrated by our program, community-academic partnerships can provide educational opportunities to strengthen the academic pipeline for students of color interested in health careers and health disparities research.
Pipeline repair development in support of the Oman to India gas pipeline
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abadie, W.; Carlson, W.
1995-12-01
This paper provides a summary of development which has been conducted to date for the ultra deep, diverless pipeline repair system for the proposed Oman to India Gas Pipeline. The work has addressed critical development areas involving testing and/or prototype development of tools and procedures required to perform a diverless pipeline repair in water depths of up to 3,525 m.
Opportunity Knocks: Pipeline Programs Offer Minority Students a Path to Dentistry
ERIC Educational Resources Information Center
Fauteux, Nicole
2012-01-01
Minority students have traditionally been underrepresented in dental schools, which is why enrichment and pipeline programs aimed at helping minority students are necessary. That reality is reflected in their woeful underrepresentation among practicing dentists. Hispanics made up only 5.8 percent of practicing dentists in 2011, according to the…
30 CFR 250.910 - Which of my facilities are subject to the Platform Verification Program?
Code of Federal Regulations, 2013 CFR
2013-07-01
..., production, and pipeline risers, and riser tensioning systems (each platform must be designed to accommodate all the loads imposed by all risers and riser does not have tensioning systems);(ii) Turrets and... are subject to the Platform Verification Program: (i) Drilling, production, and pipeline risers, and...
30 CFR 250.910 - Which of my facilities are subject to the Platform Verification Program?
Code of Federal Regulations, 2010 CFR
2010-07-01
... pipeline risers, and riser tensioning systems (each platform must be designed to accommodate all the loads imposed by all risers and riser does not have tensioning systems);(ii) Turrets and turret-and-hull... Platform Verification Program: (i) Drilling, production, and pipeline risers, and riser tensioning systems...
30 CFR 250.910 - Which of my facilities are subject to the Platform Verification Program?
Code of Federal Regulations, 2011 CFR
2011-07-01
..., production, and pipeline risers, and riser tensioning systems (each platform must be designed to accommodate all the loads imposed by all risers and riser does not have tensioning systems);(ii) Turrets and... are subject to the Platform Verification Program: (i) Drilling, production, and pipeline risers, and...
30 CFR 250.910 - Which of my facilities are subject to the Platform Verification Program?
Code of Federal Regulations, 2012 CFR
2012-07-01
..., production, and pipeline risers, and riser tensioning systems (each platform must be designed to accommodate all the loads imposed by all risers and riser does not have tensioning systems);(ii) Turrets and... are subject to the Platform Verification Program: (i) Drilling, production, and pipeline risers, and...
30 CFR 250.910 - Which of my facilities are subject to the Platform Verification Program?
Code of Federal Regulations, 2014 CFR
2014-07-01
..., production, and pipeline risers, and riser tensioning systems (each platform must be designed to accommodate all the loads imposed by all risers and riser does not have tensioning systems);(ii) Turrets and... are subject to the Platform Verification Program: (i) Drilling, production, and pipeline risers, and...
49 CFR 198.39 - Qualifications for operation of one-call notification system.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 49 Transportation 3 2010-10-01 2010-10-01 false Qualifications for operation of one-call...) PIPELINE SAFETY REGULATIONS FOR GRANTS TO AID STATE PIPELINE SAFETY PROGRAMS Adoption of One-Call Damage Prevention Program § 198.39 Qualifications for operation of one-call notification system. A one-call...
DOT National Transportation Integrated Search
1978-12-01
This study is the final phase of a muck pipeline program begun in 1973. The objective of the study was to evaluate a pneumatic pipeline system for muck haulage from a tunnel excavated by a tunnel boring machine. The system was comprised of a muck pre...
Code of Federal Regulations, 2012 CFR
2012-10-01
... BY PIPELINE: MINIMUM FEDERAL SAFETY STANDARDS Gas Transmission Pipeline Integrity Management § 192... internal corrosion, external corrosion, and stress corrosion cracking; (2) Static or resident threats, such... its integrity management program addressing actions it will take to respond to findings from this data...
Code of Federal Regulations, 2011 CFR
2011-10-01
... BY PIPELINE: MINIMUM FEDERAL SAFETY STANDARDS Gas Transmission Pipeline Integrity Management § 192... internal corrosion, external corrosion, and stress corrosion cracking; (2) Static or resident threats, such... its integrity management program addressing actions it will take to respond to findings from this data...
Code of Federal Regulations, 2013 CFR
2013-10-01
... BY PIPELINE: MINIMUM FEDERAL SAFETY STANDARDS Gas Transmission Pipeline Integrity Management § 192... internal corrosion, external corrosion, and stress corrosion cracking; (2) Static or resident threats, such... its integrity management program addressing actions it will take to respond to findings from this data...
Code of Federal Regulations, 2014 CFR
2014-10-01
... BY PIPELINE: MINIMUM FEDERAL SAFETY STANDARDS Gas Transmission Pipeline Integrity Management § 192... internal corrosion, external corrosion, and stress corrosion cracking; (2) Static or resident threats, such... its integrity management program addressing actions it will take to respond to findings from this data...
Building Effective Pipelines to Increase Diversity in the Geosciences
NASA Astrophysics Data System (ADS)
Snow, E.; Robinson, C. R.; Neal-Mujahid, R.
2017-12-01
The U.S. Geological Survey (USGS) recognizes and understands the importance of a diverse workforce in advancing our science. Valuing Differences is one of the guiding principles of the USGS, and is the critical basis of the collaboration among the Youth and Education in Science (YES) program in the USGS Office of Science, Quality, and Integrity (OSQI), the Office of Diversity and Equal Opportunity (ODEO), and USGS science centers to build pipeline programs targeting diverse young scientists. Pipeline programs are robust, sustained relationships between two entities that provide a pathway from one to the other, in this case, from minority serving institutions to the USGS. The USGS has benefited from pipeline programs for many years. Our longest running program, with University of Puerto Rico Mayaguez (UPR), is a targeted outreach and internship program that has been managed by USGS scientists in Florida since the mid-1980's Originally begun as the Minority Participation in the Earth Sciences (MPES ) Program, it has evolved over the years, and in its several forms has brought dozens of interns to the USGS. Based in part on that success, in 2006 USGS scientists in Woods Hole MA worked with their Florida counterparts to build a pipeline program with City College of New York (CCNY). In this program, USGS scientists visit CCNY monthly, giving a symposium and meeting with students and faculty. The talks are so successful that the college created a course around them. In 2017, the CCNY and UPR programs brought 12 students to the USGS for summer internships. The CCNY model has been so successful that USGS is exploring creating similar pipeline programs. The YES office is coordinating with ODEO and USGS science centers to identify partner universities and build relationships that will lead to robust partnership where USGS scientists will visit regularly to engage with faculty and students and recruit students for USGS internships. The ideal partner universities will have a high population of underserved students, strong support for minority and first-generation students, proximity to a USGS office, and faculty and/or majors in several of the fields most important to USGS science: geology, geochemistry, energy, biology, ecology, environmental health, hydrology, climate science, GIS, high-capacity computing, and remote sensing.
Thurmond, V B; Cregler, L L
1999-04-01
To track gifted underrepresented minority (URM) students who entered the pipeline to health professional school when they were in high school and to determine whether and why students left the pipeline to enter other professions. A questionnaire was mailed to 162 students who had participated in the Student Educational Enrichment Program (SEEP) in health sciences at the Medical College of Georgia between 1984 and 1991; 123 (75%) responded. Students in the study population had higher graduation rates than the average state or national student. Fifty-nine (48%) of the students had entered health care careers; 98% had stated that intention when they were in high school. Although some of the students stated trouble with course work and GPA as reasons for their decisions to change career tracks, many students said that their interests in non-medical careers had been fostered by mentors or by opportunities to serve internships. Early intervention is important to retaining students in a pipeline that leads to a health care career. Summer programs are successful, but may not be enough to help students with difficult science courses in college, especially chemistry. However, another important conclusion is that much more needs to be done to help students find mentors with whom they can develop relationships and to give them opportunities to work in health care settings.
NASA Astrophysics Data System (ADS)
Kyrychok, Vladyslav; Torop, Vasyl
2018-03-01
The present paper is devoted to the problem of the assessment of probable crack growth at pressure vessel nozzles zone under the cyclic seismic loads. The approaches to creating distributed pipeline systems, connected to equipment are being proposed. The possibility of using in common different finite element program packages for accurate estimation of the strength of bonded pipelines and pressure vessels systems is shown and justified. The authors propose checking the danger of defects in nozzle domain, evaluate the residual life of the system, basing on the developed approach.
Simulation of a manual electric-arc welding in a working gas pipeline. 1. Formulation of the problem
NASA Astrophysics Data System (ADS)
Baikov, V. I.; Gishkelyuk, I. A.; Rus', A. M.; Sidorovich, T. V.; Tonkonogov, B. A.
2010-11-01
Problems of mathematical simulation of the temperature stresses arising in the wall of a pipe of a cross-country gas pipeline in the process of electric-arc welding of defects in it have been considered. Mathematical models of formation of temperatures, deformations, and stresses in a gas pipe subjected to phase transformations have been developed. These models were numerically realized in the form of algorithms representing a part of an application-program package. Results of verification of the computational complex and calculation results obtained with it are presented.
Amateur Image Pipeline Processing using Python plus PyRAF
NASA Astrophysics Data System (ADS)
Green, Wayne
2012-05-01
A template pipeline spanning observing planning to publishing is offered as a basis for establishing a long term observing program. The data reduction pipeline encapsulates all policy and procedures, providing an accountable framework for data analysis and a teaching framework for IRAF. This paper introduces the technical details of a complete pipeline processing environment using Python, PyRAF and a few other languages. The pipeline encapsulates all processing decisions within an auditable framework. The framework quickly handles the heavy lifting of image processing. It also serves as an excellent teaching environment for astronomical data management and IRAF reduction decisions.
Diversity in the biomedical research workforce: developing talent.
McGee, Richard; Saran, Suman; Krulwich, Terry A
2012-01-01
Much has been written about the need for and barriers to achievement of greater diversity in the biomedical workforce from the perspectives of gender, race, and ethnicity; this is not a new topic. These discussions often center around a "pipeline" metaphor that imagines students flowing through a series of experiences to eventually arrive at a science career. Here we argue that diversity will only be achieved if the primary focus is on (1) what is happening within the pipeline, not just counting individuals entering and leaving it; (2) de-emphasizing the achievement of academic milestones by typical ages; and (3) adopting approaches that most effectively develop talent. Students may develop skills at different rates based on factors such as earlier access to educational resources, exposure to science (especially research experiences), and competing demands for time and attention during high school and college. Therefore, there is wide variety among students at any point along the pipeline. Taking this view requires letting go of imagining the pipeline as a sequence of age-dependent steps in favor of milestones of skill and talent development decoupled from age or educational stage. Emphasizing talent development opens up many new approaches for science training outside of traditional degree programs. This article provides examples of such approaches, including interventions at the postbaccalaureate and PhD levels, as well as a novel coaching model that incorporates well-established social science theories and complements traditional mentoring. These approaches could significantly impact diversity by developing scientific talent, especially among currently underrepresented minorities. © 2012 Mount Sinai School of Medicine.
The Tuberculosis Drug Discovery and Development Pipeline and Emerging Drug Targets
Mdluli, Khisimuzi; Kaneko, Takushi; Upton, Anna
2015-01-01
The recent accelerated approval for use in extensively drug-resistant and multidrug-resistant-tuberculosis (MDR-TB) of two first-in-class TB drugs, bedaquiline and delamanid, has reinvigorated the TB drug discovery and development field. However, although several promising clinical development programs are ongoing to evaluate new TB drugs and regimens, the number of novel series represented is few. The global early-development pipeline is also woefully thin. To have a chance of achieving the goal of better, shorter, safer TB drug regimens with utility against drug-sensitive and drug-resistant disease, a robust and diverse global TB drug discovery pipeline is key, including innovative approaches that make use of recently acquired knowledge on the biology of TB. Fortunately, drug discovery for TB has resurged in recent years, generating compounds with varying potential for progression into developable leads. In parallel, advances have been made in understanding TB pathogenesis. It is now possible to apply the lessons learned from recent TB hit generation efforts and newly validated TB drug targets to generate the next wave of TB drug leads. Use of currently underexploited sources of chemical matter and lead-optimization strategies may also improve the efficiency of future TB drug discovery. Novel TB drug regimens with shorter treatment durations must target all subpopulations of Mycobacterium tuberculosis existing in an infection, including those responsible for the protracted TB treatment duration. This review summarizes the current TB drug development pipeline and proposes strategies for generating improved hits and leads in the discovery phase that could help achieve this goal. PMID:25635061
Training Families To Learn Science Together Using Astronomical Topics
NASA Astrophysics Data System (ADS)
Noel-Storr, Jacob; Wyllie, G.; Lierheimer, D.
2012-05-01
We present a collection of messages and lessons learned from a set of Family Science programs that have been developed, implemented and/or evaluated by the RIT Insight Lab over the past 5 years. The programs are connected by their use of astronomical topics to serve as the motivator for engagement and learning. The programs all focus on the development of inquiry skills and connecting family members to each other as science learning communities, rather than focusing on the development of specific content knowledge. We show how family science programs can increase engagement in STEM for parents and their children alike, and strengthen the pipeline of the next generation of scientists and engineers.
African-American Mentoring Program (AAMP): Addressing the Cracks in the Graduate Education Pipeline
ERIC Educational Resources Information Center
Green, Tonika Duren; Ammah, Beverly Booker; Butler-Byrd, Nola; Brandon, Regina; McIntosh, Angela
2017-01-01
In this conceptual article, we focus on mentoring as a strategy to mend the cracks in the education pipeline for African American graduate students. Our article highlights the African American Mentoring Program (AAMP) model and examines the unique methods it uses to support the retention and graduation of African American graduate students from a…
Aerial image databases for pipeline rights-of-way management
NASA Astrophysics Data System (ADS)
Jadkowski, Mark A.
1996-03-01
Pipeline companies that own and manage extensive rights-of-way corridors are faced with ever-increasing regulatory pressures, operating issues, and the need to remain competitive in today's marketplace. Automation has long been an answer to the problem of having to do more work with less people, and Automated Mapping/Facilities Management/Geographic Information Systems (AM/FM/GIS) solutions have been implemented at several pipeline companies. Until recently, the ability to cost-effectively acquire and incorporate up-to-date aerial imagery into these computerized systems has been out of the reach of most users. NASA's Earth Observations Commercial Applications Program (EOCAP) is providing a means by which pipeline companies can bridge this gap. The EOCAP project described in this paper includes a unique partnership with NASA and James W. Sewall Company to develop an aircraft-mounted digital camera system and a ground-based computer system to geometrically correct and efficiently store and handle the digital aerial images in an AM/FM/GIS environment. This paper provides a synopsis of the project, including details on (1) the need for aerial imagery, (2) NASA's interest and role in the project, (3) the design of a Digital Aerial Rights-of-Way Monitoring System, (4) image georeferencing strategies for pipeline applications, and (5) commercialization of the EOCAP technology through a prototype project at Algonquin Gas Transmission Company which operates major gas pipelines in New England, New York, and New Jersey.
Effect of Ethanol Blends and Batching Operations on SCC of Carbon Steel
DOT National Transportation Integrated Search
2011-02-08
This is the draft final report of the project on blending and batching (WP#325) of the Consolidated Program on Development of Guidelines for Safe and Reliable Pipeline Transportation of Ethanol Blends. The other two aspects of the consolidated progra...
NASA Technical Reports Server (NTRS)
Jenkins, Jon M.
2017-01-01
The experience acquired through development, implementation and operation of the KeplerK2 science pipelines can provide lessons learned for the development of science pipelines for other missions such as NASA's Transiting Exoplanet Survey Satellite, and ESA's PLATO mission.
NASA Astrophysics Data System (ADS)
Wallace, Eric W.; Perry, Justin C.; Ferguson, Robert L.; Jackson, Debbie K.
2015-08-01
The present study investigated the impact of a Science, Technology, Engineering, Mathematics and Health (STEM+H) university-based pipeline program, the Careers in Health and Medical Professions Program, over the course of two summers among predominantly African-American high school students recruited from urban school districts ( N = 155). Based on a mixed methods approach, results indicated that youth made significant gains in both academic and career knowledge. Furthermore, youth generally rated the program's sessions favorably, but also rated sessions with varying levels of satisfaction. The limitations and implications for program delivery and evaluation methods among pipeline programs are discussed.
Update on the SDSS-III MARVELS data pipeline development
NASA Astrophysics Data System (ADS)
Li, Rui; Ge, J.; Thomas, N. B.; Petersen, E.; Wang, J.; Ma, B.; Sithajan, S.; Shi, J.; Ouyang, Y.; Chen, Y.
2014-01-01
MARVELS (Multi-object APO Radial Velocity Exoplanet Large-area Survey), as one of the four surveys in the SDSS-III program, has monitored over 3,300 stars during 2008-2012, with each being visited an average of 26 times over a 2-year window. Although the early data pipeline was able to detect over 20 brown dwarf candidates and several hundreds of binaries, no giant planet candidates have been reliably identified due to its large systematic errors. Learning from past data pipeline lessons, we re-designed the entire pipeline to handle various types of systematic effects caused by the instrument (such as trace, slant, distortion, drifts and dispersion) and observation condition changes (such as illumination profile and continuum). We also introduced several advanced methods to precisely extract the RV signals. To date, we have achieved a long term RMS RV measurement error of 14 m/s for HIP-14810 (one of our reference stars) after removal of the known planet signal based on previous HIRES RV measurement. This new 1-D data pipeline has been used to robustly identify four giant planet candidates within the small fraction of the survey data that has been processed (Thomas et al. this meeting). The team is currently working hard to optimize the pipeline, especially the 2-D interference-fringe RV extraction, where early results show a 1.5 times improvement over the 1-D data pipeline. We are quickly approaching the survey baseline performance requirement of 10-35 m/s RMS for 8-12 solar type stars. With this fine-tuned pipeline and the soon to be processed plates of data, we expect to discover many more giant planet candidates and make a large statistical impact to the exoplanet study.
Improved Photometry for the DASCH Pipeline
NASA Astrophysics Data System (ADS)
Tang, Sumin; Grindlay, Jonathan; Los, Edward; Servillat, Mathieu
2013-07-01
The Digital Access to a Sky Century@Harvard (DASCH) project is digitizing the ˜500,000 glass plate images obtained (full sky) by the Harvard College Observatory from 1885 to 1992. Astrometry and photometry for each resolved object are derived with photometric rms values of ˜0.15 mag for the initial photometry analysis pipeline. Here we describe new developments for DASCH photometry, applied to the Kepler field, that have yielded further improvements, including better identification of image blends and plate defects by measuring image profiles and astrometric deviations. A local calibration procedure using nearby stars in a similar magnitude range as the program star (similar to what has been done for visual photometry from the plates) yields additional improvement for a net photometric rms of ˜0.1 mag. We also describe statistical measures of light curves that are now used in the DASCH pipeline processing to identify new variables autonomously. The DASCH photometry methods described here are used in the pipeline processing for the data releases of DASCH data,5 as well as for a forthcoming paper on the long-term variables discovered by DASCH in the Kepler field.
Image processing pipeline for synchrotron-radiation-based tomographic microscopy.
Hintermüller, C; Marone, F; Isenegger, A; Stampanoni, M
2010-07-01
With synchrotron-radiation-based tomographic microscopy, three-dimensional structures down to the micrometer level can be visualized. Tomographic data sets typically consist of 1000 to 1500 projections of 1024 x 1024 to 2048 x 2048 pixels and are acquired in 5-15 min. A processing pipeline has been developed to handle this large amount of data efficiently and to reconstruct the tomographic volume within a few minutes after the end of a scan. Just a few seconds after the raw data have been acquired, a selection of reconstructed slices is accessible through a web interface for preview and to fine tune the reconstruction parameters. The same interface allows initiation and control of the reconstruction process on the computer cluster. By integrating all programs and tools, required for tomographic reconstruction into the pipeline, the necessary user interaction is reduced to a minimum. The modularity of the pipeline allows functionality for new scan protocols to be added, such as an extended field of view, or new physical signals such as phase-contrast or dark-field imaging etc.
Oil and gas pipeline construction cost analysis and developing regression models for cost estimation
NASA Astrophysics Data System (ADS)
Thaduri, Ravi Kiran
In this study, cost data for 180 pipelines and 136 compressor stations have been analyzed. On the basis of the distribution analysis, regression models have been developed. Material, Labor, ROW and miscellaneous costs make up the total cost of a pipeline construction. The pipelines are analyzed based on different pipeline lengths, diameter, location, pipeline volume and year of completion. In a pipeline construction, labor costs dominate the total costs with a share of about 40%. Multiple non-linear regression models are developed to estimate the component costs of pipelines for various cross-sectional areas, lengths and locations. The Compressor stations are analyzed based on the capacity, year of completion and location. Unlike the pipeline costs, material costs dominate the total costs in the construction of compressor station, with an average share of about 50.6%. Land costs have very little influence on the total costs. Similar regression models are developed to estimate the component costs of compressor station for various capacities and locations.
Pipeline for Contraceptive Development
Blithe, Diana L.
2016-01-01
The high rates of unplanned pregnancy reflect unmet need for effective contraceptive methods for women, especially for individuals with health risks such as obesity, diabetes, hypertension, and other conditions that may contraindicate use of an estrogen-containing product. Improvements in safety, user convenience, acceptability and availability of products remain important goals of the contraceptive development program. Another important goal is to minimize the impact of the products on the environment. Development of new methods for male contraception has the potential to address many of these issues with regard to safety for women who have contraindications to effective contraceptive methods but want to protect against pregnancy. It also will address a huge unmet need for men who want to control their fertility. Products under development for men would not introduce eco-toxic hormones in the waste water. Investment in contraceptive research to identify new products for women has been limited in the pharmaceutical industry relative to investment in drug development for other indications. Pharmaceutical R&D for male contraception was active in the 1990’s but was abandoned over a decade ago. The Eunice Kennedy Shriver National Institute of Child Health and Human Development (NICHD) has supported a contraceptive development program since 1969. Through a variety of programs including research grants and contracts, NICHD has developed a pipeline of new targets/products for male and female contraception. A number of lead candidates are under evaluation in the NICHD Contraceptive Clinical Trials Network (CCTN) (1–3). PMID:27523300
Code of Federal Regulations, 2010 CFR
2010-10-01
... addressing time dependent and independent threats for a transmission pipeline operating below 30% SMYS not in... pipeline system are covered for purposes of the integrity management program requirements, an operator must... system, or an operator may apply one method to individual portions of the pipeline system. (Refer to...
The Importance of Outreach Programs to Unblock the Pipeline and Broaden Diversity in ICT Education
ERIC Educational Resources Information Center
Lang, Catherine; Craig, Annemieke; Egan, Mary Anne
2016-01-01
There is a need for outreach programs to attract a diverse range of students to the computing discipline. The lack of qualified computing graduates to fill the growing number of computing vacancies is of concern to government and industry and there are few female students entering the computing pipeline at high school level. This paper presents…
Pydpiper: a flexible toolkit for constructing novel registration pipelines.
Friedel, Miriam; van Eede, Matthijs C; Pipitone, Jon; Chakravarty, M Mallar; Lerch, Jason P
2014-01-01
Using neuroimaging technologies to elucidate the relationship between genotype and phenotype and brain and behavior will be a key contribution to biomedical research in the twenty-first century. Among the many methods for analyzing neuroimaging data, image registration deserves particular attention due to its wide range of applications. Finding strategies to register together many images and analyze the differences between them can be a challenge, particularly given that different experimental designs require different registration strategies. Moreover, writing software that can handle different types of image registration pipelines in a flexible, reusable and extensible way can be challenging. In response to this challenge, we have created Pydpiper, a neuroimaging registration toolkit written in Python. Pydpiper is an open-source, freely available software package that provides multiple modules for various image registration applications. Pydpiper offers five key innovations. Specifically: (1) a robust file handling class that allows access to outputs from all stages of registration at any point in the pipeline; (2) the ability of the framework to eliminate duplicate stages; (3) reusable, easy to subclass modules; (4) a development toolkit written for non-developers; (5) four complete applications that run complex image registration pipelines "out-of-the-box." In this paper, we will discuss both the general Pydpiper framework and the various ways in which component modules can be pieced together to easily create new registration pipelines. This will include a discussion of the core principles motivating code development and a comparison of Pydpiper with other available toolkits. We also provide a comprehensive, line-by-line example to orient users with limited programming knowledge and highlight some of the most useful features of Pydpiper. In addition, we will present the four current applications of the code.
Pydpiper: a flexible toolkit for constructing novel registration pipelines
Friedel, Miriam; van Eede, Matthijs C.; Pipitone, Jon; Chakravarty, M. Mallar; Lerch, Jason P.
2014-01-01
Using neuroimaging technologies to elucidate the relationship between genotype and phenotype and brain and behavior will be a key contribution to biomedical research in the twenty-first century. Among the many methods for analyzing neuroimaging data, image registration deserves particular attention due to its wide range of applications. Finding strategies to register together many images and analyze the differences between them can be a challenge, particularly given that different experimental designs require different registration strategies. Moreover, writing software that can handle different types of image registration pipelines in a flexible, reusable and extensible way can be challenging. In response to this challenge, we have created Pydpiper, a neuroimaging registration toolkit written in Python. Pydpiper is an open-source, freely available software package that provides multiple modules for various image registration applications. Pydpiper offers five key innovations. Specifically: (1) a robust file handling class that allows access to outputs from all stages of registration at any point in the pipeline; (2) the ability of the framework to eliminate duplicate stages; (3) reusable, easy to subclass modules; (4) a development toolkit written for non-developers; (5) four complete applications that run complex image registration pipelines “out-of-the-box.” In this paper, we will discuss both the general Pydpiper framework and the various ways in which component modules can be pieced together to easily create new registration pipelines. This will include a discussion of the core principles motivating code development and a comparison of Pydpiper with other available toolkits. We also provide a comprehensive, line-by-line example to orient users with limited programming knowledge and highlight some of the most useful features of Pydpiper. In addition, we will present the four current applications of the code. PMID:25126069
2014-06-01
SCADA / ICS Cyber Test Lab initiated in 2013 Psychosocial – academic research exists,; opportunity for sharing and developing impact assessment...ecosystems and species at risk), accidents / system failure (rail; pipelines ; ferries CSSP strategy for the North Focus on regional l(and local) problem...Guidance; business planning; environmental scan; proposal evaluation; and performance measurement Program Risk Management – Guidelines for project
Derck, Jordan; Zahn, Kate; Finks, Jonathan F; Mand, Simanjit; Sandhu, Gurjit
2016-01-01
Racial minorities continue to be underrepresented in medicine (URiM). Increasing provider diversity is an essential component of addressing disparity in health delivery and outcomes. The pool of students URiM that are competitive applicants to medical school is often limited early on by educational inequalities in primary and secondary schooling. A growing body of evidence recognizing the importance of diversifying health professions advances the need for medical schools to develop outreach collaborations with primary and secondary schools to attract URiMs. The goal of this paper is to describe and evaluate a program that seeks to create a pipeline for URiMs early in secondary schooling by connecting these students with support and resources in the medical community that may be transformative in empowering these students to be stronger university and medical school applicants. The authors described a medical student-led, action-oriented pipeline program, Doctors of Tomorrow, which connects faculty and medical students at the University of Michigan Medical School with 9th grade students at Cass Technical High School (Cass Tech) in Detroit, Michigan. The program includes a core curriculum of hands-on experiential learning, development, and presentation of a capstone project, and mentoring of 9th grade students by medical students. Cass Tech student feedback was collected using focus groups, critical incident written narratives, and individual interviews. Medical student feedback was collected reviewing monthly meeting minutes from the Doctors of Tomorrow medical student leadership. Data were analyzed using thematic analysis. Two strong themes emerged from the Cass Tech student feedback: (i) Personal identity and its perceived effect on goal achievement and (ii) positive affect of direct mentorship and engagement with current healthcare providers through Doctors of Tomorrow. A challenge noted by the medical students was the lack of structured curriculum beyond the 1st year of the program; however, this was complemented by their commitment to the program for continued longitudinal development. The authors propose that development of outreach pipeline programs that are context specific, culturally relevant, and established in collaboration with community partners have the potential to provide underrepresented students with opportunities and skills early in their formative education to be competitive applicants to college and ultimately to medical school.
NASA Astrophysics Data System (ADS)
Duan, Yanzhi
2017-01-01
The gas pipeline networks in Sichuan and Chongqing (Sichuan-Chongqing) region have formed a fully-fledged gas pipeline transportation system in China, which supports and promotes the rapid development of gas market in Sichuan-Chongqing region. In the circumstances of further developed market-oriented economy, it is necessary to carry out further the pipeline system reform in the areas of investment/financing system, operation system and pricing system to lay a solid foundation for improving future gas production and marketing capability and adapting itself to the national gas system reform, and to achieve the objectives of multiparty participated pipeline construction, improved pipeline transportation efficiency and fair and rational pipeline transportation prices. In this article, main thinking on reform in the three areas and major deployment are addressed, and corresponding measures on developing shared pipeline economy, providing financial support to pipeline construction, setting up independent regulatory agency to enhance the industrial supervision for gas pipeline transportation, and promoting the construction of regional gas trade market are recommended.
Hal: an automated pipeline for phylogenetic analyses of genomic data.
Robbertse, Barbara; Yoder, Ryan J; Boyd, Alex; Reeves, John; Spatafora, Joseph W
2011-02-07
The rapid increase in genomic and genome-scale data is resulting in unprecedented levels of discrete sequence data available for phylogenetic analyses. Major analytical impasses exist, however, prior to analyzing these data with existing phylogenetic software. Obstacles include the management of large data sets without standardized naming conventions, identification and filtering of orthologous clusters of proteins or genes, and the assembly of alignments of orthologous sequence data into individual and concatenated super alignments. Here we report the production of an automated pipeline, Hal that produces multiple alignments and trees from genomic data. These alignments can be produced by a choice of four alignment programs and analyzed by a variety of phylogenetic programs. In short, the Hal pipeline connects the programs BLASTP, MCL, user specified alignment programs, GBlocks, ProtTest and user specified phylogenetic programs to produce species trees. The script is available at sourceforge (http://sourceforge.net/projects/bio-hal/). The results from an example analysis of Kingdom Fungi are briefly discussed.
VIV analysis of pipelines under complex span conditions
NASA Astrophysics Data System (ADS)
Wang, James; Steven Wang, F.; Duan, Gang; Jukes, Paul
2009-06-01
Spans occur when a pipeline is laid on a rough undulating seabed or when upheaval buckling occurs due to constrained thermal expansion. This not only results in static and dynamic loads on the flowline at span sections, but also generates vortex induced vibration (VIV), which can lead to fatigue issues. The phenomenon, if not predicted and controlled properly, will negatively affect pipeline integrity, leading to expensive remediation and intervention work. Span analysis can be complicated by: long span lengths, a large number of spans caused by a rough seabed, and multi-span interactions. In addition, the complexity can be more onerous and challenging when soil uncertainty, concrete degradation and unknown residual lay tension are considered in the analysis. This paper describes the latest developments and a ‘state-of-the-art’ finite element analysis program that has been developed to simulate the span response of a flowline under complex boundary and loading conditions. Both VIV and direct wave loading are captured in the analysis and the results are sequentially used for the ultimate limit state (ULS) check and fatigue life calculation.
Mathematical simulation for compensation capacities area of pipeline routes in ship systems
NASA Astrophysics Data System (ADS)
Ngo, G. V.; Sakhno, K. N.
2018-05-01
In this paper, the authors considered the problem of manufacturability’s enhancement of ship systems pipeline at the designing stage. The analysis of arrangements and possibilities for compensation of deviations for pipeline routes has been carried out. The task was set to produce the “fit pipe” together with the rest of the pipes in the route. It was proposed to compensate for deviations by movement of the pipeline route during pipe installation and to calculate maximum values of these displacements in the analyzed path. Theoretical bases of deviation compensation for pipeline routes using rotations of parallel section pairs of pipes are assembled. Mathematical and graphical simulations of compensation area capacities of pipeline routes with various configurations are completed. Prerequisites have been created for creating an automated program that will allow one to determine values of the compensatory capacities area for pipeline routes and to assign quantities of necessary allowances.
An Aperture Photometry Pipeline for K2 Data
NASA Astrophysics Data System (ADS)
Buzasi, Derek L.; Carboneau, Lindsey; Lezcano, Andy; Vydra, Ekaterina
2016-01-01
As part of an ongoing research program with undergraduate students at Florida Gulf Coast University, we have constructed an aperture photometry pipeline for K2 data. The pipeline performs dynamic automated aperture mask definition for all targets in the K2 fields, followed by aperture photometry and detrending. Our pipeline is currently used to support a number of projects, including studies of stellar rotation and activity, red giant asteroseismology, gyrochronology, and exoplanet searches. In addition, output is used to support an undergraduate class on exoplanets aimed at a student audience of both majors and non-majors. The pipeline is designed for both batch and single-target use, and is easily extensible to data from other missions, and pipeline output is available to the community. This paper will describe our pipeline and its capabilities and illustrate the quality of the results, drawing on all of the applications for which it is currently used.
Enrichment programs to create a pipeline to biomedical science careers.
Cregler, L L
1993-01-01
The Student Educational Enrichment Programs at the Medical College of Georgia in the School of Medicine were created to increase underrepresented minorities in the pipeline to biomedical science careers. Eight-week summer programs are conducted for high school, research apprentice, and intermediate and advanced college students. There is a prematriculation program for accepted medical, dental, and graduate students. Between 1979 and 1990, 245 high school students attended 12 summer programs. Of these, 240 (98%) entered college 1 year later. In 1986, after eight programs, 162 (68%) high school participants graduated from college with a baccalaureate degree, and 127 responded to a follow-up survey. Sixty-two (49%) of the college graduates attended health science schools, and 23 (18%) of these matriculated to medical school. Of college students, 504 participated in 13 summer programs. Four hundred (79%) of these students responded to a questionnaire, which indicated that 348 (87%) of the 400 entered health science occupations and/or professional schools; 179 (45%) of these students matriculated to medical school. Minority students participating in enrichment programs have greater success in gaining acceptance to college and professional school. These data suggest that early enrichment initiatives increase the number of underrepresented minorities in the biomedical science pipeline.
75 FR 63774 - Pipeline Safety: Safety of On-Shore Hazardous Liquid Pipelines
Federal Register 2010, 2011, 2012, 2013, 2014
2010-10-18
... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration 49 CFR Part... Pipelines AGENCY: Pipeline and Hazardous Materials Safety Administration (PHMSA), Department of... Gas Pipeline Safety Act of 1968, Public Law 90-481, delegated to DOT the authority to develop...
New Developments At The Science Archives Of The NASA Exoplanet Science Institute
NASA Astrophysics Data System (ADS)
Berriman, G. Bruce
2018-06-01
The NASA Exoplanet Science Institute (NExScI) at Caltech/IPAC is the science center for NASA's Exoplanet Exploration Program and as such, NExScI operates three scientific archives: the NASA Exoplanet Archive (NEA) and Exoplanet Follow-up Observation Program Website (ExoFOP), and the Keck Observatory Archive (KOA).The NASA Exoplanet Archive supports research and mission planning by the exoplanet community by operating a service that provides confirmed and candidate planets, numerous project and contributed data sets and integrated analysis tools. The ExoFOP provides an environment for exoplanet observers to share and exchange data, observing notes, and information regarding the Kepler, K2, and TESS candidates. KOA serves all raw science and calibration observations acquired by all active and decommissioned instruments at the W. M. Keck Observatory, as well as reduced data sets contributed by Keck observers.In the coming years, the NExScI archives will support a series of major endeavours allowing flexible, interactive analysis of the data available at the archives. These endeavours exploit a common infrastructure based upon modern interfaces such as JuypterLab and Python. The first service will enable reduction and analysis of precision radial velocity data from the HIRES Keck instrument. The Exoplanet Archive is developing a JuypterLab environment based on the HIRES PRV interactive environment. Additionally, KOA is supporting an Observatory initiative to develop modern, Python based pipelines, and as part of this work, it has delivered a NIRSPEC reduction pipeline. The ensemble of pipelines will be accessible through the same environments.
Pierre, Jon Paul; Young, Michael H; Wolaver, Brad D; Andrews, John R; Breton, Caroline L
2017-11-01
Spatio-temporal trends in infrastructure footprints, energy production, and landscape alteration were assessed for the Eagle Ford Shale of Texas. The period of analysis was over four 2-year periods (2006-2014). Analyses used high-resolution imagery, as well as pipeline data to map EF infrastructure. Landscape conditions from 2006 were used as baseline. Results indicate that infrastructure footprints varied from 94.5 km 2 in 2008 to 225.0 km 2 in 2014. By 2014, decreased land-use intensities (ratio of land alteration to energy production) were noted play-wide. Core-area alteration by period was highest (3331.6 km 2 ) in 2008 at the onset of play development, and increased from 582.3 to 3913.9 km 2 by 2014, though substantial revegetation of localized core areas was observed throughout the study (i.e., alteration improved in some areas and worsened in others). Land-use intensity in the eastern portion of the play was consistently lower than that in the western portion, while core alteration remained relatively constant east to west. Land alteration from pipeline construction was ~65 km 2 for all time periods, except in 2010 when alteration was recorded at 47 km 2 . Percent of total alteration from well-pad construction increased from 27.3% in 2008 to 71.5% in 2014. The average number of wells per pad across all 27 counties increased from 1.15 to 1.7. This study presents a framework for mapping landscape alteration from oil and gas infrastructure development. However, the framework could be applied to other energy development programs, such as wind or solar fields, or any other regional infrastructure development program. Landscape alteration caused by hydrocarbon pipeline installation in Val Verde County, Texas.
Formicola, Allan J; D'Abreu, Kim C; Tedesco, Lisa A
2010-10-01
By now, all dental schools should understand the need to increase the enrollment of underrepresented minority (URM) students. While there has been a major increase in the number of Hispanic/Latino, African American/Black, and Native American applicants to dental schools over the past decade, there has not been a major percent increase in the enrollment of URM students except in the schools participating in the Pipeline, Profession, and Practice: Community-Based Dental Education program, which have far exceeded the percent increase in enrollment of URM students in other U.S. dental schools during Phase I of the program (2002-07). Assuming that all dental schools wish to improve the diversity of their student bodies, chapters 9-12 of this report--for which this chapter serves as an introduction--provide strategies learned from the Pipeline schools to increase the applications and enrollment of URM students. Some of the changes that the Pipeline schools put into place were the result of two focus group studies of college and dental students of color. These studies provided guidance on some of the barriers and challenges students of color face when considering dentistry as a career. New accreditation standards make it clear that the field of dentistry expects dental schools to re-energize their commitment to diversity.
Pipeline safety fund : minimum balance was not reasonably estimated
DOT National Transportation Integrated Search
2001-04-01
The Office of Pipeline Safety (OPS), a component of the Research and Special Programs Administration (RSPA) of the Department of Transportation (DOT), performs a variety of activities related to the safety of natural gas (NG) and hazardous liquid (HL...
77 FR 74275 - Pipeline Safety: Information Collection Activities
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-13
... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket No.... These regulations require operators of hazardous liquid pipelines and gas pipelines to develop and... control room. Affected Public: Operators of both natural gas and hazardous liquid pipeline systems. Annual...
David, Fabrice P A; Delafontaine, Julien; Carat, Solenne; Ross, Frederick J; Lefebvre, Gregory; Jarosz, Yohan; Sinclair, Lucas; Noordermeer, Daan; Rougemont, Jacques; Leleu, Marion
2014-01-01
The HTSstation analysis portal is a suite of simple web forms coupled to modular analysis pipelines for various applications of High-Throughput Sequencing including ChIP-seq, RNA-seq, 4C-seq and re-sequencing. HTSstation offers biologists the possibility to rapidly investigate their HTS data using an intuitive web application with heuristically pre-defined parameters. A number of open-source software components have been implemented and can be used to build, configure and run HTS analysis pipelines reactively. Besides, our programming framework empowers developers with the possibility to design their own workflows and integrate additional third-party software. The HTSstation web application is accessible at http://htsstation.epfl.ch.
HTSstation: A Web Application and Open-Access Libraries for High-Throughput Sequencing Data Analysis
David, Fabrice P. A.; Delafontaine, Julien; Carat, Solenne; Ross, Frederick J.; Lefebvre, Gregory; Jarosz, Yohan; Sinclair, Lucas; Noordermeer, Daan; Rougemont, Jacques; Leleu, Marion
2014-01-01
The HTSstation analysis portal is a suite of simple web forms coupled to modular analysis pipelines for various applications of High-Throughput Sequencing including ChIP-seq, RNA-seq, 4C-seq and re-sequencing. HTSstation offers biologists the possibility to rapidly investigate their HTS data using an intuitive web application with heuristically pre-defined parameters. A number of open-source software components have been implemented and can be used to build, configure and run HTS analysis pipelines reactively. Besides, our programming framework empowers developers with the possibility to design their own workflows and integrate additional third-party software. The HTSstation web application is accessible at http://htsstation.epfl.ch. PMID:24475057
Channel erosion surveys along TAPS route, Alaska, 1974
Childers, Joseph; Jones, Stanley H.
1975-01-01
Repeated site surveys and aerial photographs at 26 stream crossings along the trans-Alaska pipeline system (TAPS) route during the period 1969-74 provide chronologie records of channel changes that predate pipeline-related construction at the sites. The 1974 surveys and photographs show some of the channel changes wrought by construction of the haul road from the Yukon River to Prudhoe Bay and by construction of camps and working pads all along the pipeline route. No pipeline crossings were constructed before 1975. These records of channel changes together with flood and icing measurements are part of the United States Department of the lnterior's continuing surveillance program to document the hydrologic aspects of the trans-Alaska pipeline and its environmental impacts.
Department of Energy: Nuclear S&T workforce development programs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bingham, Michelle; Bala, Marsha; Beierschmitt, Kelly
The U.S. Department of Energy (DOE) national laboratories use their expertise in nuclear science and technology (S&T) to support a robust national nuclear S&T enterprise from the ground up. Traditional academic programs do not provide all the elements necessary to develop this expertise, so the DOE has initiated a number of supplemental programs to develop and support the nuclear S&T workforce pipeline. This document catalogs existing workforce development programs that are supported by a number of DOE offices (such as the Offices of Nuclear Energy, Science, Energy Efficiency, and Environmental Management), and by the National Nuclear Security Administration (NNSA) andmore » the Naval Reactor Program. Workforce development programs in nuclear S&T administered through the Department of Homeland Security, the Nuclear Regulatory Commission, and the Department of Defense are also included. The information about these programs, which is cataloged below, is drawn from the program websites. Some programs, such as the Minority Serving Institutes Partnership Programs (MSIPPs) are available through more than one DOE office, so they appear in more than one section of this document.« less
Rep. Young, Don [R-AK-At Large
2010-09-29
House - 09/30/2010 Referred to the Subcommittee on Railroads, Pipelines, and Hazardous Materials. (All Actions) Tracker: This bill has the status IntroducedHere are the steps for Status of Legislation:
NASA Astrophysics Data System (ADS)
Henclik, Sławomir
2018-03-01
The influence of dynamic fluid-structure interaction (FSI) onto the course of water hammer (WH) can be significant in non-rigid pipeline systems. The essence of this effect is the dynamic transfer of liquid energy to the pipeline structure and back, which is important for elastic structures and can be negligible for rigid ones. In the paper a special model of such behavior is analyzed. A straight pipeline with a steady flow, fixed to the floor with several rigid supports is assumed. The transient is generated by a quickly closed valve installed at the end of the pipeline. FSI effects are assumed to be present mainly at the valve which is fixed with a spring dash-pot attachment. Analysis of WH runs, especially transient pressure changes, for various stiffness and damping parameters of the spring dash-pot valve attachment is presented in the paper. The solutions are found analytically and numerically. Numerical results have been computed with the use of an own computer program developed on the basis of the four equation model of WH-FSI and the specific boundary conditions formulated at the valve. Analytical solutions have been found with the separation of variables method for slightly simplified assumptions. Damping at the dash-pot is taken into account within the numerical study. The influence of valve attachment parameters onto the WH courses was discovered and it was found the transient amplitudes can be reduced. Such a system, elastically attached shut-off valve in a pipeline or other, equivalent design can be a real solution applicable in practice.
Sahin, Sükran; Kurum, Ekrem
2009-09-01
Ecological monitoring is a complementary component of the overall environmental management and monitoring program of any Environmental Impact Assessment (EIA) report. The monitoring method should be developed for each project phase and allow for periodic reporting and assessment of compliance with the environmental conditions and requirements of the EIA. Also, this method should incorporate a variance request program since site-specific conditions can affect construction on a daily basis and require time-critical application of alternative construction scenarios or environmental management methods integrated with alternative mitigation measures. Finally, taking full advantage of the latest information and communication technologies can enhance the quality of, and public involvement in, the environmental management program. In this paper, a landscape-scale ecological monitoring method for major construction projects is described using, as a basis, 20 months of experience on the Baku-Tbilisi-Ceyhan (BTC) Crude Oil Pipeline Project, covering Turkish Sections Lot B and Lot C. This analysis presents suggestions for improving ecological monitoring for major construction activities.
PipelineDog: a simple and flexible graphic pipeline construction and maintenance tool.
Zhou, Anbo; Zhang, Yeting; Sun, Yazhou; Xing, Jinchuan
2018-05-01
Analysis pipelines are an essential part of bioinformatics research, and ad hoc pipelines are frequently created by researchers for prototyping and proof-of-concept purposes. However, most existing pipeline management system or workflow engines are too complex for rapid prototyping or learning the pipeline concept. A lightweight, user-friendly and flexible solution is thus desirable. In this study, we developed a new pipeline construction and maintenance tool, PipelineDog. This is a web-based integrated development environment with a modern web graphical user interface. It offers cross-platform compatibility, project management capabilities, code formatting and error checking functions and an online repository. It uses an easy-to-read/write script system that encourages code reuse. With the online repository, it also encourages sharing of pipelines, which enhances analysis reproducibility and accountability. For most users, PipelineDog requires no software installation. Overall, this web application provides a way to rapidly create and easily manage pipelines. PipelineDog web app is freely available at http://web.pipeline.dog. The command line version is available at http://www.npmjs.com/package/pipelinedog and online repository at http://repo.pipeline.dog. ysun@kean.edu or xing@biology.rutgers.edu or ysun@diagnoa.com. Supplementary data are available at Bioinformatics online.
Good, now keep going: challenging the status quo in STEM pipeline and access programs
NASA Astrophysics Data System (ADS)
Wiseman, Dawn; Herrmann, Randy
2018-03-01
This contribution engages in conversation with McMahon, Griese, and Kenyon (this issue) to consider how the SURE program they describe represents a pragmatic approach to addressing the issue of underrepresentation of Indigenous people in STEM post-secondary programs. We explore how such programs are generally positioned and how they might be positioned differently to challenge the status quo within Western post-secondary institutions. The challenge arises from moving beyond the immediate pragmatics of addressing an identifiable issue framed as a problem to considering how post-secondary institutions and people developing access recruitment programs might begin unlearning colonialism.
tcpl: The ToxCast Pipeline for High-Throughput Screening Data
Motivation: The large and diverse high-throughput chemical screening efforts carried out by the US EPAToxCast program requires an efficient, transparent, and reproducible data pipeline.Summary: The tcpl R package and its associated MySQL database provide a generalized platform fo...
Closha: bioinformatics workflow system for the analysis of massive sequencing data.
Ko, GunHwan; Kim, Pan-Gyu; Yoon, Jongcheol; Han, Gukhee; Park, Seong-Jin; Song, Wangho; Lee, Byungwook
2018-02-19
While next-generation sequencing (NGS) costs have fallen in recent years, the cost and complexity of computation remain substantial obstacles to the use of NGS in bio-medical care and genomic research. The rapidly increasing amounts of data available from the new high-throughput methods have made data processing infeasible without automated pipelines. The integration of data and analytic resources into workflow systems provides a solution to the problem by simplifying the task of data analysis. To address this challenge, we developed a cloud-based workflow management system, Closha, to provide fast and cost-effective analysis of massive genomic data. We implemented complex workflows making optimal use of high-performance computing clusters. Closha allows users to create multi-step analyses using drag and drop functionality and to modify the parameters of pipeline tools. Users can also import the Galaxy pipelines into Closha. Closha is a hybrid system that enables users to use both analysis programs providing traditional tools and MapReduce-based big data analysis programs simultaneously in a single pipeline. Thus, the execution of analytics algorithms can be parallelized, speeding up the whole process. We also developed a high-speed data transmission solution, KoDS, to transmit a large amount of data at a fast rate. KoDS has a file transfer speed of up to 10 times that of normal FTP and HTTP. The computer hardware for Closha is 660 CPU cores and 800 TB of disk storage, enabling 500 jobs to run at the same time. Closha is a scalable, cost-effective, and publicly available web service for large-scale genomic data analysis. Closha supports the reliable and highly scalable execution of sequencing analysis workflows in a fully automated manner. Closha provides a user-friendly interface to all genomic scientists to try to derive accurate results from NGS platform data. The Closha cloud server is freely available for use from http://closha.kobic.re.kr/ .
Southeast geysers effluent pipeline project. Final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dellinger, M.
1998-01-15
The project concept originated in 1990 with the convergence of two problems: (1) a need for augmented injection to mitigate declining reservoir productivity at the Geysers; and (2) a need for a new method of wastewater disposal for Lake County communities near the The Geysers. A public/private partnership of Geysers operators and the Lake County Sanitation District (LACOSAN) was formed in 1991 to conduct a series of engineering, environmental, and financing studies of transporting treated wastewater effluent from the communities to the southeast portion of The Geysers via a 29-mile pipeline. By 1994, these evaluations concluded that the concept wasmore » feasible and the stakeholders proceeded to formally develop the project, including pipeline and associated facilities design; preparation of an environmental impact statement; negotiation of construction and operating agreements; and assembly of $45 million in construction funding from the stakeholders, and from state and federal agencies with related program goals. The project development process culminated in the system`s dedication on October 16, 1997. As of this writing, all project components have been constructed or installed, successfully tested in compliance with design specifications, and are operating satisfactorily.« less
A Bridge to the Stars: A Model High School-to-College Pipeline to Improve Diversity in STEM
NASA Astrophysics Data System (ADS)
McIntosh, Daniel H.; Jennings, Derrick H.
2017-01-01
Increasing participation by historically underrepresented Americans in the STEM workforce remains a national priority. Existing strategies have failed to increase diversity especially in the physical sciences despite federal mandates. To meet this urgent challenge, it is imperative to immediately identify and support the expansion of effective high school-to-college STEM pipelines. A Bridge to the Stars (ABttS) is a creative and tested pipeline designed to steadily increase the numbers of disadvantaged 15-21 year-olds pursuing and completing 4-year STEM degrees. This unique program offers extended engagement in astronomy, arguably the most accessible window to science, through a 3-tier STEM immersion program of innovative learning (in a freshman science course), authentic research training (in a freshman science lab), and supportive near-peer mentoring at U.Missouri-Kansas City, an urban research university. Each tier of the ABttS pipeline by itself has the potential to broaden student aspirations for careers as technological innovators or STEM educators. Students who elect to transition through multiple tiers will substantially reinforce their successes with STEM activities, and significantly bolster their self-esteem necessary to personally manifest STEM aspirations. We will summarize the impact of this program after 5 years, and share our latest improvements. The long-term mission of ABttS is to see urban educational institutions across the U.S. adopt similar pipelines in all STEM disciplines built on the ABttS model.
Chinese-American headway on some environmental issues
NASA Astrophysics Data System (ADS)
Showstack, Randy
Although Chinese Premier Zhu Rongji may have failed to gain entrance for his country into the World Trade Organization during his April visit to the United States, the two countries concluded a series of agreements as part of the Second Session of the 2-year-old U.S.-China Policy Forum on Environment and Development.A memorandum of understanding on a $100 million clean energy program accelerates the export of clean U.S. environmental technologies in the area of energy efficiency renewable energy, and pollution reduction. A statement of intent on the development of a Sulfur Dioxide (SO2) Emissions Trading Feasibility Study calls for China to develop a study to test the effectiveness of emissions trading in China as a market-based approach to reducing greenhouse gas emissions. And a Memorandum of Understanding on a natural gas pipeline project, signed by the Enron Corporation and the China National Petroleum Corporation, opens the way to jointly developing a natural gas pipeline to help offer an alternative to fossil fuels.
Hydrocarbons pipeline transportation risk assessment
NASA Astrophysics Data System (ADS)
Zanin, A. V.; Milke, A. A.; Kvasov, I. N.
2018-04-01
The pipeline transportation applying risks assessment issue in the arctic conditions is addressed in the paper. Pipeline quality characteristics in the given environment has been assessed. To achieve the stated objective, the pipelines mathematical model was designed and visualized by using the software product SOLIDWORKS. When developing the mathematical model the obtained results made possible to define the pipeline optimal characteristics for designing on the Arctic sea bottom. In the course of conducting the research the pipe avalanche collapse risks were examined, internal longitudinal and circular loads acting on the pipeline were analyzed, as well as the water impact hydrodynamic force was taken into consideration. The conducted calculation can contribute to the pipeline transport further development under the harsh climate conditions of the Russian Federation Arctic shelf territory.
Building a pipeline of talent for operating radio observatories
NASA Astrophysics Data System (ADS)
Wingate, Lory M.
2016-07-01
The National Radio Astronomy Observatory's (NRAO) National and International Non-Traditional Exchange (NINE) Program teaches concepts of project management and systems engineering in a focused, nine-week, continuous effort that includes a hands-on build project with the objective of constructing and verifying the performance of a student-level basic radio instrument. The combination of using a project management (PM)/systems engineering (SE) methodical approach based on internationally recognized standards in completing this build is to demonstrate clearly to the learner the positive net effects of following methodical approaches to achieving optimal results. It also exposes the learner to basic radio science theory. An additional simple research project is used to impress upon the learner both the methodical approach, and to provide a basic understanding of the functional area of interest to the learner. This program is designed to teach sustainable skills throughout the full spectrum of activities associated with constructing, operating and maintaining radio astronomy observatories. NINE Program learners thereby return to their host sites and implement the program in their own location as a NINE Hub. This requires forming a committed relationship (through a formal Letter of Agreement), establishing a site location, and developing a program that takes into consideration the needs of the community they represent. The anticipated outcome of this program is worldwide partnerships with fast growing radio astronomy communities designed to facilitate the exchange of staff and the mentoring of under-represented1 groups of learners, thereby developing a strong pipeline of global talent to construct, operate and maintain radio astronomy observatories.
Oman India Pipeline: An operational repair strategy based on a rational assessment of risk
DOE Office of Scientific and Technical Information (OSTI.GOV)
German, P.
1996-12-31
This paper describes the development of a repair strategy for the operational phase of the Oman India Pipeline based upon the probability and consequences of a pipeline failure. Risk analyses and cost benefit analyses performed provide guidance on the level of deepwater repair development effort appropriate for the Oman India Pipeline project and identifies critical areas toward which more intense development effort should be directed. The risk analysis results indicate that the likelihood of a failure of the Oman India Pipeline during its 40-year life is low. Furthermore, the probability of operational failure of the pipeline in deepwater regions ismore » extremely low, the major proportion of operational failure risk being associated with the shallow water regions.« less
Developing a Program to Increase Diversity in the Geosciences
NASA Astrophysics Data System (ADS)
Prendeville, J. C.
2001-05-01
The Geosciences have a history of poor participation by minorities- African Americans, Hispanics, Native Americans and persons with disabilities. Demographic data concerning population trends over the next decades make it clear that, without intervention, underrepresentation of these groups in the geosciences will only worsen. The Directorate for Geosciences of the National Science Foundation has acknowledged the problem of underrepresentation and the loss of intellectual resources that it represents. The Directorate has established a program to create a pool of students from underrepresented groups who will take their place in the future as both scientific researchers and educators, as well as scientifically knowledgeable citizens. The strategy employed in developing the Geosciences Diversity program emphasizes community direction and inclusion. Steps in developing the program included examining data that demonstrate where the "leaks" in the educational pipeline occur; reviewing the programs that are offered by the NSF, by other federal agencies and by professional societies; and gaining insights from individuals who have developed or managed programs that have similar goals.
Developing physician-leaders: key competencies and available programs.
Stoller, James K
2008-01-01
Because effective leadership is critical to organizational success, frontrunner organizations cultivate leaders for bench depth and pipeline development. The many challenges in healthcare today create a special need for great leadership. This paper reviews the leadership competencies needed by physician-leaders and current experience with developing physician-leaders in healthcare institution-sponsored programs. On the basis of this review, six key leadership competency domains are proposed: 1. technical skills and knowledge (regarding operational, financial, and information systems, human resources, and strategic planning), 2. industry knowledge (e.g., regarding clinical processes, regulation, and healthcare trends), 3. problem-solving skills, 4. emotional intelligence, 5. communication, and 6. a commitment to lifelong learning. Review of current experience indicates that, in addition to leadership training through degree and certificate-granting programs (e.g., by universities and/or official medical societies), healthcare institutions themselves are developing intramural programs to cultivate physician-leaders. Greater attention is needed to assessing the impact and effectiveness of such programs in developing leaders and benefiting organizational outcomes.
DOT National Transportation Integrated Search
2012-08-30
Preventing unauthorized intrusions on pipeline Right of Ways (ROWs) and mechanical damage due to third party strikes by machinery is a constant challenge for the pipeline industry. Equally important for safety and environmental protection is the dete...
An integrated SNP mining and utilization (ISMU) pipeline for next generation sequencing data.
Azam, Sarwar; Rathore, Abhishek; Shah, Trushar M; Telluri, Mohan; Amindala, BhanuPrakash; Ruperao, Pradeep; Katta, Mohan A V S K; Varshney, Rajeev K
2014-01-01
Open source single nucleotide polymorphism (SNP) discovery pipelines for next generation sequencing data commonly requires working knowledge of command line interface, massive computational resources and expertise which is a daunting task for biologists. Further, the SNP information generated may not be readily used for downstream processes such as genotyping. Hence, a comprehensive pipeline has been developed by integrating several open source next generation sequencing (NGS) tools along with a graphical user interface called Integrated SNP Mining and Utilization (ISMU) for SNP discovery and their utilization by developing genotyping assays. The pipeline features functionalities such as pre-processing of raw data, integration of open source alignment tools (Bowtie2, BWA, Maq, NovoAlign and SOAP2), SNP prediction (SAMtools/SOAPsnp/CNS2snp and CbCC) methods and interfaces for developing genotyping assays. The pipeline outputs a list of high quality SNPs between all pairwise combinations of genotypes analyzed, in addition to the reference genome/sequence. Visualization tools (Tablet and Flapjack) integrated into the pipeline enable inspection of the alignment and errors, if any. The pipeline also provides a confidence score or polymorphism information content value with flanking sequences for identified SNPs in standard format required for developing marker genotyping (KASP and Golden Gate) assays. The pipeline enables users to process a range of NGS datasets such as whole genome re-sequencing, restriction site associated DNA sequencing and transcriptome sequencing data at a fast speed. The pipeline is very useful for plant genetics and breeding community with no computational expertise in order to discover SNPs and utilize in genomics, genetics and breeding studies. The pipeline has been parallelized to process huge datasets of next generation sequencing. It has been developed in Java language and is available at http://hpc.icrisat.cgiar.org/ISMU as a standalone free software.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Van Dyke, G.D.; Shem, L.M.; Zimmerman, R.E.
1994-12-01
The goal of the Gas Research Institute Wetland Corridors Program is to document impacts of existing pipelines on the wetlands they traverse. To accomplish this goal, 12 existing wetland crossings were surveyed. These sites varied in elapsed time since pipeline construction, wetland type, pipeline installation techniques, and night- of-way management practices. This report presents the results of a survey conducted on August 22, 1991, in an emergent intertidal estuarine wetland in Terrebonne Parish, Louisiana. The site includes three pipelines installed between 1958 and 1969. Vegetation within the site comprises three native tidal marsh grasses: Spartina alterniflora, Spartina patens, and Distichlismore » spicata. All three species occurred over the pipelines, within the right-of-way and in both natural areas. Vegetative differences attributable to the installation or presence of the pipelines were not obvious over the pipelines or in the habitat east of the pipelines. However, because of the presence of a canal west of the 1969 pipeline, vegetation was less abundant in that area, and D. spicata was absent from all but the most distant plots of the transacts. Data obtained in the study indicate that when rights-of-way through brackish marsh are restored to their original elevations, they are revegetated with native vegetation similar to that in surrounding areas.« less
PanWeb: A web interface for pan-genomic analysis.
Pantoja, Yan; Pinheiro, Kenny; Veras, Allan; Araújo, Fabrício; Lopes de Sousa, Ailton; Guimarães, Luis Carlos; Silva, Artur; Ramos, Rommel T J
2017-01-01
With increased production of genomic data since the advent of next-generation sequencing (NGS), there has been a need to develop new bioinformatics tools and areas, such as comparative genomics. In comparative genomics, the genetic material of an organism is directly compared to that of another organism to better understand biological species. Moreover, the exponentially growing number of deposited prokaryote genomes has enabled the investigation of several genomic characteristics that are intrinsic to certain species. Thus, a new approach to comparative genomics, termed pan-genomics, was developed. In pan-genomics, various organisms of the same species or genus are compared. Currently, there are many tools that can perform pan-genomic analyses, such as PGAP (Pan-Genome Analysis Pipeline), Panseq (Pan-Genome Sequence Analysis Program) and PGAT (Prokaryotic Genome Analysis Tool). Among these software tools, PGAP was developed in the Perl scripting language and its reliance on UNIX platform terminals and its requirement for an extensive parameterized command line can become a problem for users without previous computational knowledge. Thus, the aim of this study was to develop a web application, known as PanWeb, that serves as a graphical interface for PGAP. In addition, using the output files of the PGAP pipeline, the application generates graphics using custom-developed scripts in the R programming language. PanWeb is freely available at http://www.computationalbiology.ufpa.br/panweb.
The Very Large Array Data Processing Pipeline
NASA Astrophysics Data System (ADS)
Kent, Brian R.; Masters, Joseph S.; Chandler, Claire J.; Davis, Lindsey E.; Kern, Jeffrey S.; Ott, Juergen; Schinzel, Frank K.; Medlin, Drew; Muders, Dirk; Williams, Stewart; Geers, Vincent C.; Momjian, Emmanuel; Butler, Bryan J.; Nakazato, Takeshi; Sugimoto, Kanako
2018-01-01
We present the VLA Pipeline, software that is part of the larger pipeline processing framework used for the Karl G. Jansky Very Large Array (VLA), and Atacama Large Millimeter/sub-millimeter Array (ALMA) for both interferometric and single dish observations.Through a collection of base code jointly used by the VLA and ALMA, the pipeline builds a hierarchy of classes to execute individual atomic pipeline tasks within the Common Astronomy Software Applications (CASA) package. Each pipeline task contains heuristics designed by the team to actively decide the best processing path and execution parameters for calibration and imaging. The pipeline code is developed and written in Python and uses a "context" structure for tracking the heuristic decisions and processing results. The pipeline "weblog" acts as the user interface in verifying the quality assurance of each calibration and imaging stage. The majority of VLA scheduling blocks above 1 GHz are now processed with the standard continuum recipe of the pipeline and offer a calibrated measurement set as a basic data product to observatory users. In addition, the pipeline is used for processing data from the VLA Sky Survey (VLASS), a seven year community-driven endeavor started in September 2017 to survey the entire sky down to a declination of -40 degrees at S-band (2-4 GHz). This 5500 hour next-generation large radio survey will explore the time and spectral domains, relying on pipeline processing to generate calibrated measurement sets, polarimetry, and imaging data products that are available to the astronomical community with no proprietary period. Here we present an overview of the pipeline design philosophy, heuristics, and calibration and imaging results produced by the pipeline. Future development will include the testing of spectral line recipes, low signal-to-noise heuristics, and serving as a testing platform for science ready data products.The pipeline is developed as part of the CASA software package by an international consortium of scientists and software developers based at the National Radio Astronomical Observatory (NRAO), the European Southern Observatory (ESO), and the National Astronomical Observatory of Japan (NAOJ).
Master-slave mixed arrays for data-flow computations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chang, T.L.; Fisher, P.D.
1983-01-01
Control cells (masters) and computation cells (slaves) are mixed in regular geometric patterns to form reconfigurable arrays known as master-slave mixed arrays (MSMAS). Interconnections of the corners and edges of the hexagonal control cells and the edges of the hexagonal computation cells are used to construct synchronous and asynchronous communication networks, which support local computation and local communication. Data-driven computations result in self-directed ring pipelines within the MSMA, and composite data-flow computations are executed in a pipelined fashion. By viewing an MSMA as a computing network of tightly-linked ring pipelines, data-flow programs can be uniformly distributed over these pipelines formore » efficient resource utilisation. 9 references.« less
About U.S. Natural Gas Pipelines
2007-01-01
This information product provides the interested reader with a broad and non-technical overview of how the U.S. natural gas pipeline network operates, along with some insights into the many individual pipeline systems that make up the network. While the focus of the presentation is the transportation of natural gas over the interstate and intrastate pipeline systems, information on subjects related to pipeline development, such as system design and pipeline expansion, are also included.
FutureGen 2.0 Pipeline and Regional Carbon Capture Storage Project - Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Burger, Chris; Wortman, David; Brown, Chris
The U.S. Department of Energy’s (DOE) FutureGen 2.0 Program involves two projects: (1) the Oxy-Combustion Power Plant Project and (2) the CO2 Pipeline and Storage Project. This Final Technical Report is focused on the CO2 Pipeline and Storage Project. The FutureGen 2.0 CO2 Pipeline and Storage Project evolved from an initial siting and project definition effort in Phase I, into the Phase II activity consisting permitting, design development, the acquisition of land rights, facility design, and licensing and regulatory approvals. Phase II also progressed into construction packaging, construction procurement, and targeted early preparatory activities in the field. The CO2 Pipelinemore » and Storage Project accomplishments were significant, and in some cases unprecedented. The engineering, permitting, legal, stakeholder, and commercial learnings substantially advance the nation’s understanding of commercial-scale CO2 storage in deep saline aquifers. Voluminous and significant information was obtained from the drilling and the testing program of the subsurface, and sophisticated modeling was performed that held up to a wide range of scrutiny. All designs progressed to the point of securing construction contracts or comfort letters attesting to successful negotiation of all contract terms and willing execution at the appropriate time all major project elements – pipeline, surface facilities, and subsurface – as well as operations. While the physical installation of the planned facilities did not proceed in part due to insufficient time to complete the project prior to the expiration of federal funding, the project met significant objectives prior to DOE’s closeout decision. Had additional time been available, there were no known, insurmountable obstacles that would have precluded successful construction and operation of the project. Due to the suspension of the project, site restoration activities were developed and the work was accomplished. The site restoration efforts are also documented in this report. All permit applications had been submitted to all agencies for those permits or approvals required prior to the start of project construction. Most of the requisite permits were received during Phase II. This report includes information on each permitting effort. Successes and lessons learned are included in this report that will add value to the next generation of carbon storage efforts.« less
Design and Operation of the World's First Long Distance Bauxite Slurry Pipeline
NASA Astrophysics Data System (ADS)
Gandhi, Ramesh; Weston, Mike; Talavera, Maru; Brittes, Geraldo Pereira; Barbosa, Eder
Mineracão Bauxita Paragominas (MBP) is the first long distance slurry pipeline transporting bauxite slurry. Bauxite had developed a reputation for being difficult to hydraulically transport using long distance pipelines. This myth has now been proven wrong. The 245-km- long, 13.5 MTPY capacity MBP pipeline was designed and commissioned by PSI for CVRD. The pipeline is located in the State of Para, Brazil. The Miltonia bauxite mine is in a remote location with no other efficient means of transport. The bauxite slurry is delivered to Alunorte Alumina refinery located near Barcarena. This first of its kind pipeline required significant development work in order to assure technical and economic feasibility. This paper describes the technical aspects of design of the pipeline. It also summarizes the operating experience gained during the first year of operation.
ERIC Educational Resources Information Center
Grineski, Sara; Daniels, Heather; Collins, Timothy; Morales, Danielle X.; Frederick, Angela; Garcia, Marilyn
2018-01-01
Research on the science, technology, engineering, and math (STEM) student development pipeline has largely ignored social class and instead examined inequalities based on gender and race. We investigate the role of social class in undergraduate student research publications. Data come from a sample of 213 undergraduate research participants…
Tigres Workflow Library: Supporting Scientific Pipelines on HPC Systems
Hendrix, Valerie; Fox, James; Ghoshal, Devarshi; ...
2016-07-21
The growth in scientific data volumes has resulted in the need for new tools that enable users to operate on and analyze data on large-scale resources. In the last decade, a number of scientific workflow tools have emerged. These tools often target distributed environments, and often need expert help to compose and execute the workflows. Data-intensive workflows are often ad-hoc, they involve an iterative development process that includes users composing and testing their workflows on desktops, and scaling up to larger systems. In this paper, we present the design and implementation of Tigres, a workflow library that supports the iterativemore » workflow development cycle of data-intensive workflows. Tigres provides an application programming interface to a set of programming templates i.e., sequence, parallel, split, merge, that can be used to compose and execute computational and data pipelines. We discuss the results of our evaluation of scientific and synthetic workflows showing Tigres performs with minimal template overheads (mean of 13 seconds over all experiments). We also discuss various factors (e.g., I/O performance, execution mechanisms) that affect the performance of scientific workflows on HPC systems.« less
NMRPipe: a multidimensional spectral processing system based on UNIX pipes.
Delaglio, F; Grzesiek, S; Vuister, G W; Zhu, G; Pfeifer, J; Bax, A
1995-11-01
The NMRPipe system is a UNIX software environment of processing, graphics, and analysis tools designed to meet current routine and research-oriented multidimensional processing requirements, and to anticipate and accommodate future demands and developments. The system is based on UNIX pipes, which allow programs running simultaneously to exchange streams of data under user control. In an NMRPipe processing scheme, a stream of spectral data flows through a pipeline of processing programs, each of which performs one component of the overall scheme, such as Fourier transformation or linear prediction. Complete multidimensional processing schemes are constructed as simple UNIX shell scripts. The processing modules themselves maintain and exploit accurate records of data sizes, detection modes, and calibration information in all dimensions, so that schemes can be constructed without the need to explicitly define or anticipate data sizes or storage details of real and imaginary channels during processing. The asynchronous pipeline scheme provides other substantial advantages, including high flexibility, favorable processing speeds, choice of both all-in-memory and disk-bound processing, easy adaptation to different data formats, simpler software development and maintenance, and the ability to distribute processing tasks on multi-CPU computers and computer networks.
Tigres Workflow Library: Supporting Scientific Pipelines on HPC Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hendrix, Valerie; Fox, James; Ghoshal, Devarshi
The growth in scientific data volumes has resulted in the need for new tools that enable users to operate on and analyze data on large-scale resources. In the last decade, a number of scientific workflow tools have emerged. These tools often target distributed environments, and often need expert help to compose and execute the workflows. Data-intensive workflows are often ad-hoc, they involve an iterative development process that includes users composing and testing their workflows on desktops, and scaling up to larger systems. In this paper, we present the design and implementation of Tigres, a workflow library that supports the iterativemore » workflow development cycle of data-intensive workflows. Tigres provides an application programming interface to a set of programming templates i.e., sequence, parallel, split, merge, that can be used to compose and execute computational and data pipelines. We discuss the results of our evaluation of scientific and synthetic workflows showing Tigres performs with minimal template overheads (mean of 13 seconds over all experiments). We also discuss various factors (e.g., I/O performance, execution mechanisms) that affect the performance of scientific workflows on HPC systems.« less
NASA Astrophysics Data System (ADS)
Goldoni, P.
2011-03-01
The X-shooter data reduction pipeline is an integral part of the X-shooter project, it allows the production of reduced data in physical quantities from the raw data produced by the instrument. The pipeline is based on the data reduction library developed by the X-shooter consortium with contributions from France, The Netherlands and ESO and it uses the Common Pipeline Library (CPL) developed at ESO. The pipeline has been developed for two main functions. The first function is to monitor the operation of the instrument through the reduction of the acquired data, both at Paranal, for a quick-look control, and in Garching, for a more thorough evaluation. The second function is to allow an optimized data reduction for a scientific user. In the following I will first outline the main steps of data reduction with the pipeline then I will briefly show two examples of optimization of the results for science reduction.
NASA Astrophysics Data System (ADS)
Wen, Shipeng; Xu, Jishang; Hu, Guanghai; Dong, Ping; Shen, Hong
2015-08-01
The safety of submarine pipelines is largely influenced by free spans and corrosions. Previous studies on free spans caused by seabed scours are mainly based on the stable environment, where the background seabed scour is in equilibrium and the soil is homogeneous. To study the effects of background erosion on the free span development of subsea pipelines, a submarine pipeline located at the abandoned Yellow River subaqueous delta lobe was investigated with an integrated surveying system which included a Multibeam bathymetric system, a dual-frequency side-scan sonar, a high resolution sub-bottom profiler, and a Magnetic Flux Leakage (MFL) sensor. We found that seabed homogeneity has a great influence on the free span development of the pipeline. More specifically, for homogeneous background scours, the morphology of scour hole below the pipeline is quite similar to that without the background scour, whereas for inhomogeneous background scour, the nature of spanning is mainly dependent on the evolution of seabed morphology near the pipeline. Magnetic Flux Leakage (MFL) detection results also reveal the possible connection between long free spans and accelerated corrosion of the pipeline.
NASA Sounding Rocket Program Educational Outreach
NASA Technical Reports Server (NTRS)
Rosanova, G.
2013-01-01
Educational and public outreach is a major focus area for the National Aeronautics and Space Administration (NASA). The NASA Sounding Rocket Program (NSRP) shares in the belief that NASA plays a unique and vital role in inspiring future generations to pursue careers in science, mathematics, and technology. To fulfill this vision, the NSRP engages in a variety of educator training workshops and student flight projects that provide unique and exciting hands-on rocketry and space flight experiences. Specifically, the Wallops Rocket Academy for Teachers and Students (WRATS) is a one-week tutorial laboratory experience for high school teachers to learn the basics of rocketry, as well as build an instrumented model rocket for launch and data processing. The teachers are thus armed with the knowledge and experience to subsequently inspire the students at their home institution. Additionally, the NSRP has partnered with the Colorado Space Grant Consortium (COSGC) to provide a "pipeline" of space flight opportunities to university students and professors. Participants begin by enrolling in the RockOn! Workshop, which guides fledgling rocketeers through the construction and functional testing of an instrumentation kit. This is then integrated into a sealed canister and flown on a sounding rocket payload, which is recovered for the students to retrieve and process their data post flight. The next step in the "pipeline" involves unique, user-defined RockSat-C experiments in a sealed canister that allow participants more independence in developing, constructing, and testing spaceflight hardware. These experiments are flown and recovered on the same payload as the RockOn! Workshop kits. Ultimately, the "pipeline" culminates in the development of an advanced, user-defined RockSat-X experiment that is flown on a payload which provides full exposure to the space environment (not in a sealed canister), and includes telemetry and attitude control capability. The RockOn! and RockSat-C elements of the "pipeline" have been successfully demonstrated by five annual flights thus far from Wallops Flight Facility. RockSat-X has successfully flown twice, also from Wallops. The NSRP utilizes launch vehicles comprised of military surplus rocket motors (Terrier-Improved Orion and Terrier-Improved Malemute) to execute these missions. The NASA Sounding Rocket Program is proud of its role in inspiring the "next generation of explorers" and is working to expand its reach to all regions of the United States and the international community as well.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shem, L.M.; Zimmerman, R.E.; Alsum, S.K.
1994-12-01
The goal of the Gas Research Institute Wetland Corridors Program is to document impacts of existing pipelines on the wetlands they traverse. To accomplish this goal, 12 existing wetland crossings were surveyed. These sites varied in elapsed time since pipeline construction, wetland type, pipeline installation techniques, and right-of-way (ROW) management practices. This report presents results of a survey conducted over the period of August 5--7, 1991, at the Little Timber Creek crossing in Gloucester County, New Jersey, where three pipelines, constructed in 1950, 1960, and 1990, cross the creek and associated wetlands. The old side of the ROW, created bymore » the installation of the 1960 pipeline, was designed to contain a raised peat bed over the 1950 pipeline and an open-water ditch over the 1960 pipeline. The new portion of the ROW, created by installation of the 1990 pipeline, has an open-water ditch over the pipeline (resulting from settling of the backfill) and a raised peat bed (resulting from rebound of compacted peat). Both the old and new ROWs contain dense stands of herbs; the vegetation on the old ROW was more similar to that in the adjacent natural area than was vegetation in the new ROW. The ROW increased species and habitat diversity in the wetlands. It may contribute to the spread of purple loosestrife and affect species sensitive to habitat fragmentation.« less
The SCUBA Data Reduction Pipeline: ORAC-DR at the JCMT
NASA Astrophysics Data System (ADS)
Jenness, Tim; Economou, Frossie
The ORAC data reduction pipeline, developed for UKIRT, has been designed to be a completely general approach to writing data reduction pipelines. This generality has enabled the JCMT to adapt the system for use with SCUBA with minimal development time using the existing SCUBA data reduction algorithms (Surf).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1994-02-01
Upper East Fork Poplar Creek Operable Unit 2 consists of the Abandoned Nitric Acid pipeline (ANAP). This pipeline was installed in 1951 to transport liquid wastes {approximately}4800 ft from Buildings 9212, 9215, and 9206 to the S-3 Ponds. Materials known to have been discharged through the pipeline include nitric acid, depleted and enriched uranium, various metal nitrates, salts, and lead skimmings. During the mid-1980s, sections of the pipeline were removed during various construction projects. A total of 19 locations were chosen to be investigated along the pipeline for the first phase of this Remedial Investigation. Sampling consisted of drilling downmore » to obtain a soil sample at a depth immediately below the pipeline. Additional samples were obtained deeper in the subsurface depending upon the depth of the pipeline, the depth of the water table, and the point of auger refusal. The 19 samples collected below the pipeline were analyzed by the Oak Ridge Y-12 Plant`s laboratory for metals, nitrate/nitrite, and isotopic uranium. Samples collected from three boreholes were also analyzed for volatile organic compounds because these samples produced a response with organic vapor monitoring equipment. Uranium activities in the soil samples ranged from 0.53 to 13.0 pCi/g for {sup 238}U, from 0.075 to 0.75 pCi/g for {sup 235}U, and from 0.71 to 5.0 pCi/g for {sup 238}U. Maximum total values for lead, chromium, and nickel were 75.1 mg/kg, 56.3 mg/kg, and 53.0 mg/kg, respectively. The maximum nitrate/nitrite value detected was 32.0 mg-N/kg. One sample obtained adjacent to a sewer line contained various organic compounds, at least some of which were tentatively identified as fragrance chemicals commonly associated with soaps and cleaning solutions. The results of the baseline human health risk assessment for the ANAP contaminants of potential concern show no unacceptable risks to human health.« less
Diagnostic Inspection of Pipelines for Estimating the State of Stress in Them
NASA Astrophysics Data System (ADS)
Subbotin, V. A.; Kolotilov, Yu. V.; Smirnova, V. Yu.; Ivashko, S. K.
2017-12-01
The diagnostic inspection used to estimate the technical state of a pipeline is described. The problems of inspection works are listed, and a functional-structural scheme is developed to estimate the state of stress in a pipeline. Final conclusions regarding the actual loading of a pipeline section are drawn upon a cross analysis of the entire information obtained during pipeline inspection.
The Health Equity Scholars Program: Innovation in the Leaky Pipeline.
Upshur, Carole C; Wrighting, Diedra M; Bacigalupe, Gonzalo; Becker, Joan; Hayman, Laura; Lewis, Barbara; Mignon, Sylvia; Rokop, Megan E; Sweet, Elizabeth; Torres, Marie Idali; Watanabe, Paul; Woods, Cedric
2018-04-01
Despite attempts to increase enrollment of under-represented minorities (URMs: primarily Black/African American, Hispanic/Latino, and Native American students) in health professional programs, limited progress has been made. Compelling reasons to rectify this situation include equity for URMs, better prepared health professionals when programs are diverse, better quality and access to health care for UMR populations, and the need for diverse talent to tackle difficult questions in health science and health care delivery. However, many students who initiate traditional "pipeline" programs designed to link URMs to professional schools in health professions and the sciences, do not complete them. In addition, program requirements often restrict entry to highly qualified students while not expanding opportunities for promising, but potentially less well-prepared candidates. The current study describes innovations in an undergraduate pipeline program, the Health Equity Scholars Program (HESP) designed to address barriers URMs experience in more traditional programs, and provides evaluative outcomes and qualitative feedback from participants. A primary outcome was timely college graduation. Eighty percent (80%) of participants, both transfer students and first time students, so far achieved this outcome, with 91% on track, compared to the campus average of 42% for all first time students and 58-67% for transfers. Grade point averages also improved (p = 0.056) after program participation. Graduates (94%) were working in health care/human services positions and three were in health-related graduate programs. Creating a more flexible program that admits a broader range of URMs has potential to expand the numbers of URM students interested and prepared to make a contribution to health equity research and clinical care.
Granatum: a graphical single-cell RNA-Seq analysis pipeline for genomics scientists.
Zhu, Xun; Wolfgruber, Thomas K; Tasato, Austin; Arisdakessian, Cédric; Garmire, David G; Garmire, Lana X
2017-12-05
Single-cell RNA sequencing (scRNA-Seq) is an increasingly popular platform to study heterogeneity at the single-cell level. Computational methods to process scRNA-Seq data are not very accessible to bench scientists as they require a significant amount of bioinformatic skills. We have developed Granatum, a web-based scRNA-Seq analysis pipeline to make analysis more broadly accessible to researchers. Without a single line of programming code, users can click through the pipeline, setting parameters and visualizing results via the interactive graphical interface. Granatum conveniently walks users through various steps of scRNA-Seq analysis. It has a comprehensive list of modules, including plate merging and batch-effect removal, outlier-sample removal, gene-expression normalization, imputation, gene filtering, cell clustering, differential gene expression analysis, pathway/ontology enrichment analysis, protein network interaction visualization, and pseudo-time cell series construction. Granatum enables broad adoption of scRNA-Seq technology by empowering bench scientists with an easy-to-use graphical interface for scRNA-Seq data analysis. The package is freely available for research use at http://garmiregroup.org/granatum/app.
DEVELOPMENT OF AN ENVIRONMENTALLY BENIGN MICROBIAL INHIBITOR TO CONTROL INTERNAL PIPELINE CORROSION
DOE Office of Scientific and Technical Information (OSTI.GOV)
J. Robert Paterek; Gemma Husmillo
The overall program objective is to develop and evaluate environmental benign agents or products that are effective in the prevention, inhibition, and mitigation of microbially influenced corrosion (MIC) in the internal surfaces of metallic natural gas pipelines. The goal is one or more environmental benign, a.k.a. ''green'' products that can be applied to maintain the structure and dependability of the natural gas infrastructure. Capsicum sp. extracts and pure compounds were screened for their antimicrobial activity against MIC causing bacteria. Studies on the ability of these compounds to dissociate biofilm from the substratum were conducted using microtiter plate assays. Tests usingmore » laboratory scale pipeline simulators continued. Preliminary results showed that the natural extracts possess strong antimicrobial activity being comparable to or even better than the pure compounds tested against strains of sulfate reducers. Their minimum inhibitory concentrations had been determined. It was also found that they possess bactericidal properties at minimal concentrations. Biofilm dissociation activity as assessed by microtiter plate assays demonstrated varying degrees of differences between the treated and untreated group with the superior performance of the extracts over pure compounds. Such is an indication of the possible benefits that could be obtained from these natural products. Confirmatory experiments are underway.« less
An Integrated SNP Mining and Utilization (ISMU) Pipeline for Next Generation Sequencing Data
Azam, Sarwar; Rathore, Abhishek; Shah, Trushar M.; Telluri, Mohan; Amindala, BhanuPrakash; Ruperao, Pradeep; Katta, Mohan A. V. S. K.; Varshney, Rajeev K.
2014-01-01
Open source single nucleotide polymorphism (SNP) discovery pipelines for next generation sequencing data commonly requires working knowledge of command line interface, massive computational resources and expertise which is a daunting task for biologists. Further, the SNP information generated may not be readily used for downstream processes such as genotyping. Hence, a comprehensive pipeline has been developed by integrating several open source next generation sequencing (NGS) tools along with a graphical user interface called Integrated SNP Mining and Utilization (ISMU) for SNP discovery and their utilization by developing genotyping assays. The pipeline features functionalities such as pre-processing of raw data, integration of open source alignment tools (Bowtie2, BWA, Maq, NovoAlign and SOAP2), SNP prediction (SAMtools/SOAPsnp/CNS2snp and CbCC) methods and interfaces for developing genotyping assays. The pipeline outputs a list of high quality SNPs between all pairwise combinations of genotypes analyzed, in addition to the reference genome/sequence. Visualization tools (Tablet and Flapjack) integrated into the pipeline enable inspection of the alignment and errors, if any. The pipeline also provides a confidence score or polymorphism information content value with flanking sequences for identified SNPs in standard format required for developing marker genotyping (KASP and Golden Gate) assays. The pipeline enables users to process a range of NGS datasets such as whole genome re-sequencing, restriction site associated DNA sequencing and transcriptome sequencing data at a fast speed. The pipeline is very useful for plant genetics and breeding community with no computational expertise in order to discover SNPs and utilize in genomics, genetics and breeding studies. The pipeline has been parallelized to process huge datasets of next generation sequencing. It has been developed in Java language and is available at http://hpc.icrisat.cgiar.org/ISMU as a standalone free software. PMID:25003610
DOT National Transportation Integrated Search
2010-08-01
Significant financial and environmental consequences often result from line leakage of oil product pipelines. Product can escape into the surrounding soil as even the smallest leak can lead to rupture of the pipeline. From a health perspective, water...
Snow as building material for construction of technological along-the-route roads of main pipelines
NASA Astrophysics Data System (ADS)
Merdanov, S. M.; Egorov, A. L.; Kostyrchenko, V. A.; Madyarov, T. M.
2018-05-01
The article deals with the process of compacting snow in a closed volume with the use of vacuum processing for the construction of technological along-the-route roads of main pipelines. The relevance of the chosen study is substantiated; methods and designs for snow compaction are considered. The publication activity and defenses of doctoral and candidate dissertations on the research subject are analyzed. Patent analysis of existing methods and equipment for snow compaction with indication of their disadvantages is carried out. A design calculation was carried out using computer programs in which a strength calculation was performed to identify the most stressed places in the construction of a vibrating hydraulic tyre-type roller. A 3D model of the experimental setup was developed.
Freight pipelines: Current status and anticipated future use
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1998-07-01
This report is issued by the Task Committee on Freight Pipelines, Pipeline Division, ASCE. Freight pipelines of various types (including slurry pipeline, pneumatic pipeline, and capsule pipeline) have been used throughout the world for over a century for transporting solid and sometimes even package products. Recent advancements in pipeline technology, aided by advanced computer control systems and trenchless technologies, have greatly facilitated the transportation of solids by pipelines. Today, in many situations, freight pipelines are not only the most economical and practical means for transporting solids, they are also the most reliable, safest and most environmentally friendly transportation mode. Increasedmore » use of underground pipelines to transport freight is anticipated in the future, especially as the technology continues to improve and surface transportation modes such as highways become more congested. This paper describes the state of the art and expected future uses of various types of freight pipelines. Obstacles hindering the development and use of the most advanced freight pipeline systems, such as the pneumatic capsule pipeline for interstate transport of freight, are discussed.« less
King, Andrew J; Fisher, Arielle M; Becich, Michael J; Boone, David N
2017-01-01
The University of Pittsburgh's Department of Biomedical Informatics and Division of Pathology Informatics created a Science, Technology, Engineering, and Mathematics (STEM) pipeline in 2011 dedicated to providing cutting-edge informatics research and career preparatory experiences to a diverse group of highly motivated high-school students. In this third editorial installment describing the program, we provide a brief overview of the pipeline, report on achievements of the past scholars, and present results from self-reported assessments by the 2015 cohort of scholars. The pipeline continues to expand with the 2015 addition of the innovation internship, and the introduction of a program in 2016 aimed at offering first-time research experiences to undergraduates who are underrepresented in pathology and biomedical informatics. Achievements of program scholars include authorship of journal articles, symposium and summit presentations, and attendance at top 25 universities. All of our alumni matriculated into higher education and 90% remain in STEM majors. The 2015 high-school program had ten participating scholars who self-reported gains in confidence in their research abilities and understanding of what it means to be a scientist.
King, Andrew J.; Fisher, Arielle M.; Becich, Michael J.; Boone, David N.
2017-01-01
The University of Pittsburgh's Department of Biomedical Informatics and Division of Pathology Informatics created a Science, Technology, Engineering, and Mathematics (STEM) pipeline in 2011 dedicated to providing cutting-edge informatics research and career preparatory experiences to a diverse group of highly motivated high-school students. In this third editorial installment describing the program, we provide a brief overview of the pipeline, report on achievements of the past scholars, and present results from self-reported assessments by the 2015 cohort of scholars. The pipeline continues to expand with the 2015 addition of the innovation internship, and the introduction of a program in 2016 aimed at offering first-time research experiences to undergraduates who are underrepresented in pathology and biomedical informatics. Achievements of program scholars include authorship of journal articles, symposium and summit presentations, and attendance at top 25 universities. All of our alumni matriculated into higher education and 90% remain in STEM majors. The 2015 high-school program had ten participating scholars who self-reported gains in confidence in their research abilities and understanding of what it means to be a scientist. PMID:28400991
"Pushed" to Teach: Pedagogies and Policies for a Black Women Educator Pipeline
ERIC Educational Resources Information Center
Gist, Conra D.; White, Terrenda; Bianco, Margarita
2018-01-01
This research study examines the learning experiences of 11th- and 12th-grade Black girls participating in a precollegiate program committed to increasing the number of Teachers of Color entering the profession by viewing a teaching career as an act of social justice committed to educational equity. The pipeline functions as an education reform…
The Challenge of Creating a More Diverse Economics: Lessons from the UCR Minority Pipeline Project
ERIC Educational Resources Information Center
Dymski, Gary A.
2017-01-01
This paper reflects on the experience of the 1999-2002 minority pipeline program (MPP) at the University of California, Riverside. With support from the American Economic Association, the MPP identified students of color interested in economics, let them explore economic issues affecting minority communities, and encouraged them to consider…
The Vulnerability Formation Mechanism and Control Strategy of the Oil and Gas Pipeline City
NASA Astrophysics Data System (ADS)
Chen, Y. L.; Han, L.
2017-12-01
Most of the pipelines of oil and gas pipelines in our country have been for more than 25 years. These pipes are buried underground and was difficult to daily test. In addition, it was vulnerable to environmental, corrosion and natural disasters, So there is a hidden nature of accidents. The rapid development of urbanization, population accumulation, dense building and insufficient safety range are all the reasons for the frequent accidents of oil and gas pipelines. Therefore, to appraise and know the safe condition of the city various regions oil and gas pipelines is vital significant. In order to ensure the safety of oil and gas pipeline city, this paper defines the connotation of oil and gas pipeline city vulnerability according to the previous research on vulnerability. Then from three perspectives of environment, structure and behavior, based on the analytical paradigm of “structure—vulnerability conduct—performance” about oil and gas, the influential indicators of vulnerable oil and gas pipelines were analysed, the vulnerability mechanism framework of Oil and gas pipeline city was also constructed. Finally, the paper proposed the regulating strategy of the vulnerability of the oil and gas pipeline city to decrease its vulnerability index, which can be realize the city’s vulnerability evaluation and provides new ideas for the sustainable development of the city.
Chery, Joyce G; Sass, Chodon; Specht, Chelsea D
2017-09-01
We developed a bioinformatic pipeline that leverages a publicly available genome and published transcriptomes to design primers in conserved coding sequences flanking targeted introns of single-copy nuclear loci. Paullinieae (Sapindaceae) is used to demonstrate the pipeline. Transcriptome reads phylogenetically closer to the lineage of interest are aligned to the closest genome. Single-nucleotide polymorphisms are called, generating a "pseudoreference" closer to the lineage of interest. Several filters are applied to meet the criteria of single-copy nuclear loci with introns of a desired size. Primers are designed in conserved coding sequences flanking introns. Using this pipeline, we developed nine single-copy nuclear intron markers for Paullinieae. This pipeline is highly flexible and can be used for any group with available genomic and transcriptomic resources. This pipeline led to the development of nine variable markers for phylogenetic study without generating sequence data de novo.
NASA Astrophysics Data System (ADS)
Gilchrist, Pamela O.; Carpenter, Eric D.; Gray-Battle, Asia
2014-07-01
A hybrid teacher professional development, student science technology mathematics and engineering pipeline enrichment program was operated by the reporting research group for the past 3 years. Overall, the program has reached 69 students from 13 counties in North Carolina and 57 teachers from 30 counties spread over a total of five states. Quantitative analysis of oral presentations given by participants at a program event is provided. Scores from multiple raters were averaged and used as a criterion in several regression analyses. Overall it was revealed that student grade point averages, most advanced science course taken, extra quality points earned in their most advanced science course taken, and posttest scores on a pilot research design survey were significant predictors of student oral presentation scores. Rationale for findings, opportunities for future research, and implications for the iterative development of the program are discussed.
The High Level Data Reduction Library
NASA Astrophysics Data System (ADS)
Ballester, P.; Gabasch, A.; Jung, Y.; Modigliani, A.; Taylor, J.; Coccato, L.; Freudling, W.; Neeser, M.; Marchetti, E.
2015-09-01
The European Southern Observatory (ESO) provides pipelines to reduce data for most of the instruments at its Very Large telescope (VLT). These pipelines are written as part of the development of VLT instruments, and are used both in the ESO's operational environment and by science users who receive VLT data. All the pipelines are highly specific geared toward instruments. However, experience showed that the independently developed pipelines include significant overlap, duplication and slight variations of similar algorithms. In order to reduce the cost of development, verification and maintenance of ESO pipelines, and at the same time improve the scientific quality of pipelines data products, ESO decided to develop a limited set of versatile high-level scientific functions that are to be used in all future pipelines. The routines are provided by the High-level Data Reduction Library (HDRL). To reach this goal, we first compare several candidate algorithms and verify them during a prototype phase using data sets from several instruments. Once the best algorithm and error model have been chosen, we start a design and implementation phase. The coding of HDRL is done in plain C and using the Common Pipeline Library (CPL) functionality. HDRL adopts consistent function naming conventions and a well defined API to minimise future maintenance costs, implements error propagation, uses pixel quality information, employs OpenMP to take advantage of multi-core processors, and is verified with extensive unit and regression tests. This poster describes the status of the project and the lesson learned during the development of reusable code implementing algorithms of high scientific quality.
NASA Astrophysics Data System (ADS)
Pérez-López, F.; Vallejo, J. C.; Martínez, S.; Ortiz, I.; Macfarlane, A.; Osuna, P.; Gill, R.; Casale, M.
2015-09-01
BepiColombo is an interdisciplinary ESA mission to explore the planet Mercury in cooperation with JAXA. The mission consists of two separate orbiters: ESA's Mercury Planetary Orbiter (MPO) and JAXA's Mercury Magnetospheric Orbiter (MMO), which are dedicated to the detailed study of the planet and its magnetosphere. The MPO scientific payload comprises eleven instruments packages covering different disciplines developed by several European teams. This paper describes the design and development approach of the framework required to support the operation of the distributed BepiColombo MPO instruments pipelines, developed and operated from different locations, but designed as a single entity. An architecture based on primary-redundant configuration, fully integrated into the BepiColombo Science Operations Control System (BSCS), has been selected, where some instrument pipelines will be operated from the instrument team's data processing centres, having a pipeline replica that can be run from the Science Ground Segment (SGS), while others will be executed as primary pipelines from the SGS, adopting the SGS the pipeline orchestration role.
Boatright, Dowin; Tunson, Java; Caruso, Emily; Angerhofer, Christy; Baker, Brooke; King, Renee; Bakes, Katherine; Oberfoell, Stephanie; Lowenstein, Steven; Druck, Jeffrey
2016-11-01
In 2008, the Council of Emergency Medicine Residency Directors (CORD) developed a set of recruitment strategies designed to increase the number of under-represented minorities (URMs) in Emergency Medicine (EM) residency. We conducted a survey of United States (US) EM residency program directors to: describe the racial and ethnic composition of residents; ascertain whether each program had instituted CORD recruitment strategies; and identify program characteristics associated with recruitment of a high proportion of URM residents. The survey was distributed to accredited, nonmilitary US EM residency programs during 2013. Programs were dichotomized into high URM and low URM by the percentage of URM residents. High- and low-URM programs were compared with respect to size, geography, percentage of URM faculty, importance assigned to common applicant selection criteria, and CORD recruitment strategies utilized. Odds ratios and 95% confidence limits were calculated. Of 154 residency programs, 72% responded. The median percentage of URM residents per program was 9%. Only 46% of EM programs engaged in at least two recruitment strategies. Factors associated with higher resident diversity (high-URM) included: diversity of EM faculty (high-URM) (odds ratio [OR] 5.3; 95% confidence interval [CI] 2.1-13.0); applicant's URM status considered important (OR 4.9; 95% CI 2.1-11.9); engaging in pipeline activities (OR 4.8; 95% CI 1.4-15.7); and extracurricular activities considered important (OR 2.6; 95% CI 1.2-6.0). Less than half of EM programs have instituted two or more recruitment strategies from the 2008 CORD diversity panel. EM faculty diversity, active pipeline programs, and attention paid to applicants' URM status and extracurricular activities were associated with higher resident diversity. Copyright © 2016 Elsevier Inc. All rights reserved.
Bailit, Howard L
2010-10-01
Disparities in access to dental care are a major problem in the United States. Effectively run community-based dental education programs can make a significant contribution to reducing access disparities and at the same time enrich the educational experiences of dental students and residents. For complex historical reasons, dental schools did not base their clinical training programs in community hospitals and clinics like the other health professions. Now, because of trends in school finances, changes in societal values, and limitations in current educational experiences, schools are increasing the time students spend in community clinics. This is likely to continue. The chapters in the first section of the report on the Pipeline, Profession, and Practice: Community-Based Dental Education program--for which this chapter serves as an introduction-provide detailed information on the operation of community-based education programs.
Development of a robotic system of nonstripping pipeline repair by reinforced polymeric compositions
NASA Astrophysics Data System (ADS)
Rybalkin, LA
2018-03-01
The article considers the possibility of creating a robotic system for pipeline repair. The pipeline repair is performed due to inner layer formation by special polyurethane compositions reinforced by short glass fiber strands. This approach provides the opportunity to repair pipelines without excavation works and pipe replacement.
Expansion of the U.S. Natural Gas Pipeline Network
2009-01-01
Additions in 2008 and Projects through 2011. This report examines new natural gas pipeline capacity added to the U.S. natural gas pipeline system during 2008. In addition, it discusses and analyzes proposed natural gas pipeline projects that may be developed between 2009 and 2011, and the market factors supporting these initiatives.
Fernandes, M
1999-04-01
This highly interactive meeting effectively covered critical issues on every transaction from drug discovery through to development and commercialization. The program included company-specific descriptions of new discovery products, together with seminars by clinical research and site management organizations on the acceleration of development, pharmaco-economics, branding of products, direct-to-consumer advertising, global marketing, management, information technology and business strategy. There were approximately 50 sessions covered by 70 speakers.
Rand, Hugh; Shumway, Martin; Trees, Eija K.; Simmons, Mustafa; Agarwala, Richa; Davis, Steven; Tillman, Glenn E.; Defibaugh-Chavez, Stephanie; Carleton, Heather A.; Klimke, William A.; Katz, Lee S.
2017-01-01
Background As next generation sequence technology has advanced, there have been parallel advances in genome-scale analysis programs for determining evolutionary relationships as proxies for epidemiological relationship in public health. Most new programs skip traditional steps of ortholog determination and multi-gene alignment, instead identifying variants across a set of genomes, then summarizing results in a matrix of single-nucleotide polymorphisms or alleles for standard phylogenetic analysis. However, public health authorities need to document the performance of these methods with appropriate and comprehensive datasets so they can be validated for specific purposes, e.g., outbreak surveillance. Here we propose a set of benchmark datasets to be used for comparison and validation of phylogenomic pipelines. Methods We identified four well-documented foodborne pathogen events in which the epidemiology was concordant with routine phylogenomic analyses (reference-based SNP and wgMLST approaches). These are ideal benchmark datasets, as the trees, WGS data, and epidemiological data for each are all in agreement. We have placed these sequence data, sample metadata, and “known” phylogenetic trees in publicly-accessible databases and developed a standard descriptive spreadsheet format describing each dataset. To facilitate easy downloading of these benchmarks, we developed an automated script that uses the standard descriptive spreadsheet format. Results Our “outbreak” benchmark datasets represent the four major foodborne bacterial pathogens (Listeria monocytogenes, Salmonella enterica, Escherichia coli, and Campylobacter jejuni) and one simulated dataset where the “known tree” can be accurately called the “true tree”. The downloading script and associated table files are available on GitHub: https://github.com/WGS-standards-and-analysis/datasets. Discussion These five benchmark datasets will help standardize comparison of current and future phylogenomic pipelines, and facilitate important cross-institutional collaborations. Our work is part of a global effort to provide collaborative infrastructure for sequence data and analytic tools—we welcome additional benchmark datasets in our recommended format, and, if relevant, we will add these on our GitHub site. Together, these datasets, dataset format, and the underlying GitHub infrastructure present a recommended path for worldwide standardization of phylogenomic pipelines. PMID:29372115
Timme, Ruth E; Rand, Hugh; Shumway, Martin; Trees, Eija K; Simmons, Mustafa; Agarwala, Richa; Davis, Steven; Tillman, Glenn E; Defibaugh-Chavez, Stephanie; Carleton, Heather A; Klimke, William A; Katz, Lee S
2017-01-01
As next generation sequence technology has advanced, there have been parallel advances in genome-scale analysis programs for determining evolutionary relationships as proxies for epidemiological relationship in public health. Most new programs skip traditional steps of ortholog determination and multi-gene alignment, instead identifying variants across a set of genomes, then summarizing results in a matrix of single-nucleotide polymorphisms or alleles for standard phylogenetic analysis. However, public health authorities need to document the performance of these methods with appropriate and comprehensive datasets so they can be validated for specific purposes, e.g., outbreak surveillance. Here we propose a set of benchmark datasets to be used for comparison and validation of phylogenomic pipelines. We identified four well-documented foodborne pathogen events in which the epidemiology was concordant with routine phylogenomic analyses (reference-based SNP and wgMLST approaches). These are ideal benchmark datasets, as the trees, WGS data, and epidemiological data for each are all in agreement. We have placed these sequence data, sample metadata, and "known" phylogenetic trees in publicly-accessible databases and developed a standard descriptive spreadsheet format describing each dataset. To facilitate easy downloading of these benchmarks, we developed an automated script that uses the standard descriptive spreadsheet format. Our "outbreak" benchmark datasets represent the four major foodborne bacterial pathogens ( Listeria monocytogenes , Salmonella enterica , Escherichia coli , and Campylobacter jejuni ) and one simulated dataset where the "known tree" can be accurately called the "true tree". The downloading script and associated table files are available on GitHub: https://github.com/WGS-standards-and-analysis/datasets. These five benchmark datasets will help standardize comparison of current and future phylogenomic pipelines, and facilitate important cross-institutional collaborations. Our work is part of a global effort to provide collaborative infrastructure for sequence data and analytic tools-we welcome additional benchmark datasets in our recommended format, and, if relevant, we will add these on our GitHub site. Together, these datasets, dataset format, and the underlying GitHub infrastructure present a recommended path for worldwide standardization of phylogenomic pipelines.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shem, L.M.; Van Dyke, G.D.; Zimmerman, R.E.
1994-12-01
The goal of the Gas Research Institute Wetland Corridors Program is to document impacts of existing pipelines on the wetlands they traverse. To accomplish this goal, 12 existing wetland crossings were surveyed. These sites varied in elapsed time since pipeline construction, wetland type, pipeline installation techniques, and right-of-way (ROW) management practices. This report presents the results of a survey conducted August 17--19, 1992, at the Norris Brook crossing in the town of Peabody, Essex County, Massachusetts. The pipeline at this site was installed during September and October 1990. A backhoe was used to install the pipeline. The pipe was assembledmore » on the adjacent upland and slid into the trench, after which the backhoe was used again to fill the trench and cover the pipeline. Within two years after pipeline construction, a dense vegetative community, composed predominantly of native perennial species, had become established on the ROW. Compared with adjacent natural areas undisturbed by pipeline installation, there was an increase in purple loosestrife and cattail within the ROW, while large woody species were excluded from the ROW. As a result of the ROW`s presence, habitat diversity, edge-type habitat, and species diversity increased within the site. Crooked-stem aster, Aster prenanthoides (a species on the Massasschusetts list of plants of special concern), occurred in low numbers in the adjacent natural areas and had reinvaded the ROW in low numbers.« less
Using modern imaging techniques to old HST data: a summary of the ALICE program.
NASA Astrophysics Data System (ADS)
Choquet, Elodie; Soummer, Remi; Perrin, Marshall; Pueyo, Laurent; Hagan, James Brendan; Zimmerman, Neil; Debes, John Henry; Schneider, Glenn; Ren, Bin; Milli, Julien; Wolff, Schuyler; Stark, Chris; Mawet, Dimitri; Golimowski, David A.; Hines, Dean C.; Roberge, Aki; Serabyn, Eugene
2018-01-01
Direct imaging of extrasolar systems is a powerful technique to study the physical properties of exoplanetary systems and understand their formation and evolution mechanisms. The detection and characterization of these objects are challenged by their high contrast with their host star. Several observing strategies and post-processing algorithms have been developed for ground-based high-contrast imaging instruments, enabling the discovery of directly-imaged and spectrally-characterized exoplanets. The Hubble Space Telescope (HST), pioneer in directly imaging extrasolar systems, has yet been often limited to the detection of bright debris disks systems, with sensitivity limited by the difficulty to implement an optimal PSF subtraction stategy, which is readily offered on ground-based telescopes in pupil tracking mode.The Archival Legacy Investigations of Circumstellar Environments (ALICE) program is a consistent re-analysis of the 10 year old coronagraphic archive of HST's NICMOS infrared imager. Using post-processing methods developed for ground-based observations, we used the whole archive to calibrate PSF temporal variations and improve NICMOS's detection limits. We have now delivered ALICE-reprocessed science products for the whole NICMOS archival data back to the community. These science products, as well as the ALICE pipeline, were used to prototype the JWST coronagraphic data and reduction pipeline. The ALICE program has enabled the detection of 10 faint debris disk systems never imaged before in the near-infrared and several substellar companion candidates, which we are all in the process of characterizing through follow-up observations with both ground-based facilities and HST-STIS coronagraphy. In this publication, we provide a summary of the results of the ALICE program, advertise its science products and discuss the prospects of the program.
System for corrosion monitoring in pipeline applying fuzzy logic mathematics
NASA Astrophysics Data System (ADS)
Kuzyakov, O. N.; Kolosova, A. L.; Andreeva, M. A.
2018-05-01
A list of factors influencing corrosion rate on the external side of underground pipeline is determined. Principles of constructing a corrosion monitoring system are described; the system performance algorithm and program are elaborated. A comparative analysis of methods for calculating corrosion rate is undertaken. Fuzzy logic mathematics is applied to reduce calculations while considering a wider range of corrosion factors.
ERIC Educational Resources Information Center
Russell, Melody L.; Atwater, Mary M.
2005-01-01
This study focuses on 11 African American undergraduate seniors in a biology degree program at a predominantly white research institution in the southeastern United States. These 11 respondents shared their journeys throughout the high school and college science pipeline. Participants described similar precollege factors and experiences that…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stencel, J.M.; Ochsenbein, M.P.
2003-04-14
The KY DOE EPSCoR Program included efforts to impact positively the pipeline of science and engineering students and to establish research, education and business infrastructure, sustainable beyond DOE EPSCoR funding.
Natural Gas Compressor Stations on the Interstate Pipeline Network: Developments Since 1996
2007-01-01
This special report looks at the use of natural gas pipeline compressor stations on the interstate natural gas pipeline network that serves the lower 48 states. It examines the compression facilities added over the past 10 years and how the expansions have supported pipeline capacity growth intended to meet the increasing demand for natural gas.
Current biodefense vaccine programs and challenges.
Wolfe, Daniel N; Florence, William; Bryant, Paula
2013-07-01
The Defense Threat Reduction Agency's Joint Science and Technology Office manages the Chemical and Biological Defense Program's Science and Technology portfolio. The Joint Science and Technology Office's mission is to invest in transformational ideas, innovative people and actionable technology development for Chemical and Biological Defense solutions, with the primary goal to deliver Science and Technology products and capabilities to the warfighter and civilian population that outpace the threat. This commentary focuses on one thrust area within this mission: the Vaccine program of the Joint Science and Technology Office's Translational Medical Division. Here, we will describe candidate vaccines currently in the S&T pipeline, enabling technologies that should facilitate advanced development of these candidates into FDA licensed vaccines, and how the ever-changing biological threat landscape impacts the future of biodefense vaccines.
The Snapshot A Star SurveY (SASSY)
NASA Astrophysics Data System (ADS)
Garani, Jasmine I.; Nielsen, Eric; Marchis, Franck; Liu, Michael C.; Macintosh, Bruce; Rajan, Abhijith; De Rosa, Robert J.; Jinfei Wang, Jason; Esposito, Thomas M.; Best, William M. J.; Bowler, Brendan; Dupuy, Trent; Ruffio, Jean-Baptiste
2018-01-01
The Snapshot A Star Survey (SASSY) is an adaptive optics survey conducted using NIRC2 on the Keck II telescope to search for young, self-luminous planets and brown dwarfs (M > 5MJup) around high mass stars (M > 1.5 M⊙). We present the results of a custom data reduction pipeline developed for the coronagraphic observations of our 200 target stars. Our data analysis method includes basic near infrared data processing (flat-field correction, bad pixel removal, distortion correction) as well as performing PSF subtraction through a Reference Differential Imaging algorithm based on a library of PSFs derived from the observations using the pyKLIP routine. We present the results from the pipeline of a few stars from the survey with analysis of candidate companions. SASSY is sensitive to companions 600,000 times fainter than the host star withint the inner few arcseconds, allowing us to detect companions with masses ~8MJup at age 110 Myr. This work was supported by the Leadership Alliance's Summer Research Early Identification Program at Stanford University, the NSF REU program at the SETI Institute and NASA grant NNX14AJ80G.
The ORAC-DR data reduction pipeline
NASA Astrophysics Data System (ADS)
Cavanagh, B.; Jenness, T.; Economou, F.; Currie, M. J.
2008-03-01
The ORAC-DR data reduction pipeline has been used by the Joint Astronomy Centre since 1998. Originally developed for an infrared spectrometer and a submillimetre bolometer array, it has since expanded to support twenty instruments from nine different telescopes. By using shared code and a common infrastructure, rapid development of an automated data reduction pipeline for nearly any astronomical data is possible. This paper discusses the infrastructure available to developers and estimates the development timescales expected to reduce data for new instruments using ORAC-DR.
The American Science Pipeline: Sustaining Innovation in a Time of Economic Crisis
Hue, Gillian; Sales, Jessica; Comeau, Dawn; Lynn, David G.
2010-01-01
Significant limitations have emerged in America's science training pipeline, including inaccessibility, inflexibility, financial limitations, and lack of diversity. We present three effective programs that collectively address these challenges. The programs are grounded in rigorous science and integrate through diverse disciplines across undergraduate, graduate, and postdoctoral students, and resonate with the broader community. We discuss these models in the context of current economic constraints on higher education and the urgent need for our institutions to recruit and retain diverse student populations and sustain the successful American record in scientific education and innovation. PMID:21123689
Stability of subsea pipelines during large storms
Draper, Scott; An, Hongwei; Cheng, Liang; White, David J.; Griffiths, Terry
2015-01-01
On-bottom stability design of subsea pipelines transporting hydrocarbons is important to ensure safety and reliability but is challenging to achieve in the onerous metocean (meteorological and oceanographic) conditions typical of large storms (such as tropical cyclones, hurricanes or typhoons). This challenge is increased by the fact that industry design guidelines presently give no guidance on how to incorporate the potential benefits of seabed mobility, which can lead to lowering and self-burial of the pipeline on a sandy seabed. In this paper, we demonstrate recent advances in experimental modelling of pipeline scour and present results investigating how pipeline stability can change in a large storm. An emphasis is placed on the initial development of the storm, where scour is inevitable on an erodible bed as the storm velocities build up to peak conditions. During this initial development, we compare the rate at which peak near-bed velocities increase in a large storm (typically less than 10−3 m s−2) to the rate at which a pipeline scours and subsequently lowers (which is dependent not only on the storm velocities, but also on the mechanism of lowering and the pipeline properties). We show that the relative magnitude of these rates influences pipeline embedment during a storm and the stability of the pipeline. PMID:25512592
Bioinformatic pipelines in Python with Leaf
2013-01-01
Background An incremental, loosely planned development approach is often used in bioinformatic studies when dealing with custom data analysis in a rapidly changing environment. Unfortunately, the lack of a rigorous software structuring can undermine the maintainability, communicability and replicability of the process. To ameliorate this problem we propose the Leaf system, the aim of which is to seamlessly introduce the pipeline formality on top of a dynamical development process with minimum overhead for the programmer, thus providing a simple layer of software structuring. Results Leaf includes a formal language for the definition of pipelines with code that can be transparently inserted into the user’s Python code. Its syntax is designed to visually highlight dependencies in the pipeline structure it defines. While encouraging the developer to think in terms of bioinformatic pipelines, Leaf supports a number of automated features including data and session persistence, consistency checks between steps of the analysis, processing optimization and publication of the analytic protocol in the form of a hypertext. Conclusions Leaf offers a powerful balance between plan-driven and change-driven development environments in the design, management and communication of bioinformatic pipelines. Its unique features make it a valuable alternative to other related tools. PMID:23786315
NASA Technical Reports Server (NTRS)
1976-01-01
McDonnel Douglas Corporation is using a heat-pipe device, developed through the space program, to transport oil from Alaska's rich North Slope fields. It is being used to keep the ground frozen along the 798- mile pipeline saving hundreds of millions of dollars and protecting the tundra environment. Heatpipes are totally automatic, they sense and respond to climatic conditions with no moving parts, require no external power, and never need adjustment or servicing.
ERIC Educational Resources Information Center
Flowers, Kenneth W.
2015-01-01
With nearly every industry predicting severe employee shortages, the available worker pipeline, including the employed, may need to upgrade their skills. In addition, the number of jobs available will soon exceed the number of available workers, even if all the workers were skilled. This study investigated the perceptions held by key individuals…
Viability of using different types of main oil pipelines pump drives
NASA Astrophysics Data System (ADS)
Zakirzakov, A. G.; Zemenkov, Yu D.; Akulov, K. A.
2018-05-01
The choice of the pumping units' drive of main oil pipelines is of great importance both for design of pipelines and for modernization of existing ones. At the beginning of oil pipeline transport development, due to the limited number and types of energy sources, the choice was not difficult. The combustion energy of the pumped product was often the only available energy resource for its transportation. In this regard, the pipelines that had autonomous energy sources favorably differed from other energy consumers in the sector. With the passage of time, with the development of the country's electricity supply system, the electric drive for power-line equipment of oil pipelines becomes the dominant type of a pumping station drive. Nowadays, the traditional component is an essential factor when choosing some type of the drive. For many years, oil companies have been using electric drives for pumps, while gas transport enterprises prefer self-contained gas turbines.
NASA Astrophysics Data System (ADS)
Stefan Devlin, Benjamin; Nakura, Toru; Ikeda, Makoto; Asada, Kunihiro
We detail a self synchronous field programmable gate array (SSFPGA) with dual-pipeline (DP) architecture to conceal pre-charge time for dynamic logic, and its throughput optimization by using pipeline alignment implemented on benchmark circuits. A self synchronous LUT (SSLUT) consists of a three input tree-type structure with 8bits of SRAM for programming. A self synchronous switch box (SSSB) consists of both pass transistors and buffers to route signals, with 12bits of SRAM. One common block with one SSLUT and one SSSB occupies 2.2Mλ2 area with 35bits of SRAM, and the prototype SSFPGA with 34 × 30 (1020) blocks is designed and fabricated using 65nm CMOS. Measured results show at 1.2V 430MHz and 647MHz operation for a 3bit ripple carry adder, without and with throughput optimization, respectively. We find that using the proposed pipeline alignment techniques we can perform at maximum throughput of 647MHz in various benchmarks on the SSFPGA. We demonstrate up to 56.1 times throughput improvement with our pipeline alignment techniques. The pipeline alignment is carried out within the number of logic elements in the array and pipeline buffers in the switching matrix.
Drive Control System for Pipeline Crawl Robot Based on CAN Bus
NASA Astrophysics Data System (ADS)
Chen, H. J.; Gao, B. T.; Zhang, X. H.; Deng2, Z. Q.
2006-10-01
Drive control system plays important roles in pipeline robot. In order to inspect the flaw and corrosion of seabed crude oil pipeline, an original mobile pipeline robot with crawler drive unit, power and monitor unit, central control unit, and ultrasonic wave inspection device is developed. The CAN bus connects these different function units and presents a reliable information channel. Considering the limited space, a compact hardware system is designed based on an ARM processor with two CAN controllers. With made-to-order CAN protocol for the crawl robot, an intelligent drive control system is developed. The implementation of the crawl robot demonstrates that the presented drive control scheme can meet the motion control requirements of the underwater pipeline crawl robot.
Young, Meredith E; Thomas, Aliki; Varpio, Lara; Razack, Saleem I; Hanson, Mark D; Slade, Steve; Dayem, Katharine L; McKnight, David J
2017-04-01
Several national level calls have encouraged reconsideration of diversity issues in medical education. Particular interest has been placed on admissions, as decisions made here shape the nature of the future physician workforce. Critical analysis of current practices paired with evidence-informed policies may counter some of the barriers impeding access for underrepresented groups. We present a framework for diversity-related program development and evaluation grounded within a knowledge translation framework, and supported by the initiation of longitudinal collection of diversity-related data. We provide an illustrative case study for each component of the framework. Descriptive analyses are presented of pre/post intervention diversity metrics if applicable and available. The framework's focal points are: 1) data-driven identification of underrepresented groups, 2) pipeline development and targeted recruitment, 3) ensuring an inclusive process, 4) ensuring inclusive assessment, 5) ensuring inclusive selection, and 6) iterative use of diversity-related data. Case studies ranged from wording changes on admissions websites to the establishment of educational and administrative offices addressing needs of underrepresented populations. We propose that diversity-related data must be collected on a variety of markers, developed in partnership with stakeholders who are most likely to facilitate implementation of best practices and new policies. These data can facilitate the design, implementation, and evaluation of evidence-informed diversity initiatives and provide a structure for continued investigation into 'interventions' supporting diversity-related initiatives.
Status of the TESS Science Processing Operations Center
NASA Astrophysics Data System (ADS)
Jenkins, Jon Michael; Caldwell, Douglas A.; Davies, Misty; Li, Jie; Morris, Robert L.; Rose, Mark; Smith, Jeffrey C.; Tenenbaum, Peter; Ting, Eric; Twicken, Joseph D.; Wohler, Bill
2018-06-01
The Transiting Exoplanet Survey Satellite (TESS) was selected by NASA’s Explorer Program to conduct a search for Earth’s closest cousins starting in 2018. TESS will conduct an all-sky transit survey of F, G and K dwarf stars between 4 and 12 magnitudes and M dwarf stars within 200 light years. TESS is expected to discover 1,000 small planets less than twice the size of Earth, and to measure the masses of at least 50 of these small worlds. The TESS science pipeline is being developed by the Science Processing Operations Center (SPOC) at NASA Ames Research Center based on the highly successful Kepler science pipeline. Like the Kepler pipeline, the TESS pipeline provides calibrated pixels, simple and systematic error-corrected aperture photometry, and centroid locations for all 200,000+ target stars observed over the 2-year mission, along with associated uncertainties. The pixel and light curve products are modeled on the Kepler archive products and will be archived to the Mikulski Archive for Space Telescopes (MAST). In addition to the nominal science data, the 30-minute Full Frame Images (FFIs) simultaneously collected by TESS will also be calibrated by the SPOC and archived at MAST. The TESS pipeline searches through all light curves for evidence of transits that occur when a planet crosses the disk of its host star. The Data Validation pipeline generates a suite of diagnostic metrics for each transit-like signature, and then extracts planetary parameters by fitting a limb-darkened transit model to each potential planetary signature. The results of the transit search are modeled on the Kepler transit search products (tabulated numerical results, time series products, and pdf reports) all of which will be archived to MAST. Synthetic sample data products are available at https://archive.stsci.edu/tess/ete-6.html.Funding for the TESS Mission has been provided by the NASA Science Mission Directorate.
49 CFR 198.37 - State one-call damage prevention program.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 49 Transportation 3 2010-10-01 2010-10-01 false State one-call damage prevention program. 198.37... REGULATIONS FOR GRANTS TO AID STATE PIPELINE SAFETY PROGRAMS Adoption of One-Call Damage Prevention Program § 198.37 State one-call damage prevention program. A State must adopt a one-call damage prevention...
Users Manual for the Dynamic Student Flow Model.
1981-07-31
populations within each pipeline are reasonably homogeneous and the pipeline curriculum provides a structured path along which the student must progress...curriculum is structured, student populations are non-homogeneous. They are drawn from diverse sources such as the Naval Aca- demy, NROTC and the Aviation...Officer Candidate program in numbers subjectively determined to provide the best population for subsequent flight training. His- torically, different
Pipeline Optimization Program (PLOP)
2006-08-01
the framework of the Dredging Operations Decision Support System (DODSS, https://dodss.wes.army.mil/wiki/0). PLOP compiles industry standards and...efficiency point ( BEP ). In the interest of acceptable wear rate on the pump, industrial standards dictate that the flow Figure 2. Pump class as a function of...percentage of the flow rate corresponding to the BEP . Pump Acceptability Rules. The facts for pump performance, industrial standards and pipeline and
49 CFR 192.913 - When may an operator deviate its program from certain requirements of this subpart?
Code of Federal Regulations, 2011 CFR
2011-10-01
... management program. An operator that uses a performance-based approach that satisfies the requirements for... to demonstrate the exceptional performance of its integrity management program through the following... to the operator's pipeline system and to the operator's integrity management program; (vi) A...
Langlois, Lillie A; Drohan, Patrick J; Brittingham, Margaret C
2017-07-15
Large, continuous forest provides critical habitat for some species of forest dependent wildlife. The rapid expansion of shale gas development within the northern Appalachians results in direct loss of such habitat at well sites, pipelines, and access roads; however the resulting habitat fragmentation surrounding such areas may be of greater importance. Previous research has suggested that infrastructure supporting gas development is the driver for habitat loss, but knowledge of what specific infrastructure affects habitat is limited by a lack of spatial tracking of infrastructure development in different land uses. We used high-resolution aerial imagery, land cover data, and well point data to quantify shale gas development across four time periods (2010, 2012, 2014, 2016), including: the number of wells permitted, drilled, and producing gas (a measure of pipeline development); land use change; and forest fragmentation on both private and public land. As of April 2016, the majority of shale gas development was located on private land (74% of constructed well pads); however, the number of wells drilled per pad was lower on private compared to public land (3.5 and 5.4, respectively). Loss of core forest was more than double on private than public land (4.3 and 2.0%, respectively), which likely results from better management practices implemented on public land. Pipelines were by far the largest contributor to the fragmentation of core forest due to shale gas development. Forecasting future land use change resulting from gas development suggests that the greatest loss of core forest will occur with pads constructed farthest from pre-existing pipelines (new pipelines must be built to connect pads) and in areas with greater amounts of core forest. To reduce future fragmentation, our results suggest new pads should be placed near pre-existing pipelines and methods to consolidate pipelines with other infrastructure should be used. Without these mitigation practices, we will continue to lose core forest as a result of new pipelines and infrastructure particularly on private land. Copyright © 2017 Elsevier Ltd. All rights reserved.
Evaluation of fishing gear induced pipeline damage
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ellinas, C.P.; King, B.; Davies, R.
1995-12-31
Impact and damage to pipelines due to fishing activities is one of the hazards faced by North Sea pipelines during their operating lives. Available data indicate that about one in ten of reported incidents are due to fishing activities. This paper is concerned with one such occurrence, the assessment of the resulting damage, the methods used to confirm pipeline integrity and the approaches developed for its repair.
KAnalyze: a fast versatile pipelined K-mer toolkit
Audano, Peter; Vannberg, Fredrik
2014-01-01
Motivation: Converting nucleotide sequences into short overlapping fragments of uniform length, k-mers, is a common step in many bioinformatics applications. While existing software packages count k-mers, few are optimized for speed, offer an application programming interface (API), a graphical interface or contain features that make it extensible and maintainable. We designed KAnalyze to compete with the fastest k-mer counters, to produce reliable output and to support future development efforts through well-architected, documented and testable code. Currently, KAnalyze can output k-mer counts in a sorted tab-delimited file or stream k-mers as they are read. KAnalyze can process large datasets with 2 GB of memory. This project is implemented in Java 7, and the command line interface (CLI) is designed to integrate into pipelines written in any language. Results: As a k-mer counter, KAnalyze outperforms Jellyfish, DSK and a pipeline built on Perl and Linux utilities. Through extensive unit and system testing, we have verified that KAnalyze produces the correct k-mer counts over multiple datasets and k-mer sizes. Availability and implementation: KAnalyze is available on SourceForge: https://sourceforge.net/projects/kanalyze/ Contact: fredrik.vannberg@biology.gatech.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:24642064
KAnalyze: a fast versatile pipelined k-mer toolkit.
Audano, Peter; Vannberg, Fredrik
2014-07-15
Converting nucleotide sequences into short overlapping fragments of uniform length, k-mers, is a common step in many bioinformatics applications. While existing software packages count k-mers, few are optimized for speed, offer an application programming interface (API), a graphical interface or contain features that make it extensible and maintainable. We designed KAnalyze to compete with the fastest k-mer counters, to produce reliable output and to support future development efforts through well-architected, documented and testable code. Currently, KAnalyze can output k-mer counts in a sorted tab-delimited file or stream k-mers as they are read. KAnalyze can process large datasets with 2 GB of memory. This project is implemented in Java 7, and the command line interface (CLI) is designed to integrate into pipelines written in any language. As a k-mer counter, KAnalyze outperforms Jellyfish, DSK and a pipeline built on Perl and Linux utilities. Through extensive unit and system testing, we have verified that KAnalyze produces the correct k-mer counts over multiple datasets and k-mer sizes. KAnalyze is available on SourceForge: https://sourceforge.net/projects/kanalyze/. © The Author 2014. Published by Oxford University Press.
ATI SAA Annex 3 Button Tensile Test Report I
NASA Technical Reports Server (NTRS)
Tang, Henry H.
2013-01-01
This report documents the results of a study carried out under Splace Act Agreement SAA-EA-10-004 between the National Aeronautics and Space Administration (NASA) and Astro Technology Incorpporated (ATI). NASA and ATI have entered into this agreement to collaborate on the development of technologies that can benefit both the US government space programs and the oil and gas industry. The report documents the results of a test done on an adnesive system for attaching new monitoring sensor devices to pipelines under Annex III of SAA-EA-10-004: "Proof-of-Concept Design and Testing of a Post Installed Sensing Device on Subsea Risers and Pipelines". The tasks of Annex III are to design and test a proof-of-concept sensing device for in-situ installation on pipelines, risers, or other structures deployed in deep water. The function of the sensor device is to measure various signals such as strain, stress and temperature. This study complements the work done, in Annex I of the SAA, on attaching a fiber optic sensing device to pipe via adhesive bonding. Both Annex I and Annex III studies were conducted in the Crew and Thermal System Division (CTSD) at the Johnson Space Center (JSC) in collaboration with ATI.
Yu, Dongliang; Meng, Yijun; Zuo, Ziwei; Xue, Jie; Wang, Huizhong
2016-01-01
Nat-siRNAs (small interfering RNAs originated from natural antisense transcripts) are a class of functional small RNA (sRNA) species discovered in both plants and animals. These siRNAs are highly enriched within the annealed regions of the NAT (natural antisense transcript) pairs. To date, great research efforts have been taken for systematical identification of the NATs in various organisms. However, developing a freely available and easy-to-use program for NAT prediction is strongly demanded by researchers. Here, we proposed an integrative pipeline named NATpipe for systematical discovery of NATs from de novo assembled transcriptomes. By utilizing sRNA sequencing data, the pipeline also allowed users to search for phase-distributed nat-siRNAs within the perfectly annealed regions of the NAT pairs. Additionally, more reliable nat-siRNA loci could be identified based on degradome sequencing data. A case study on the non-model plant Dendrobium officinale was performed to illustrate the utility of NATpipe. Finally, we hope that NATpipe would be a useful tool for NAT prediction, nat-siRNA discovery, and related functional studies. NATpipe is available at www.bioinfolab.cn/NATpipe/NATpipe.zip. PMID:26858106
Automated Laser Ultrasonic Testing (ALUT) of Hybrid Arc Welds for Pipeline Construction, #272
DOT National Transportation Integrated Search
2009-12-22
One challenge in developing new gas reserves is the high cost of pipeline construction. Welding costs are a major component of overall construction costs. Industry continues to seek advanced pipeline welding technologies to improve productivity and s...
Cyberforce 2025: Crafting a Selection Program for Tomorrow’s Cyber Warriors
2013-02-14
One of the more current and commonly used adult IQ tests is the Wechsler Adult Intelligence Scale – fourth edition (WAIS- IV), which currently...LeMay Center for Doctrine Development and Education, 2011. David J. Kay, Terry J. Pudas, and Brett Young. Preparing the Pipeline: The...34 May/June 1973: 30-34. Times, The New York. Foreign Intelligence Surveillance Act (FISA). New York, September 13, 2012. Trollman, Capt David
Parallel processing in a host plus multiple array processor system for radar
NASA Technical Reports Server (NTRS)
Barkan, B. Z.
1983-01-01
Host plus multiple array processor architecture is demonstrated to yield a modular, fast, and cost-effective system for radar processing. Software methodology for programming such a system is developed. Parallel processing with pipelined data flow among the host, array processors, and discs is implemented. Theoretical analysis of performance is made and experimentally verified. The broad class of problems to which the architecture and methodology can be applied is indicated.
CCDLAB: A Graphical User Interface FITS Image Data Reducer, Viewer, and Canadian UVIT Data Pipeline
NASA Astrophysics Data System (ADS)
Postma, Joseph E.; Leahy, Denis
2017-11-01
CCDLAB was originally developed as a FITS image data reducer and viewer, and development was then continued to provide ground support for the development of the UVIT detector system provided by the Canadian Space Agency to the Indian Space Research Organization’s ASTROSAT satellite and UVIT telescopes. After the launch of ASTROSAT and during UVIT’s first-light and PV phase starting in 2015 December, necessity required the development of a data pipeline to produce scientific images out of the Level 1 format data produced for UVIT by ISRO. Given the previous development of CCDLAB for UVIT ground support, the author provided a pipeline for the new Level 1 format data to be run through CCDLAB with the additional satellite-dependent reduction operations required to produce scientific data. Features of the pipeline are discussed with focus on the relevant data-reduction challenges intrinsic to UVIT data.
NASA Astrophysics Data System (ADS)
Iturbe, Rosario; Castro, Alejandrina; Perez, Guillermina; Flores, Carlos; Torres, Luis G.
2008-10-01
For the year 1996, 366 incidents related with clandestine poaching of oil-products were reported in Mexico, 159 in 1997, and 240 in 1998. For the year 2003 (the most recently reported figure), there were 136 events. Petroleos Mexicanos (PEMEX), very concerned with the environmental agenda, has developed programs oriented to diminish contamination levels in all of its oil facilities. This work was aimed at characterizing zones around polyduct segments, pipelines, pumping stations, and right-of-way pipelines located in the center of Mexico. The TPH contaminated sites were, in decreasing order, polyduct km 39 + 150 > polyduct km 25 + 020 > Zoquital > Tepetitlan > Catalina > Venta Prieta > Ceiba. Most of the sampled points showed the presence of more than one of the 16 PAHs considered by USEPA as priority pollutants. Except point TEPE 2A, where no PAHs were detected, all the sampled points showed values from low to medium concentrations; however, values found at the sites did not exceed the limits according to the Mexican or the American legislation. The place with the largest contaminated area corresponded to the polyduct km 39 + 150, with 130 m2 and 260 m3 to be treated. The least contaminated area was that around the JUAN 4 point at Juandho station, with 20 m2 and 22 m3 of contaminated soil. The total area to be treated is about 230 m2 and 497 m3.
Stepping Stones to Research: Providing Pipelines from Middle School through PhD
NASA Astrophysics Data System (ADS)
Noel-Storr, Jacob; Baum, S. A.; RIT Insight Lab SSR Team; Carlson CenterImaging Science Faculty, Chester F.
2014-01-01
We present a decade's worth of strategies designed to promote and provide "Stepping Stones to Research" to provide a realistic pipeline of educational opportunities, with multiple gateways and exit points, for students moving towards STEM careers along the "STEM pipeline". We also illustrate how the Stepping Stones are designed to incidentally co-inside with related external opportunities through which we can also guide and support our mentees on their paths. We present programs such as middle school family science programs, high school research opportunities, high school internships, undergraduate research pathways, research experiences for undergraduates, and other opportunities. We will highlight the presentations being made at this very meeting -- from the first presentation of a high school student, to a dissertation presentation of a PhD graduate -- that have benefited from this stepping stone principle. We also reflect on the essential nature of building a "researcher-trust", even as a young student, of advocates and mentors who can support the continuation of a scientific career.
NASA Astrophysics Data System (ADS)
Osland, Anna Christine
Hazardous liquid and natural gas transmission pipelines have received limited attention by planning scholars even though local development decisions can have broad consequences if a rupture occurs. In this dissertation, I evaluated the implications of land-use planning for reducing risk to transmission pipeline hazards in North Carolina via three investigations. First, using a survey of planning directors in jurisdictions with transmission pipeline hazards, I investigated the land use planning tools used to mitigate pipeline hazards and the factors associated with tool adoption. Planning scholars have documented the difficulty of inducing planning in hazardous areas, yet there remain gaps in knowledge about the factors associated with tool adoption. Despite the risks associated with pipeline ruptures, I found most localities use few mitigation tools, and the adoption of regulatory and informational tools appear to be influenced by divergent factors. Whereas risk perception, commitment, capacity, and community context were associated with total tool and information tool use, only risk perception and capacity factors were associated with regulatory tool use. Second, using interviews of emergency managers and planning directors, I examined the role of agency collaboration for building mitigation capacity. Scholars have highlighted the potential of technical collaboration, yet less research has investigated how inter-agency collaboration shapes mitigation capacity. I identify three categories of technical collaboration, discuss how collaborative spillovers can occur from one planning area to another, and challenge the notion that all technical collaborations result in equal mitigation outcomes. Third, I evaluated characteristics of the population near pipelines to address equity concerns. Surprisingly, I did not find broad support for differences in exposure of vulnerable populations. Nonetheless, my analyses uncovered statistically significant clusters of vulnerable groups within the hazard area. Interestingly, development closer to pipelines was newer than areas farther away, illustrating the failure of land-use planning to reduce development encroachment. Collectively, these results highlight the potential of land-use planning to keep people and development from encroaching on pipeline hazards. While this study indicates that planners in many areas address pipeline hazards, it also illustrates how changes to local practices can further reduce risks to human health, homeland security, and the environment.
Adaptations to a New Physical Training Program in the Combat Controller Training Pipeline
2010-09-01
education regarding optimizing recovery through hydration and nutrition . We designed and implemented a short class that explained the benefits of pre...to poor nutrition and hydration practices. Finally, many of the training methods employed throughout the pipeline were outdated, non-periodized, and...contributing to overtraining. Creation of a nutrition and hydration class. Apart from being told to drink copious amounts of water, trainees had little
Human Factors Analysis of Pipeline Monitoring and Control Operations: Final Technical Report
DOT National Transportation Integrated Search
2008-11-26
The purpose of the Human Factors Analysis of Pipeline Monitoring and Control Operations project was to develop procedures that could be used by liquid pipeline operators to assess and manage the human factors risks in their control rooms that may adv...
MEGAnnotator: a user-friendly pipeline for microbial genomes assembly and annotation.
Lugli, Gabriele Andrea; Milani, Christian; Mancabelli, Leonardo; van Sinderen, Douwe; Ventura, Marco
2016-04-01
Genome annotation is one of the key actions that must be undertaken in order to decipher the genetic blueprint of organisms. Thus, a correct and reliable annotation is essential in rendering genomic data valuable. Here, we describe a bioinformatics pipeline based on freely available software programs coordinated by a multithreaded script named MEGAnnotator (Multithreaded Enhanced prokaryotic Genome Annotator). This pipeline allows the generation of multiple annotated formats fulfilling the NCBI guidelines for assembled microbial genome submission, based on DNA shotgun sequencing reads, and minimizes manual intervention, while also reducing waiting times between software program executions and improving final quality of both assembly and annotation outputs. MEGAnnotator provides an efficient way to pre-arrange the assembly and annotation work required to process NGS genome sequence data. The script improves the final quality of microbial genome annotation by reducing ambiguous annotations. Moreover, the MEGAnnotator platform allows the user to perform a partial annotation of pre-assembled genomes and includes an option to accomplish metagenomic data set assemblies. MEGAnnotator platform will be useful for microbiologists interested in genome analyses of bacteria as well as those investigating the complexity of microbial communities that do not possess the necessary skills to prepare their own bioinformatics pipeline. © FEMS 2016. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
General-Purpose Electronic System Tests Aircraft
NASA Technical Reports Server (NTRS)
Glover, Richard D.
1989-01-01
Versatile digital equipment supports research, development, and maintenance. Extended aircraft interrogation and display system is general-purpose assembly of digital electronic equipment on ground for testing of digital electronic systems on advanced aircraft. Many advanced features, including multiple 16-bit microprocessors, pipeline data-flow architecture, advanced operating system, and resident software-development tools. Basic collection of software includes program for handling many types of data and for displays in various formats. User easily extends basic software library. Hardware and software interfaces to subsystems provided by user designed for flexibility in configuration to meet user's requirements.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shem, L.M.; Van Dyke, G.D.; Zimmerman, R.E.
The goal of the Gas Research Institute Wetland Corridors Program is to document impacts of existing pipelines on the wetlands they traverse. To accomplish this goal, 12 existing wetland crossings were surveyed. These sites varied in elapsed time since pipeline construction, wetland type, pipeline installation techniques, and right-of-way (ROW) management practices. This report presents the results of surveys conducted July 14-18, 1992, at the Deep Creek and the Brandy Branch crossings of a pipeline installed during May 1991 in Nassau County, Florida. Both floodplains supported bottomland hardwood forests. The pipeline at the Deep Creek crossing was installed by means ofmore » horizontal directional drilling after the ROW had been clear-cut, while the pipeline at the Brandy Branch crossing was installed by means of conventional open trenching. Neither site was seeded or fertilized. At the time of sampling, a dense vegetative community, made up primarily of native perennial herbaceous species, occupied the ROW within the Deep Creek floodplain. The Brandy Branch ROW was vegetated by a less dense stand of primarily native perennial herbaceous plants. Plant diversity was also lower at the Brandy Branch crossing than at the Deep Creek crossing. The results suggest that some of the differences in plant communities are related to the more hydric conditions at the Brandy Branch floodplain.« less
Telluric currents: A meeting of theory and observation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boteler, D.H.; Seager, W.H.
Pipe-to-soil (P/S) potential variations resulting from telluric currents have been observed on pipelines in many locations. However, it has never teen clear which parts of a pipeline will experience the worst effects. Two studies were conducted to answer this question. Distributed-source transmission line (DSTL) theory was applied to the problem of modeling geomagnetic induction in pipelines. This theory predicted that the largest P/S potential variations would occur at the ends of the pipeline. The theory also predicted that large P/S potential variations, of opposite sign, should occur on either side of an insulating flange. Independently, an observation program was conductedmore » to determine the change in telluric current P/S potential variations and to design counteractive measures along a pipeline in northern Canada. Observations showed that the amplitude of P/S potential fluctuations had maxima at the northern and southern ends of the pipeline. A further set of recordings around an insulating flange showed large P/S potential variations, of opposite sign, on either side of the flange. Agreement between the observations and theoretical predictions was remarkable. While the observations confirmed the theory, the theory explains how P/S potential variations are produced by telluric currents and provides the basis for design of cathodic protection systems for pipelines that can counteract any adverse telluric effects.« less
The ALMA Science Pipeline: Current Status
NASA Astrophysics Data System (ADS)
Humphreys, Elizabeth; Miura, Rie; Brogan, Crystal L.; Hibbard, John; Hunter, Todd R.; Indebetouw, Remy
2016-09-01
The ALMA Science Pipeline is being developed for the automated calibration and imaging of ALMA interferometric and single-dish data. The calibration Pipeline for interferometric data was accepted for use by ALMA Science Operations in 2014, and for single-dish data end-to-end processing in 2015. However, work is ongoing to expand the use cases for which the Pipeline can be used e.g. for higher frequency and lower signal-to-noise datasets, and for new observing modes. A current focus includes the commissioning of science target imaging for interferometric data. For the Single Dish Pipeline, the line finding algorithm used in baseline subtraction and baseline flagging heuristics have been greately improved since the prototype used for data from the previous cycle. These algorithms, unique to the Pipeline, produce better results than standard manual processing in many cases. In this poster, we report on the current status of the Pipeline capabilities, present initial results from the Imaging Pipeline, and the smart line finding and flagging algorithm used in the Single Dish Pipeline. The Pipeline is released as part of CASA (the Common Astronomy Software Applications package).
Fisher, Jill A; Cottingham, Marci D; Kalbaugh, Corey A
2015-04-01
In spite of a growing literature on pharmaceuticalization, little is known about the pharmaceutical industry's investments in research and development (R&D). Information about the drugs being developed can provide important context for existing case studies detailing the expanding--and often problematic--role of pharmaceuticals in society. To access the pharmaceutical industry's pipeline, we constructed a database of drugs for which pharmaceutical companies reported initiating clinical trials over a five-year period (July 2006-June 2011), capturing 2477 different drugs in 4182 clinical trials. Comparing drugs in the pipeline that target diseases in high-income and low-income countries, we found that the number of drugs for diseases prevalent in high-income countries was 3.46 times higher than drugs for diseases prevalent in low-income countries. We also found that the plurality of drugs in the pipeline was being developed to treat cancers (26.2%). Interpreting our findings through the lens of pharmaceuticalization, we illustrate how investigating the entire drug development pipeline provides important information about patterns of pharmaceuticalization that are invisible when only marketed drugs are considered. Copyright © 2014 Elsevier Ltd. All rights reserved.
TESS Data Processing and Quick-look Pipeline
NASA Astrophysics Data System (ADS)
Fausnaugh, Michael; Huang, Xu; Glidden, Ana; Guerrero, Natalia; TESS Science Office
2018-01-01
We describe the data analysis procedures and pipelines for the Transiting Exoplanet Survey Satellite (TESS). We briefly review the processing pipeline developed and implemented by the Science Processing Operations Center (SPOC) at NASA Ames, including pixel/full-frame image calibration, photometric analysis, pre-search data conditioning, transiting planet search, and data validation. We also describe data-quality diagnostic analyses and photometric performance assessment tests. Finally, we detail a "quick-look pipeline" (QLP) that has been developed by the MIT branch of the TESS Science Office (TSO) to provide a fast and adaptable routine to search for planet candidates in the 30 minute full-frame images.
Mason, Bonnie S; Ross, William; Ortega, Gezzer; Chambers, Monique C; Parks, Michael L
2016-09-01
Women and minorities remain underrepresented in orthopaedic surgery. In an attempt to increase the diversity of those entering the physician workforce, Nth Dimensions implemented a targeted pipeline curriculum that includes the Orthopaedic Summer Internship Program. The program exposes medical students to the specialty of orthopaedic surgery and equips students to be competitive applicants to orthopaedic surgery residency programs. The effect of this program on women and underrepresented minority applicants to orthopaedic residencies is highlighted in this article. (1) For women we asked: is completing the Orthopaedic Summer Internship Program associated with higher odds of applying to orthopaedic surgery residency? (2) For underrepresented minorities, is completing the Orthopaedic Summer Internship Program associated with higher odds of applying to orthopaedic residency? Between 2005 and 2012, 118 students completed the Nth Dimensions/American Academy of Orthopaedic Surgeons Orthopaedic Summer Internship Program. The summer internship consisted of an 8-week clinical and research program between the first and second years of medical school and included a series of musculoskeletal lectures, hands-on, practical workshops, presentation of a completed research project, ongoing mentoring, professional development, and counselling through each participant's subsequent years of medical school. In correlation with available national application data, residency application data were obtained for those Orthopaedic Summer Internship Program participants who applied to the match between 2011 through 2014. For these 4 cohort years, we evaluated whether this program was associated with increased odds of applying to orthopaedic surgery residency compared with national controls. For the same four cohorts, we evaluated whether underrepresented minority students who completed the program had increased odds of applying to an orthopaedic surgery residency compared with national controls. Fifty Orthopaedic Summer Internship scholars applied for an orthopaedic residency position. For women, completion of the Orthopaedic Summer Internship was associated with increased odds of applying to orthopaedic surgery residency (after summer internship: nine of 17 [35%]; national controls: 800 of 78,316 [1%]; odds ratio [OR], 51.3; 95% confidence interval [CI], 21.1-122.0; p < 0.001). Similarly, for underrepresented minorities, Orthopaedic Summer Internship completion was also associated with increased odds of orthopaedic applications from 2011 to 2014 (after Orthopaedic Summer Internship: 15 of 48 [31%]; non-Orthopaedic Summer Internship applicants nationally: 782 of 25,676 [3%]; OR, 14.5 [7.3-27.5]; p < 0.001). Completion of the Nth Dimensions Orthopaedic Summer Internship Program has a positive impact on increasing the odds of each student participant applying to an orthopaedic surgery residency program. This program may be a key factor in contributing to the pipeline of women and underrepresented minorities into orthopaedic surgery. Level III, therapeutic study.
Guidelines for riser splash zone design and repair
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1995-02-01
The many years of offshore oil and gas development has established the subsea pipeline as a reliable and cost effective means of transportation for produced hydrocarbons. The requirement for subsea pipeline systems will continue to move into deeper water and more remote locations with the future development of oil and gas exploration. The integrity of subsea pipeline and riser systems, throughout their operating lifetime, is an important area for operators to consider in maximizing reliability and serviceability for economic, contractual and environmental reasons. Adequate design and installation are the basis for ensuring the integrity of any subsea pipeline and risermore » systems. In the event of system damage, from any source, quick and accurate repair and reinstatement of the pipeline system is essential. This report has been developed to provide guidelines for riser and splash zone design, to perform a detailed overview of existing riser repair techniques and products, and to prepare comprehensive guidelines identifying the capabilities and limits of riser reinstatement systems.« less
Development of the updated system of city underground pipelines based on Visual Studio
NASA Astrophysics Data System (ADS)
Zhang, Jianxiong; Zhu, Yun; Li, Xiangdong
2009-10-01
Our city has owned the integrated pipeline network management system with ArcGIS Engine 9.1 as the bottom development platform and with Oracle9i as basic database for storaging data. In this system, ArcGIS SDE9.1 is applied as the spatial data engine, and the system was a synthetic management software developed with Visual Studio visualization procedures development tools. As the pipeline update function of the system has the phenomenon of slower update and even sometimes the data lost, to ensure the underground pipeline data can real-time be updated conveniently and frequently, and the actuality and integrity of the underground pipeline data, we have increased a new update module in the system developed and researched by ourselves. The module has the powerful data update function, and can realize the function of inputting and outputting and rapid update volume of data. The new developed module adopts Visual Studio visualization procedures development tools, and uses access as the basic database to storage data. We can edit the graphics in AutoCAD software, and realize the database update using link between the graphics and the system. Practice shows that the update module has good compatibility with the original system, reliable and high update efficient of the database.
NGSANE: a lightweight production informatics framework for high-throughput data analysis.
Buske, Fabian A; French, Hugh J; Smith, Martin A; Clark, Susan J; Bauer, Denis C
2014-05-15
The initial steps in the analysis of next-generation sequencing data can be automated by way of software 'pipelines'. However, individual components depreciate rapidly because of the evolving technology and analysis methods, often rendering entire versions of production informatics pipelines obsolete. Constructing pipelines from Linux bash commands enables the use of hot swappable modular components as opposed to the more rigid program call wrapping by higher level languages, as implemented in comparable published pipelining systems. Here we present Next Generation Sequencing ANalysis for Enterprises (NGSANE), a Linux-based, high-performance-computing-enabled framework that minimizes overhead for set up and processing of new projects, yet maintains full flexibility of custom scripting when processing raw sequence data. Ngsane is implemented in bash and publicly available under BSD (3-Clause) licence via GitHub at https://github.com/BauerLab/ngsane. Denis.Bauer@csiro.au Supplementary data are available at Bioinformatics online.
NASA Astrophysics Data System (ADS)
Rui, Zhenhua
This study analyzes historical cost data of 412 pipelines and 220 compressor stations. On the basis of this analysis, the study also evaluates the feasibility of an Alaska in-state gas pipeline using Monte Carlo simulation techniques. Analysis of pipeline construction costs shows that component costs, shares of cost components, and learning rates for material and labor costs vary by diameter, length, volume, year, and location. Overall average learning rates for pipeline material and labor costs are 6.1% and 12.4%, respectively. Overall average cost shares for pipeline material, labor, miscellaneous, and right of way (ROW) are 31%, 40%, 23%, and 7%, respectively. Regression models are developed to estimate pipeline component costs for different lengths, cross-sectional areas, and locations. An analysis of inaccuracy in pipeline cost estimation demonstrates that the cost estimation of pipeline cost components is biased except for in the case of total costs. Overall overrun rates for pipeline material, labor, miscellaneous, ROW, and total costs are 4.9%, 22.4%, -0.9%, 9.1%, and 6.5%, respectively, and project size, capacity, diameter, location, and year of completion have different degrees of impacts on cost overruns of pipeline cost components. Analysis of compressor station costs shows that component costs, shares of cost components, and learning rates for material and labor costs vary in terms of capacity, year, and location. Average learning rates for compressor station material and labor costs are 12.1% and 7.48%, respectively. Overall average cost shares of material, labor, miscellaneous, and ROW are 50.6%, 27.2%, 21.5%, and 0.8%, respectively. Regression models are developed to estimate compressor station component costs in different capacities and locations. An investigation into inaccuracies in compressor station cost estimation demonstrates that the cost estimation for compressor stations is biased except for in the case of material costs. Overall average overrun rates for compressor station material, labor, miscellaneous, land, and total costs are 3%, 60%, 2%, -14%, and 11%, respectively, and cost overruns for cost components are influenced by location and year of completion to different degrees. Monte Carlo models are developed and simulated to evaluate the feasibility of an Alaska in-state gas pipeline by assigning triangular distribution of the values of economic parameters. Simulated results show that the construction of an Alaska in-state natural gas pipeline is feasible at three scenarios: 500 million cubic feet per day (mmcfd), 750 mmcfd, and 1000 mmcfd.
New Software for Ensemble Creation in the Spitzer-Space-Telescope Operations Database
NASA Technical Reports Server (NTRS)
Laher, Russ; Rector, John
2004-01-01
Some of the computer pipelines used to process digital astronomical images from NASA's Spitzer Space Telescope require multiple input images, in order to generate high-level science and calibration products. The images are grouped into ensembles according to well documented ensemble-creation rules by making explicit associations in the operations Informix database at the Spitzer Science Center (SSC). The advantage of this approach is that a simple database query can retrieve the required ensemble of pipeline input images. New and improved software for ensemble creation has been developed. The new software is much faster than the existing software because it uses pre-compiled database stored-procedures written in Informix SPL (SQL programming language). The new software is also more flexible because the ensemble creation rules are now stored in and read from newly defined database tables. This table-driven approach was implemented so that ensemble rules can be inserted, updated, or deleted without modifying software.
Fuhrmann, C. N.; Halme, D. G.; O’Sullivan, P. S.; Lindstaedt, B.
2011-01-01
Today's doctoral programs continue to prepare students for a traditional academic career path despite the inadequate supply of research-focused faculty positions. We advocate for a broader doctoral curriculum that prepares trainees for a wide range of science-related career paths. In support of this argument, we describe data from our survey of doctoral students in the basic biomedical sciences at University of California, San Francisco (UCSF). Midway through graduate training, UCSF students are already considering a broad range of career options, with one-third intending to pursue a non–research career path. To better support this branching career pipeline, we recommend that national standards for training and mentoring include emphasis on career planning and professional skills development to ensure the success of PhD-level scientists as they contribute to a broadly defined global scientific enterprise. PMID:21885820
Technology Cost and Schedule Estimation (TCASE) Final Report
NASA Technical Reports Server (NTRS)
Wallace, Jon; Schaffer, Mark
2015-01-01
During the 2014-2015 project year, the focus of the TCASE project has shifted from collection of historical data from many sources to securing a data pipeline between TCASE and NASA's widely used TechPort system. TCASE v1.0 implements a data import solution that was achievable within the project scope, while still providing the basis for a long-term ability to keep TCASE in sync with TechPort. Conclusion: TCASE data quantity is adequate and the established data pipeline will enable future growth. Data quality is now highly dependent the quality of data in TechPort. Recommendation: Technology development organizations within NASA should continue to work closely with project/program data tracking and archiving efforts (e.g. TechPort) to ensure that the right data is being captured at the appropriate quality level. TCASE would greatly benefit, for example, if project cost/budget information was included in TechPort in the future.
Fuhrmann, C N; Halme, D G; O'Sullivan, P S; Lindstaedt, B
2011-01-01
Today's doctoral programs continue to prepare students for a traditional academic career path despite the inadequate supply of research-focused faculty positions. We advocate for a broader doctoral curriculum that prepares trainees for a wide range of science-related career paths. In support of this argument, we describe data from our survey of doctoral students in the basic biomedical sciences at University of California, San Francisco (UCSF). Midway through graduate training, UCSF students are already considering a broad range of career options, with one-third intending to pursue a non-research career path. To better support this branching career pipeline, we recommend that national standards for training and mentoring include emphasis on career planning and professional skills development to ensure the success of PhD-level scientists as they contribute to a broadly defined global scientific enterprise.
Song, Yan; Dhodda, Raj; Zhang, Jun; Sydor, Jens
2014-05-01
In the recent past, we have seen an increase in the outsourcing of bioanalysis in pharmaceutical companies in support of their drug development pipeline. This trend is largely driven by the effort to reduce internal cost, especially in support of late-stage pipeline assets where established bioanalytical assays are used to analyze a large volume of samples. This article will highlight our perspective of how bioanalytical laboratories within pharmaceutical companies can be developed into the best partner in the advancement of drug development pipelines with high-quality support at competitive cost.
Impacting the Science Community through Teacher Development: Utilizing Virtual Learning.
Boulay, Rachel; van Raalte, Lisa
2014-01-01
Commitment to the STEM (science, technology, engineering, math) pipeline is slowly declining despite the need for professionals in the medical field. Addressing this, the John A. Burns School of Medicine developed a summer teacher-training program with a supplemental technology-learning component to improve science teachers' knowledge and skills of Molecular Biology. Subsequently, students' skills, techniques, and application of molecular biology are impacted. Science teachers require training that will prepare them for educating future professionals and foster interest in the medical field. After participation in the program and full access to the virtual material, twelve high school science teachers completed a final written reflective statement to evaluate their experiences. Using thematic analysis, knowledge and classroom application were investigated in this study. Results were two-fold: teachers identified difference areas of gained knowledge from the teacher-training program and teachers' reporting various benefits in relation to curricula development after participating in the program. It is concluded that participation in the program and access to the virtual material will impact the science community by updating teacher knowledge and positively influencing students' experience with science.
Miller, Mark P.; Knaus, Brian J.; Mullins, Thomas D.; Haig, Susan M.
2013-01-01
SSR_pipeline is a flexible set of programs designed to efficiently identify simple sequence repeats (e.g., microsatellites) from paired-end high-throughput Illumina DNA sequencing data. The program suite contains 3 analysis modules along with a fourth control module that can automate analyses of large volumes of data. The modules are used to 1) identify the subset of paired-end sequences that pass Illumina quality standards, 2) align paired-end reads into a single composite DNA sequence, and 3) identify sequences that possess microsatellites (both simple and compound) conforming to user-specified parameters. The microsatellite search algorithm is extremely efficient, and we have used it to identify repeats with motifs from 2 to 25bp in length. Each of the 3 analysis modules can also be used independently to provide greater flexibility or to work with FASTQ or FASTA files generated from other sequencing platforms (Roche 454, Ion Torrent, etc.). We demonstrate use of the program with data from the brine fly Ephydra packardi (Diptera: Ephydridae) and provide empirical timing benchmarks to illustrate program performance on a common desktop computer environment. We further show that the Illumina platform is capable of identifying large numbers of microsatellites, even when using unenriched sample libraries and a very small percentage of the sequencing capacity from a single DNA sequencing run. All modules from SSR_pipeline are implemented in the Python programming language and can therefore be used from nearly any computer operating system (Linux, Macintosh, and Windows).
Miller, Mark P; Knaus, Brian J; Mullins, Thomas D; Haig, Susan M
2013-01-01
SSR_pipeline is a flexible set of programs designed to efficiently identify simple sequence repeats (e.g., microsatellites) from paired-end high-throughput Illumina DNA sequencing data. The program suite contains 3 analysis modules along with a fourth control module that can automate analyses of large volumes of data. The modules are used to 1) identify the subset of paired-end sequences that pass Illumina quality standards, 2) align paired-end reads into a single composite DNA sequence, and 3) identify sequences that possess microsatellites (both simple and compound) conforming to user-specified parameters. The microsatellite search algorithm is extremely efficient, and we have used it to identify repeats with motifs from 2 to 25 bp in length. Each of the 3 analysis modules can also be used independently to provide greater flexibility or to work with FASTQ or FASTA files generated from other sequencing platforms (Roche 454, Ion Torrent, etc.). We demonstrate use of the program with data from the brine fly Ephydra packardi (Diptera: Ephydridae) and provide empirical timing benchmarks to illustrate program performance on a common desktop computer environment. We further show that the Illumina platform is capable of identifying large numbers of microsatellites, even when using unenriched sample libraries and a very small percentage of the sequencing capacity from a single DNA sequencing run. All modules from SSR_pipeline are implemented in the Python programming language and can therefore be used from nearly any computer operating system (Linux, Macintosh, and Windows).
An acceleration system for Laplacian image fusion based on SoC
NASA Astrophysics Data System (ADS)
Gao, Liwen; Zhao, Hongtu; Qu, Xiujie; Wei, Tianbo; Du, Peng
2018-04-01
Based on the analysis of Laplacian image fusion algorithm, this paper proposes a partial pipelining and modular processing architecture, and a SoC based acceleration system is implemented accordingly. Full pipelining method is used for the design of each module, and modules in series form the partial pipelining with unified data formation, which is easy for management and reuse. Integrated with ARM processor, DMA and embedded bare-mental program, this system achieves 4 layers of Laplacian pyramid on the Zynq-7000 board. Experiments show that, with small resources consumption, a couple of 256×256 images can be fused within 1ms, maintaining a fine fusion effect at the same time.
MIEC-SVM: automated pipeline for protein peptide/ligand interaction prediction.
Li, Nan; Ainsworth, Richard I; Wu, Meixin; Ding, Bo; Wang, Wei
2016-03-15
MIEC-SVM is a structure-based method for predicting protein recognition specificity. Here, we present an automated MIEC-SVM pipeline providing an integrated and user-friendly workflow for construction and application of the MIEC-SVM models. This pipeline can handle standard amino acids and those with post-translational modifications (PTMs) or small molecules. Moreover, multi-threading and support to Sun Grid Engine (SGE) are implemented to significantly boost the computational efficiency. The program is available at http://wanglab.ucsd.edu/MIEC-SVM CONTACT: : wei-wang@ucsd.edu Supplementary data available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Analysis of oil-pipeline distribution of multiple products subject to delivery time-windows
NASA Astrophysics Data System (ADS)
Jittamai, Phongchai
This dissertation defines the operational problems of, and develops solution methodologies for, a distribution of multiple products into oil pipeline subject to delivery time-windows constraints. A multiple-product oil pipeline is a pipeline system composing of pipes, pumps, valves and storage facilities used to transport different types of liquids. Typically, products delivered by pipelines are petroleum of different grades moving either from production facilities to refineries or from refineries to distributors. Time-windows, which are generally used in logistics and scheduling areas, are incorporated in this study. The distribution of multiple products into oil pipeline subject to delivery time-windows is modeled as multicommodity network flow structure and mathematically formulated. The main focus of this dissertation is the investigation of operating issues and problem complexity of single-source pipeline problems and also providing solution methodology to compute input schedule that yields minimum total time violation from due delivery time-windows. The problem is proved to be NP-complete. The heuristic approach, a reversed-flow algorithm, is developed based on pipeline flow reversibility to compute input schedule for the pipeline problem. This algorithm is implemented in no longer than O(T·E) time. This dissertation also extends the study to examine some operating attributes and problem complexity of multiple-source pipelines. The multiple-source pipeline problem is also NP-complete. A heuristic algorithm modified from the one used in single-source pipeline problems is introduced. This algorithm can also be implemented in no longer than O(T·E) time. Computational results are presented for both methodologies on randomly generated problem sets. The computational experience indicates that reversed-flow algorithms provide good solutions in comparison with the optimal solutions. Only 25% of the problems tested were more than 30% greater than optimal values and approximately 40% of the tested problems were solved optimally by the algorithms.
Development of DKB ETL module in case of data conversion
NASA Astrophysics Data System (ADS)
Kaida, A. Y.; Golosova, M. V.; Grigorieva, M. A.; Gubin, M. Y.
2018-05-01
Modern scientific experiments involve the producing of huge volumes of data that requires new approaches in data processing and storage. These data themselves, as well as their processing and storage, are accompanied by a valuable amount of additional information, called metadata, distributed over multiple informational systems and repositories, and having a complicated, heterogeneous structure. Gathering these metadata for experiments in the field of high energy nuclear physics (HENP) is a complex issue, requiring the quest for solutions outside the box. One of the tasks is to integrate metadata from different repositories into some kind of a central storage. During the integration process, metadata taken from original source repositories go through several processing steps: metadata aggregation, transformation according to the current data model and loading it to the general storage in a standardized form. The R&D project of ATLAS experiment on LHC, Data Knowledge Base, is aimed to provide fast and easy access to significant information about LHC experiments for the scientific community. The data integration subsystem, being developed for the DKB project, can be represented as a number of particular pipelines, arranging data flow from data sources to the main DKB storage. The data transformation process, represented by a single pipeline, can be considered as a number of successive data transformation steps, where each step is implemented as an individual program module. This article outlines the specifics of program modules, used in the dataflow, and describes one of the modules developed and integrated into the data integration subsystem of DKB.
The Henry Cecil Ranson McBay Chair in Space Science
NASA Technical Reports Server (NTRS)
Bota, Kofi B.; King, James, Jr.
1999-01-01
The goals and objectives of the Henry Cecil Ransom McBay Chair in Space Sciences were to: (1) provide leadership in developing and expanding Space Science curriculum; (2) contribute to the research and education endeavors of NASA's Mission to Planet Earth program; (3) expand opportunities for education and hands-on research in Space and Earth Sciences; (4) enhance scientific and technological literacy at all educational levels and to increase awareness of opportunities in the Space Sciences; and (5) develop a pipeline, starting with high school, of African American students who will develop into a cadre of well-trained scientists with interest in Space Science Research and Development.
2015 Stewardship Science Academic Programs Annual
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stone, Terri; Mischo, Millicent
The Stockpile Stewardship Academic Programs (SSAP) are essential to maintaining a pipeline of professionals to support the technical capabilities that reside at the National Nuclear Security Administration (NNSA) national laboratories, sites, and plants. Since 1992, the United States has observed the moratorium on nuclear testing while significantly decreasing the nuclear arsenal. To accomplish this without nuclear testing, NNSA and its laboratories developed a science-based Stockpile Stewardship Program to maintain and enhance the experimental and computational tools required to ensure the continued safety, security, and reliability of the stockpile. NNSA launched its academic program portfolio more than a decade ago tomore » engage students skilled in specific technical areas of relevance to stockpile stewardship. The success of this program is reflected by the large number of SSAP students choosing to begin their careers at NNSA national laboratories.« less
Teachers-in-Residence: New Pathways into the Profession. Ask the Team
ERIC Educational Resources Information Center
Han, Grace; Doyle, Daniela
2013-01-01
Teacher residency programs are a relatively new method for building stronger teacher pipelines. Research assessing the impact of these programs is still limited, but some early reports suggest that residency programs hold promise for improving teacher effectiveness and retention rates (Barrett, Hovde, Hahn, & Rosqueta, 2011; Papay, West,…
Emory U. Trains Its Own Leaders
ERIC Educational Resources Information Center
Selingo, Jeffrey J.
2009-01-01
This article describes Emory University's Excellence Through Leadership program. Started in 2006, the yearlong program is designed to help up to 20 administrators and faculty members annually improve their leadership skills, as well as create a pipeline to eventually replace senior leaders at the institution. Emory's leadership program is just one…
The Herschel Data Processing System - Hipe And Pipelines - During The Early Mission Phase
NASA Astrophysics Data System (ADS)
Ardila, David R.; Herschel Science Ground Segment Consortium
2010-01-01
The Herschel Space Observatory, the fourth cornerstone mission in the ESA science program, was launched 14th of May 2009. With a 3.5 m telescope, it is the largest space telescope ever launched. Herschel's three instruments (HIFI, PACS, and SPIRE) perform photometry and spectroscopy in the 55 - 672 micron range and will deliver exciting science for the astronomical community during at least three years of routine observations. Here we summarize the state of the Herschel Data Processing System and give an overview about future development milestones and plans. The development of the Herschel Data Processing System started seven years ago to support the data analysis for Instrument Level Tests. Resources were made available to implement a freely distributable Data Processing System capable of interactively and automatically reduce Herschel data at different processing levels. The system combines data retrieval, pipeline execution and scientific analysis in one single environment. The software is coded in Java and Jython to be platform independent and to avoid the need for commercial licenses. The Herschel Interactive Processing Environment (HIPE) is the user-friendly face of Herschel Data Processing. The first PACS preview observation of M51 was processed with HIPE, using basic pipeline scripts to a fantastic image within 30 minutes of data reception. Also the first HIFI observations on DR-21 were successfully reduced to high quality spectra, followed by SPIRE observations on M66 and M74. The Herschel Data Processing System is a joint development by the Herschel Science Ground Segment Consortium, consisting of ESA, the NASA Herschel Science Center, and the HIFI, PACS and SPIRE consortium members.
30 CFR 817.180 - Utility installations.
Code of Federal Regulations, 2010 CFR
2010-07-01
... PERMANENT PROGRAM PERFORMANCE STANDARDS PERMANENT PROGRAM PERFORMANCE STANDARDS-UNDERGROUND MINING ACTIVITIES § 817.180 Utility installations. All underground mining activities shall be conducted in a manner...; oil, gas, and coal-slurry pipelines, railroads; electric and telephone lines; and water and sewage...
Internal Corrosion Detection in Liquids Pipelines
DOT National Transportation Integrated Search
2012-01-01
PHMSA project DTRS56-05-T-0005 "Development of ICDA for Liquid Petroleum Pipelines" led to the development of a Direct Assessment (DA) protocol to prioritize locations of possible internal corrosion. The underlying basis LP-ICDA is simple; corrosion ...
Lee, Chang Jun
2015-01-01
In the fields of researches associated with plant layout optimization, the main goal is to minimize the costs of pipelines and pumping between connecting equipment under various constraints. However, what is the lacking of considerations in previous researches is to transform various heuristics or safety regulations into mathematical equations. For example, proper safety distances between equipments have to be complied for preventing dangerous accidents on a complex plant. Moreover, most researches have handled single-floor plant. However, many multi-floor plants have been constructed for the last decade. Therefore, the proper algorithm handling various regulations and multi-floor plant should be developed. In this study, the Mixed Integer Non-Linear Programming (MINLP) problem including safety distances, maintenance spaces, etc. is suggested based on mathematical equations. The objective function is a summation of pipeline and pumping costs. Also, various safety and maintenance issues are transformed into inequality or equality constraints. However, it is really hard to solve this problem due to complex nonlinear constraints. Thus, it is impossible to use conventional MINLP solvers using derivatives of equations. In this study, the Particle Swarm Optimization (PSO) technique is employed. The ethylene oxide plant is illustrated to verify the efficacy of this study.
TCGA-assembler 2: software pipeline for retrieval and processing of TCGA/CPTAC data.
Wei, Lin; Jin, Zhilin; Yang, Shengjie; Xu, Yanxun; Zhu, Yitan; Ji, Yuan
2018-05-01
The Cancer Genome Atlas (TCGA) program has produced huge amounts of cancer genomics data providing unprecedented opportunities for research. In 2014, we developed TCGA-Assembler, a software pipeline for retrieval and processing of public TCGA data. In 2016, TCGA data were transferred from the TCGA data portal to the Genomic Data Commons (GDCs), which is supported by a different set of data storage and retrieval mechanisms. In addition, new proteomics data of TCGA samples have been generated by the Clinical Proteomic Tumor Analysis Consortium (CPTAC) program, which were not available for downloading through TCGA-Assembler. It is desirable to acquire and integrate data from both GDC and CPTAC. We develop TCGA-assembler 2 (TA2) to automatically download and integrate data from GDC and CPTAC. We make substantial improvement on the functionality of TA2 to enhance user experience and software performance. TA2 together with its previous version have helped more than 2000 researchers from 64 countries to access and utilize TCGA and CPTAC data in their research. Availability of TA2 will continue to allow existing and new users to conduct reproducible research based on TCGA and CPTAC data. http://www.compgenome.org/TCGA-Assembler/ or https://github.com/compgenome365/TCGA-Assembler-2. zhuyitan@gmail.com or koaeraser@gmail.com. Supplementary data are available at Bioinformatics online.
The Kepler Science Data Processing Pipeline Source Code Road Map
NASA Technical Reports Server (NTRS)
Wohler, Bill; Jenkins, Jon M.; Twicken, Joseph D.; Bryson, Stephen T.; Clarke, Bruce Donald; Middour, Christopher K.; Quintana, Elisa Victoria; Sanderfer, Jesse Thomas; Uddin, Akm Kamal; Sabale, Anima;
2016-01-01
We give an overview of the operational concepts and architecture of the Kepler Science Processing Pipeline. Designed, developed, operated, and maintained by the Kepler Science Operations Center (SOC) at NASA Ames Research Center, the Science Processing Pipeline is a central element of the Kepler Ground Data System. The SOC consists of an office at Ames Research Center, software development and operations departments, and a data center which hosts the computers required to perform data analysis. The SOC's charter is to analyze stellar photometric data from the Kepler spacecraft and report results to the Kepler Science Office for further analysis. We describe how this is accomplished via the Kepler Science Processing Pipeline, including, the software algorithms. We present the high-performance, parallel computing software modules of the pipeline that perform transit photometry, pixel-level calibration, systematic error correction, attitude determination, stellar target management, and instrument characterization.
Landslide and Land Subsidence Hazards to Pipelines
Baum, Rex L.; Galloway, Devin L.; Harp, Edwin L.
2008-01-01
Landslides and land subsidence pose serious hazards to pipelines throughout the world. Many existing pipeline corridors and more and more new pipelines cross terrain that is affected by either landslides, land subsidence, or both. Consequently the pipeline industry recognizes a need for increased awareness of methods for identifying and evaluating landslide and subsidence hazard for pipeline corridors. This report was prepared in cooperation with the U.S. Department of Transportation Pipeline and Hazardous Materials Safety Administration, and Pipeline Research Council International through a cooperative research and development agreement (CRADA) with DGH Consulting, Inc., to address the need for up-to-date information about current methods to identify and assess these hazards. Chapters in this report (1) describe methods for evaluating landslide hazard on a regional basis, (2) describe the various types of land subsidence hazard in the United States and available methods for identifying and quantifying subsidence, and (3) summarize current methods for investigating individual landslides. In addition to the descriptions, this report provides information about the relative costs, limitations and reliability of various methods.
Iturbe, Rosario; Flores, Carlos; Castro, Alejandrina; Torres, Luis G
2007-10-01
Oil spills due to oil pipelines is a very frequent problem in Mexico. Petroleos Mexicanos (PEMEX), very concerned with the environmental agenda, has been developing inspection and correction plans for zones around oil pipelines pumping stations and pipeline right-of-way. These stations are located at regular intervals of kilometres along the pipelines. In this study, two sections of an oil pipeline and two pipeline pumping stations zones are characterized in terms of the presence of Total Petroleum Hydrocarbons (TPHs) and Polycyclic Aromatic Hydrocarbons (PAHs). The study comprehends sampling of the areas, delimitation of contamination in the vertical and horizontal extension, analysis of the sampled soils regarding TPHs content and, in some cases, the 16 PAHs considered as priority by USEPA, calculation of areas and volumes contaminated (according to Mexican legislation, specifically NOM-EM-138-ECOL-2002) and, finally, a proposal for the best remediation techniques suitable for the contamination levels and the localization of contaminants.
Prime the Pipeline Project (P[cube]): Putting Knowledge to Work
ERIC Educational Resources Information Center
Greenes, Carole; Wolfe, Susan; Weight, Stephanie; Cavanagh, Mary; Zehring, Julie
2011-01-01
With funding from NSF, the Prime the Pipeline Project (P[cube]) is responding to the need to strengthen the science, technology, engineering, and mathematics (STEM) pipeline from high school to college by developing and evaluating the scientific village strategy and the culture it creates. The scientific village, a community of high school…
Dhanyalakshmi, K H; Naika, Mahantesha B N; Sajeevan, R S; Mathew, Oommen K; Shafi, K Mohamed; Sowdhamini, Ramanathan; N Nataraja, Karaba
2016-01-01
The modern sequencing technologies are generating large volumes of information at the transcriptome and genome level. Translation of this information into a biological meaning is far behind the race due to which a significant portion of proteins discovered remain as proteins of unknown function (PUFs). Attempts to uncover the functional significance of PUFs are limited due to lack of easy and high throughput functional annotation tools. Here, we report an approach to assign putative functions to PUFs, identified in the transcriptome of mulberry, a perennial tree commonly cultivated as host of silkworm. We utilized the mulberry PUFs generated from leaf tissues exposed to drought stress at whole plant level. A sequence and structure based computational analysis predicted the probable function of the PUFs. For rapid and easy annotation of PUFs, we developed an automated pipeline by integrating diverse bioinformatics tools, designated as PUFs Annotation Server (PUFAS), which also provides a web service API (Application Programming Interface) for a large-scale analysis up to a genome. The expression analysis of three selected PUFs annotated by the pipeline revealed abiotic stress responsiveness of the genes, and hence their potential role in stress acclimation pathways. The automated pipeline developed here could be extended to assign functions to PUFs from any organism in general. PUFAS web server is available at http://caps.ncbs.res.in/pufas/ and the web service is accessible at http://capservices.ncbs.res.in/help/pufas.
Developing Cancer Informatics Applications and Tools Using the NCI Genomic Data Commons API.
Wilson, Shane; Fitzsimons, Michael; Ferguson, Martin; Heath, Allison; Jensen, Mark; Miller, Josh; Murphy, Mark W; Porter, James; Sahni, Himanso; Staudt, Louis; Tang, Yajing; Wang, Zhining; Yu, Christine; Zhang, Junjun; Ferretti, Vincent; Grossman, Robert L
2017-11-01
The NCI Genomic Data Commons (GDC) was launched in 2016 and makes available over 4 petabytes (PB) of cancer genomic and associated clinical data to the research community. This dataset continues to grow and currently includes over 14,500 patients. The GDC is an example of a biomedical data commons, which collocates biomedical data with storage and computing infrastructure and commonly used web services, software applications, and tools to create a secure, interoperable, and extensible resource for researchers. The GDC is (i) a data repository for downloading data that have been submitted to it, and also a system that (ii) applies a common set of bioinformatics pipelines to submitted data; (iii) reanalyzes existing data when new pipelines are developed; and (iv) allows users to build their own applications and systems that interoperate with the GDC using the GDC Application Programming Interface (API). We describe the GDC API and how it has been used both by the GDC itself and by third parties. Cancer Res; 77(21); e15-18. ©2017 AACR . ©2017 American Association for Cancer Research.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shem, L.M.; Van Dyke, G.D.; Zimmerman, R.E.
1994-12-01
The goal of the Gas Research Institute Wetland Corridors Program is to document impacts of existing pipelines on the wetlands they traverse. To accomplish this goal, 12 existing wetland crossings were surveyed. These sites varied in elapsed time since pipeline construction, wetland type, pipeline installation techniques, and right-of-way (ROW) management practices. This report presents the results of a survey conducted over the period of August 3-4, 1992, at the Cassadaga wetlands crossing in Gerry Township, Chautauqua County, New York. The pipeline at this site was installed during February and March 1981. After completion of pipeline installation, the ROW was fertilized,more » mulched, and seeded with annual ryegrass. Two adjacent sites were surveyed in this study: a forested wetland and an emergent wetlands Eleven years after pipeline installation, the ROW at both sites supported diverse vegetative communities. Although devoid of large woody species, the ROW within the forested wetland had a dense vegetative cover. The ROW within the emergent wetland had a slightly less dense and more diverse vegetative community compared with that in the adjacent natural areas (NAs). The ROW within the emergent wetland also had a large number of introduced species that were not present in the adjacent NAs. The ROW, with its emergent marsh plant community, provided habitat diversity within the forested wetlands Because the ROW contained species not found within the adjacent NAs, overall species diversity was increased.« less
FirebrowseR: an R client to the Broad Institute’s Firehose Pipeline
Deng, Mario; Brägelmann, Johannes; Kryukov, Ivan; Saraiva-Agostinho, Nuno; Perner, Sven
2017-01-01
With its Firebrowse service (http://firebrowse.org/) the Broad Institute is making large-scale multi-platform omics data analysis results publicly available through a Representational State Transfer (REST) Application Programmable Interface (API). Querying this database through an API client from an arbitrary programming environment is an essential task, allowing other developers and researchers to focus on their analysis and avoid data wrangling. Hence, as a first result, we developed a workflow to automatically generate, test and deploy such clients for rapid response to API changes. Its underlying infrastructure, a combination of free and publicly available web services, facilitates the development of API clients. It decouples changes in server software from the client software by reacting to changes in the RESTful service and removing direct dependencies on a specific implementation of an API. As a second result, FirebrowseR, an R client to the Broad Institute’s RESTful Firehose Pipeline, is provided as a working example, which is built by the means of the presented workflow. The package’s features are demonstrated by an example analysis of cancer gene expression data. Database URL: https://github.com/mariodeng/ PMID:28062517
Yoon, Jun-Hee; Kim, Thomas W; Mendez, Pedro; Jablons, David M; Kim, Il-Jin
2017-01-01
The development of next-generation sequencing (NGS) technology allows to sequence whole exomes or genome. However, data analysis is still the biggest bottleneck for its wide implementation. Most laboratories still depend on manual procedures for data handling and analyses, which translates into a delay and decreased efficiency in the delivery of NGS results to doctors and patients. Thus, there is high demand for developing an automatic and an easy-to-use NGS data analyses system. We developed comprehensive, automatic genetic analyses controller named Mobile Genome Express (MGE) that works in smartphones or other mobile devices. MGE can handle all the steps for genetic analyses, such as: sample information submission, sequencing run quality check from the sequencer, secured data transfer and results review. We sequenced an Actrometrix control DNA containing multiple proven human mutations using a targeted sequencing panel, and the whole analysis was managed by MGE, and its data reviewing program called ELECTRO. All steps were processed automatically except for the final sequencing review procedure with ELECTRO to confirm mutations. The data analysis process was completed within several hours. We confirmed the mutations that we have identified were consistent with our previous results obtained by using multi-step, manual pipelines.
FirebrowseR: an R client to the Broad Institute's Firehose Pipeline.
Deng, Mario; Brägelmann, Johannes; Kryukov, Ivan; Saraiva-Agostinho, Nuno; Perner, Sven
2017-01-01
With its Firebrowse service (http://firebrowse.org/) the Broad Institute is making large-scale multi-platform omics data analysis results publicly available through a Representational State Transfer (REST) Application Programmable Interface (API). Querying this database through an API client from an arbitrary programming environment is an essential task, allowing other developers and researchers to focus on their analysis and avoid data wrangling. Hence, as a first result, we developed a workflow to automatically generate, test and deploy such clients for rapid response to API changes. Its underlying infrastructure, a combination of free and publicly available web services, facilitates the development of API clients. It decouples changes in server software from the client software by reacting to changes in the RESTful service and removing direct dependencies on a specific implementation of an API. As a second result, FirebrowseR, an R client to the Broad Institute's RESTful Firehose Pipeline, is provided as a working example, which is built by the means of the presented workflow. The package's features are demonstrated by an example analysis of cancer gene expression data.Database URL: https://github.com/mariodeng/. © The Author(s) 2017. Published by Oxford University Press.
Applications of the pipeline environment for visual informatics and genomics computations
2011-01-01
Background Contemporary informatics and genomics research require efficient, flexible and robust management of large heterogeneous data, advanced computational tools, powerful visualization, reliable hardware infrastructure, interoperability of computational resources, and detailed data and analysis-protocol provenance. The Pipeline is a client-server distributed computational environment that facilitates the visual graphical construction, execution, monitoring, validation and dissemination of advanced data analysis protocols. Results This paper reports on the applications of the LONI Pipeline environment to address two informatics challenges - graphical management of diverse genomics tools, and the interoperability of informatics software. Specifically, this manuscript presents the concrete details of deploying general informatics suites and individual software tools to new hardware infrastructures, the design, validation and execution of new visual analysis protocols via the Pipeline graphical interface, and integration of diverse informatics tools via the Pipeline eXtensible Markup Language syntax. We demonstrate each of these processes using several established informatics packages (e.g., miBLAST, EMBOSS, mrFAST, GWASS, MAQ, SAMtools, Bowtie) for basic local sequence alignment and search, molecular biology data analysis, and genome-wide association studies. These examples demonstrate the power of the Pipeline graphical workflow environment to enable integration of bioinformatics resources which provide a well-defined syntax for dynamic specification of the input/output parameters and the run-time execution controls. Conclusions The LONI Pipeline environment http://pipeline.loni.ucla.edu provides a flexible graphical infrastructure for efficient biomedical computing and distributed informatics research. The interactive Pipeline resource manager enables the utilization and interoperability of diverse types of informatics resources. The Pipeline client-server model provides computational power to a broad spectrum of informatics investigators - experienced developers and novice users, user with or without access to advanced computational-resources (e.g., Grid, data), as well as basic and translational scientists. The open development, validation and dissemination of computational networks (pipeline workflows) facilitates the sharing of knowledge, tools, protocols and best practices, and enables the unbiased validation and replication of scientific findings by the entire community. PMID:21791102
ERIC Educational Resources Information Center
Dixon, John; Girifalco, Tony; Yakabosky, Walt
2008-01-01
This article describes the Applied Engineering Technology (AET) Career and Educational Pathways Program, which helps local manufacturers find quality workers. The program features 32 high schools, three community colleges, and 10 four-year institutions offering an integrated regional system of applied engineering education. The goal is to enroll…
Program for At-Risk Students Helps College, Too
ERIC Educational Resources Information Center
Carlson, Scott
2012-01-01
The author introduces a new program that brings city kids who really need college to a private rural campus that really needs kids. Under the program, called Pipelines Into Partnership, a handful of urban high schools and community organizations--the groups that know their kids beyond the black and white of their transcripts--determine which…
18 CFR 357.5 - Cash management programs.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Cash management...: CARRIERS SUBJECT TO PART I OF THE INTERSTATE COMMERCE ACT § 357.5 Cash management programs. Oil pipeline... and § 357.2 of this title that participate in cash management programs must file these agreements with...
18 CFR 357.5 - Cash management programs.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 18 Conservation of Power and Water Resources 1 2011-04-01 2011-04-01 false Cash management...: CARRIERS SUBJECT TO PART I OF THE INTERSTATE COMMERCE ACT § 357.5 Cash management programs. Oil pipeline... and § 357.2 of this title that participate in cash management programs must file these agreements with...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-01
... Plan for the American Burying Beetle for Pipelines and Well Field Development in Oklahoma and Texas..., operation, and repair of oil and gas pipelines, and related well field activities. Individual oil and gas... pipelines and related well field activities, and will include measures necessary to minimize and mitigate...
Nearing, Kathryn A; Hunt, Cerise; Presley, Jessica H; Nuechterlein, Bridget M; Moss, Marc; Manson, Spero M
2015-10-01
This paper is the first in a five-part series on the clinical and translational science educational pipeline and presents strategies to support recruitment and retention to create diverse pathways into clinical and translational research (CTR). The strategies address multiple levels or contexts of persistence decisions and include: (1) creating a seamless pipeline by forming strategic partnerships to achieve continuity of support for scholars and collective impact; (2) providing meaningful research opportunities to support identity formation as a scientist and sustain motivation to pursue and persist in CTR careers; (3) fostering an environment for effective mentorship and peer support to promote academic and social integration; (4) advocating for institutional policies to alleviate environmental pull factors; and, (5) supporting program evaluation-particularly, the examination of longitudinal outcomes. By combining institutional policies that promote a culture and climate for diversity with quality, evidence-based programs and integrated networks of support, we can create the environment necessary for diverse scholars to progress successfully and efficiently through the pipeline to achieve National Institutes of Health's vision of a robust CTR workforce. © 2015 Wiley Periodicals, Inc.
NASA Astrophysics Data System (ADS)
Frust, Tobias; Wagner, Michael; Stephan, Jan; Juckeland, Guido; Bieberle, André
2017-10-01
Ultrafast X-ray tomography is an advanced imaging technique for the study of dynamic processes basing on the principles of electron beam scanning. A typical application case for this technique is e.g. the study of multiphase flows, that is, flows of mixtures of substances such as gas-liquidflows in pipelines or chemical reactors. At Helmholtz-Zentrum Dresden-Rossendorf (HZDR) a number of such tomography scanners are operated. Currently, there are two main points limiting their application in some fields. First, after each CT scan sequence the data of the radiation detector must be downloaded from the scanner to a data processing machine. Second, the current data processing is comparably time-consuming compared to the CT scan sequence interval. To enable online observations or use this technique to control actuators in real-time, a modular and scalable data processing tool has been developed, consisting of user-definable stages working independently together in a so called data processing pipeline, that keeps up with the CT scanner's maximal frame rate of up to 8 kHz. The newly developed data processing stages are freely programmable and combinable. In order to achieve the highest processing performance all relevant data processing steps, which are required for a standard slice image reconstruction, were individually implemented in separate stages using Graphics Processing Units (GPUs) and NVIDIA's CUDA programming language. Data processing performance tests on different high-end GPUs (Tesla K20c, GeForce GTX 1080, Tesla P100) showed excellent performance. Program Files doi:http://dx.doi.org/10.17632/65sx747rvm.1 Licensing provisions: LGPLv3 Programming language: C++/CUDA Supplementary material: Test data set, used for the performance analysis. Nature of problem: Ultrafast computed tomography is performed with a scan rate of up to 8 kHz. To obtain cross-sectional images from projection data computer-based image reconstruction algorithms must be applied. The objective of the presented program is to reconstruct a data stream of around 1.3 GB s-1 in a minimum time period. Thus, the program allows to go into new fields of application and to use in the future even more compute-intensive algorithms, especially for data post-processing, to improve the quality of data analysis. Solution method: The program solves the given problem using a two-step process: first, by a generic, expandable and widely applicable template library implementing the streaming paradigm (GLADOS); second, by optimized processing stages for ultrafast computed tomography implementing the required algorithms in a performance-oriented way using CUDA (RISA). Thereby, task-parallelism between the processing stages as well as data parallelism within one processing stage is realized.
NASA Astrophysics Data System (ADS)
Artana, K. B.; Pitana, T.; Dinariyana, D. P.; Ariana, M.; Kristianto, D.; Pratiwi, E.
2018-06-01
The aim of this research is to develop an algorithm and application that can perform real-time monitoring of the safety operation of offshore platforms and subsea gas pipelines as well as determine the need for ship inspection using data obtained from automatic identification system (AIS). The research also focuses on the integration of shipping database, AIS data, and others to develop a prototype for designing a real-time monitoring system of offshore platforms and pipelines. A simple concept is used in the development of this prototype, which is achieved by using an overlaying map that outlines the coordinates of the offshore platform and subsea gas pipeline with the ship's coordinates (longitude/latitude) as detected by AIS. Using such information, we can then build an early warning system (EWS) relayed through short message service (SMS), email, or other means when the ship enters the restricted and exclusion zone of platforms and pipelines. The ship inspection system is developed by combining several attributes. Then, decision analysis software is employed to prioritize the vessel's four attributes, including ship age, ship type, classification, and flag state. Results show that the EWS can increase the safety level of offshore platforms and pipelines, as well as the efficient use of patrol boats in monitoring the safety of the facilities. Meanwhile, ship inspection enables the port to prioritize the ship to be inspected in accordance with the priority ranking inspection score.
Numerical Modeling of Mechanical Behavior for Buried Steel Pipelines Crossing Subsidence Strata
Han, C. J.
2015-01-01
This paper addresses the mechanical behavior of buried steel pipeline crossing subsidence strata. The investigation is based on numerical simulation of the nonlinear response of the pipeline-soil system through finite element method, considering large strain and displacement, inelastic material behavior of buried pipeline and the surrounding soil, as well as contact and friction on the pipeline-soil interface. Effects of key parameters on the mechanical behavior of buried pipeline were investigated, such as strata subsidence, diameter-thickness ratio, buried depth, internal pressure, friction coefficient and soil properties. The results show that the maximum strain appears on the outer transition subsidence section of the pipeline, and its cross section is concave shaped. With the increasing of strata subsidence and diameter-thickness ratio, the out of roundness, longitudinal strain and equivalent plastic strain increase gradually. With the buried depth increasing, the deflection, out of roundness and strain of the pipeline decrease. Internal pressure and friction coefficient have little effect on the deflection of buried pipeline. Out of roundness is reduced and the strain is increased gradually with the increasing of internal pressure. The physical properties of soil have a great influence on the mechanical properties of buried pipeline. The results from the present study can be used for the development of optimization design and preventive maintenance for buried steel pipelines. PMID:26103460
U.S. pipeline industry enters new era
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnsen, M.R.
1999-11-01
The largest construction project in North America this year and next--the Alliance Pipeline--marks some advances for the US pipeline industry. With the Alliance Pipeline system (Alliance), mechanized welding and ultrasonic testing are making their debuts in the US as primary mainline construction techniques. Particularly in Canada and Europe, mechanized welding technology has been used for both onshore and offshore pipeline construction for at least 15 years. However, it has never before been used to build a cross-country pipeline in the US, although it has been tested on short segments. This time, however, an accelerated construction schedule, among other reasons, necessitatedmore » the use of mechanized gas metal arc welding (GMAW). The $3-billion pipeline will delivery natural gas from northwestern British Columbia and northeastern Alberta in Canada to a hub near Chicago, Ill., where it will connect to the North American pipeline grid. Once the pipeline is completed and buried, crews will return the topsoil. Corn and other crops will reclaim the land. While the casual passerby probably won't know the Alliance pipeline is there, it may have a far-reaching effect on the way mainline pipelines are built in the US. For even though mechanized welding and ultrasonic testing are being used for the first time in the United States on this project, some US workers had already gained experience with the technology on projects elsewhere. And work on this pipeline has certainly developed a much larger pool of experienced workers for industry to draw from. The Alliance project could well signal the start of a new era in US pipeline construction.« less
NASA Astrophysics Data System (ADS)
Hafich, K. A.; Hannigan, M.; Martens, W.; McDonald, J. E.; Knight, D.; Gardiner, L. S.; Collier, A. M.; Fletcher, H.; Polmear, M.
2015-12-01
Hydraulic fracturing is a highly contentious issue, and trusted sources of information about the impacts and benefits are difficult to find. Scientific research is making strides to catch up with rapidly expanding unconventional oil and gas development, in part, to meet the need for information for policy, regulation, and public interest. A leader in hydraulic fracturing research, the AirWaterGas Sustainability Research Network is a multi-institution, multi-disciplinary team of researchers working to understand the environmental, economic, and social tradeoffs of oil and gas development. AirWaterGas recently restructured and implemented our education and outreach program around a partnership with the CU-Boulder Office for Outreach and Engagement that leverages existing campus infrastructure, networks, and expertise to disseminate research results and engage the public. The education and outreach team is working with formal and informal K-12 educators through several programs: a yearlong teacher professional development program, a rural classroom air quality monitoring program, and a community partnership grant program. Each program brings together scientists and educators in different environments such as the classroom, online learning, in-person workshops, and community lectures. We will present best practices for developing and implementing a viable outreach and education program through building and fostering mutually beneficial partnerships that bridge the gap between scientists and the public.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shem, L.M.; Zimmerman, R.E.; Hayes, D.
The goal of the Gas Research Institute Wetland Corridors Program is to document impacts of existing pipeline on the wetlands they traverse. To accomplish this goal, 12 existing wetland crossings were surveyed. These sites varied in elapsed time since pipeline construction, wetland type, pipeline installation techniques, and night of-way (ROW) management practices. This report presents the results of a survey conducted over the period of August 12-13, 1991, at the Bayou Grand Cane crossing in De Soto Parish, Louisiana, where a pipeline constructed three years prior to the survey crosses the bayou through mature bottomland hardwoods. The sit was notmore » seeded or fertilized after construction activities. At the time of sampling, a dense herb stratum (composed of mostly native species) covered the 20-m-wide ROW, except within drainage channels. As a result of the creation of the ROW, new habitat was created, plant diversity increased, and forest habitat became fragmented. The ROW must be maintained at an early stage of succession to allow access to the pipeline however, impacts to the wetland were minimized by decreasing the width of the ROW to 20 m and recreating the drainage channels across the ROW. The canopy trees on the ROW`s edge shaded part of the ROW, which helped to minimize the effects of the ROW.« less
Study of stress-strain state of pipeline under permafrost conditions
NASA Astrophysics Data System (ADS)
Tarasenko, A. A.; Redutinskiy, M. N.; Chepur, P. V.; Gruchenkova, A. A.
2018-05-01
In this paper, the dependences of the stress-strain state and subsidence of pipelines on the dimensions of the subsidence zone are obtained for the sizes of pipes that have become most widespread during the construction of main oil pipelines (530x10, 820x12, 1020x12, 1020x14, 1020x16, 1220x14, 1220x16, 1220x18 mm). True values of stresses in the pipeline wall, as well as the exact location of maximum stresses for the interval of subsidence zones from 5 to 60 meters, are determined. For this purpose, the authors developed a finite element model of the pipeline that takes into account the actual interaction of the pipeline with the subgrade and allows calculating the SSS of the structure for a variable subsidence zone. Based on the obtained dependences for the underground laying of oil pipelines in permafrost areas, it is proposed to artificially limit the zone of possible subsidence by separation supports from the soil with higher building properties and physical-mechanical parameters. This technical solution would significantly reduce costs when constructing new oil pipelines in permafrost areas.
Urban Underground Pipelines Mapping Using Ground Penetrating Radar
NASA Astrophysics Data System (ADS)
Jaw, S. W.; M, Hashim
2014-02-01
Underground spaces are now being given attention to exploit for transportation, utilities, and public usage. The underground has become a spider's web of utility networks. Mapping of underground utility pipelines has become a challenging and difficult task. As such, mapping of underground utility pipelines is a "hit-and-miss" affair, and results in many catastrophic damages, particularly in urban areas. Therefore, this study was conducted to extract locational information of the urban underground utility pipeline using trenchless measuring tool, namely ground penetrating radar (GPR). The focus of this study was to conduct underground utility pipeline mapping for retrieval of geometry properties of the pipelines, using GPR. In doing this, a series of tests were first conducted at the preferred test site and real-life experiment, followed by modeling of field-based model using Finite-Difference Time-Domain (FDTD). Results provide the locational information of underground utility pipelines associated with its mapping accuracy. Eventually, this locational information of the underground utility pipelines is beneficial to civil infrastructure management and maintenance which in the long term is time-saving and critically important for the development of metropolitan areas.
Research on prognostics and health management of underground pipeline
NASA Astrophysics Data System (ADS)
Zhang, Guangdi; Yang, Meng; Yang, Fan; Ni, Na
2018-04-01
With the development of the city, the construction of the underground pipeline is more and more complex, which has relation to the safety and normal operation of the city, known as "the lifeline of the city". First of all, this paper introduces the principle of PHM (Prognostics and Health Management) technology, then proposed for fault diagnosis, prognostics and health management in view of underground pipeline, make a diagnosis and prognostics for the faults appearing in the operation of the underground pipeline, and then make a health assessment of the whole underground pipe network in order to ensure the operation of the pipeline safely. Finally, summarize and prospect the future research direction.
78 FR 43263 - Paperless Hazard Communications Pilot Program
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-19
... Research Division (PHH-23), Pipeline and Hazardous Materials Safety Administration, 1200 New Jersey Avenue... materials by air, highway, rail, and water) to test the feasibility and then evaluate both the feasibility... times and locations.'' On September 12, 2001, the Research and Special Programs Administration (the...
Historical analysis of US pipeline accidents triggered by natural hazards
NASA Astrophysics Data System (ADS)
Girgin, Serkan; Krausmann, Elisabeth
2015-04-01
Natural hazards, such as earthquakes, floods, landslides, or lightning, can initiate accidents in oil and gas pipelines with potentially major consequences on the population or the environment due to toxic releases, fires and explosions. Accidents of this type are also referred to as Natech events. Many major accidents highlight the risk associated with natural-hazard impact on pipelines transporting dangerous substances. For instance, in the USA in 1994, flooding of the San Jacinto River caused the rupture of 8 and the undermining of 29 pipelines by the floodwaters. About 5.5 million litres of petroleum and related products were spilled into the river and ignited. As a results, 547 people were injured and significant environmental damage occurred. Post-incident analysis is a valuable tool for better understanding the causes, dynamics and impacts of pipeline Natech accidents in support of future accident prevention and mitigation. Therefore, data on onshore hazardous-liquid pipeline accidents collected by the US Pipeline and Hazardous Materials Safety Administration (PHMSA) was analysed. For this purpose, a database-driven incident data analysis system was developed to aid the rapid review and categorization of PHMSA incident reports. Using an automated data-mining process followed by a peer review of the incident records and supported by natural hazard databases and external information sources, the pipeline Natechs were identified. As a by-product of the data-collection process, the database now includes over 800,000 incidents from all causes in industrial and transportation activities, which are automatically classified in the same way as the PHMSA record. This presentation describes the data collection and reviewing steps conducted during the study, provides information on the developed database and data analysis tools, and reports the findings of a statistical analysis of the identified hazardous liquid pipeline incidents in terms of accident dynamics and consequences.
Forecasting and Evaluation of Gas Pipelines Geometric Forms Breach Hazard
NASA Astrophysics Data System (ADS)
Voronin, K. S.
2016-10-01
Main gas pipelines during operation are under the influence of the permanent pressure drops which leads to their lengthening and as a result, to instability of their position in space. In dynamic systems that have feedback, phenomena, preceding emergencies, should be observed. The article discusses the forced vibrations of the gas pipeline cylindrical surface under the influence of dynamic loads caused by pressure surges, and the process of its geometric shape deformation. Frequency of vibrations, arising in the pipeline at the stage preceding its bending, is being determined. Identification of this frequency can be the basis for the development of a method of monitoring the technical condition of the gas pipeline, and forecasting possible emergency situations allows planning and carrying out in due time reconstruction works on sections of gas pipeline with a possible deviation from the design position.
NASA Astrophysics Data System (ADS)
Russell, Melody L.; Atwater, Mary M.
2005-08-01
This study focuses on 11 African American undergraduate seniors in a biology degree program at a predominantly white research institution in the southeastern United States. These 11 respondents shared their journeys throughout the high school and college science pipeline. Participants described similar precollege factors and experiences that contributed to their academic success and persistence at a predominantly white institution. One of the most critical factors in their academic persistence was participation in advanced science and mathematics courses as part of their high school college preparatory program. Additional factors that had a significant impact on their persistence and academic success were family support, teacher encouragement, intrinsic motivation, and perseverance.
A software framework for pipelined arithmetic algorithms in field programmable gate arrays
NASA Astrophysics Data System (ADS)
Kim, J. B.; Won, E.
2018-03-01
Pipelined algorithms implemented in field programmable gate arrays are extensively used for hardware triggers in the modern experimental high energy physics field and the complexity of such algorithms increases rapidly. For development of such hardware triggers, algorithms are developed in C++, ported to hardware description language for synthesizing firmware, and then ported back to C++ for simulating the firmware response down to the single bit level. We present a C++ software framework which automatically simulates and generates hardware description language code for pipelined arithmetic algorithms.
NASA Astrophysics Data System (ADS)
Ding, Wenhua; Li, Shaopo; Li, Jiading; Li, Qun; Chen, Tieqiang; Zhang, Hai
In recent years, there has been development of several significant pipeline projects for the transmission of oil and gas from deep water environments. The production of gas transmission pipelines for application demands heavy wall, high strength, good lower temperature toughness and good weldability. To overcome the difficulty of producing consistent mechanical property in heavy wall pipe Shougang Steel Research in cooperation with the Shougang Steel Qinhuangdao China (Shouqin) 4.3m heavy wide plate mill research was conducted.
Pipeline enhances Norman Wells potential
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
Approval of an oil pipeline from halfway down Canada's MacKenzie River Valley at Norman Wells to N. Alberta has raised the potential for development of large reserves along with controversy over native claims. The project involves 2 closely related proposals. One, by Esso Resources, the exploration and production unit of Imperial Oil, will increase oil production from the Norman Wells field from 3000 bpd currently to 25,000 bpd. The other proposal, by Interprovincial Pipeline (N.W) Ltd., calls for construction of an underground pipeline to transport the additional production from Norman Wells to Alberta. The 560-mile, 12-in. pipeline will extend frommore » Norman Wells, which is 90 miles south of the Arctic Circle on the north shore of the Mackenzie River, south to the end of an existing line at Zama in N. Alberta. There will be 3 pumping stations en route. This work also discusses recovery, potential, drilling limitations, the processing plant, positive impact, and further development of the Norman Wells project.« less
Fincher, Ruth-Marie E; Sykes-Brown, Wilma; Allen-Noble, Rosie
2002-07-01
The objective of the Health Professions Partnership Initiative is to increase the number of underrepresented minority Georgia residents who become health care professionals by (1) creating a pipeline of well-qualified high school and college students interested in health care careers, (2) increasing the number of well-qualified applicants to medical and other health professions schools, and (3) increasing the number of underrepresented minority students at the Medical College of Georgia (MCG). The Health Professions Partnership Initiative at MCG was created in 1996 by collaboration among the MCG Schools of Medicine and Nursing, two Augusta high schools attended primarily by underrepresented minority students, three historically black colleges and universities, the Fort Discovery National Science Center of Augusta, community service organizations, and MCG student organizations. The project was funded by the Association of American Medical Colleges and The Robert Wood Johnson Foundation. The high school component, the Health Science Learning Academy (HSLA), was designed to strengthen the students' educational backgrounds and interest in professional careers as evidenced by increased standardized test scores and numbers of students entering college and health professions schools. Additional goals included a system to track students' progress throughout the pipeline as well as professional development sessions to enrich faculty members' knowledge and enhance their teaching expertise. The HSLA began with ninth-grade students from the two high schools. During its second year, funding from the Health 1st Foundation allowed inclusion of another high school and expansion to ninth grade through twelfth grade. The HSLA's enrichment classes meet for three hours on 18 Saturday mornings during the academic year and include computer-interactive SAT preparation and English composition (tenth grade); biology, algebra, calculus, and English composition (eleventh grade); and advanced mathematics and biology (twelfth grade). The ultimate solution to the paucity of underrepresented minority physicians resides largely in successful pipeline programs that expand the pool of well-qualified applicants, matriculants, and graduates from medical schools. Intermediate results of the HSLA support the success of the program. Since its creation in the 1996-1997 academic year, 203 students have participated in the HSLA and all 38 (from the original two schools) who completed the four-year program have enrolled in college. The mean SAT score for students who completed the HSLA program was 1,066, compared with a mean of 923 for all college-bound students in the participating schools. The mean increases in SAT scores for students who completed the four-year program were.5% (1,100 to 1,105) for students attending a magnet high school and 18% (929 to 1,130) for students attending the comprehensive high school. The mean overall increases in SAT scores for students in the two high schools were 1% (1,044 to 1,048) and 9.1% (765 to 834), respectively. The HSLA is accomplishing its goals and, while it is too early to know if these students will participate in MCAT preparatory programs and apply to medical and other health professions schools, their sustained commitment and enthusiasm bode well for continued success.
Using steady-state equations for transient flow calculation in natural gas pipelines
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maddox, R.N.; Zhou, P.
1984-04-02
Maddox and Zhou have extended their technique for calculating the unsteady-state behavior of straight gas pipelines to complex pipeline systems and networks. After developing the steady-state flow rate and pressure profile for each pipe in the network, analysts can perform the transient-state analysis in the real-time step-wise manner described for this technique.
A Critique of the STEM Pipeline: Young People's Identities in Sweden and Science Education Policy
ERIC Educational Resources Information Center
Mendick, Heather; Berge, Maria; Danielsson, Anna
2017-01-01
In this article, we develop critiques of the pipeline model which dominates Western science education policy, using discourse analysis of interviews with two Swedish young women focused on "identity work". We argue that it is important to unpack the ways that the pipeline model fails to engage with intersections of gender, ethnicity,…
The Minimal Preprocessing Pipelines for the Human Connectome Project
Glasser, Matthew F.; Sotiropoulos, Stamatios N; Wilson, J Anthony; Coalson, Timothy S; Fischl, Bruce; Andersson, Jesper L; Xu, Junqian; Jbabdi, Saad; Webster, Matthew; Polimeni, Jonathan R; Van Essen, David C; Jenkinson, Mark
2013-01-01
The Human Connectome Project (HCP) faces the challenging task of bringing multiple magnetic resonance imaging (MRI) modalities together in a common automated preprocessing framework across a large cohort of subjects. The MRI data acquired by the HCP differ in many ways from data acquired on conventional 3 Tesla scanners and often require newly developed preprocessing methods. We describe the minimal preprocessing pipelines for structural, functional, and diffusion MRI that were developed by the HCP to accomplish many low level tasks, including spatial artifact/distortion removal, surface generation, cross-modal registration, and alignment to standard space. These pipelines are specially designed to capitalize on the high quality data offered by the HCP. The final standard space makes use of a recently introduced CIFTI file format and the associated grayordinates spatial coordinate system. This allows for combined cortical surface and subcortical volume analyses while reducing the storage and processing requirements for high spatial and temporal resolution data. Here, we provide the minimum image acquisition requirements for the HCP minimal preprocessing pipelines and additional advice for investigators interested in replicating the HCP’s acquisition protocols or using these pipelines. Finally, we discuss some potential future improvements for the pipelines. PMID:23668970
Open source pipeline for ESPaDOnS reduction and analysis
NASA Astrophysics Data System (ADS)
Martioli, Eder; Teeple, Doug; Manset, Nadine; Devost, Daniel; Withington, Kanoa; Venne, Andre; Tannock, Megan
2012-09-01
OPERA is a Canada-France-Hawaii Telescope (CFHT) open source collaborative software project currently under development for an ESPaDOnS echelle spectro-polarimetric image reduction pipeline. OPERA is designed to be fully automated, performing calibrations and reduction, producing one-dimensional intensity and polarimetric spectra. The calibrations are performed on two-dimensional images. Spectra are extracted using an optimal extraction algorithm. While primarily designed for CFHT ESPaDOnS data, the pipeline is being written to be extensible to other echelle spectrographs. A primary design goal is to make use of fast, modern object-oriented technologies. Processing is controlled by a harness, which manages a set of processing modules, that make use of a collection of native OPERA software libraries and standard external software libraries. The harness and modules are completely parametrized by site configuration and instrument parameters. The software is open- ended, permitting users of OPERA to extend the pipeline capabilities. All these features have been designed to provide a portable infrastructure that facilitates collaborative development, code re-usability and extensibility. OPERA is free software with support for both GNU/Linux and MacOSX platforms. The pipeline is hosted on SourceForge under the name "opera-pipeline".
Ho, Cheng-I; Lin, Min-Der; Lo, Shang-Lien
2010-07-01
A methodology based on the integration of a seismic-based artificial neural network (ANN) model and a geographic information system (GIS) to assess water leakage and to prioritize pipeline replacement is developed in this work. Qualified pipeline break-event data derived from the Taiwan Water Corporation Pipeline Leakage Repair Management System were analyzed. "Pipe diameter," "pipe material," and "the number of magnitude-3( + ) earthquakes" were employed as the input factors of ANN, while "the number of monthly breaks" was used for the prediction output. This study is the first attempt to manipulate earthquake data in the break-event ANN prediction model. Spatial distribution of the pipeline break-event data was analyzed and visualized by GIS. Through this, the users can swiftly figure out the hotspots of the leakage areas. A northeastern township in Taiwan, frequently affected by earthquakes, is chosen as the case study. Compared to the traditional processes for determining the priorities of pipeline replacement, the methodology developed is more effective and efficient. Likewise, the methodology can overcome the difficulty of prioritizing pipeline replacement even in situations where the break-event records are unavailable.
Assessing the Impact of a Research-Based STEM Program on STEM Majors' Attitudes and Beliefs
ERIC Educational Resources Information Center
Huziak-Clark, Tracy; Sondergeld, Toni; Staaden, Moira; Knaggs, Christine; Bullerjahn, Anne
2015-01-01
The Science, Engineering, and Technology Gateway of Ohio (SETGO) program has a three-pronged approach to meeting the needs at different levels of students in the science, technology, engineering, and mathematics (STEM) pipeline. The SETGO program was an extensive collaboration between a two-year community college and a nearby four-year…
The ABCs of the US Broad Spectrum Antimicrobials Program: Antibiotics, Biosecurity, and Congress
2015-01-01
Antibiotic resistance has been increasing at an alarming rate in the United States and globally for decades, but the problem has only recently gained broad attention at the highest levels of the US government. More and more patients are dying of infections that do not respond to antibiotics that are currently available. Meanwhile, the antibacterial product pipeline remains fragile in part because of a lack of commercial interest from pharmaceutical companies. The Biomedical Advanced Research and Development Authority (BARDA) Broad Spectrum Antimicrobials (BSA) program leads the US government's effort to bridge this gap by advancing new antibacterials through late stages of clinical development. Other commentators have described in detail BARDA's structure, process, and role in antibacterial development. This commentary offers a public policy perspective on the emerging politics of antibiotic resistance in the context of US biosecurity politics and medical countermeasure (MCM) development. It identifies promising developments and difficult challenges that together will ultimately determine whether BARDA can become a global leader for antibiotic development. PMID:26569379
ERIC Educational Resources Information Center
Abdul-Alim, Jamaal
2012-01-01
This article features the Ronald E. McNair Postbaccalaureate Achievement Program at the University of Memphis. The McNair program is named after Ronald E. McNair, the second African-American in space, who died in the Space Shuttle Challenger explosion in 1986. Approximately 200 campuses across the nation host the program. Whereas the program…
ERIC Educational Resources Information Center
Hitt, Dallas Hambrick; Tucker, Pamela D.; Young, Michelle D.
2012-01-01
The professional pipeline represents a developmental perspective for fostering leadership capacity in schools and districts, from identification of potential talent during the recruitment phase to ensuring career-long learning through professional development. An intentional and mindful approach to supporting the development of educational leaders…
Magnetic Flux Leakage and Principal Component Analysis for metal loss approximation in a pipeline
NASA Astrophysics Data System (ADS)
Ruiz, M.; Mujica, L. E.; Quintero, M.; Florez, J.; Quintero, S.
2015-07-01
Safety and reliability of hydrocarbon transportation pipelines represent a critical aspect for the Oil an Gas industry. Pipeline failures caused by corrosion, external agents, among others, can develop leaks or even rupture, which can negatively impact on population, natural environment, infrastructure and economy. It is imperative to have accurate inspection tools traveling through the pipeline to diagnose the integrity. In this way, over the last few years, different techniques under the concept of structural health monitoring (SHM) have continuously been in development. This work is based on a hybrid methodology that combines the Magnetic Flux Leakage (MFL) and Principal Components Analysis (PCA) approaches. The MFL technique induces a magnetic field in the pipeline's walls. The data are recorded by sensors measuring leakage magnetic field in segments with loss of metal, such as cracking, corrosion, among others. The data provide information of a pipeline with 15 years of operation approximately, which transports gas, has a diameter of 20 inches and a total length of 110 km (with several changes in the topography). On the other hand, PCA is a well-known technique that compresses the information and extracts the most relevant information facilitating the detection of damage in several structures. At this point, the goal of this work is to detect and localize critical loss of metal of a pipeline that are currently working.
Modelling Time and Length Scales of Scour Around a Pipeline
NASA Astrophysics Data System (ADS)
Smith, H. D.; Foster, D. L.
2002-12-01
The scour and burial of submarine objects is an area of interest for engineers, oceanographers and military personnel. Given the limited availability of field observations, there exists a need to accurately describe the hydrodynamics and sediment response around an obstacle using numerical models. In this presentation, we will compare observations of submarine pipeline scour with model predictions. The research presented here uses the computational fluid dynamics (CFD) model FLOW-3D. FLOW-3D, developed by Flow Science in Santa Fe, NM, is a 3-dimensional finite-difference model that solves the Navier-Stokes and continuity equations. Using the Volume of Fluid (VOF) technique, FLOW-3D is able to resolve fluid-fluid and fluid-air interfaces. The FAVOR technique allows for complex geometry to be resolved with rectangular grids. FLOW-3D uses a bulk transport method to describe sediment transport and feedback to the hydrodynamic solver is accomplished by morphology evolution and fluid viscosity due to sediment suspension. Previous investigations by the authors have shown FLOW-3D to well-predict the hydrodynamics around five static scoured bed profiles and a stationary pipeline (``Modelling of Flow Around a Cylinder Over a Scoured Bed,'' submit to Journal of Waterway, Port, Coastal, and Ocean Engineering). Following experiments performed by Mao (1986, Dissertation, Technical University of Denmark), we will be performing model-data comparisons of length and time scales for scour around a pipeline. Preliminary investigations with LES and k-ɛ closure schemes have shown that the model predicts shorter time scales in scour hole development than that observed by Mao. Predicted time and length scales of scour hole development are shown to be a function of turbulence closure scheme, grain size, and hydrodynamic forcing. Subsequent investigations consider variable wave-current flow regimes and object burial. This investigation will allow us to identify different regimes for the scour process based on dimensionless parameters such as the Reynolds number, the Keulegan-Carpenter number, and the sediment mobility number. This research is sponsored by the Office of Naval Research - Mine Burial Program.
Interim report projects funded by EEC (European Economic Community) for offshore research
DOE Office of Scientific and Technical Information (OSTI.GOV)
Parrott, M.
1978-10-01
In Nov. 1973, the EEC first adopted the principle of subsidizing work in the offshore field to improve technological development activities; since 1973, the EEC has awarded three programs (for 1974, 1975, and 1977) for 95 projects. The most recent grants, in 1977, for 40 projects at a subsidy cost of $66.8 million are tabulated, showing project category, company, estimated investment, and project description. Project categories include pipelaying, pipeline transport, underwater storage, LNG storage and transport, etc. Under the grant system, the community contributes 40Vertical Bar3< to projects on exploration and production techniques, 35Vertical Bar3< for development of production equipmentmore » and machinery, and 25-30Vertical Bar3< for technological development projects for transport and storage of hydrocarbons. According to the EEC commission, 65 projects have been submitted for a fourth program for which the total amount provided in the budget is $43.8 million.« less
Pipeline transport and simultaneous saccharification of corn stover.
Kumar, Amit; Cameron, Jay B; Flynn, Peter C
2005-05-01
Pipeline transport of corn stover delivered by truck from the field is evaluated against a range of truck transport costs. Corn stover transported by pipeline at 20% solids concentration (wet basis) or higher could directly enter an ethanol fermentation plant, and hence the investment in the pipeline inlet end processing facilities displaces comparable investment in the plant. At 20% solids, pipeline transport of corn stover costs less than trucking at capacities in excess of 1.4 M drytonnes/yr when compared to a mid range of truck transport cost (excluding any credit for economies of scale achieved in the ethanol fermentation plant from larger scale due to multiple pipelines). Pipelining of corn stover gives the opportunity to conduct simultaneous transport and saccharification (STS). If current enzymes are used, this would require elevated temperature. Heating of the slurry for STS, which in a fermentation plant is achieved from waste heat, is a significant cost element (more than 5 cents/l of ethanol) if done at the pipeline inlet unless waste heat is available, for example from an electric power plant located adjacent to the pipeline inlet. Heat loss in a 1.26 m pipeline carrying 2 M drytonnes/yr is about 5 degrees C at a distance of 400 km in typical prairie clay soils, and would not likely require insulation; smaller pipelines or different soil conditions might require insulation for STS. Saccharification in the pipeline would reduce the need for investment in the fermentation plant, saving about 0.2 cents/l of ethanol. Transport of corn stover in multiple pipelines offers the opportunity to develop a large ethanol fermentation plant, avoiding some of the diseconomies of scale that arise from smaller plants whose capacities are limited by issues of truck congestion.
Ognibene, Frederick P.; Gallin, John I.; Baum, Bruce J.; Wyatt, Richard G.; Gottesman, Michael M.
2017-01-01
Purpose Clinician-scientists are considered an endangered species for many reasons, including challenges with establishing and maintaining a career pipeline. Career outcomes from year-long medical and dental students’ research enrichment programs have not been well determined. Therefore, the authors assessed career and research outcome data from a cohort of participants in the National Institutes of Health (NIH) Clinical Research Training Program (CRTP). Method The CRTP provided a year-long mentored clinical or translational research opportunity for 340 medical and dental students. Of these, 135 completed their training, including fellowships, from 1997 to January 2014. Data for 130 of 135 were analyzed, including time conducting research, types of public funding (NIH grants), and publications from self-reported surveys that were verified via NIH RePORT and PUBMED. Results Nearly two-thirds (84 of 130) indicated that they were conducting research, and over half of the 84 (approximately one-third of the total cohort) spent more than 25% of time devoted to research. Of those 84, over 25% received grant support from the NIH, and those further in their careers published more scholarly manuscripts. Conclusions Data suggest that the CRTP helped foster the careers of research-oriented medical and dental students as measured by time conducting research, successful competition for federal funding, and the publication of their research. Longer follow-up is warranted to assess the impact of these mentored research experiences. Investments in mentored research programs for health professional students are invaluable to support the dwindling pipeline of biomedical researchers and clinician-scientists. PMID:27224296
Ognibene, Frederick P; Gallin, John I; Baum, Bruce J; Wyatt, Richard G; Gottesman, Michael M
2016-12-01
Clinician-scientists are considered an endangered species for many reasons, including challenges with establishing and maintaining a career pipeline. Career outcomes from yearlong medical and dental students' research enrichment programs have not been well determined. Therefore, the authors assessed career and research outcome data from a cohort of participants in the National Institutes of Health (NIH) Clinical Research Training Program (CRTP). The CRTP provided a yearlong mentored clinical or translational research opportunity for 340 medical and dental students. Of these, 135 completed their training, including fellowships, from 1997 to January 2014. Data for 130 of 135 were analyzed: time conducting research, types of public funding (NIH grants), and publications from self-reported surveys that were verified via the NIH Research Portfolio Online Reporting Tools Web site and PubMed. Nearly two-thirds (84 of 130) indicated that they were conducting research, and over half of the 84 (approximately one-third of the total cohort) spent more than 25% of time conducting research. Of those 84, over 25% received grant support from the NIH, and those further in their careers published more scholarly manuscripts. Data suggest that the CRTP helped foster the careers of research-oriented medical and dental students as measured by time conducting research, successful competition for federal funding, and the publication of their research. Longer follow-up is warranted to assess the impact of these mentored research experiences. Investments in mentored research programs for health professional students are invaluable to support the dwindling pipeline of biomedical researchers and clinician-scientists.
75 FR 15613 - Hazardous Materials Transportation; Registration and Fee Assessment Program
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-30
... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration 49 CFR Part 107 [Docket No. PHMSA-2009-0201 (HM-208H)] RIN 2137-AE47 Hazardous Materials Transportation... registration program are to gather information about the transportation of hazardous materials, and to fund the...
49 CFR 199.113 - Employee assistance program.
Code of Federal Regulations, 2013 CFR
2013-10-01
... MATERIALS SAFETY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED) PIPELINE SAFETY DRUG AND ALCOHOL TESTING Drug Testing § 199.113 Employee assistance program. (a) Each operator shall provide an employee... must be drug tested based on reasonable cause. The operator may establish the EAP as a part of its...
49 CFR 199.113 - Employee assistance program.
Code of Federal Regulations, 2010 CFR
2010-10-01
... MATERIALS SAFETY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED) PIPELINE SAFETY DRUG AND ALCOHOL TESTING Drug Testing § 199.113 Employee assistance program. (a) Each operator shall provide an employee... must be drug tested based on reasonable cause. The operator may establish the EAP as a part of its...
49 CFR 199.113 - Employee assistance program.
Code of Federal Regulations, 2012 CFR
2012-10-01
... MATERIALS SAFETY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED) PIPELINE SAFETY DRUG AND ALCOHOL TESTING Drug Testing § 199.113 Employee assistance program. (a) Each operator shall provide an employee... must be drug tested based on reasonable cause. The operator may establish the EAP as a part of its...
49 CFR 199.113 - Employee assistance program.
Code of Federal Regulations, 2011 CFR
2011-10-01
... MATERIALS SAFETY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED) PIPELINE SAFETY DRUG AND ALCOHOL TESTING Drug Testing § 199.113 Employee assistance program. (a) Each operator shall provide an employee... must be drug tested based on reasonable cause. The operator may establish the EAP as a part of its...
49 CFR 199.113 - Employee assistance program.
Code of Federal Regulations, 2014 CFR
2014-10-01
... MATERIALS SAFETY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED) PIPELINE SAFETY DRUG AND ALCOHOL TESTING Drug Testing § 199.113 Employee assistance program. (a) Each operator shall provide an employee... must be drug tested based on reasonable cause. The operator may establish the EAP as a part of its...
Diversifying the STEM Pipeline: The Model Replication Institutions Program
ERIC Educational Resources Information Center
Cullinane, Jenna
2009-01-01
In 2006, the National Science Foundation (NSF) began funding the Model Replication Institutions (MRI) program, which sought to improve the quality, availability, and diversity of science, technology, engineering, and mathematics (STEM) education. Faced with pressing national priorities in the STEM fields and chronic gaps in postsecondary…
Impact of formulary restriction with prior authorization by an antimicrobial stewardship program
Reed, Erica E.; Stevenson, Kurt B.; West, Jessica E.; Bauer, Karri A.; Goff, Debra A.
2013-01-01
In an era of increasing antimicrobial resistance and few antimicrobials in the developmental pipeline, many institutions have developed antimicrobial stewardship programs (ASPs) to help implement evidence-based (EB) strategies for ensuring appropriate utilization of these agents. EB strategies for accomplishing this include formulary restriction with prior authorization. Potential limitations to this particular strategy include delays in therapy, prescriber pushback, and unintended increases in use of un-restricted antimicrobials; however, our ASP found that implementing prior authorization for select antimicrobials along with making a significant effort to educate clinicians on criteria for use ensured more appropriate prescribing of these agents, hopefully helping to preserve their utility for years to come. PMID:23154323
Impact of formulary restriction with prior authorization by an antimicrobial stewardship program.
Reed, Erica E; Stevenson, Kurt B; West, Jessica E; Bauer, Karri A; Goff, Debra A
2013-02-15
In an era of increasing antimicrobial resistance and few antimicrobials in the developmental pipeline, many institutions have developed antimicrobial stewardship programs (ASPs) to help implement evidence-based (EB) strategies for ensuring appropriate utilization of these agents. EB strategies for accomplishing this include formulary restriction with prior authorization. Potential limitations to this particular strategy include delays in therapy, prescriber pushback, and unintended increases in use of un-restricted antimicrobials; however, our ASP found that implementing prior authorization for select antimicrobials along with making a significant effort to educate clinicians on criteria for use ensured more appropriate prescribing of these agents, hopefully helping to preserve their utility for years to come.
NASA Astrophysics Data System (ADS)
Pierre, Jon Paul; Young, Michael H.; Wolaver, Brad D.; Andrews, John R.; Breton, Caroline L.
2017-11-01
Spatio-temporal trends in infrastructure footprints, energy production, and landscape alteration were assessed for the Eagle Ford Shale of Texas. The period of analysis was over four 2-year periods (2006-2014). Analyses used high-resolution imagery, as well as pipeline data to map EF infrastructure. Landscape conditions from 2006 were used as baseline. Results indicate that infrastructure footprints varied from 94.5 km2 in 2008 to 225.0 km2 in 2014. By 2014, decreased land-use intensities (ratio of land alteration to energy production) were noted play-wide. Core-area alteration by period was highest (3331.6 km2) in 2008 at the onset of play development, and increased from 582.3 to 3913.9 km2 by 2014, though substantial revegetation of localized core areas was observed throughout the study (i.e., alteration improved in some areas and worsened in others). Land-use intensity in the eastern portion of the play was consistently lower than that in the western portion, while core alteration remained relatively constant east to west. Land alteration from pipeline construction was 65 km2 for all time periods, except in 2010 when alteration was recorded at 47 km2. Percent of total alteration from well-pad construction increased from 27.3% in 2008 to 71.5% in 2014. The average number of wells per pad across all 27 counties increased from 1.15 to 1.7. This study presents a framework for mapping landscape alteration from oil and gas infrastructure development. However, the framework could be applied to other energy development programs, such as wind or solar fields, or any other regional infrastructure development program.
DOE Office of Scientific and Technical Information (OSTI.GOV)
George C. Vradis
2003-07-01
This development program is a joint effort among the Northeast Gas Association (formerly New York Gas Group), Foster-Miller, Inc., and the US Department of Energy (DOE) through the National Energy Technology Laboratory (NETL). The DOE's contribution to this project is $572,525 out of a total of $772,525. The present report summarizes the accomplishments of the project during its third three-month period (from April 2003 through June 2003). The project was initiated with delay in February 2003 due to contractual issues that emerged between NGA and Foster-Miller, Inc. The two organizations are working diligently to maintain the program's pace and expectmore » to complete it in time. The efforts of the project focused during this period in finalizing the assessment of the tether technology, which is intended to be used as the means of communication between robot and operator. Results indicate that the tether is a viable option under certain pipeline operating conditions, but not all. Concerns also exist regarding the abrasion resistance of the tether, this issue being the last studied. Substantial work was also conducted on the design of the robotic platform, which has progressed very well. Finally, work on the MFL sensor, able to negotiate all pipeline obstacles (including plug valves), was initiated by PII following the successful completion of the subcontract negotiations between Foster-Miller and PII. The sensor design is at this point the critical path in the project's timetable.« less
NASA Astrophysics Data System (ADS)
Dudin, S. M.; Novitskiy, D. V.
2018-05-01
The works of researchers at VNIIgaz, Giprovostokneft, Kuibyshev NIINP, Grozny Petroleum Institute, etc., are devoted to modeling heterogeneous medium flows in pipelines under laboratory conditions. In objective consideration, the empirical relationships obtained and the calculation procedures for pipelines transporting multiphase products are a bank of experimental data on the problem of pipeline transportation of multiphase systems. Based on the analysis of the published works, the main design requirements for experimental installations designed to study the flow regimes of gas-liquid flows in pipelines were formulated, which were taken into account by the authors when creating the experimental stand. The article describes the results of experimental studies of the flow regimes of a gas-liquid mixture in a pipeline, and also gives a methodological description of the experimental installation. Also the article describes the software of the experimental scientific and educational stand developed with the participation of the authors.
Feng, Qingshan; Li, Rui; Nie, Baohua; Liu, Shucong; Zhao, Lianyu; Zhang, Hong
2016-01-01
Girth weld cracking is one of the main failure modes in oil and gas pipelines; girth weld cracking inspection has great economic and social significance for the intrinsic safety of pipelines. This paper introduces the typical girth weld defects of oil and gas pipelines and the common nondestructive testing methods, and systematically generalizes the progress in the studies on technical principles, signal analysis, defect sizing method and inspection reliability, etc., of magnetic flux leakage (MFL) inspection, liquid ultrasonic inspection, electromagnetic acoustic transducer (EMAT) inspection and remote field eddy current (RFDC) inspection for oil and gas pipeline girth weld defects. Additionally, it introduces the new technologies for composite ultrasonic, laser ultrasonic, and magnetostriction inspection, and provides reference for development and application of oil and gas pipeline girth weld defect in-line inspection technology. PMID:28036016
Virtual Instrumentation Corrosion Controller for Natural Gas Pipelines
NASA Astrophysics Data System (ADS)
Gopalakrishnan, J.; Agnihotri, G.; Deshpande, D. M.
2012-12-01
Corrosion is an electrochemical process. Corrosion in natural gas (methane) pipelines leads to leakages. Corrosion occurs when anode and cathode are connected through electrolyte. Rate of corrosion in metallic pipeline can be controlled by impressing current to it and thereby making it to act as cathode of corrosion cell. Technologically advanced and energy efficient corrosion controller is required to protect natural gas pipelines. Proposed virtual instrumentation (VI) based corrosion controller precisely controls the external corrosion in underground metallic pipelines, enhances its life and ensures safety. Designing and development of proportional-integral-differential (PID) corrosion controller using VI (LabVIEW) is carried out. When the designed controller is deployed at field, it maintains the pipe to soil potential (PSP) within safe operating limit and not entering into over/under protection zone. Horizontal deployment of this technique can be done to protect all metallic structure, oil pipelines, which need corrosion protection.
PRADA: pipeline for RNA sequencing data analysis.
Torres-García, Wandaliz; Zheng, Siyuan; Sivachenko, Andrey; Vegesna, Rahulsimham; Wang, Qianghu; Yao, Rong; Berger, Michael F; Weinstein, John N; Getz, Gad; Verhaak, Roel G W
2014-08-01
Technological advances in high-throughput sequencing necessitate improved computational tools for processing and analyzing large-scale datasets in a systematic automated manner. For that purpose, we have developed PRADA (Pipeline for RNA-Sequencing Data Analysis), a flexible, modular and highly scalable software platform that provides many different types of information available by multifaceted analysis starting from raw paired-end RNA-seq data: gene expression levels, quality metrics, detection of unsupervised and supervised fusion transcripts, detection of intragenic fusion variants, homology scores and fusion frame classification. PRADA uses a dual-mapping strategy that increases sensitivity and refines the analytical endpoints. PRADA has been used extensively and successfully in the glioblastoma and renal clear cell projects of The Cancer Genome Atlas program. http://sourceforge.net/projects/prada/ gadgetz@broadinstitute.org or rverhaak@mdanderson.org Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Haller, Toomas; Leitsalu, Liis; Fischer, Krista; Nuotio, Marja-Liisa; Esko, Tõnu; Boomsma, Dorothea Irene; Kyvik, Kirsten Ohm; Spector, Tim D; Perola, Markus; Metspalu, Andres
2017-01-01
Ancestry information at the individual level can be a valuable resource for personalized medicine, medical, demographical and history research, as well as for tracing back personal history. We report a new method for quantitatively determining personal genetic ancestry based on genome-wide data. Numerical ancestry component scores are assigned to individuals based on comparisons with reference populations. These comparisons are conducted with an existing analytical pipeline making use of genotype phasing, similarity matrix computation and our addition-multidimensional best fitting by MixFit. The method is demonstrated by studying Estonian and Finnish populations in geographical context. We show the main differences in the genetic composition of these otherwise close European populations and how they have influenced each other. The components of our analytical pipeline are freely available computer programs and scripts one of which was developed in house (available at: www.geenivaramu.ee/en/tools/mixfit).
Astrometry with A-Track Using Gaia DR1 Catalogue
NASA Astrophysics Data System (ADS)
Kılıç, Yücel; Erece, Orhan; Kaplan, Murat
2018-04-01
In this work, we built all sky index files from Gaia DR1 catalogue for the high-precision astrometric field solution and the precise WCS coordinates of the moving objects. For this, we used build-astrometry-index program as a part of astrometry.net code suit. Additionally, we added astrometry.net's WCS solution tool to our previously developed software which is a fast and robust pipeline for detecting moving objects such as asteroids and comets in sequential FITS images, called A-Track. Moreover, MPC module was added to A-Track. This module is linked to an asteroid database to name the found objects and prepare the MPC file to report the results. After these innovations, we tested a new version of the A-Track code on photometrical data taken by the SI-1100 CCD with 1-meter telescope at TÜBİTAK National Observatory, Antalya. The pipeline can be used to analyse large data archives or daily sequential data. The code is hosted on GitHub under the GNU GPL v3 license.
2017-03-01
Contribution to Project: Ian primarily focuses on developing tissue imaging pipeline and perform imaging data analysis . Funding Support: Partially...3D ReconsTruction), a multi-faceted image analysis pipeline , permitting quantitative interrogation of functional implications of heterogeneous... analysis pipeline , to observe and quantify phenotypic metastatic landscape heterogeneity in situ with spatial and molecular resolution. Our implementation
Building the Pipeline for Hubble Legacy Archive Grism data
NASA Astrophysics Data System (ADS)
Kümmel, M.; Albrecht, R.; Fosbury, R.; Freudling, W.; Haase, J.; Hook, R. N.; Kuntschner, H.; Lombardi, M.; Micol, A.; Rosa, M.; Stoehr, F.; Walsh, J. R.
2008-10-01
The Pipeline for Hubble Legacy Archive Grism data (PHLAG) is currently being developed as an end-to-end pipeline for the Hubble Legacy Archive (HLA). The inputs to PHLAG are slitless spectroscopic HST data with only the basic calibrations from standard HST pipelines applied; the outputs are fully calibrated, Virtuall Observatory-compatible spectra, which will be made available through a static HLA-archive. We give an overview of the various aspects of PHLAG. The pipeline consists of several subcomponents -- data preparation, data retrieval, image combination, object detection, spectral extraction using the aXe software, quality control -- which is discussed in detail. As a pilot project, PHLAG is currently being applied to NICMOS G141 grism data. Examples of G141 spectra reduced with PHLAG are shown.
Sequanix: a dynamic graphical interface for Snakemake workflows.
Desvillechabrol, Dimitri; Legendre, Rachel; Rioualen, Claire; Bouchier, Christiane; van Helden, Jacques; Kennedy, Sean; Cokelaer, Thomas
2018-06-01
We designed a PyQt graphical user interface-Sequanix-aimed at democratizing the use of Snakemake pipelines in the NGS space and beyond. By default, Sequanix includes Sequana NGS pipelines (Snakemake format) (http://sequana.readthedocs.io), and is also capable of loading any external Snakemake pipeline. New users can easily, visually, edit configuration files of expert-validated pipelines and can interactively execute these production-ready workflows. Sequanix will be useful to both Snakemake developers in exposing their pipelines and to a wide audience of users. Source on http://github.com/sequana/sequana, bio-containers on http://bioconda.github.io and Singularity hub (http://singularity-hub.org). dimitri.desvillechabrol@pasteur.fr or thomas.cokelaer@pasteur.fr. Supplementary data are available at Bioinformatics online.
ENVIRONMENTALLY BENIGN MITIGATION OF MICROBIOLOGICALLY INFLUENCED CORROSION (MIC)
DOE Office of Scientific and Technical Information (OSTI.GOV)
J. Robert Paterek; Gemma Husmillo; Amrutha Daram
The overall program objective is to develop and evaluate environmentally benign agents or products that are effective in the prevention, inhibition, and mitigation of microbially influenced corrosion (MIC) in the internal surfaces of metallic natural gas pipelines. The goal is to develop one or more environmentally benign (a.k.a. ''green'') products that can be applied to maintain the structure and dependability of the natural gas infrastructure. The technical approach for this quarter includes the application of new methods of Capsicum sp. (pepper) extraction by soxhlet method and analysis of a new set of extracts by thin layer chromatography (TLC) and highmore » performance liquid chromatography (HPLC); isolation and cultivation of MIC-causing microorganisms from corroded pipeline samples; and evaluation of antimicrobial activities of the old set of pepper extracts in comparison with major components of known biocides and corrosion inhibitors. Twelve new extracts from three varieties of Capsicum sp. (Serrano, Habanero, and Chile de Arbol) were obtained by soxhlet extraction using 4 different solvents. Results of TLC done on these extracts showed the presence of capsaicin and some phenolic compounds, while that of HPLC detected capsaicin and dihydrocapsaicin peaks. More tests will be done to determine specific components. Additional isolates from the group of heterotrophic, acid-producing, denitrifying and sulfate-reducing bacteria were obtained from the pipeline samples submitted by gas companies. Isolates of interest will be used in subsequent antimicrobial testing and test-loop simulation system experiments. Results of antimicrobial screening of Capsicum sp. extracts and components of known commercial biocides showed comparable activities when tested against two strains of sulfate-reducing bacteria.« less
NASA Astrophysics Data System (ADS)
Schuster, G.
2006-05-01
New NASA-funded educational initiatives make for a pipeline of products meeting the needs of today's educators in inner city schools, for NASA Explorer Schools and across the nation. Three projects include training and include: 1) WDLC [Weather Data Learning Center] , a math achievement program with data entry, inquiry-based investigations, and the application of math using weather maps and imagery for Grade 4; 2) Project 3D-VIEW, where students in Grades 5 and 6 become experts in air, life, water, land and Earth systems using 3D technologies requiring 3D glasses. A formal literacy and math piece are included, and 1200 teachers will be provided training and materials free beginning in Fall 2006, and 3) Signals of Spring, where students in Grades 7 to 8, or high school, use NASA data to explain the movement of dozens of birds, land and marine animals that are tracked by satellite. Comprehensive content in life and Earth science is taught with curricular activities, interactive mapping, image interpretation, and online journals and common misconceptions are dispelled. Scientist involvement and support for a project is essential for students who are developing process skills and performing science activities. Current research partners include Columbia University's Teachers College and Stanford University's School of Education.
NASA Astrophysics Data System (ADS)
Barry, N.; Beardsley, A.; Bowman, J.; Briggs, F.; Byrne, R.; Carroll, P.; Hazelton, B.; Jacobs, D.; Jordan, C.; Kittiwisit, P.; Lanman, A.; Lenc, E.; Li, W.; Line, J.; McKinley, B.; Mitchell, D.; Morales, M.; Murray, S.; Paul, S.; Pindor, B.; Pober, J.; Rahimi, M.; Riding, J.; Sethi, S.; Shankar, U.; Subrahmanyan, R.; Sullivan, I.; Takahashi, K.; Thyagarajan, N.; Tingay, S.; Trott, C.; Wayth, R.; Webster, R.; Wyithe, S.
2017-01-01
The Murchison Widefield Array is designed to measure the fluctuations in the 21cm emission from neutral hydrogen during the Epoch of Reionisation. The new hex configuration is explicitly designed to test the predicted increase in sensitivity of redundant baselines. However the challenge of the new array is to understand calibration with the new configuration. We have developed two new pipelines to reduce the hex data, and will compare the results with previous datasets from the Phase 1 array. We have now processed 80 hours of data refining the data analysis through our two established Phase 1 pipelines. This proposal requests as much observing time as possible in semester 2017-A to (1) obtain a comparable hex dataset to test the sensitivity and systematic limits with redundant arrays, (2) establish the optimal observing strategy for an EoR detection, and (3) continue to explore observational strategies in the three EoR fields to advise the design of SKA-low experiments. Due to the proposed changes in the array during the upcoming semester, we have not requested a specific number of hours, but will optimise our observing program as availability of the telescope becomes clear. We note that this observing proposal implements the key scientific program that can benefit from the new hex configuration.
Cohen Freue, Gabriela V.; Meredith, Anna; Smith, Derek; Bergman, Axel; Sasaki, Mayu; Lam, Karen K. Y.; Hollander, Zsuzsanna; Opushneva, Nina; Takhar, Mandeep; Lin, David; Wilson-McManus, Janet; Balshaw, Robert; Keown, Paul A.; Borchers, Christoph H.; McManus, Bruce; Ng, Raymond T.; McMaster, W. Robert
2013-01-01
Recent technical advances in the field of quantitative proteomics have stimulated a large number of biomarker discovery studies of various diseases, providing avenues for new treatments and diagnostics. However, inherent challenges have limited the successful translation of candidate biomarkers into clinical use, thus highlighting the need for a robust analytical methodology to transition from biomarker discovery to clinical implementation. We have developed an end-to-end computational proteomic pipeline for biomarkers studies. At the discovery stage, the pipeline emphasizes different aspects of experimental design, appropriate statistical methodologies, and quality assessment of results. At the validation stage, the pipeline focuses on the migration of the results to a platform appropriate for external validation, and the development of a classifier score based on corroborated protein biomarkers. At the last stage towards clinical implementation, the main aims are to develop and validate an assay suitable for clinical deployment, and to calibrate the biomarker classifier using the developed assay. The proposed pipeline was applied to a biomarker study in cardiac transplantation aimed at developing a minimally invasive clinical test to monitor acute rejection. Starting with an untargeted screening of the human plasma proteome, five candidate biomarker proteins were identified. Rejection-regulated proteins reflect cellular and humoral immune responses, acute phase inflammatory pathways, and lipid metabolism biological processes. A multiplex multiple reaction monitoring mass-spectrometry (MRM-MS) assay was developed for the five candidate biomarkers and validated by enzyme-linked immune-sorbent (ELISA) and immunonephelometric assays (INA). A classifier score based on corroborated proteins demonstrated that the developed MRM-MS assay provides an appropriate methodology for an external validation, which is still in progress. Plasma proteomic biomarkers of acute cardiac rejection may offer a relevant post-transplant monitoring tool to effectively guide clinical care. The proposed computational pipeline is highly applicable to a wide range of biomarker proteomic studies. PMID:23592955
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pallesen, T.R.; Braestrup, M.W.; Jorgensen, O.
Development of Danish North Sea hydrocarbon resources includes the 17-km Rolf pipeline installed in 1985. This one consists of an insulated 8-in. two-phase flow product line with a 3-in. piggyback gas lift line. A practical solution to design of this insulated pipeline, including the small diameter, piggyback injection line was corrosion coating of fusion bonded epoxy (FBE) and polyethylene (PE) sleeve pipe. The insulation design prevents hydrate formation under the most conservative flow regime during gas lift production. Also, the required minimum flow rate during the initial natural lift period is well below the value anticipiated at the initiation ofmore » gas lift. The weight coating design ensures stability on the seabed during the summer months only; thus trenching was required during the same installation season. Installation of insulated flowlines serving marginal fields is a significant feature of North Sea hydrocarbon development projects. The Skjold field is connected to Gorm by a 6-in., two-phase-flow line. The 11-km line was installed in 1982 as the first insulated pipeline in the North Sea. The Rolf field, located 17 km west of Gorm, went on stream Jan. 2. The development includes an unmanned wellhead platform and an insulated, two-phase-flow pipeline to the Gorm E riser platform. After separation on the Gorm C process platform, the oil and condensate are transported to shore through the 20-in. oil pipeline, and the natural gas is piped to Tyra for transmission through the 30-in. gas pipeline. Oil production at Rolf is assisted by the injection of lift gas, transported from Gorm through a 3-in. pipeline, installed piggyback on the insulated 8-in. product line. The seabed is smooth and sandy, the water depth varying between 33.7 m (110.5 ft) at Rolf and 39.1 m (128 ft) at Gorm.« less
NASA Astrophysics Data System (ADS)
Wingate, Lory Mitchell
2017-01-01
The National Radio Astronomy Observatory’s (NRAO) National and International Non-Traditional Exchange (NINE) Program teaches concepts of project management and systems engineering to chosen participants within a nine-week program held at NRAO in New Mexico. Participants are typically graduate level students or professionals. Participation in the NINE Program is through a competitive process. The program includes a hands-on service project designed to increase the participants knowledge of radio astronomy. The approach demonstrate clearly to the learner the positive net effects of following methodical approaches to achieving optimal science results.The NINE teaches participants important sustainable skills associated with constructing, operating and maintaining radio astronomy observatories. NINE Program learners are expected to return to their host sites and implement the program in their own location as a NINE Hub. This requires forming a committed relationship (through a formal Letter of Agreement), establishing a site location, and developing a program that takes into consideration the needs of the community they represent. The anticipated outcome of this program is worldwide partnerships with fast growing radio astronomy communities designed to facilitate the exchange of staff and the mentoring of under-represented groups of learners, thereby developing a strong pipeline of global talent to construct, operate and maintain radio astronomy observatories.
2013-01-01
Background Accurate and complete identification of mobile elements is a challenging task in the current era of sequencing, given their large numbers and frequent truncations. Group II intron retroelements, which consist of a ribozyme and an intron-encoded protein (IEP), are usually identified in bacterial genomes through their IEP; however, the RNA component that defines the intron boundaries is often difficult to identify because of a lack of strong sequence conservation corresponding to the RNA structure. Compounding the problem of boundary definition is the fact that a majority of group II intron copies in bacteria are truncated. Results Here we present a pipeline of 11 programs that collect and analyze group II intron sequences from GenBank. The pipeline begins with a BLAST search of GenBank using a set of representative group II IEPs as queries. Subsequent steps download the corresponding genomic sequences and flanks, filter out non-group II introns, assign introns to phylogenetic subclasses, filter out incomplete and/or non-functional introns, and assign IEP sequences and RNA boundaries to the full-length introns. In the final step, the redundancy in the data set is reduced by grouping introns into sets of ≥95% identity, with one example sequence chosen to be the representative. Conclusions These programs should be useful for comprehensive identification of group II introns in sequence databases as data continue to rapidly accumulate. PMID:24359548
Diversifying the STEM Pipeline: Recommendations from the Model Replication Institutions Program
ERIC Educational Resources Information Center
Institute for Higher Education Policy, 2010
2010-01-01
Launched in 2006 to address issues of national competitiveness and equity in science, technology, engineering, and mathematics (STEM) fields, the National Science Foundation-funded Model Replication Institutions (MRI) program sought to improve the quality, availability, and diversity of STEM education. The project offered technical assistance to…
The Challenges in Providing Needed Transition Programming to Juvenile Offenders
ERIC Educational Resources Information Center
Platt, John S.; Bohac, Paul D.; Wade, Wanda
2015-01-01
The transition to and from juvenile justice settings is a complex and challenging process. Effectively preparing juvenile justice personnel to address the transition needs of incarcerated students is an essential aspect of reducing the negative effects of the school-to-prison pipeline. This article examines program and professional development…
Building a Pipeline: One Company's Holistic Approach to College Relations
ERIC Educational Resources Information Center
Pratt, Joseph
2003-01-01
This article describes how Fidelity, the largest mutual fund company in the United States, has transformed a traditional college recruiting program into a holistic college partnership that emphasizes the interdependence of its parts. Fidelity's enhanced internship program embraces the "try before you buy" philosophy, which benefits both the firm…
49 CFR 192.909 - How can an operator change its integrity management program?
Code of Federal Regulations, 2011 CFR
2011-10-01
... Transmission Pipeline Integrity Management § 192.909 How can an operator change its integrity management... 49 Transportation 3 2011-10-01 2011-10-01 false How can an operator change its integrity management program? 192.909 Section 192.909 Transportation Other Regulations Relating to Transportation...
Corrosivity Sensor for Exposed Pipelines Based on Wireless Energy Transfer.
Lawand, Lydia; Shiryayev, Oleg; Al Handawi, Khalil; Vahdati, Nader; Rostron, Paul
2017-05-30
External corrosion was identified as one of the main causes of pipeline failures worldwide. A solution that addresses the issue of detecting and quantifying corrosivity of environment for application to existing exposed pipelines has been developed. It consists of a sensing array made of an assembly of thin strips of pipeline steel and a circuit that provides a visual sensor reading to the operator. The proposed sensor is passive and does not require a constant power supply. Circuit design was validated through simulations and lab experiments. Accelerated corrosion experiment was conducted to confirm the feasibility of the proposed corrosivity sensor design.
Canada seeks US financing waiver to clear Alaska Gas Pipeline's path
DOE Office of Scientific and Technical Information (OSTI.GOV)
Corrigan, R.
1981-09-26
A Canadian official outlines in an interview his government's hope that the US will proceed with the financing and construction of the Alaska Highway natural gas pipeline. The Canadian portion of the pipeline was begun under good faith because Canada sees her best interests served when US supply needs are met and when both countries have the energy to develop and prosper. Canada asks the Reagan administration to present Congress with a waiver package that will facilitate financing by eliminating a prohibition against pipeline share ownership by the owners of gas in Alaska. (DCK)
Pipeline bottoming cycle study. Final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1980-06-01
The technical and economic feasibility of applying bottoming cycles to the prime movers that drive the compressors of natural gas pipelines was studied. These bottoming cycles convert some of the waste heat from the exhaust gas of the prime movers into shaft power and conserve gas. Three typical compressor station sites were selected, each on a different pipeline. Although the prime movers were different, they were similar enough in exhaust gas flow rate and temperature that a single bottoming cycle system could be designed, with some modifications, for all three sites. Preliminary design included selection of the bottoming cycle workingmore » fluid, optimization of the cycle, and design of the components, such as turbine, vapor generator and condensers. Installation drawings were made and hardware and installation costs were estimated. The results of the economic assessment of retrofitting bottoming cycle systems on the three selected sites indicated that profitability was strongly dependent upon the site-specific installation costs, how the energy was used and the yearly utilization of the apparatus. The study indicated that the bottoming cycles are a competitive investment alternative for certain applications for the pipeline industry. Bottoming cycles are technically feasible. It was concluded that proper design and operating practices would reduce the environmental and safety hazards to acceptable levels. The amount of gas that could be saved through the year 2000 by the adoption of bottoming cycles for two different supply projections was estimated as from 0.296 trillion ft/sup 3/ for a low supply projection to 0.734 trillion ft/sup 3/ for a high supply projection. The potential market for bottoming cycle equipment for the two supply projections varied from 170 to 500 units of varying size. Finally, a demonstration program plan was developed.« less
Designing health promotion programs by watching the market.
Gelb, B D; Bryant, J M
1992-03-01
More health care providers and payors are beginning to see health promotion programs as a significant tool for attracting patients, reducing costs, or both. To help design programs that take into account the values and lifestyles of the target group, naturalistic observation can be useful. The authors illustrate the approach in a study of pipeline workers that provided input for the design of nutrition and smoking cessation programs.
Impacts of Chandra X-ray Observatory Public Communications and Engagement
NASA Astrophysics Data System (ADS)
Arcand, Kimberly K.; Watzke, Megan; Lestition, Kathleen; Edmonds, Peter
2015-01-01
The Chandra X-ray Observatory Center runs a multifaceted Public Communications & Engagement program encompassing press relations, public engagement, and education. Our goals include reaching a large and diverse audience of national and international scope, establishing direct connections and working relationships with the scientists whose research forms the basis for all products, creating peer-reviewed materials and activities that evolve from an integrated pipeline design and encourage users toward deeper engagement, and developing materials that target underserved audiences such as women, Spanish speakers, and the sight and hearing impaired. This talk will highlight some of the key features of our program, from the high quality curated digital presence to the cycle of research and evaluation that informs our practice at all points of the program creation. We will also discuss the main impacts of the program, from the tens of millions of participants reached through the establishment and sustainability of a network of science 'volunpeers.'
NASA Astrophysics Data System (ADS)
Tomarov, G. V.; Povarov, V. P.; Shipkov, A. A.; Gromov, A. F.; Kiselev, A. N.; Shepelev, S. V.; Galanin, A. V.
2015-02-01
Specific features relating to development of the information-analytical system on the problem of flow-accelerated corrosion of pipeline elements in the secondary coolant circuit of the VVER-440-based power units at the Novovoronezh nuclear power plant are considered. The results from a statistical analysis of data on the quantity, location, and operating conditions of the elements and preinserted segments of pipelines used in the condensate-feedwater and wet steam paths are presented. The principles of preparing and using the information-analytical system for determining the lifetime to reaching inadmissible wall thinning in elements of pipelines used in the secondary coolant circuit of the VVER-440-based power units at the Novovoronezh NPP are considered.
NGSPanPipe: A Pipeline for Pan-genome Identification in Microbial Strains from Experimental Reads.
Kulsum, Umay; Kapil, Arti; Singh, Harpreet; Kaur, Punit
2018-01-01
Recent advancements in sequencing technologies have decreased both time span and cost for sequencing the whole bacterial genome. High-throughput Next-Generation Sequencing (NGS) technology has led to the generation of enormous data concerning microbial populations publically available across various repositories. As a consequence, it has become possible to study and compare the genomes of different bacterial strains within a species or genus in terms of evolution, ecology and diversity. Studying the pan-genome provides insights into deciphering microevolution, global composition and diversity in virulence and pathogenesis of a species. It can also assist in identifying drug targets and proposing vaccine candidates. The effective analysis of these large genome datasets necessitates the development of robust tools. Current methods to develop pan-genome do not support direct input of raw reads from the sequencer machine but require preprocessing of reads as an assembled protein/gene sequence file or the binary matrix of orthologous genes/proteins. We have designed an easy-to-use integrated pipeline, NGSPanPipe, which can directly identify the pan-genome from short reads. The output from the pipeline is compatible with other pan-genome analysis tools. We evaluated our pipeline with other methods for developing pan-genome, i.e. reference-based assembly and de novo assembly using simulated reads of Mycobacterium tuberculosis. The single script pipeline (pipeline.pl) is applicable for all bacterial strains. It integrates multiple in-house Perl scripts and is freely accessible from https://github.com/Biomedinformatics/NGSPanPipe .
Design Optimization of Innovative High-Level Waste Pipeline Unplugging Technologies - 13341
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pribanic, T.; Awwad, A.; Varona, J.
2013-07-01
Florida International University (FIU) is currently working on the development and optimization of two innovative pipeline unplugging methods: the asynchronous pulsing system (APS) and the peristaltic crawler system (PCS). Experiments were conducted on the APS to determine how air in the pipeline influences the system's performance as well as determine the effectiveness of air mitigation techniques in a pipeline. The results obtained during the experimental phase of the project, including data from pipeline pressure pulse tests along with air bubble compression tests are presented. Single-cycle pulse amplification caused by a fast-acting cylinder piston pump in 21.8, 30.5, and 43.6 mmore » pipelines were evaluated. Experiments were conducted on fully flooded pipelines as well as pipelines that contained various amounts of air to evaluate the system's performance when air is present in the pipeline. Also presented are details of the improvements implemented to the third generation crawler system (PCS). The improvements include the redesign of the rims of the unit to accommodate a camera system that provides visual feedback of the conditions inside the pipeline. Visual feedback allows the crawler to be used as a pipeline unplugging and inspection tool. Tests conducted previously demonstrated a significant reduction of the crawler speed with increasing length of tether. Current improvements include the positioning of a pneumatic valve manifold system that is located in close proximity to the crawler, rendering tether length independent of crawler speed. Additional improvements to increase the crawler's speed were also investigated and presented. Descriptions of the test beds, which were designed to emulate possible scenarios present on the Department of Energy (DOE) pipelines, are presented. Finally, conclusions and recommendations for the systems are provided. (authors)« less
NASA Astrophysics Data System (ADS)
Mohamed, Adel M. E.; Mohamed, Abuo El-Ela A.
2013-06-01
Ground vibrations induced by blasting in the cement quarries are one of the fundamental problems in the quarrying industry and may cause severe damage to the nearby utilities and pipelines. Therefore, a vibration control study plays an important role in the minimization of environmental effects of blasting in quarries. The current paper presents the influence of the quarry blasts at the National Cement Company (NCC) on the two oil pipelines of SUMED Company southeast of Helwan City, by measuring the ground vibrations in terms of Peak Particle Velocity (PPV). The seismic refraction for compressional waves deduced from the shallow seismic survey and the shear wave velocity obtained from the Multi channel Analysis of Surface Waves (MASW) technique are used to evaluate the closest site of the two pipelines to the quarry blasts. The results demonstrate that, the closest site of the two pipelines is of class B, according to the National Earthquake Hazard Reduction Program (NEHRP) classification and the safe distance to avoid any environmental effects is 650 m, following the deduced Peak Particle Velocity (PPV) and scaled distance (SD) relationship (PPV = 700.08 × SD-1.225) in mm/s and the Air over Pressure (air blast) formula (air blast = 170.23 × SD-0.071) in dB. In the light of prediction analysis, the maximum allowable charge weight per delay was found to be 591 kg with damage criterion of 12.5 mm/s at the closest site of the SUMED pipelines.
Closing the race and gender gaps in computer science education
NASA Astrophysics Data System (ADS)
Robinson, John Henry
Life in a technological society brings new paradigms and pressures to bear on education. These pressures are magnified for underrepresented students and must be addressed if they are to play a vital part in society. Educational pipelines need to be established to provide at risk students with the means and opportunity to succeed in science, technology, engineering, and mathematics (STEM) majors. STEM educational pipelines are programs consisting of components that seek to facilitate students' completion of a college degree by providing access to higher education, intervention, mentoring, support infrastructure, and programs that encourage academic success. Successes in the STEM professions mean that more educators, scientist, engineers, and researchers will be available to add diversity to the professions and to provide role models for future generations. The issues that the educational pipelines must address are improving at risk groups' perceptions and awareness of the math, science, and engineering professions. Additionally, the educational pipelines must provide intervention in math preparation, overcome gender and race socialization, and provide mentors and counseling to help students achieve better self perceptions and provide positive role models. This study was designed to explorer the underrepresentation of minorities and women in the computer science major at Rowan University through a multilayered action research methodology. The purpose of this research study was to define and understand the needs of underrepresented students in computer science, to examine current policies and enrollment data for Rowan University, to develop a historical profile of the Computer Science program from the standpoint of ethnicity and gender enrollment to ascertain trends in students' choice of computer science as a major, and an attempt to determine if raising awareness about computer science for incoming freshmen, and providing an alternate route into the computer science major will entice more women and minorities to pursue a degree in computer science at Rowan University. Finally, this study examined my espoused leadership theories and my leadership theories in use through reflective practices as I progressed through the cycles of this project. The outcomes of this study indicated a large downward trend in women enrollment in computer science and a relatively flat trend in minority enrollment. The enrollment data at Rowan University was found to follow a nationwide trend for underrepresented students' enrollment in STEM majors. The study also indicated that students' mental models are based upon their race and gender socialization and their understanding of the world and society. The mental models were shown to play a large role in the students' choice of major. Finally, a computer science pipeline was designed and piloted as part of this study in an attempt to entice more students into the major and facilitate their success. Additionally, the mental models of the participants were challenged through interactions to make them aware of what possibilities are available with a degree in computer science. The entire study was wrapped in my leadership, which was practiced and studied over the course of this work.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
This is a short paper on the history and development of the Platte Pipe Line which stretches 1156 miles from Byron, Wyoming, to Wood River, Illinois. It discusses the development and significance of one of the most used crude oil pipelines in the United States. It also discusses its role in advanced pipeline control technology and the future of the system.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-20
... oil and gas pipelines. While subsequent efforts by industry to develop infrastructure such as oil and gas pipelines and their associated components are reasonably foreseeable, these elements are not... Highway to Umiat, to increase access to potential oil and gas resources for exploration and development...
Career Development of Women in Academia: Traversing the Leaky Pipeline
ERIC Educational Resources Information Center
Gasser, Courtney E.; Shaffer, Katharine S.
2014-01-01
Women's experiences in academia are laden with a fundamental set of issues pertaining to gender inequalities. A model reflecting women's career development and experiences around their academic pipeline (or career in academia) is presented. This model further conveys a new perspective on the experiences of women academicians before, during and…
Evidence-Based Professional Development Considerations along the School-to-Prison Pipeline
ERIC Educational Resources Information Center
Houchins, David E.; Shippen, Margaret E.; Murphy, Kristin M.
2012-01-01
This article addresses professional development (PD) issues for those who provide services to students in the school-to-prison pipeline (STPP). Emphasis is on implementing evidence-based practices. The authors use a modified version of Desimone's PD framework for the structure of this article involving (a) collective participation and common…
Gaps of Decision Support Models for Pipeline Renewal and Recommendations for Improvement
In terms of the development of software for decision support for pipeline renewal, more attention to date has been paid to the development of asset management models that help an owner decide on which portions of a system to prioritize needed actions. There has been much less w...
GAPS OF DECISION SUPPORT MODELS FOR PIPELINE RENEWAL AND RECOMMENDATIONS FOR IMPROVEMENT (SLIDE)
In terms of the development of software for decision support for pipeline renewal, more attention to date has been paid to the development of asset management models that help an owner decide on which portions of a system to prioritize needed actions. There has been much less wor...
Commercial Mobile Alert Service (CMAS) Alerting Pipeline Taxonomy
2012-03-01
for the consumer at the mo- ment but will soon become a commoditized, basic requirement. For example, as the baby boomers grow older, mobile services...Commercial Mobile Alert Service (CMAS) Alerting Pipeline Taxonomy The WEA Project Team March 2012 SPECIAL REPORT CMU/SEI-2012-TR-019 CERT...report presents a taxonomy developed for the Commercial Mobile Alert Service (CMAS). The CMAS Alerting Pipeline Taxonomy is a hierarchical classification
DEVELOPMENT OF AN ENVIRONMENTALLY BENIGN MICROBIAL INHIBITOR TO CONTROL INTERNAL PIPELINE CORROSION
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bill W. Bogan; Brigid M. Lamb; John J. Kilbane II
2004-10-30
The overall program objective is to develop and evaluate environmentally benign agents or products that are effective in the prevention, inhibition, and mitigation of microbially influenced corrosion (MIC) in the internal surfaces of metallic natural gas pipelines. The goal is to develop one or more environmentally benign (a.k.a. ''green'') products that can be applied to maintain the structure and dependability of the natural gas infrastructure. Previous testing indicated that the growth, and the metal corrosion caused by pure cultures of sulfate reducing bacteria were inhibited by hexane extracts of some pepper plants. This quarter tests were performed to determine ifmore » chemical compounds other than pepper extracts could inhibit the growth of corrosion-associated microbes and to determine if pepper extracts and other compounds can inhibit corrosion when mature biofilms are present. Several chemical compounds were shown to be capable of inhibiting the growth of corrosion-associated microorganisms, and all of these compounds limited the amount of corrosion caused by mature biofilms to a similar extent. It is difficult to control corrosion caused by mature biofilms, but any compound that disrupts the metabolism of any of the major microbial groups present in corrosion-associated biofilms shows promise in limiting the amount/rate of corrosion.« less
Makropoulos, Antonios; Robinson, Emma C; Schuh, Andreas; Wright, Robert; Fitzgibbon, Sean; Bozek, Jelena; Counsell, Serena J; Steinweg, Johannes; Vecchiato, Katy; Passerat-Palmbach, Jonathan; Lenz, Gregor; Mortari, Filippo; Tenev, Tencho; Duff, Eugene P; Bastiani, Matteo; Cordero-Grande, Lucilio; Hughes, Emer; Tusor, Nora; Tournier, Jacques-Donald; Hutter, Jana; Price, Anthony N; Teixeira, Rui Pedro A G; Murgasova, Maria; Victor, Suresh; Kelly, Christopher; Rutherford, Mary A; Smith, Stephen M; Edwards, A David; Hajnal, Joseph V; Jenkinson, Mark; Rueckert, Daniel
2018-06-01
The Developing Human Connectome Project (dHCP) seeks to create the first 4-dimensional connectome of early life. Understanding this connectome in detail may provide insights into normal as well as abnormal patterns of brain development. Following established best practices adopted by the WU-MINN Human Connectome Project (HCP), and pioneered by FreeSurfer, the project utilises cortical surface-based processing pipelines. In this paper, we propose a fully automated processing pipeline for the structural Magnetic Resonance Imaging (MRI) of the developing neonatal brain. This proposed pipeline consists of a refined framework for cortical and sub-cortical volume segmentation, cortical surface extraction, and cortical surface inflation, which has been specifically designed to address considerable differences between adult and neonatal brains, as imaged using MRI. Using the proposed pipeline our results demonstrate that images collected from 465 subjects ranging from 28 to 45 weeks post-menstrual age (PMA) can be processed fully automatically; generating cortical surface models that are topologically correct, and correspond well with manual evaluations of tissue boundaries in 85% of cases. Results improve on state-of-the-art neonatal tissue segmentation models and significant errors were found in only 2% of cases, where these corresponded to subjects with high motion. Downstream, these surfaces will enhance comparisons of functional and diffusion MRI datasets, supporting the modelling of emerging patterns of brain connectivity. Copyright © 2018 Elsevier Inc. All rights reserved.
Automated processing pipeline for neonatal diffusion MRI in the developing Human Connectome Project.
Bastiani, Matteo; Andersson, Jesper L R; Cordero-Grande, Lucilio; Murgasova, Maria; Hutter, Jana; Price, Anthony N; Makropoulos, Antonios; Fitzgibbon, Sean P; Hughes, Emer; Rueckert, Daniel; Victor, Suresh; Rutherford, Mary; Edwards, A David; Smith, Stephen M; Tournier, Jacques-Donald; Hajnal, Joseph V; Jbabdi, Saad; Sotiropoulos, Stamatios N
2018-05-28
The developing Human Connectome Project is set to create and make available to the scientific community a 4-dimensional map of functional and structural cerebral connectivity from 20 to 44 weeks post-menstrual age, to allow exploration of the genetic and environmental influences on brain development, and the relation between connectivity and neurocognitive function. A large set of multi-modal MRI data from fetuses and newborn infants is currently being acquired, along with genetic, clinical and developmental information. In this overview, we describe the neonatal diffusion MRI (dMRI) image processing pipeline and the structural connectivity aspect of the project. Neonatal dMRI data poses specific challenges, and standard analysis techniques used for adult data are not directly applicable. We have developed a processing pipeline that deals directly with neonatal-specific issues, such as severe motion and motion-related artefacts, small brain sizes, high brain water content and reduced anisotropy. This pipeline allows automated analysis of in-vivo dMRI data, probes tissue microstructure, reconstructs a number of major white matter tracts, and includes an automated quality control framework that identifies processing issues or inconsistencies. We here describe the pipeline and present an exemplar analysis of data from 140 infants imaged at 38-44 weeks post-menstrual age. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Blažek, M.; Kabáth, P.; Klocová, T.; Skarka, M.
2018-04-01
Nowadays, when amount of data still increases, it is necessary to automatise their processing. State-of-the-art instruments are capable to produce even tens of thousands of images during a single night. One of them is HAWK-I that is a part of Very Large Telescope of European Southern Observatory. This instrument works in near-infrared band. In my Master thesis, I dealt with developing a pipeline to process data obtained by the instrument. It is written in Python programming language using commands of IRAF astronomical software and it is developed directly for "Fast Photometry Mode" of HAWK-I. In this mode, a large number of data has been obtained during secondary eclipses of exoplanets by their host star. The pipeline was tested by a data set from sorting of the images to making a light curve. The data of WASP-18 system contained almost 40 000 images observed by using a filter centered at 2.09 μm wavelength and there is a plan to process other data sets. A goal of processing of WASP-18 and the other data sets is consecutive analysis of exoplanetary atmospheres of the observed systems.
NASA Astrophysics Data System (ADS)
Lan, G.; Jiang, J.; Li, D. D.; Yi, W. S.; Zhao, Z.; Nie, L. N.
2013-12-01
The calculation of water-hammer pressure phenomenon of single-phase liquid is already more mature for a pipeline of uniform characteristics, but less research has addressed the calculation of slurry water hammer pressure in complex pipelines with slurry flows carrying solid particles. In this paper, based on the developments of slurry pipelines at home and abroad, the fundamental principle and method of numerical simulation of transient processes are presented, and several boundary conditions are given. Through the numerical simulation and analysis of transient processes of a practical engineering of long-distance slurry transportation pipeline system, effective protection measures and operating suggestions are presented. A model for calculating the water impact of solid and fluid phases is established based on a practical engineering of long-distance slurry pipeline transportation system. After performing a numerical simulation of the transient process, analyzing and comparing the results, effective protection measures and operating advice are recommended, which has guiding significance to the design and operating management of practical engineering of longdistance slurry pipeline transportation system.
SNPhylo: a pipeline to construct a phylogenetic tree from huge SNP data.
Lee, Tae-Ho; Guo, Hui; Wang, Xiyin; Kim, Changsoo; Paterson, Andrew H
2014-02-26
Phylogenetic trees are widely used for genetic and evolutionary studies in various organisms. Advanced sequencing technology has dramatically enriched data available for constructing phylogenetic trees based on single nucleotide polymorphisms (SNPs). However, massive SNP data makes it difficult to perform reliable analysis, and there has been no ready-to-use pipeline to generate phylogenetic trees from these data. We developed a new pipeline, SNPhylo, to construct phylogenetic trees based on large SNP datasets. The pipeline may enable users to construct a phylogenetic tree from three representative SNP data file formats. In addition, in order to increase reliability of a tree, the pipeline has steps such as removing low quality data and considering linkage disequilibrium. A maximum likelihood method for the inference of phylogeny is also adopted in generation of a tree in our pipeline. Using SNPhylo, users can easily produce a reliable phylogenetic tree from a large SNP data file. Thus, this pipeline can help a researcher focus more on interpretation of the results of analysis of voluminous data sets, rather than manipulations necessary to accomplish the analysis.
Slaying Hydra: A Python-Based Reduction Pipeline for the Hydra Multi-Object Spectrograph
NASA Astrophysics Data System (ADS)
Seifert, Richard; Mann, Andrew
2018-01-01
We present a Python-based data reduction pipeline for the Hydra Multi-Object Spectrograph on the WIYN 3.5 m telescope, an instrument which enables simultaneous spectroscopy of up to 93 targets. The reduction steps carried out include flat-fielding, dynamic fiber tracing, wavelength calibration, optimal fiber extraction, and sky subtraction. The pipeline also supports the use of sky lines to correct for zero-point offsets between fibers. To account for the moving parts on the instrument and telescope, fiber positions and wavelength solutions are derived in real-time for each dataset. The end result is a one-dimensional spectrum for each target fiber. Quick and fully automated, the pipeline enables on-the-fly reduction while observing, and has been known to outperform the IRAF pipeline by more accurately reproducing known RVs. While Hydra has many configurations in both high- and low-resolution, the pipeline was developed and tested with only one high-resolution mode. In the future we plan to expand the pipeline to work in most commonly used modes.
NASA Astrophysics Data System (ADS)
Lee, Rebecca Elizabeth
Despite the proliferation of women in higher education and the workforce, they have yet to achieve parity with men in many of the science, technology, engineering, and math (STEM) majors and careers. The gap is even greater in the representation of women from lower socioeconomic backgrounds. This study examined pre-college intervention strategies provided by the University of Southern California's Math, Engineering, Science Achievement (MESA) program, as well as the relationships and experiences that contributed to the success of underrepresented female high school students in the STEM pipeline. A social capital framework provided the backdrop to the study. This qualitative study takes an ethnographic approach, incorporating 11 interviews, 42 hours of observation, and document analysis to address the research questions: How does involvement in the MESA program impact female students' decisions to pursue a mathematics or science major in college? What is the role of significant others in supporting and encouraging student success? The findings revealed a continuous cycle of support for these students. The cycle started in the home environment, where parents were integral in the early influence on the students' decisions to pursue higher education. Relationships with teachers, counselors, and peers provided critical networks of support in helping these students to achieve their academic goals. Participation in the MESA program empowered the students and provided additional connections to knowledge-based resources. This study highlights the interplay among family, school, and the MESA program in the overall support of underrepresented female students in the STEM pipeline.
Commercial Nuclear Power Industry: Assessing and Meeting the Radiation Protection Workforce Needs.
Hiatt, Jerry W
2017-02-01
This paper will provide an overview of the process used by the commercial nuclear power industry in assessing the status of existing industry staffing and projecting future supply demand needs. The most recent Nuclear Energy Institute-developed "Pipeline Survey Results" will be reviewed with specific emphasis on the radiation protection specialty. Both radiation protection technician and health physicist specialties will be discussed. The industry-initiated Nuclear Uniform Curriculum Program will be reviewed as an example of how the industry has addressed the need for developing additional resources. Furthermore, the reality of challenges encountered in maintaining the needed number of health physicists will also be discussed.
The JCSG high-throughput structural biology pipeline.
Elsliger, Marc André; Deacon, Ashley M; Godzik, Adam; Lesley, Scott A; Wooley, John; Wüthrich, Kurt; Wilson, Ian A
2010-10-01
The Joint Center for Structural Genomics high-throughput structural biology pipeline has delivered more than 1000 structures to the community over the past ten years. The JCSG has made a significant contribution to the overall goal of the NIH Protein Structure Initiative (PSI) of expanding structural coverage of the protein universe, as well as making substantial inroads into structural coverage of an entire organism. Targets are processed through an extensive combination of bioinformatics and biophysical analyses to efficiently characterize and optimize each target prior to selection for structure determination. The pipeline uses parallel processing methods at almost every step in the process and can adapt to a wide range of protein targets from bacterial to human. The construction, expansion and optimization of the JCSG gene-to-structure pipeline over the years have resulted in many technological and methodological advances and developments. The vast number of targets and the enormous amounts of associated data processed through the multiple stages of the experimental pipeline required the development of variety of valuable resources that, wherever feasible, have been converted to free-access web-based tools and applications.
Kepler Science Operations Center Architecture
NASA Technical Reports Server (NTRS)
Middour, Christopher; Klaus, Todd; Jenkins, Jon; Pletcher, David; Cote, Miles; Chandrasekaran, Hema; Wohler, Bill; Girouard, Forrest; Gunter, Jay P.; Uddin, Kamal;
2010-01-01
We give an overview of the operational concepts and architecture of the Kepler Science Data Pipeline. Designed, developed, operated, and maintained by the Science Operations Center (SOC) at NASA Ames Research Center, the Kepler Science Data Pipeline is central element of the Kepler Ground Data System. The SOC charter is to analyze stellar photometric data from the Kepler spacecraft and report results to the Kepler Science Office for further analysis. We describe how this is accomplished via the Kepler Science Data Pipeline, including the hardware infrastructure, scientific algorithms, and operational procedures. The SOC consists of an office at Ames Research Center, software development and operations departments, and a data center that hosts the computers required to perform data analysis. We discuss the high-performance, parallel computing software modules of the Kepler Science Data Pipeline that perform transit photometry, pixel-level calibration, systematic error-correction, attitude determination, stellar target management, and instrument characterization. We explain how data processing environments are divided to support operational processing and test needs. We explain the operational timelines for data processing and the data constructs that flow into the Kepler Science Data Pipeline.
NASA Astrophysics Data System (ADS)
Toropov, V. S.
2018-05-01
The paper suggests a set of measures to select the equipment and its components in order to reduce energy costs in the process of pulling the pipeline into the well in the constructing the trenchless pipeline crossings of various materials using horizontal directional drilling technology. A methodology for reducing energy costs has been developed by regulating the operation modes of equipment during the process of pulling the working pipeline into a drilled and pre-expanded well. Since the power of the drilling rig is the most important criterion in the selection of equipment for the construction of a trenchless crossover, an algorithm is proposed for calculating the required capacity of the rig when operating in different modes in the process of pulling the pipeline into the well.
The Analysis of the T+X Program and a Proposal for a New Pilot
2013-07-01
analyze how suitable the T+X ratings are for expansion from a 4-year obligation ( 4YO ) to a 5YO. We start by looking at these ratings in 2008 through...a training pipeline of 7.8 months, on average. In combination with a 60-month PST, there was a 20-month deficit between the sailors’ 4YO and the...had a training pipeline of 6.5 months before June 2011 and about 6.0 months since then. Previously, PSTs varied from 54 to 60 months; thus, 4YO
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-19
... the best available knowledge and expertise, and considers stakeholder perspectives. Specifically the... rooms. All public spaces are ADA accessible. Contact the Westin for more information. Refer to the...
Xu, Dong; Zhang, Jian; Roy, Ambrish; Zhang, Yang
2011-01-01
I-TASSER is an automated pipeline for protein tertiary structure prediction using multiple threading alignments and iterative structure assembly simulations. In CASP9 experiments, two new algorithms, QUARK and FG-MD, were added to the I-TASSER pipeline for improving the structural modeling accuracy. QUARK is a de novo structure prediction algorithm used for structure modeling of proteins that lack detectable template structures. For distantly homologous targets, QUARK models are found useful as a reference structure for selecting good threading alignments and guiding the I-TASSER structure assembly simulations. FG-MD is an atomic-level structural refinement program that uses structural fragments collected from the PDB structures to guide molecular dynamics simulation and improve the local structure of predicted model, including hydrogen-bonding networks, torsion angles and steric clashes. Despite considerable progress in both the template-based and template-free structure modeling, significant improvements on protein target classification, domain parsing, model selection, and ab initio folding of beta-proteins are still needed to further improve the I-TASSER pipeline. PMID:22069036
49 CFR 110.30 - Grant application.
Code of Federal Regulations, 2012 CFR
2012-10-01
..., Pipeline and Hazardous Materials Safety Administration, U.S. Department of Transportation, East Building... emergency response team; and (D) The impact that the grant will have on the program. (ii) A discussion of...
49 CFR 110.30 - Grant application.
Code of Federal Regulations, 2013 CFR
2013-10-01
..., Pipeline and Hazardous Materials Safety Administration, U.S. Department of Transportation, East Building... emergency response team; and (D) The impact that the grant will have on the program. (ii) A discussion of...
49 CFR 110.30 - Grant application.
Code of Federal Regulations, 2014 CFR
2014-10-01
..., Pipeline and Hazardous Materials Safety Administration, U.S. Department of Transportation, East Building... emergency response team; and (D) The impact that the grant will have on the program. (ii) A discussion of...
49 CFR 110.30 - Grant application.
Code of Federal Regulations, 2011 CFR
2011-10-01
..., Pipeline and Hazardous Materials Safety Administration, U.S. Department of Transportation, East Building... emergency response team; and (D) The impact that the grant will have on the program. (ii) A discussion of...
Algorithms for parallel flow solvers on message passing architectures
NASA Technical Reports Server (NTRS)
Vanderwijngaart, Rob F.
1995-01-01
The purpose of this project has been to identify and test suitable technologies for implementation of fluid flow solvers -- possibly coupled with structures and heat equation solvers -- on MIMD parallel computers. In the course of this investigation much attention has been paid to efficient domain decomposition strategies for ADI-type algorithms. Multi-partitioning derives its efficiency from the assignment of several blocks of grid points to each processor in the parallel computer. A coarse-grain parallelism is obtained, and a near-perfect load balance results. In uni-partitioning every processor receives responsibility for exactly one block of grid points instead of several. This necessitates fine-grain pipelined program execution in order to obtain a reasonable load balance. Although fine-grain parallelism is less desirable on many systems, especially high-latency networks of workstations, uni-partition methods are still in wide use in production codes for flow problems. Consequently, it remains important to achieve good efficiency with this technique that has essentially been superseded by multi-partitioning for parallel ADI-type algorithms. Another reason for the concentration on improving the performance of pipeline methods is their applicability in other types of flow solver kernels with stronger implied data dependence. Analytical expressions can be derived for the size of the dynamic load imbalance incurred in traditional pipelines. From these it can be determined what is the optimal first-processor retardation that leads to the shortest total completion time for the pipeline process. Theoretical predictions of pipeline performance with and without optimization match experimental observations on the iPSC/860 very well. Analysis of pipeline performance also highlights the effect of uncareful grid partitioning in flow solvers that employ pipeline algorithms. If grid blocks at boundaries are not at least as large in the wall-normal direction as those immediately adjacent to them, then the first processor in the pipeline will receive a computational load that is less than that of subsequent processors, magnifying the pipeline slowdown effect. Extra compensation is needed for grid boundary effects, even if all grid blocks are equally sized.
Kepler: A Search for Terrestrial Planets - SOC 9.3 DR25 Pipeline Parameter Configuration Reports
NASA Technical Reports Server (NTRS)
Campbell, Jennifer R.
2017-01-01
This document describes the manner in which the pipeline and algorithm parameters for the Kepler Science Operations Center (SOC) science data processing pipeline were managed. This document is intended for scientists and software developers who wish to better understand the software design for the final Kepler codebase (SOC 9.3) and the effect of the software parameters on the Data Release (DR) 25 archival products.
Inverse Transient Analysis for Classification of Wall Thickness Variations in Pipelines
Tuck, Jeffrey; Lee, Pedro
2013-01-01
Analysis of transient fluid pressure signals has been investigated as an alternative method of fault detection in pipeline systems and has shown promise in both laboratory and field trials. The advantage of the method is that it can potentially provide a fast and cost effective means of locating faults such as leaks, blockages and pipeline wall degradation within a pipeline while the system remains fully operational. The only requirement is that high speed pressure sensors are placed in contact with the fluid. Further development of the method requires detailed numerical models and enhanced understanding of transient flow within a pipeline where variations in pipeline condition and geometry occur. One such variation commonly encountered is the degradation or thinning of pipe walls, which can increase the susceptible of a pipeline to leak development. This paper aims to improve transient-based fault detection methods by investigating how changes in pipe wall thickness will affect the transient behaviour of a system; this is done through the analysis of laboratory experiments. The laboratory experiments are carried out on a stainless steel pipeline of constant outside diameter, into which a pipe section of variable wall thickness is inserted. In order to detect the location and severity of these changes in wall conditions within the laboratory system an inverse transient analysis procedure is employed which considers independent variations in wavespeed and diameter. Inverse transient analyses are carried out using a genetic algorithm optimisation routine to match the response from a one-dimensional method of characteristics transient model to the experimental time domain pressure responses. The accuracy of the detection technique is evaluated and benefits associated with various simplifying assumptions and simulation run times are investigated. It is found that for the case investigated, changes in the wavespeed and nominal diameter of the pipeline are both important to the accuracy of the inverse analysis procedure and can be used to differentiate the observed transient behaviour caused by changes in wall thickness from that caused by other known faults such as leaks. Further application of the method to real pipelines is discussed.
Detection of underground pipeline based on Golay waveform design
NASA Astrophysics Data System (ADS)
Dai, Jingjing; Xu, Dazhuan
2017-08-01
The detection of underground pipeline is an important problem in the development of the city, but the research about it is not mature at present. In this paper, based on the principle of waveform design in wireless communication, we design an acoustic signal detection system to detect the location of underground pipelines. According to the principle of acoustic localization, we chose DSP-F28335 as the development board, and use DA and AD module as the master control chip. The DA module uses complementary Golay sequence as emission signal. The AD module acquisiting data synchronously, so that the echo signals which containing position information of the target is recovered through the signal processing. The test result shows that the method in this paper can not only calculate the sound velocity of the soil, but also can locate the location of underground pipelines accurately.
The LCOGT Science Archive and Data Pipeline
NASA Astrophysics Data System (ADS)
Lister, Tim; Walker, Z.; Ciardi, D.; Gelino, C. R.; Good, J.; Laity, A.; Swain, M.
2013-01-01
Las Cumbres Observatory Global Telescope (LCOGT) is building and deploying a world-wide network of optical telescopes dedicated to time-domain astronomy. In the past year, we have deployed and commissioned four new 1m telescopes at McDonald Observatory, Texas and at CTIO, Chile, with more to come at SAAO, South Africa and Siding Spring Observatory, Australia. To handle these new data sources coming from the growing LCOGT network, and to serve them to end users, we have constructed a new data pipeline and Science Archive. We describe the new LCOGT pipeline, currently under development and testing, which makes use of the ORAC-DR automated recipe-based data reduction pipeline and illustrate some of the new data products. We also present the new Science Archive, which is being developed in partnership with the Infrared Processing and Analysis Center (IPAC) and show some of the new features the Science Archive provides.
A Model for Oil-Gas Pipelines Cost Prediction Based on a Data Mining Process
NASA Astrophysics Data System (ADS)
Batzias, Fragiskos A.; Spanidis, Phillip-Mark P.
2009-08-01
This paper addresses the problems associated with the cost estimation of oil/gas pipelines during the elaboration of feasibility assessments. Techno-economic parameters, i.e., cost, length and diameter, are critical for such studies at the preliminary design stage. A methodology for the development of a cost prediction model based on Data Mining (DM) process is proposed. The design and implementation of a Knowledge Base (KB), maintaining data collected from various disciplines of the pipeline industry, are presented. The formulation of a cost prediction equation is demonstrated by applying multiple regression analysis using data sets extracted from the KB. Following the methodology proposed, a learning context is inductively developed as background pipeline data are acquired, grouped and stored in the KB, and through a linear regression model provide statistically substantial results, useful for project managers or decision makers.
ERIC Educational Resources Information Center
Bernstein, Hamutal; Martin, Carlos; Eyster, Lauren; Anderson, Theresa; Owen, Stephanie; Martin-Caughey, Amanda
2015-01-01
The Urban Institute conducted an implementation and participant-outcomes evaluation of the Alaska Native Science & Engineering Program (ANSEP). ANSEP is a multi-stage initiative designed to prepare and support Alaska Native students from middle school through graduate school to succeed in science, technology, engineering, and math (STEM)…
ERIC Educational Resources Information Center
Palmer, Mark H.; Elmore, R. Douglas; Watson, Mary Jo; Kloesel, Kevin; Palmer, Kristen
2009-01-01
Very few Native American students pursue careers in the geosciences. To address this national problem, several units at the University of Oklahoma are implementing a geoscience "pipeline" program that is designed to increase the number of Native American students entering geoscience disciplines. One of the program's strategies includes…
49 CFR 192.907 - What must an operator do to implement this subpart?
Code of Federal Regulations, 2011 CFR
2011-10-01
... management program must consist, at a minimum, of a framework that describes the process for implementing... Transmission Pipeline Integrity Management § 192.907 What must an operator do to implement this subpart? (a... follow a written integrity management program that contains all the elements described in § 192.911 and...
49 CFR 192.907 - What must an operator do to implement this subpart?
Code of Federal Regulations, 2012 CFR
2012-10-01
... management program must consist, at a minimum, of a framework that describes the process for implementing... Transmission Pipeline Integrity Management § 192.907 What must an operator do to implement this subpart? (a... follow a written integrity management program that contains all the elements described in § 192.911 and...
49 CFR 192.907 - What must an operator do to implement this subpart?
Code of Federal Regulations, 2013 CFR
2013-10-01
... management program must consist, at a minimum, of a framework that describes the process for implementing... Transmission Pipeline Integrity Management § 192.907 What must an operator do to implement this subpart? (a... follow a written integrity management program that contains all the elements described in § 192.911 and...
49 CFR 192.907 - What must an operator do to implement this subpart?
Code of Federal Regulations, 2014 CFR
2014-10-01
... management program must consist, at a minimum, of a framework that describes the process for implementing... Transmission Pipeline Integrity Management § 192.907 What must an operator do to implement this subpart? (a... follow a written integrity management program that contains all the elements described in § 192.911 and...
Corrosivity Sensor for Exposed Pipelines Based on Wireless Energy Transfer
Lawand, Lydia; Shiryayev, Oleg; Al Handawi, Khalil; Vahdati, Nader; Rostron, Paul
2017-01-01
External corrosion was identified as one of the main causes of pipeline failures worldwide. A solution that addresses the issue of detecting and quantifying corrosivity of environment for application to existing exposed pipelines has been developed. It consists of a sensing array made of an assembly of thin strips of pipeline steel and a circuit that provides a visual sensor reading to the operator. The proposed sensor is passive and does not require a constant power supply. Circuit design was validated through simulations and lab experiments. Accelerated corrosion experiment was conducted to confirm the feasibility of the proposed corrosivity sensor design. PMID:28556805
A VLSI pipeline design of a fast prime factor DFT on a finite field
NASA Technical Reports Server (NTRS)
Truong, T. K.; Hsu, I. S.; Shao, H. M.; Reed, I. S.; Shyu, H. C.
1986-01-01
A conventional prime factor discrete Fourier transform (DFT) algorithm is used to realize a discrete Fourier-like transform on the finite field, GF(q sub n). A pipeline structure is used to implement this prime factor DFT over GF(q sub n). This algorithm is developed to compute cyclic convolutions of complex numbers and to decode Reed-Solomon codes. Such a pipeline fast prime factor DFT algorithm over GF(q sub n) is regular, simple, expandable, and naturally suitable for VLSI implementation. An example illustrating the pipeline aspect of a 30-point transform over GF(q sub n) is presented.
Development of the Write Process for Pipeline-Ready Heavy Oil
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee Brecher; Charles Mones; Frank Guffey
Work completed under this program advances the goal of demonstrating Western Research Institute's (WRI's) WRITE{trademark} process for upgrading heavy oil at field scale. MEG Energy Corporation (MEG) located in Calgary, Alberta, Canada supported efforts at WRI to develop the WRITE{trademark} process as an oil sands, field-upgrading technology through this Task 51 Jointly Sponsored Research project. The project consisted of 6 tasks: (1) optimization of the distillate recovery unit (DRU), (2) demonstration and design of a continuous coker, (3) conceptual design and cost estimate for a commercial facility, (4) design of a WRITE{trademark} pilot plant, (5) hydrotreating studies, and (6) establishmore » a petroleum analysis laboratory. WRITE{trademark} is a heavy oil and bitumen upgrading process that produces residuum-free, pipeline ready oil from heavy material with undiluted density and viscosity that exceed prevailing pipeline specifications. WRITE{trademark} uses two processing stages to achieve low and high temperature conversion of heavy oil or bitumen. The first stage DRU operates at mild thermal cracking conditions, yielding a light overhead product and a heavy residuum or bottoms material. These bottoms flow to the second stage continuous coker that operates at severe pyrolysis conditions, yielding light pyrolyzate and coke. The combined pyrolyzate and mildly cracked overhead streams form WRITE{trademark}'s synthetic crude oil (SCO) production. The main objectives of this project were to (1) complete testing and analysis at bench scale with the DRU and continuous coker reactors and provide results to MEG for process evaluation and scale-up determinations and (2) complete a technical and economic assessment of WRITE{trademark} technology to determine its viability. The DRU test program was completed and a processing envelope developed. These results were used for process assessment and for scaleup. Tests in the continuous coker were intended to determine the throughput capability of the coker so a scaled design could be developed that maximized feed rate for a given size of reactor. These tests were only partially successful because of equipment problems. A redesigned coker, which addressed the problems, has been build but not operated. A preliminary economic analysis conducted by MEG and an their engineering consultant concluded that the WRITE{trademark} process is a technically feasible method for upgrading bitumen and that it produces SCO that meets pipeline specifications for density. When compared to delayed coking, the industry benchmark for thermal upgrading of bitumen, WRITE{trademark} produced more SCO, less coke, less CO{sub 2} per barrel of bitumen fed, and had lower capital and operating costs. On the other hand, WRITE{trademark}'s lower processing severity yielded crude with higher density and a different product distribution for naphtha, light gas oil and vacuum oil that, taken together, might reduce the value of the SCO. These issues plus the completion of more detailed process evaluation and economics need to be resolved before WRITE{trademark} is deployed as a field-scale pilot.« less
Bad Actors Criticality Assessment for Pipeline system
NASA Astrophysics Data System (ADS)
Nasir, Meseret; Chong, Kit wee; Osman, Sabtuni; Siaw Khur, Wee
2015-04-01
Failure of a pipeline system could bring huge economic loss. In order to mitigate such catastrophic loss, it is required to evaluate and rank the impact of each bad actor of the pipeline system. In this study, bad actors are known as the root causes or any potential factor leading to the system downtime. Fault Tree Analysis (FTA) is used to analyze the probability of occurrence for each bad actor. Bimbaum's Importance and criticality measure (BICM) is also employed to rank the impact of each bad actor on the pipeline system failure. The results demonstrate that internal corrosion; external corrosion and construction damage are critical and highly contribute to the pipeline system failure with 48.0%, 12.4% and 6.0% respectively. Thus, a minor improvement in internal corrosion; external corrosion and construction damage would bring significant changes in the pipeline system performance and reliability. These results could also be useful to develop efficient maintenance strategy by identifying the critical bad actors.
NASA Astrophysics Data System (ADS)
Hosseini, Mahmood; Salek, Shamila; Moradi, Masoud
2008-07-01
The effect of corrosion phenomenon has been investigated by performing some sets of 3-Dimensional Nonlinear Time History Analysis (3-D NLTHA) in which soil structure interaction as well as wave propagation effects have been taken into consideration. The 3-D NLTHA has been performed by using a finite element computer program, and both states of overall and local corrosions have been considered for the study. The corrosion has been modeled in the computer program by introducing decreased values of either pipe wall thickness or modulus of elasticity and Poisson ratio. Three sets of 3-component accelerograms have been used in analyses, and some appropriate numbers of zeros have been added at the beginning of records to take into account the wave propagation in soil and its multi-support excitation effect. The soil has been modeled by nonlinear springs in longitudinal, lateral, and vertical directions. A relatively long segment of the pipeline has been considered for the study and the effect of end conditions has been investigated by assuming different kinds end supports for the segment. After studying the corroded pipeline, a remedy has been considered for the seismic retrofit of corroded pipe by using a kind of Fiber Reinforced Polymers (FRP) cover. The analyses have been repeated for the retrofitted pipeline to realize the adequacy of FRP cover. Numerical results show that if the length of the pipeline segment is large enough, comparing to the wave length of shear wave in the soil, the end conditions do not have any major effect on the maximum stress and strain values in the pipe. Results also show that corrosion can lead to the increase in plastic strain values in the pipe up to 4 times in the case of overall corrosion and up to 20 times in the case of local corrosion. The satisfactory effect of using FRP cover is also shown by the analyses results, which confirm the decrease of strain values to 1/3.
Zhang, Jing; Liang, Lichen; Anderson, Jon R; Gatewood, Lael; Rottenberg, David A; Strother, Stephen C
2008-01-01
As functional magnetic resonance imaging (fMRI) becomes widely used, the demands for evaluation of fMRI processing pipelines and validation of fMRI analysis results is increasing rapidly. The current NPAIRS package, an IDL-based fMRI processing pipeline evaluation framework, lacks system interoperability and the ability to evaluate general linear model (GLM)-based pipelines using prediction metrics. Thus, it can not fully evaluate fMRI analytical software modules such as FSL.FEAT and NPAIRS.GLM. In order to overcome these limitations, a Java-based fMRI processing pipeline evaluation system was developed. It integrated YALE (a machine learning environment) into Fiswidgets (a fMRI software environment) to obtain system interoperability and applied an algorithm to measure GLM prediction accuracy. The results demonstrated that the system can evaluate fMRI processing pipelines with univariate GLM and multivariate canonical variates analysis (CVA)-based models on real fMRI data based on prediction accuracy (classification accuracy) and statistical parametric image (SPI) reproducibility. In addition, a preliminary study was performed where four fMRI processing pipelines with GLM and CVA modules such as FSL.FEAT and NPAIRS.CVA were evaluated with the system. The results indicated that (1) the system can compare different fMRI processing pipelines with heterogeneous models (NPAIRS.GLM, NPAIRS.CVA and FSL.FEAT) and rank their performance by automatic performance scoring, and (2) the rank of pipeline performance is highly dependent on the preprocessing operations. These results suggest that the system will be of value for the comparison, validation, standardization and optimization of functional neuroimaging software packages and fMRI processing pipelines.
Development of an Automated Imaging Pipeline for the Analysis of the Zebrafish Larval Kidney
Westhoff, Jens H.; Giselbrecht, Stefan; Schmidts, Miriam; Schindler, Sebastian; Beales, Philip L.; Tönshoff, Burkhard; Liebel, Urban; Gehrig, Jochen
2013-01-01
The analysis of kidney malformation caused by environmental influences during nephrogenesis or by hereditary nephropathies requires animal models allowing the in vivo observation of developmental processes. The zebrafish has emerged as a useful model system for the analysis of vertebrate organ development and function, and it is suitable for the identification of organotoxic or disease-modulating compounds on a larger scale. However, to fully exploit its potential in high content screening applications, dedicated protocols are required allowing the consistent visualization of inner organs such as the embryonic kidney. To this end, we developed a high content screening compatible pipeline for the automated imaging of standardized views of the developing pronephros in zebrafish larvae. Using a custom designed tool, cavities were generated in agarose coated microtiter plates allowing for accurate positioning and orientation of zebrafish larvae. This enabled the subsequent automated acquisition of stable and consistent dorsal views of pronephric kidneys. The established pipeline was applied in a pilot screen for the analysis of the impact of potentially nephrotoxic drugs on zebrafish pronephros development in the Tg(wt1b:EGFP) transgenic line in which the developing pronephros is highlighted by GFP expression. The consistent image data that was acquired allowed for quantification of gross morphological pronephric phenotypes, revealing concentration dependent effects of several compounds on nephrogenesis. In addition, applicability of the imaging pipeline was further confirmed in a morpholino based model for cilia-associated human genetic disorders associated with different intraflagellar transport genes. The developed tools and pipeline can be used to study various aspects in zebrafish kidney research, and can be readily adapted for the analysis of other organ systems. PMID:24324758
Development of an automated imaging pipeline for the analysis of the zebrafish larval kidney.
Westhoff, Jens H; Giselbrecht, Stefan; Schmidts, Miriam; Schindler, Sebastian; Beales, Philip L; Tönshoff, Burkhard; Liebel, Urban; Gehrig, Jochen
2013-01-01
The analysis of kidney malformation caused by environmental influences during nephrogenesis or by hereditary nephropathies requires animal models allowing the in vivo observation of developmental processes. The zebrafish has emerged as a useful model system for the analysis of vertebrate organ development and function, and it is suitable for the identification of organotoxic or disease-modulating compounds on a larger scale. However, to fully exploit its potential in high content screening applications, dedicated protocols are required allowing the consistent visualization of inner organs such as the embryonic kidney. To this end, we developed a high content screening compatible pipeline for the automated imaging of standardized views of the developing pronephros in zebrafish larvae. Using a custom designed tool, cavities were generated in agarose coated microtiter plates allowing for accurate positioning and orientation of zebrafish larvae. This enabled the subsequent automated acquisition of stable and consistent dorsal views of pronephric kidneys. The established pipeline was applied in a pilot screen for the analysis of the impact of potentially nephrotoxic drugs on zebrafish pronephros development in the Tg(wt1b:EGFP) transgenic line in which the developing pronephros is highlighted by GFP expression. The consistent image data that was acquired allowed for quantification of gross morphological pronephric phenotypes, revealing concentration dependent effects of several compounds on nephrogenesis. In addition, applicability of the imaging pipeline was further confirmed in a morpholino based model for cilia-associated human genetic disorders associated with different intraflagellar transport genes. The developed tools and pipeline can be used to study various aspects in zebrafish kidney research, and can be readily adapted for the analysis of other organ systems.
Nayor, Jennifer; Borges, Lawrence F; Goryachev, Sergey; Gainer, Vivian S; Saltzman, John R
2018-07-01
ADR is a widely used colonoscopy quality indicator. Calculation of ADR is labor-intensive and cumbersome using current electronic medical databases. Natural language processing (NLP) is a method used to extract meaning from unstructured or free text data. (1) To develop and validate an accurate automated process for calculation of adenoma detection rate (ADR) and serrated polyp detection rate (SDR) on data stored in widely used electronic health record systems, specifically Epic electronic health record system, Provation ® endoscopy reporting system, and Sunquest PowerPath pathology reporting system. Screening colonoscopies performed between June 2010 and August 2015 were identified using the Provation ® reporting tool. An NLP pipeline was developed to identify adenomas and sessile serrated polyps (SSPs) on pathology reports corresponding to these colonoscopy reports. The pipeline was validated using a manual search. Precision, recall, and effectiveness of the natural language processing pipeline were calculated. ADR and SDR were then calculated. We identified 8032 screening colonoscopies that were linked to 3821 pathology reports (47.6%). The NLP pipeline had an accuracy of 100% for adenomas and 100% for SSPs. Mean total ADR was 29.3% (range 14.7-53.3%); mean male ADR was 35.7% (range 19.7-62.9%); and mean female ADR was 24.9% (range 9.1-51.0%). Mean total SDR was 4.0% (0-9.6%). We developed and validated an NLP pipeline that accurately and automatically calculates ADRs and SDRs using data stored in Epic, Provation ® and Sunquest PowerPath. This NLP pipeline can be used to evaluate colonoscopy quality parameters at both individual and practice levels.
Kepler Science Operations Center Pipeline Framework
NASA Technical Reports Server (NTRS)
Klaus, Todd C.; McCauliff, Sean; Cote, Miles T.; Girouard, Forrest R.; Wohler, Bill; Allen, Christopher; Middour, Christopher; Caldwell, Douglas A.; Jenkins, Jon M.
2010-01-01
The Kepler mission is designed to continuously monitor up to 170,000 stars at a 30 minute cadence for 3.5 years searching for Earth-size planets. The data are processed at the Science Operations Center (SOC) at NASA Ames Research Center. Because of the large volume of data and the memory and CPU-intensive nature of the analysis, significant computing hardware is required. We have developed generic pipeline framework software that is used to distribute and synchronize the processing across a cluster of CPUs and to manage the resulting products. The framework is written in Java and is therefore platform-independent, and scales from a single, standalone workstation (for development and research on small data sets) to a full cluster of homogeneous or heterogeneous hardware with minimal configuration changes. A plug-in architecture provides customized control of the unit of work without the need to modify the framework itself. Distributed transaction services provide for atomic storage of pipeline products for a unit of work across a relational database and the custom Kepler DB. Generic parameter management and data accountability services are provided to record the parameter values, software versions, and other meta-data used for each pipeline execution. A graphical console allows for the configuration, execution, and monitoring of pipelines. An alert and metrics subsystem is used to monitor the health and performance of the pipeline. The framework was developed for the Kepler project based on Kepler requirements, but the framework itself is generic and could be used for a variety of applications where these features are needed.
East Spar: Alliance approach for offshore gasfield development
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1998-04-01
East spar is a gas/condensate field 25 miles west of Barrow Island, offshore Western Australia. Proved plus probable reserves at the time of development were estimated at 430 Bcf gas and 28 million bbl of condensate. The field was discovered in early 1993 when the Western Australia gas market was deregulated and the concept of a gas pipeline to the gold fields was proposed. This created a window of opportunity for East Spar, but only if plans could be established quickly. A base-case development plan was established to support gas marketing while alternative plans could be developed in parallel. Themore » completed East Spar facilities comprise two subsea wells, a subsea gathering system, and a multiphase (gas/condensate/water) pipeline to new gas-processing facilities. The subsea facilities are controlled through a navigation, communication, and control (NCC) buoy. The control room and gas-processing plant are 39 miles east of the field on Varanus Island. Sales gas is exported through a pre-existing gas-sales pipeline to the Dampier-Bunbury and Goldfields Gas Transmission pipelines. Condensate is stored in and exported by use of pre-existing facilities on Varanus Island. Field development from approval to first production took 22 months. The paper describes its field development.« less
NASA Astrophysics Data System (ADS)
Razak, K. Abdul; Othman, M. I. H.; Mat Yusuf, S.; Fuad, M. F. I. Ahmad; yahaya, Effah
2018-05-01
Oil and gas today being developed at different water depth characterized as shallow, deep and ultra-deep waters. Among the major components involved during the offshore installation is pipelines. Pipelines are a transportation method of material through a pipe. In oil and gas industry, pipeline come from a bunch of line pipe that welded together to become a long pipeline and can be divided into two which is gas pipeline and oil pipeline. In order to perform pipeline installation, we need pipe laying barge or pipe laying vessel. However, pipe laying vessel can be divided into two types: S-lay vessel and J-lay vessel. The function of pipe lay vessel is not only to perform pipeline installation. It also performed installation of umbilical or electrical cables. In the simple words, pipe lay vessel is performing the installation of subsea in all the connecting infrastructures. Besides that, the installation processes of pipelines require special focus to make the installation succeed. For instance, the heavy pipelines may exceed the lay vessel’s tension capacities in certain kind of water depth. Pipeline have their own characteristic and we can group it or differentiate it by certain parameters such as grade of material, type of material, size of diameter, size of wall thickness and the strength. For instances, wall thickness parameter studies indicate that if use the higher steel grade of the pipelines will have a significant contribution in pipeline wall thickness reduction. When running the process of pipe lay, water depth is the most critical thing that we need to monitor and concern about because of course we cannot control the water depth but we can control the characteristic of the pipe like apply line pipe that have wall thickness suitable with current water depth in order to avoid failure during the installation. This research will analyse whether the pipeline parameter meet the requirements limit and minimum yield stress. It will overlook to simulate pipe grade API 5L X60 which size from 8 to 20mm thickness with a water depth of 50 to 300m. Result shown that pipeline installation will fail from the wall thickness of 18mm onwards since it has been passed the critical yield percentage.
Rights, Bunche, Rose and the "pipeline".
Marks, Steven R.; Wilkinson-Lee, Ada M.
2006-01-01
We address education "pipelines" and their social ecology, drawing on the 1930's writing of Ralph J. Bunche, a Nobel peace maker whose war against systematic second-class education for the poor, minority and nonminority alike is nearly forgotten; and of the epidemiologist Geoffrey Rose, whose 1985 paper spotlighted the difficulty of shifting health status and risks in a "sick society. From the perspective of human rights and human development, we offer suggestions toward the paired "ends" of the pipeline: equality of opportunity for individuals, and equality of health for populations. We offer a national "to do" list to improve pipeline flow and then reconsider the merits of the "pipeline" metaphor, which neither matches the reality of lived education pathways nor supports notions of human rights, freedoms and capabilities, but rather reflects a commoditizing stance to free persons. PMID:17019927
Budin, Francois; Hoogstoel, Marion; Reynolds, Patrick; Grauer, Michael; O'Leary-Moore, Shonagh K; Oguz, Ipek
2013-01-01
Magnetic resonance imaging (MRI) of rodent brains enables study of the development and the integrity of the brain under certain conditions (alcohol, drugs etc.). However, these images are difficult to analyze for biomedical researchers with limited image processing experience. In this paper we present an image processing pipeline running on a Midas server, a web-based data storage system. It is composed of the following steps: rigid registration, skull-stripping, average computation, average parcellation, parcellation propagation to individual subjects, and computation of region-based statistics on each image. The pipeline is easy to configure and requires very little image processing knowledge. We present results obtained by processing a data set using this pipeline and demonstrate how this pipeline can be used to find differences between populations.
Research &Discover: A Pipeline of the Next Generation of Earth System Scientists
NASA Astrophysics Data System (ADS)
Hurtt, G. C.; Einaudi, F.; Moore, B.; Salomonson, V.; Campbell, J.
2006-12-01
In 2002, the University of New Hampshire (UNH) and NASA Goddard Space Flight Center (GSFC) started the educational initiative Research &Discover with the goals to: (i) recruit outstanding young scientists into research careers in Earth science and Earth remote sensing (broadly defined), and (ii) support Earth science graduate students enrolled at UNH through a program of collaborative partnerships with GSFC scientists and UNH faculty. To meet these goals, the program consists of a linked set of educational opportunities that begins with a paid summer research internship at UNH for students following their Junior year of college, and is followed by a second paid summer internship at GSFC for students following their Senior year of college. These summer internships are then followed by two-year fellowship opportunities at UNH for graduate studies jointly supervised by UNH faculty and GSFC scientists. After 5 years of implementation, the program has awarded summer research internships to 22 students, and graduate research fellowships to 6 students. These students have produced more than 78 scientific research presentations, 5 undergraduate theses, 2 Masters theses, and 4 peer-reviewed publications. More than 80% of alums are actively pursuing careers in Earth sciences now. In the process, the program has engaged 19 faculty from UNH and 15 scientists from GSFC as advisors/mentors. New collaborations between these scientists have resulted in new joint research proposals, and the development, delivery, and assessment of a new course in Earth System Science at UNH. Research &Discover represents an educational model of collaboration between a national lab and university to create a pipeline of the next generation of Earth system scientists.
Digital Mapping of Buried Pipelines with a Dual Array System
DOT National Transportation Integrated Search
2003-06-06
The objective of this research is to develop a non-invasive system for detecting, mapping, and inspecting ferrous and plastic pipelines in place using technology that combines and interprets measurements from ground penetrating radar and electromagne...
2011-01-01
Background Many plants have large and complex genomes with an abundance of repeated sequences. Many plants are also polyploid. Both of these attributes typify the genome architecture in the tribe Triticeae, whose members include economically important wheat, rye and barley. Large genome sizes, an abundance of repeated sequences, and polyploidy present challenges to genome-wide SNP discovery using next-generation sequencing (NGS) of total genomic DNA by making alignment and clustering of short reads generated by the NGS platforms difficult, particularly in the absence of a reference genome sequence. Results An annotation-based, genome-wide SNP discovery pipeline is reported using NGS data for large and complex genomes without a reference genome sequence. Roche 454 shotgun reads with low genome coverage of one genotype are annotated in order to distinguish single-copy sequences and repeat junctions from repetitive sequences and sequences shared by paralogous genes. Multiple genome equivalents of shotgun reads of another genotype generated with SOLiD or Solexa are then mapped to the annotated Roche 454 reads to identify putative SNPs. A pipeline program package, AGSNP, was developed and used for genome-wide SNP discovery in Aegilops tauschii-the diploid source of the wheat D genome, and with a genome size of 4.02 Gb, of which 90% is repetitive sequences. Genomic DNA of Ae. tauschii accession AL8/78 was sequenced with the Roche 454 NGS platform. Genomic DNA and cDNA of Ae. tauschii accession AS75 was sequenced primarily with SOLiD, although some Solexa and Roche 454 genomic sequences were also generated. A total of 195,631 putative SNPs were discovered in gene sequences, 155,580 putative SNPs were discovered in uncharacterized single-copy regions, and another 145,907 putative SNPs were discovered in repeat junctions. These SNPs were dispersed across the entire Ae. tauschii genome. To assess the false positive SNP discovery rate, DNA containing putative SNPs was amplified by PCR from AL8/78 and AS75 and resequenced with the ABI 3730 xl. In a sample of 302 randomly selected putative SNPs, 84.0% in gene regions, 88.0% in repeat junctions, and 81.3% in uncharacterized regions were validated. Conclusion An annotation-based genome-wide SNP discovery pipeline for NGS platforms was developed. The pipeline is suitable for SNP discovery in genomic libraries of complex genomes and does not require a reference genome sequence. The pipeline is applicable to all current NGS platforms, provided that at least one such platform generates relatively long reads. The pipeline package, AGSNP, and the discovered 497,118 Ae. tauschii SNPs can be accessed at (http://avena.pw.usda.gov/wheatD/agsnp.shtml). PMID:21266061
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robin Gordon; Bill Bruce; Ian Harris
2004-04-12
The two broad categories of deposited weld metal repair and fiber-reinforced composite liner repair technologies were reviewed for potential application for internal repair of gas transmission pipelines. Both are used to some extent for other applications and could be further developed for internal, local, structural repair of gas transmission pipelines. Preliminary test programs were developed for both deposited weld metal repair and for fiber-reinforced composite liner repair. Evaluation trials have been conducted using a modified fiber-reinforced composite liner provided by RolaTube and pipe sections without liners. All pipe section specimens failed in areas of simulated damage. Pipe sections containing fiber-reinforcedmore » composite liners failed at pressures marginally greater than the pipe sections without liners. The next step is to evaluate a liner material with a modulus of elasticity approximately 95% of the modulus of elasticity for steel. Preliminary welding parameters were developed for deposited weld metal repair in preparation of the receipt of Pacific Gas & Electric's internal pipeline welding repair system (that was designed specifically for 559 mm (22 in.) diameter pipe) and the receipt of 559 mm (22 in.) pipe sections from Panhandle Eastern. The next steps are to transfer welding parameters to the PG&E system and to pressure test repaired pipe sections to failure. A survey of pipeline operators was conducted to better understand the needs and performance requirements of the natural gas transmission industry regarding internal repair. Completed surveys contained the following principal conclusions: (1) Use of internal weld repair is most attractive for river crossings, under other bodies of water, in difficult soil conditions, under highways, under congested intersections, and under railway crossings. (2) Internal pipe repair offers a strong potential advantage to the high cost of horizontal direct drilling (HDD) when a new bore must be created to solve a leak or other problem. (3) Typical travel distances can be divided into three distinct groups: up to 305 m (1,000 ft.); between 305 m and 610 m (1,000 ft. and 2,000 ft.); and beyond 914 m (3,000 ft.). All three groups require pig-based systems. A despooled umbilical system would suffice for the first two groups which represents 81% of survey respondents. The third group would require an onboard self-contained power unit for propulsion and welding/liner repair energy needs. (4) Pipe diameter sizes range from 50.8 mm (2 in.) through 1,219.2 mm (48 in.). The most common size range for 80% to 90% of operators surveyed is 508 mm to 762 mm (20 in. to 30 in.), with 95% using 558.8 mm (22 in.) pipe. An evaluation of potential repair methods clearly indicates that the project should continue to focus on the development of a repair process involving the use of GMAW welding and on the development of a repair process involving the use of fiber-reinforced composite liners.« less
Milchenko, Mikhail; Snyder, Abraham Z; LaMontagne, Pamela; Shimony, Joshua S; Benzinger, Tammie L; Fouke, Sarah Jost; Marcus, Daniel S
2016-07-01
Neuroimaging research often relies on clinically acquired magnetic resonance imaging (MRI) datasets that can originate from multiple institutions. Such datasets are characterized by high heterogeneity of modalities and variability of sequence parameters. This heterogeneity complicates the automation of image processing tasks such as spatial co-registration and physiological or functional image analysis. Given this heterogeneity, conventional processing workflows developed for research purposes are not optimal for clinical data. In this work, we describe an approach called Heterogeneous Optimization Framework (HOF) for developing image analysis pipelines that can handle the high degree of clinical data non-uniformity. HOF provides a set of guidelines for configuration, algorithm development, deployment, interpretation of results and quality control for such pipelines. At each step, we illustrate the HOF approach using the implementation of an automated pipeline for Multimodal Glioma Analysis (MGA) as an example. The MGA pipeline computes tissue diffusion characteristics of diffusion tensor imaging (DTI) acquisitions, hemodynamic characteristics using a perfusion model of susceptibility contrast (DSC) MRI, and spatial cross-modal co-registration of available anatomical, physiological and derived patient images. Developing MGA within HOF enabled the processing of neuro-oncology MR imaging studies to be fully automated. MGA has been successfully used to analyze over 160 clinical tumor studies to date within several research projects. Introduction of the MGA pipeline improved image processing throughput and, most importantly, effectively produced co-registered datasets that were suitable for advanced analysis despite high heterogeneity in acquisition protocols.
ExoDat Information System at CeSAM
NASA Astrophysics Data System (ADS)
Agneray, F.; Moreau, C.; Chabaud, P.; Damiani, C.; Deleuil, M.
2014-05-01
CoRoT (Convection Rotation and planetary transits) is a space based mission led by French space agency (CNES) in association with French and international laboratories. One of CoRoT's goal is to detect exoplanets by the transit method. The Exoplanet Database (Exodat) is a VO compliant information system for the CoRoT exoplanet program. The main functions of ExoDat are to provide a source catalog for the observation fields and targets selection; to characterize the CoRoT targets (spectral type, variability , contamination...);and to support follow up programs. ExoDat is built using the AstroNomical Information System (ANIS) developed by the CeSAM (Centre de donneeS Astrophysique de Marseille). It offers download of observation catalogs and additional services like: search, extract and display data by using a combination of criteria, object list, and cone-search interfaces. Web services have been developed to provide easy access for user's softwares and pipelines.
Continuous Turbidity Monitoring in the Indian Creek Watershed, Tazewell County, Virginia, 2006-08
Moyer, Douglas; Hyer, Kenneth
2009-01-01
Thousands of miles of natural gas pipelines are installed annually in the United States. These pipelines commonly cross streams, rivers, and other water bodies during pipeline construction. A major concern associated with pipelines crossing water bodies is increased sediment loading and the subsequent impact to the ecology of the aquatic system. Several studies have investigated the techniques used to install pipelines across surface-water bodies and their effect on downstream suspended-sediment concentrations. These studies frequently employ the evaluation of suspended-sediment or turbidity data that were collected using discrete sample-collection methods. No studies, however, have evaluated the utility of continuous turbidity monitoring for identifying real-time sediment input and providing a robust dataset for the evaluation of long-term changes in suspended-sediment concentration as it relates to a pipeline crossing. In 2006, the U.S. Geological Survey, in cooperation with East Tennessee Natural Gas and the U.S. Fish and Wildlife Service, began a study to monitor the effects of construction of the Jewell Ridge Lateral natural gas pipeline on turbidity conditions below pipeline crossings of Indian Creek and an unnamed tributary to Indian Creek, in Tazewell County, Virginia. The potential for increased sediment loading to Indian Creek is of major concern for watershed managers because Indian Creek is listed as one of Virginia's Threatened and Endangered Species Waters and contains critical habitat for two freshwater mussel species, purple bean (Villosa perpurpurea) and rough rabbitsfoot (Quadrula cylindrical strigillata). Additionally, Indian Creek contains the last known reproducing population of the tan riffleshell (Epioblasma florentina walkeri). Therefore, the objectives of the U.S. Geological Survey monitoring effort were to (1) develop a continuous turbidity monitoring network that attempted to measure real-time changes in suspended sediment (using turbidity as a surrogate) downstream from the pipeline crossings, and (2) provide continuous turbidity data that enable the development of a real-time turbidity-input warning system and assessment of long-term changes in turbidity conditions. Water-quality conditions were assessed using continuous water-quality monitors deployed upstream and downstream from the pipeline crossings in Indian Creek and the unnamed tributary. These paired upstream and downstream monitors were outfitted with turbidity, pH (for Indian Creek only), specific-conductance, and water-temperature sensors. Water-quality data were collected continuously (every 15 minutes) during three phases of the pipeline construction: pre-construction, during construction, and post-construction. Continuous turbidity data were evaluated at various time steps to determine whether the construction of the pipeline crossings had an effect on downstream suspended-sediment conditions in Indian Creek and the unnamed tributary. These continuous turbidity data were analyzed in real time with the aid of a turbidity-input warning system. A warning occurred when turbidity values downstream from the pipeline were 6 Formazin Nephelometric Units or 15 percent (depending on the observed range) greater than turbidity upstream from the pipeline crossing. Statistical analyses also were performed on monthly and phase-of-construction turbidity data to determine if the pipeline crossing served as a long-term source of sediment. Results of this intensive water-quality monitoring effort indicate that values of turbidity in Indian Creek increased significantly between the upstream and downstream water-quality monitors during the construction of the Jewell Ridge pipeline. The magnitude of the significant turbidity increase, however, was small (less than 2 Formazin Nephelometric Units). Patterns in the continuous turbidity data indicate that the actual pipeline crossing of Indian Creek had little influence of downstream water quality; co
Cottingham, Marci D.; Kalbaugh, Corey A.
2014-01-01
In spite of a growing literature on pharmaceuticalization, little is known about the pharmaceutical industry’s investments in research and development (R&D). Information about the drugs being developed can provide important context for existing case studies detailing the expanding – and often problematic – role of pharmaceuticals in society. To access the pharmaceutical industry’s pipeline, we constructed a database of drugs for which pharmaceutical companies reported initiating clinical trials over a five-year period (July 2006-June 2011), capturing 2,477 different drugs in 4,182 clinical trials. Comparing drugs in the pipeline that target diseases in high-income and low-income countries, we found that the number of drugs for diseases prevalent in high-income countries was 3.46 times higher than drugs for diseases prevalent in low-income countries. We also found that the plurality of drugs in the pipeline were being developed to treat cancers (26.2%). Interpreting our findings through the lens of pharmaceuticalization, we illustrate how investigating the entire drug development pipeline provides important information about patterns of pharmaceuticalization that are invisible when only marketed drugs are considered. PMID:25159693
Bao, Shunxing; Damon, Stephen M; Landman, Bennett A; Gokhale, Aniruddha
2016-02-27
Adopting high performance cloud computing for medical image processing is a popular trend given the pressing needs of large studies. Amazon Web Services (AWS) provide reliable, on-demand, and inexpensive cloud computing services. Our research objective is to implement an affordable, scalable and easy-to-use AWS framework for the Java Image Science Toolkit (JIST). JIST is a plugin for Medical-Image Processing, Analysis, and Visualization (MIPAV) that provides a graphical pipeline implementation allowing users to quickly test and develop pipelines. JIST is DRMAA-compliant allowing it to run on portable batch system grids. However, as new processing methods are implemented and developed, memory may often be a bottleneck for not only lab computers, but also possibly some local grids. Integrating JIST with the AWS cloud alleviates these possible restrictions and does not require users to have deep knowledge of programming in Java. Workflow definition/management and cloud configurations are two key challenges in this research. Using a simple unified control panel, users have the ability to set the numbers of nodes and select from a variety of pre-configured AWS EC2 nodes with different numbers of processors and memory storage. Intuitively, we configured Amazon S3 storage to be mounted by pay-for-use Amazon EC2 instances. Hence, S3 storage is recognized as a shared cloud resource. The Amazon EC2 instances provide pre-installs of all necessary packages to run JIST. This work presents an implementation that facilitates the integration of JIST with AWS. We describe the theoretical cost/benefit formulae to decide between local serial execution versus cloud computing and apply this analysis to an empirical diffusion tensor imaging pipeline.
NASA Astrophysics Data System (ADS)
Bao, Shunxing; Damon, Stephen M.; Landman, Bennett A.; Gokhale, Aniruddha
2016-03-01
Adopting high performance cloud computing for medical image processing is a popular trend given the pressing needs of large studies. Amazon Web Services (AWS) provide reliable, on-demand, and inexpensive cloud computing services. Our research objective is to implement an affordable, scalable and easy-to-use AWS framework for the Java Image Science Toolkit (JIST). JIST is a plugin for Medical- Image Processing, Analysis, and Visualization (MIPAV) that provides a graphical pipeline implementation allowing users to quickly test and develop pipelines. JIST is DRMAA-compliant allowing it to run on portable batch system grids. However, as new processing methods are implemented and developed, memory may often be a bottleneck for not only lab computers, but also possibly some local grids. Integrating JIST with the AWS cloud alleviates these possible restrictions and does not require users to have deep knowledge of programming in Java. Workflow definition/management and cloud configurations are two key challenges in this research. Using a simple unified control panel, users have the ability to set the numbers of nodes and select from a variety of pre-configured AWS EC2 nodes with different numbers of processors and memory storage. Intuitively, we configured Amazon S3 storage to be mounted by pay-for- use Amazon EC2 instances. Hence, S3 storage is recognized as a shared cloud resource. The Amazon EC2 instances provide pre-installs of all necessary packages to run JIST. This work presents an implementation that facilitates the integration of JIST with AWS. We describe the theoretical cost/benefit formulae to decide between local serial execution versus cloud computing and apply this analysis to an empirical diffusion tensor imaging pipeline.
Bao, Shunxing; Damon, Stephen M.; Landman, Bennett A.; Gokhale, Aniruddha
2016-01-01
Adopting high performance cloud computing for medical image processing is a popular trend given the pressing needs of large studies. Amazon Web Services (AWS) provide reliable, on-demand, and inexpensive cloud computing services. Our research objective is to implement an affordable, scalable and easy-to-use AWS framework for the Java Image Science Toolkit (JIST). JIST is a plugin for Medical-Image Processing, Analysis, and Visualization (MIPAV) that provides a graphical pipeline implementation allowing users to quickly test and develop pipelines. JIST is DRMAA-compliant allowing it to run on portable batch system grids. However, as new processing methods are implemented and developed, memory may often be a bottleneck for not only lab computers, but also possibly some local grids. Integrating JIST with the AWS cloud alleviates these possible restrictions and does not require users to have deep knowledge of programming in Java. Workflow definition/management and cloud configurations are two key challenges in this research. Using a simple unified control panel, users have the ability to set the numbers of nodes and select from a variety of pre-configured AWS EC2 nodes with different numbers of processors and memory storage. Intuitively, we configured Amazon S3 storage to be mounted by pay-for-use Amazon EC2 instances. Hence, S3 storage is recognized as a shared cloud resource. The Amazon EC2 instances provide pre-installs of all necessary packages to run JIST. This work presents an implementation that facilitates the integration of JIST with AWS. We describe the theoretical cost/benefit formulae to decide between local serial execution versus cloud computing and apply this analysis to an empirical diffusion tensor imaging pipeline. PMID:27127335
Improved satellite and geospatial tools for pipeline operator decision support systems.
DOT National Transportation Integrated Search
2017-01-06
Under Cooperative Agreement No. OASRTRS-14-H-CAL, California Polytechnic State University San Luis Obispo (Cal Poly), partnered with C-CORE, MDA, PRCI, and Electricore to design and develop improved satellite and geospatial tools for pipeline operato...
Panjikar, Santosh; Parthasarathy, Venkataraman; Lamzin, Victor S; Weiss, Manfred S; Tucker, Paul A
2005-04-01
The EMBL-Hamburg Automated Crystal Structure Determination Platform is a system that combines a number of existing macromolecular crystallographic computer programs and several decision-makers into a software pipeline for automated and efficient crystal structure determination. The pipeline can be invoked as soon as X-ray data from derivatized protein crystals have been collected and processed. It is controlled by a web-based graphical user interface for data and parameter input, and for monitoring the progress of structure determination. A large number of possible structure-solution paths are encoded in the system and the optimal path is selected by the decision-makers as the structure solution evolves. The processes have been optimized for speed so that the pipeline can be used effectively for validating the X-ray experiment at a synchrotron beamline.
Fuchs, Jonathan; Kouyate, Aminta; Kroboth, Liz; McFarland, Willi
2016-01-01
Structured, mentored research programs for high school and undergraduate students from underrepresented minority (URM) backgrounds are needed to increase the diversity of our nation’s biomedical research workforce. In particular, a robust pipeline of investigators from the communities disproportionately affected by the HIV epidemic is needed not only for fairness and equity but for insights and innovations to address persistent racial and ethnic disparities in new infections. We created the Summer HIV/AIDS Research Program (SHARP) at the San Francisco Department of Public Health for URM undergraduates as a 12-week program of hands-on research experience, one-on-one mentoring by a senior HIV investigator, didactic seminars for content and research methods, and networking opportunities. The first four cohorts (2012–2015) of SHARP gained research skills, built confidence in their abilities and self-identified as scientists. In addition, the majority of program alumni is employed in research positions and has been admitted to or is pursuing graduate degree programs in fields related to HIV prevention. While we await empirical studies of specific mentoring strategies at early educational stages, programs that engage faculty who are sensitive to the unique challenges facing diverse students and who draw lessons from established mentoring frameworks can help build an inclusive generation of HIV researchers. PMID:27066986
Fuchs, Jonathan; Kouyate, Aminta; Kroboth, Liz; McFarland, Willi
2016-09-01
Structured, mentored research programs for high school and undergraduate students from underrepresented minority (URM) backgrounds are needed to increase the diversity of our nation's biomedical research workforce. In particular, a robust pipeline of investigators from the communities disproportionately affected by the HIV epidemic is needed not only for fairness and equity but for insights and innovations to address persistent racial and ethnic disparities in new infections. We created the Summer HIV/AIDS Research Program (SHARP) at the San Francisco Department of Public Health for URM undergraduates as a 12-week program of hands-on research experience, one-on-one mentoring by a senior HIV investigator, didactic seminars for content and research methods, and networking opportunities. The first four cohorts (2012-2015) of SHARP gained research skills, built confidence in their abilities and self-identified as scientists. In addition, the majority of program alumni is employed in research positions and has been admitted to or is pursuing graduate degree programs in fields related to HIV prevention. While we await empirical studies of specific mentoring strategies at early educational stages, programs that engage faculty who are sensitive to the unique challenges facing diverse students and who draw lessons from established mentoring frameworks can help build an inclusive generation of HIV researchers.
Characterization and photometric performance of the Hyper Suprime-Cam Software Pipeline
NASA Astrophysics Data System (ADS)
Huang, Song; Leauthaud, Alexie; Murata, Ryoma; Bosch, James; Price, Paul; Lupton, Robert; Mandelbaum, Rachel; Lackner, Claire; Bickerton, Steven; Miyazaki, Satoshi; Coupon, Jean; Tanaka, Masayuki
2018-01-01
The Subaru Strategic Program (SSP) is an ambitious multi-band survey using the Hyper Suprime-Cam (HSC) on the Subaru telescope. The Wide layer of the SSP is both wide and deep, reaching a detection limit of i ˜ 26.0 mag. At these depths, it is challenging to achieve accurate, unbiased, and consistent photometry across all five bands. The HSC data are reduced using a pipeline that builds on the prototype pipeline for the Large Synoptic Survey Telescope. We have developed a Python-based, flexible framework to inject synthetic galaxies into real HSC images, called SynPipe. Here we explain the design and implementation of SynPipe and generate a sample of synthetic galaxies to examine the photometric performance of the HSC pipeline. For stars, we achieve 1% photometric precision at i ˜ 19.0 mag and 6% precision at i ˜ 25.0 in the i band (corresponding to statistical scatters of ˜0.01 and ˜0.06 mag respectively). For synthetic galaxies with single-Sérsic profiles, forced CModel photometry achieves 13% photometric precision at i ˜ 20.0 mag and 18% precision at i ˜ 25.0 in the i band (corresponding to statistical scatters of ˜0.15 and ˜0.22 mag respectively). We show that both forced point spread function and CModel photometry yield unbiased color estimates that are robust to seeing conditions. We identify several caveats that apply to the version of HSC pipeline used for the first public HSC data release (DR1) that need to be taking into consideration. First, the degree to which an object is blended with other objects impacts the overall photometric performance. This is especially true for point sources. Highly blended objects tend to have larger photometric uncertainties, systematically underestimated fluxes, and slightly biased colors. Secondly, >20% of stars at 22.5 < i < 25.0 mag can be misclassified as extended objects. Thirdly, the current CModel algorithm tends to strongly underestimate the half-light radius and ellipticity of galaxy with i > 21.5 mag.
Pipelines subject to slow landslide movements: Structural modeling vs field measurement
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bruschi, R.; Glavina, S.; Spinazze, M.
1996-12-01
In recent years finite element techniques have been increasingly used to investigate the behavior of buried pipelines subject to soil movements. The use of these tools provides a rational basis for the definition of minimum wall thickness requirements in landslide crossings. Furthermore the design of mitigation measures or monitoring systems which control the development of undesirable strains in the pipe wall over time, requires a detailed structural modeling. The scope of this paper is to discuss the use of dedicated structural modeling with relevant calibration to field measurements. The strain measurements used were regularly gathered from pipe sections, in twomore » different sites over a period of time long enough to record changes of axial strain due to soil movement. Detailed structural modeling of pipeline layout in both sites and for operating conditions, is applied. Numerical simulations show the influence of the distribution of soil movement acting on the pipeline with regards to the state of strain which can be developed in certain locations. The role of soil nature and direction of relative movements in the definition of loads transferred to the pipeline, is also discussed.« less
Song, Jia; Zheng, Sisi; Nguyen, Nhung; Wang, Youjun; Zhou, Yubin; Lin, Kui
2017-10-03
Because phylogenetic inference is an important basis for answering many evolutionary problems, a large number of algorithms have been developed. Some of these algorithms have been improved by integrating gene evolution models with the expectation of accommodating the hierarchy of evolutionary processes. To the best of our knowledge, however, there still is no single unifying model or algorithm that can take all evolutionary processes into account through a stepwise or simultaneous method. On the basis of three existing phylogenetic inference algorithms, we built an integrated pipeline for inferring the evolutionary history of a given gene family; this pipeline can model gene sequence evolution, gene duplication-loss, gene transfer and multispecies coalescent processes. As a case study, we applied this pipeline to the STIMATE (TMEM110) gene family, which has recently been reported to play an important role in store-operated Ca 2+ entry (SOCE) mediated by ORAI and STIM proteins. We inferred their phylogenetic trees in 69 sequenced chordate genomes. By integrating three tree reconstruction algorithms with diverse evolutionary models, a pipeline for inferring the evolutionary history of a gene family was developed, and its application was demonstrated.
Implementation of Cloud based next generation sequencing data analysis in a clinical laboratory.
Onsongo, Getiria; Erdmann, Jesse; Spears, Michael D; Chilton, John; Beckman, Kenneth B; Hauge, Adam; Yohe, Sophia; Schomaker, Matthew; Bower, Matthew; Silverstein, Kevin A T; Thyagarajan, Bharat
2014-05-23
The introduction of next generation sequencing (NGS) has revolutionized molecular diagnostics, though several challenges remain limiting the widespread adoption of NGS testing into clinical practice. One such difficulty includes the development of a robust bioinformatics pipeline that can handle the volume of data generated by high-throughput sequencing in a cost-effective manner. Analysis of sequencing data typically requires a substantial level of computing power that is often cost-prohibitive to most clinical diagnostics laboratories. To address this challenge, our institution has developed a Galaxy-based data analysis pipeline which relies on a web-based, cloud-computing infrastructure to process NGS data and identify genetic variants. It provides additional flexibility, needed to control storage costs, resulting in a pipeline that is cost-effective on a per-sample basis. It does not require the usage of EBS disk to run a sample. We demonstrate the validation and feasibility of implementing this bioinformatics pipeline in a molecular diagnostics laboratory. Four samples were analyzed in duplicate pairs and showed 100% concordance in mutations identified. This pipeline is currently being used in the clinic and all identified pathogenic variants confirmed using Sanger sequencing further validating the software.
GIS characterization of spatially distributed lifeline damage
Toprak, Selcuk; O'Rourke, Thomas; Tutuncu, Ilker
1999-01-01
This paper describes the visualization of spatially distributed water pipeline damage following an earthquake using geographical information systems (GIS). Pipeline damage is expressed as a repair rate (RR). Repair rate contours are developed with GIS by dividing the study area into grid cells (n ?? n), determining the number of particular pipeline repairs in each grid cell, and dividing the number of repairs by the length of that pipeline in each cell area. The resulting contour plot is a two-dimensional visualization of point source damage. High damage zones are defined herein as areas with an RR value greater than the mean RR for the entire study area of interest. A hyperbolic relationship between visual display of high pipeline damage zones and grid size, n, was developed. The relationship is expressed in terms of two dimensionless parameters, threshold area coverage (TAC) and dimensionless grid size (DGS). The relationship is valid over a wide range of different map scales spanning approximately 1,200 km2 for the largest portion of the Los Angeles water distribution system to 1 km2 for the Marina in San Francisco. This relationship can aid GIS users to get sufficiently refined, but easily visualized, maps of damage patterns.
Mary Beth Adams; Pamela J. Edwards; W. Mark Ford; Joshua B. Johnson; Thomas M. Schuler; Melissa Thomas-Van Gundy; Frederica Wood
2011-01-01
Development of a natural gas well and pipeline on the Fernow Experimental Forest, WV, raised concerns about the effects on the natural and scientifi c resources of the Fernow, set aside in 1934 for long-term research. A case study approach was used to evaluate effects of the development. This report includes results of monitoring projects as well as observations...
Theory and Application of Magnetic Flux Leakage Pipeline Detection.
Shi, Yan; Zhang, Chao; Li, Rui; Cai, Maolin; Jia, Guanwei
2015-12-10
Magnetic flux leakage (MFL) detection is one of the most popular methods of pipeline inspection. It is a nondestructive testing technique which uses magnetic sensitive sensors to detect the magnetic leakage field of defects on both the internal and external surfaces of pipelines. This paper introduces the main principles, measurement and processing of MFL data. As the key point of a quantitative analysis of MFL detection, the identification of the leakage magnetic signal is also discussed. In addition, the advantages and disadvantages of different identification methods are analyzed. Then the paper briefly introduces the expert systems used. At the end of this paper, future developments in pipeline MFL detection are predicted.
Theory and Application of Magnetic Flux Leakage Pipeline Detection
Shi, Yan; Zhang, Chao; Li, Rui; Cai, Maolin; Jia, Guanwei
2015-01-01
Magnetic flux leakage (MFL) detection is one of the most popular methods of pipeline inspection. It is a nondestructive testing technique which uses magnetic sensitive sensors to detect the magnetic leakage field of defects on both the internal and external surfaces of pipelines. This paper introduces the main principles, measurement and processing of MFL data. As the key point of a quantitative analysis of MFL detection, the identification of the leakage magnetic signal is also discussed. In addition, the advantages and disadvantages of different identification methods are analyzed. Then the paper briefly introduces the expert systems used. At the end of this paper, future developments in pipeline MFL detection are predicted. PMID:26690435
Simulation of pipeline in the area of the underwater crossing
NASA Astrophysics Data System (ADS)
Burkov, P.; Chernyavskiy, D.; Burkova, S.; Konan, E. C.
2014-08-01
The article studies stress-strain behavior of the main oil-pipeline section Alexandrovskoye-Anzhero-Sudzhensk using software system Ansys. This method of examination and assessment of technical conditions of objects of pipeline transport studies the objects and the processes that affect the technical condition of these facilities, including the research on the basis of computer simulation. Such approach allows to develop the theory, methods of calculations and designing of objects of pipeline transport, units and parts of machines, regardless of their industry and destination with a view to improve the existing constructions and create new structures, machines of high performance, durability and reliability, maintainability, low material capacity and cost, which have competitiveness on the world market.
NASA Astrophysics Data System (ADS)
Li, Jun-Wei; Cao, Jun-Wei
2010-04-01
One challenge in large-scale scientific data analysis is to monitor data in real-time in a distributed environment. For the LIGO (Laser Interferometer Gravitational-wave Observatory) project, a dedicated suit of data monitoring tools (DMT) has been developed, yielding good extensibility to new data type and high flexibility to a distributed environment. Several services are provided, including visualization of data information in various forms and file output of monitoring results. In this work, a DMT monitor, OmegaMon, is developed for tracking statistics of gravitational-wave (OW) burst triggers that are generated from a specific OW burst data analysis pipeline, the Omega Pipeline. Such results can provide diagnostic information as reference of trigger post-processing and interferometer maintenance.
Philanthropy funding for neurosurgery research and program development.
Zusman, Edie E; Heary, Robert F; Stroink, Ann R; Berger, Mitchel S; Popp, A John; Friedlander, Robert M; Martin, Neil A; Lonser, Russell R; Asthagiri, Ashok R
2013-07-01
In times of fiscal and political uncertainty, philanthropy has become an increasingly important mechanism for building, maintaining, and expanding neurosurgical research programs. Although philanthropy has historically helped launch many hospital systems, scientists and clinicians have generally relied on government grants and industry investment to support research and program infrastructure. However, competition for funds from all sources has increased at the same time as the pipelines for those funds have eroded. Philanthropy can provide salary support to allow neurosurgeons to pursue research and, ultimately, advance the field to improve outcomes for patients. Funds raised can fill financial gaps to recruit and pay for needed research staff, equipment, and facilities. To foster charitable giving, institutions can develop both a culture and processes to promote and support philanthropy. Furthermore, it is essential to ensure that donor relationships are properly nurtured with ongoing stewardship. In addition to cultivating grateful patients, there are numerous creative models of fundraising for research that can be explored, including venture philanthropy, in which voluntary health organizations or individuals partner with academia and industry to invest in early-stage drug development and other innovations. Other approaches include formation of nonprofit foundations and partnerships with other entities to work jointly on shared development goals.
Developing eThread pipeline using SAGA-pilot abstraction for large-scale structural bioinformatics.
Ragothaman, Anjani; Boddu, Sairam Chowdary; Kim, Nayong; Feinstein, Wei; Brylinski, Michal; Jha, Shantenu; Kim, Joohyun
2014-01-01
While most of computational annotation approaches are sequence-based, threading methods are becoming increasingly attractive because of predicted structural information that could uncover the underlying function. However, threading tools are generally compute-intensive and the number of protein sequences from even small genomes such as prokaryotes is large typically containing many thousands, prohibiting their application as a genome-wide structural systems biology tool. To leverage its utility, we have developed a pipeline for eThread--a meta-threading protein structure modeling tool, that can use computational resources efficiently and effectively. We employ a pilot-based approach that supports seamless data and task-level parallelism and manages large variation in workload and computational requirements. Our scalable pipeline is deployed on Amazon EC2 and can efficiently select resources based upon task requirements. We present runtime analysis to characterize computational complexity of eThread and EC2 infrastructure. Based on results, we suggest a pathway to an optimized solution with respect to metrics such as time-to-solution or cost-to-solution. Our eThread pipeline can scale to support a large number of sequences and is expected to be a viable solution for genome-scale structural bioinformatics and structure-based annotation, particularly, amenable for small genomes such as prokaryotes. The developed pipeline is easily extensible to other types of distributed cyberinfrastructure.
Developing eThread Pipeline Using SAGA-Pilot Abstraction for Large-Scale Structural Bioinformatics
Ragothaman, Anjani; Feinstein, Wei; Jha, Shantenu; Kim, Joohyun
2014-01-01
While most of computational annotation approaches are sequence-based, threading methods are becoming increasingly attractive because of predicted structural information that could uncover the underlying function. However, threading tools are generally compute-intensive and the number of protein sequences from even small genomes such as prokaryotes is large typically containing many thousands, prohibiting their application as a genome-wide structural systems biology tool. To leverage its utility, we have developed a pipeline for eThread—a meta-threading protein structure modeling tool, that can use computational resources efficiently and effectively. We employ a pilot-based approach that supports seamless data and task-level parallelism and manages large variation in workload and computational requirements. Our scalable pipeline is deployed on Amazon EC2 and can efficiently select resources based upon task requirements. We present runtime analysis to characterize computational complexity of eThread and EC2 infrastructure. Based on results, we suggest a pathway to an optimized solution with respect to metrics such as time-to-solution or cost-to-solution. Our eThread pipeline can scale to support a large number of sequences and is expected to be a viable solution for genome-scale structural bioinformatics and structure-based annotation, particularly, amenable for small genomes such as prokaryotes. The developed pipeline is easily extensible to other types of distributed cyberinfrastructure. PMID:24995285
Code of Federal Regulations, 2013 CFR
2013-10-01
...: MINIMUM FEDERAL SAFETY STANDARDS Gas Transmission Pipeline Integrity Management § 192.915 What knowledge... to the integrity management program possesses and maintains a thorough knowledge of the integrity... 49 Transportation 3 2013-10-01 2013-10-01 false What knowledge and training must personnel have to...
Code of Federal Regulations, 2014 CFR
2014-10-01
...: MINIMUM FEDERAL SAFETY STANDARDS Gas Transmission Pipeline Integrity Management § 192.915 What knowledge... to the integrity management program possesses and maintains a thorough knowledge of the integrity... 49 Transportation 3 2014-10-01 2014-10-01 false What knowledge and training must personnel have to...
Code of Federal Regulations, 2011 CFR
2011-10-01
...: MINIMUM FEDERAL SAFETY STANDARDS Gas Transmission Pipeline Integrity Management § 192.915 What knowledge... to the integrity management program possesses and maintains a thorough knowledge of the integrity... 49 Transportation 3 2011-10-01 2011-10-01 false What knowledge and training must personnel have to...
Code of Federal Regulations, 2012 CFR
2012-10-01
...: MINIMUM FEDERAL SAFETY STANDARDS Gas Transmission Pipeline Integrity Management § 192.915 What knowledge... to the integrity management program possesses and maintains a thorough knowledge of the integrity... 49 Transportation 3 2012-10-01 2012-10-01 false What knowledge and training must personnel have to...
ERIC Educational Resources Information Center
Charleston, LaVar J.; Gilbert, Juan E.; Escobar, Barbara; Jackson, Jerlando F. L.
2014-01-01
African Americans represent 1.3% of all computing sciences faculty in PhD-granting departments, underscoring the severe underrepresentation of Black/African American tenure-track faculty in computing (CRA, 2012). The Future Faculty/Research Scientist Mentoring (FFRM) program, funded by the National Science Foundation, was found to be an effective…
ERIC Educational Resources Information Center
Wesley Schultz, P.; Hernandez, Paul R.; Woodcock, Anna; Estrada, Mica; Chance, Randie C.; Aguilar, Maria; Serpe, Richard T.
2011-01-01
For more than 40 years, there has been a concerted national effort to promote diversity among the scientific research community. Yet given the persistent national-level disparity in educational achievements of students from various ethnic and racial groups, the efficacy of these programs has come into question. The current study reports results…
Learning Systematically from Experience through a Research-to-Practice Pipeline in Chicago
ERIC Educational Resources Information Center
Fine, Wendy; Lansing, Jiffy; Bacon, Marshaun
2018-01-01
The Becoming A Man (BAM) program is a school-based group counseling and mentoring program run by Youth Guidance (YG), a community organization that serves children in Chicago schools who are at risk. BAM guides young men to learn, internalize, and practice social cognitive skills, make responsible decisions for their future, and become positive…
Advanced Technological Education (ATE) Program: Building a Pipeline of Skilled Workers. Policy Brief
ERIC Educational Resources Information Center
American Youth Policy Forum, 2010
2010-01-01
In the Fall of 2008, the American Youth Policy Forum hosted a series of three Capitol Hill forums showcasing the Advanced Technological Education (ATE) program supported by the National Science Foundation (NSF). The goal of these forums was to educate national policymakers about the importance of: (1) improving the science and math competencies of…
NASA Technical Reports Server (NTRS)
Chaudhary, Aashish; Votava, Petr; Nemani, Ramakrishna R.; Michaelis, Andrew; Kotfila, Chris
2016-01-01
We are developing capabilities for an integrated petabyte-scale Earth science collaborative analysis and visualization environment. The ultimate goal is to deploy this environment within the NASA Earth Exchange (NEX) and OpenNEX in order to enhance existing science data production pipelines in both high-performance computing (HPC) and cloud environments. Bridging of HPC and cloud is a fairly new concept under active research and this system significantly enhances the ability of the scientific community to accelerate analysis and visualization of Earth science data from NASA missions, model outputs and other sources. We have developed a web-based system that seamlessly interfaces with both high-performance computing (HPC) and cloud environments, providing tools that enable science teams to develop and deploy large-scale analysis, visualization and QA pipelines of both the production process and the data products, and enable sharing results with the community. Our project is developed in several stages each addressing separate challenge - workflow integration, parallel execution in either cloud or HPC environments and big-data analytics or visualization. This work benefits a number of existing and upcoming projects supported by NEX, such as the Web Enabled Landsat Data (WELD), where we are developing a new QA pipeline for the 25PB system.
Analytics and Visualization Pipelines for Big Data on the NASA Earth Exchange (NEX) and OpenNEX
NASA Astrophysics Data System (ADS)
Chaudhary, A.; Votava, P.; Nemani, R. R.; Michaelis, A.; Kotfila, C.
2016-12-01
We are developing capabilities for an integrated petabyte-scale Earth science collaborative analysis and visualization environment. The ultimate goal is to deploy this environment within the NASA Earth Exchange (NEX) and OpenNEX in order to enhance existing science data production pipelines in both high-performance computing (HPC) and cloud environments. Bridging of HPC and cloud is a fairly new concept under active research and this system significantly enhances the ability of the scientific community to accelerate analysis and visualization of Earth science data from NASA missions, model outputs and other sources. We have developed a web-based system that seamlessly interfaces with both high-performance computing (HPC) and cloud environments, providing tools that enable science teams to develop and deploy large-scale analysis, visualization and QA pipelines of both the production process and the data products, and enable sharing results with the community. Our project is developed in several stages each addressing separate challenge - workflow integration, parallel execution in either cloud or HPC environments and big-data analytics or visualization. This work benefits a number of existing and upcoming projects supported by NEX, such as the Web Enabled Landsat Data (WELD), where we are developing a new QA pipeline for the 25PB system.
Fuchs, Helmut; Aguilar-Pimentel, Juan Antonio; Amarie, Oana V; Becker, Lore; Calzada-Wack, Julia; Cho, Yi-Li; Garrett, Lillian; Hölter, Sabine M; Irmler, Martin; Kistler, Martin; Kraiger, Markus; Mayer-Kuckuk, Philipp; Moreth, Kristin; Rathkolb, Birgit; Rozman, Jan; da Silva Buttkus, Patricia; Treise, Irina; Zimprich, Annemarie; Gampe, Kristine; Hutterer, Christine; Stöger, Claudia; Leuchtenberger, Stefanie; Maier, Holger; Miller, Manuel; Scheideler, Angelika; Wu, Moya; Beckers, Johannes; Bekeredjian, Raffi; Brielmeier, Markus; Busch, Dirk H; Klingenspor, Martin; Klopstock, Thomas; Ollert, Markus; Schmidt-Weber, Carsten; Stöger, Tobias; Wolf, Eckhard; Wurst, Wolfgang; Yildirim, Ali Önder; Zimmer, Andreas; Gailus-Durner, Valérie; Hrabě de Angelis, Martin
2017-09-29
Since decades, model organisms have provided an important approach for understanding the mechanistic basis of human diseases. The German Mouse Clinic (GMC) was the first phenotyping facility that established a collaboration-based platform for phenotype characterization of mouse lines. In order to address individual projects by a tailor-made phenotyping strategy, the GMC advanced in developing a series of pipelines with tests for the analysis of specific disease areas. For a general broad analysis, there is a screening pipeline that covers the key parameters for the most relevant disease areas. For hypothesis-driven phenotypic analyses, there are thirteen additional pipelines with focus on neurological and behavioral disorders, metabolic dysfunction, respiratory system malfunctions, immune-system disorders and imaging techniques. In this article, we give an overview of the pipelines and describe the scientific rationale behind the different test combinations. Copyright © 2017 Elsevier B.V. All rights reserved.
SAND: an automated VLBI imaging and analysing pipeline - I. Stripping component trajectories
NASA Astrophysics Data System (ADS)
Zhang, M.; Collioud, A.; Charlot, P.
2018-02-01
We present our implementation of an automated very long baseline interferometry (VLBI) data-reduction pipeline that is dedicated to interferometric data imaging and analysis. The pipeline can handle massive VLBI data efficiently, which makes it an appropriate tool to investigate multi-epoch multiband VLBI data. Compared to traditional manual data reduction, our pipeline provides more objective results as less human interference is involved. The source extraction is carried out in the image plane, while deconvolution and model fitting are performed in both the image plane and the uv plane for parallel comparison. The output from the pipeline includes catalogues of CLEANed images and reconstructed models, polarization maps, proper motion estimates, core light curves and multiband spectra. We have developed a regression STRIP algorithm to automatically detect linear or non-linear patterns in the jet component trajectories. This algorithm offers an objective method to match jet components at different epochs and to determine their proper motions.
Simulation and Experiment Research on Fatigue Life of High Pressure Air Pipeline Joint
NASA Astrophysics Data System (ADS)
Shang, Jin; Xie, Jianghui; Yu, Jian; Zhang, Deman
2017-12-01
High pressure air pipeline joint is important part of high pressure air system, whose reliability is related to the safety and stability of the system. This thesis developed a new type-high pressure air pipeline joint, carried out dynamics research on CB316-1995 and new type-high pressure air pipeline joint with finite element method, deeply analysed the join forms of different design schemes and effect of materials on stress, tightening torque and fatigue life of joint. Research team set up vibration/pulse test bench, carried out joint fatigue life contrast test. The result shows: the maximum stress of the joint is inverted in the inner side of the outer sleeve nut, which is consistent with the failure mode of the crack on the outer sleeve nut in practice. Simulation and experiment of fatigue life and tightening torque of new type-high pressure air pipeline joint are better than CB316-1995 joint.
2016-09-01
natural gas pipelines , water pipelines , and metallic USTs. The full and complete data sets for curve-fit development were not pro- vided to ERDC...Dunmire (OUSD(AT&L)), Bernie Rodriguez (IMPW-E), and Valerie D. Hines (DAIM-ODF). The work was performed by the Materials and Structures Branch...of structures being tested increases, as in the case of pipelines that run many miles or the case of when a structure’s coating quality
ORAC-DR: Pipelining With Other People's Code
NASA Astrophysics Data System (ADS)
Economou, Frossie; Bridger, Alan; Wright, Gillian S.; Jenness, Tim; Currie, Malcolm J.; Adamson, Andy
As part of the UKIRT ORAC project, we have developed a pipeline (orac-dr) for driving on-line data reduction using existing astronomical packages as algorithm engines and display tools. The design is modular and extensible on several levels, allowing it to be easily adapted to a wide variety of instruments. Here we briefly review the design, discuss the robustness and speed of execution issues inherent in such pipelines, and address what constitutes a desirable (in terms of ``buy-in'' effort) engine or tool.