40 CFR 49.4166 - Monitoring requirements.
Code of Federal Regulations, 2013 CFR
2013-07-01
... burning pilot flame, electronically controlled automatic igniters, and monitoring system failures, using a... failure, electronically controlled automatic igniter failure, or improper monitoring equipment operation... and natural gas emissions in the event that natural gas recovered for pipeline injection must be...
40 CFR 49.4166 - Monitoring requirements.
Code of Federal Regulations, 2014 CFR
2014-07-01
... burning pilot flame, electronically controlled automatic igniters, and monitoring system failures, using a... failure, electronically controlled automatic igniter failure, or improper monitoring equipment operation... and natural gas emissions in the event that natural gas recovered for pipeline injection must be...
NASA Astrophysics Data System (ADS)
Artana, K. B.; Pitana, T.; Dinariyana, D. P.; Ariana, M.; Kristianto, D.; Pratiwi, E.
2018-06-01
The aim of this research is to develop an algorithm and application that can perform real-time monitoring of the safety operation of offshore platforms and subsea gas pipelines as well as determine the need for ship inspection using data obtained from automatic identification system (AIS). The research also focuses on the integration of shipping database, AIS data, and others to develop a prototype for designing a real-time monitoring system of offshore platforms and pipelines. A simple concept is used in the development of this prototype, which is achieved by using an overlaying map that outlines the coordinates of the offshore platform and subsea gas pipeline with the ship's coordinates (longitude/latitude) as detected by AIS. Using such information, we can then build an early warning system (EWS) relayed through short message service (SMS), email, or other means when the ship enters the restricted and exclusion zone of platforms and pipelines. The ship inspection system is developed by combining several attributes. Then, decision analysis software is employed to prioritize the vessel's four attributes, including ship age, ship type, classification, and flag state. Results show that the EWS can increase the safety level of offshore platforms and pipelines, as well as the efficient use of patrol boats in monitoring the safety of the facilities. Meanwhile, ship inspection enables the port to prioritize the ship to be inspected in accordance with the priority ranking inspection score.
Monitoring and localization hydrocarbon and sulfur oxides emissions by SRS-lidar
NASA Astrophysics Data System (ADS)
Zhevlakov, A. P.; Konopelko, L. P.; Bespalov, V. G.; Elizarov, V. V.; Grishkanich, A. S.; Redka, D. N.; Bogoslovsky, S. A.; Il'inskiy, A. A.; Chubchenko, Y. K.
2017-10-01
We developed a Raman lidar with ultraspectral resolution for automatic airborne monitoring of pipeline leaks and for oil and gas exploration. Test flights indicate that a sensitivity of 6 ppm for methane and 2 ppm for hydrogen sulfide has been reached for leakage detection.
Monitoring of pipeline ruptures by means of a Robust Satellite Technique (RST)
NASA Astrophysics Data System (ADS)
Filizzola, C.; Baldassarre, G.; Corrado, R.; Mazzeo, G.; Marchese, F.; Paciello, R.; Pergola, N.; Tramutoli, V.
2009-04-01
Pipeline ruptures have deep economic and ecologic consequences so that pipeline networks represent critical infrastructures to be carefully monitored particularly in areas which are frequently affected by natural disasters like earthquakes, hurricanes, landslide, etc. In order to minimize damages, the detection of harmful events along pipelines should be as rapid as possible and, at the same time, what is detected should be an actual incident and not a false alarm. In this work, a Robust Satellite Technique (RST), already applied to the prevision and NRT (Near Real Time) monitoring of major natural and environmental hazards (such as seismically active areas, volcanic activity, hydrological risk, forest fires and oil spills) has been employed to automatically identify, from satellite, anomalous Thermal Infrared (TIR) transients related to explosions of oil/gas pipelines. In this context, the combination of the RST approach with high temporal resolution, offered by geostationary satellites, seems to assure both a reliable and timely detection of such events. The potentials of the technique (applied to MSG-SEVIRI data) were tested over Iraq, a region which is sadly known for the numerous (mainly manmade) accidents to pipelines, in order to have a simulation of the effects (such as fires or explosions near or directly involving a pipeline facility) due to natural disasters.
Capacitive system detects and locates fluid leaks
NASA Technical Reports Server (NTRS)
1966-01-01
Electronic monitoring system automatically detects and locates minute leaks in seams of large fluid storage tanks and pipelines covered with thermal insulation. The system uses a capacitive tape-sensing element that is adhesively bonded over seams where fluid leaks are likely to occur.
Automatic Measuring System for Oil Stream Paraffin Deposits Parameters
NASA Astrophysics Data System (ADS)
Kopteva, A. V.; Koptev, V. Yu
2018-03-01
This paper describes a new method for monitoring oil pipelines, as well as a highly efficient and automated paraffin deposit monitoring method. When operating oil pipelines, there is an issue of paraffin, resin and salt deposits on the pipeline walls that come with the oil stream. It ultimately results in frequent transportation suspension to clean or even replace pipes and other equipment, thus shortening operation periods between repairs, creating emergency situations and increasing production expenses, badly affecting environment, damaging ecology and spoil underground water, killing animals, birds etc. Oil spills contaminate rivers, lakes, and ground waters. Oil transportation monitoring issues are still subject for further studying. Thus, there is the need to invent a radically new automated process control and management system, together with measurement means intellectualization. The measurement principle is based on the Lambert-Beer law that describes the dependence between the gamma-radiation frequency and the density together with the linear attenuation coefficient for a substance. Using the measuring system with high accuracy (± 0,2%), one can measure the thickness of paraffin deposits with an absolute accuracy of ± 5 mm, which is sufficient to ensure reliable operation of the pipeline system. Safety is a key advantage, when using the proposed control system.
Statistical analysis on the signals monitoring multiphase flow patterns in pipeline-riser system
NASA Astrophysics Data System (ADS)
Ye, Jing; Guo, Liejin
2013-07-01
The signals monitoring petroleum transmission pipeline in offshore oil industry usually contain abundant information about the multiphase flow on flow assurance which includes the avoidance of most undesirable flow pattern. Therefore, extracting reliable features form these signals to analyze is an alternative way to examine the potential risks to oil platform. This paper is focused on characterizing multiphase flow patterns in pipeline-riser system that is often appeared in offshore oil industry and finding an objective criterion to describe the transition of flow patterns. Statistical analysis on pressure signal at the riser top is proposed, instead of normal prediction method based on inlet and outlet flow conditions which could not be easily determined during most situations. Besides, machine learning method (least square supported vector machine) is also performed to classify automatically the different flow patterns. The experiment results from a small-scale loop show that the proposed method is effective for analyzing the multiphase flow pattern.
NASA Astrophysics Data System (ADS)
Chen, Xiangqun; Huang, Rui; Shen, Liman; chen, Hao; Xiong, Dezhi; Xiao, Xiangqi; Liu, Mouhai; Xu, Renheng
2018-03-01
In this paper, the semi-active RFID watt-hour meter is applied to automatic test lines and intelligent warehouse management, from the transmission system, test system and auxiliary system, monitoring system, realize the scheduling of watt-hour meter, binding, control and data exchange, and other functions, make its more accurate positioning, high efficiency of management, update the data quickly, all the information at a glance. Effectively improve the quality, efficiency and automation of verification, and realize more efficient data management and warehouse management.
Development and Application of On-line Monitor for the ZLW-1 Axis Cracks
NASA Astrophysics Data System (ADS)
Shi-jun, Yang; Qian-hui, Yang; Jian-guo, Jin
2018-03-01
This article mainly introduces a method that uses acoustic emission techniques to achieve on-line monitor for the shaft cracks and crack growth. According to this method, axis crack monitor is produced by acoustic emission techniques. This instrument can apply to all the pressure vessels, pipelines and rotor machines that can bear buckling load. It has the online real-time monitoring, automatic recording, printing, sound and light alarm, collecting crack information function. After a series of tests in both laboratory and field, it shows that this instrument is very versatile and possesses broad prospects of development and application.
Earthquake Risk Reduction to Istanbul Natural Gas Distribution Network
NASA Astrophysics Data System (ADS)
Zulfikar, Can; Kariptas, Cagatay; Biyikoglu, Hikmet; Ozarpa, Cevat
2017-04-01
Earthquake Risk Reduction to Istanbul Natural Gas Distribution Network Istanbul Natural Gas Distribution Corporation (IGDAS) is one of the end users of the Istanbul Earthquake Early Warning (EEW) signal. IGDAS, the primary natural gas provider in Istanbul, operates an extensive system 9,867km of gas lines with 750 district regulators and 474,000 service boxes. The natural gas comes to Istanbul city borders with 70bar in 30inch diameter steel pipeline. The gas pressure is reduced to 20bar in RMS stations and distributed to district regulators inside the city. 110 of 750 district regulators are instrumented with strong motion accelerometers in order to cut gas flow during an earthquake event in the case of ground motion parameters exceeds the certain threshold levels. Also, state of-the-art protection systems automatically cut natural gas flow when breaks in the gas pipelines are detected. IGDAS uses a sophisticated SCADA (supervisory control and data acquisition) system to monitor the state-of-health of its pipeline network. This system provides real-time information about quantities related to pipeline monitoring, including input-output pressure, drawing information, positions of station and RTU (remote terminal unit) gates, slum shut mechanism status at 750 district regulator sites. IGDAS Real-time Earthquake Risk Reduction algorithm follows 4 stages as below: 1) Real-time ground motion data transmitted from 110 IGDAS and 110 KOERI (Kandilli Observatory and Earthquake Research Institute) acceleration stations to the IGDAS Scada Center and KOERI data center. 2) During an earthquake event EEW information is sent from IGDAS Scada Center to the IGDAS stations. 3) Automatic Shut-Off is applied at IGDAS district regulators, and calculated parameters are sent from stations to the IGDAS Scada Center and KOERI. 4) Integrated building and gas pipeline damage maps are prepared immediately after the earthquake event. The today's technology allows to rapidly estimate the expected level of shaking when an earthquake starts to occur. However, in Istanbul case for a potential Marmara Sea Earthquake, the time is very limited even to estimate the level of shaking. The robust threshold based EEW system is only algorithm for such a near source event to activate automatic shut-off mechanism in the critical infrastructures before the damaging waves arrive. This safety measure even with a few seconds of early warning time will help to mitigate potential damages and secondary hazards.
IDCDACS: IDC's Distributed Application Control System
NASA Astrophysics Data System (ADS)
Ertl, Martin; Boresch, Alexander; Kianička, Ján; Sudakov, Alexander; Tomuta, Elena
2015-04-01
The Preparatory Commission for the CTBTO is an international organization based in Vienna, Austria. Its mission is to establish a global verification regime to monitor compliance with the Comprehensive Nuclear-Test-Ban Treaty (CTBT), which bans all nuclear explosions. For this purpose time series data from a global network of seismic, hydro-acoustic and infrasound (SHI) sensors are transmitted to the International Data Centre (IDC) in Vienna in near-real-time, where it is processed to locate events that may be nuclear explosions. We newly designed the distributed application control system that glues together the various components of the automatic waveform data processing system at the IDC (IDCDACS). Our highly-scalable solution preserves the existing architecture of the IDC processing system that proved successful over many years of operational use, but replaces proprietary components with open-source solutions and custom developed software. Existing code was refactored and extended to obtain a reusable software framework that is flexibly adaptable to different types of processing workflows. Automatic data processing is organized in series of self-contained processing steps, each series being referred to as a processing pipeline. Pipelines process data by time intervals, i.e. the time-series data received from monitoring stations is organized in segments based on the time when the data was recorded. So-called data monitor applications queue the data for processing in each pipeline based on specific conditions, e.g. data availability, elapsed time or completion states of preceding processing pipelines. IDCDACS consists of a configurable number of distributed monitoring and controlling processes, a message broker and a relational database. All processes communicate through message queues hosted on the message broker. Persistent state information is stored in the database. A configurable processing controller instantiates and monitors all data processing applications. Due to decoupling by message queues the system is highly versatile and failure tolerant. The implementation utilizes the RabbitMQ open-source messaging platform that is based upon the Advanced Message Queuing Protocol (AMQP), an on-the-wire protocol (like HTML) and open industry standard. IDCDACS uses high availability capabilities provided by RabbitMQ and is equipped with failure recovery features to survive network and server outages. It is implemented in C and Python and is operated in a Linux environment at the IDC. Although IDCDACS was specifically designed for the existing IDC processing system its architecture is generic and reusable for different automatic processing workflows, e.g. similar to those described in (Olivieri et al. 2012, Kværna et al. 2012). Major advantages are its independence of the specific data processing applications used and the possibility to reconfigure IDCDACS for different types of processing, data and trigger logic. A possible future development would be to use the IDCDACS framework for different scientific domains, e.g. for processing of Earth observation satellite data extending the one-dimensional time-series intervals to spatio-temporal data cubes. REFERENCES Olivieri M., J. Clinton (2012) An almost fair comparison between Earthworm and SeisComp3, Seismological Research Letters, 83(4), 720-727. Kværna, T., S. J. Gibbons, D. B. Harris, D. A. Dodge (2012) Adapting pipeline architectures to track developing aftershock sequences and recurrent explosions, Proceedings of the 2012 Monitoring Research Review: Ground-Based Nuclear Explosion Monitoring Technologies, 776-785.
Design of cylindrical pipe automatic welding control system based on STM32
NASA Astrophysics Data System (ADS)
Chen, Shuaishuai; Shen, Weicong
2018-04-01
The development of modern economy makes the demand for pipeline construction and construction rapidly increasing, and the pipeline welding has become an important link in pipeline construction. At present, there are still a large number of using of manual welding methods at home and abroad, and field pipe welding especially lacks miniature and portable automatic welding equipment. An automated welding system consists of a control system, which consisting of a lower computer control panel and a host computer operating interface, as well as automatic welding machine mechanisms and welding power systems in coordination with the control system. In this paper, a new control system of automatic pipe welding based on the control panel of the lower computer and the interface of the host computer is proposed, which has many advantages over the traditional automatic welding machine.
NASA Astrophysics Data System (ADS)
Karpov, S.; Beskin, G.; Biryukov, A.; Bondar, S.; Ivanov, E.; Katkova, E.; Perkov, A.; Sasyuk, V.
2016-12-01
Here we present the summary of first years of operation and the first results of a novel 9-channel wide-field optical monitoring system with sub-second temporal resolution, Mini-Mega-TORTORA (MMT-9), which is in operation now at Special Astrophysical Observatory on Russian Caucasus. The system is able to observe the sky simultaneously in either wide (˜900 square degrees) or narrow (˜100 square degrees) fields of view, either in clear light or with any combination of color (Johnson-Cousins B, V or R) and polarimetric filters installed, with exposure times ranging from 0.1 s to hundreds of seconds. The real-time system data analysis pipeline performs automatic detection of rapid transient events, both near-Earth and extragalactic. The objects routinely detected by MMT include faint meteors and artificial satellites. The pipeline for a longer time scales variability analysis is still in development.
NASA Astrophysics Data System (ADS)
Karpov, S.; Beskin, G.; Biryukov, A.; Bondar, S.; Ivanov, E.; Katkova, E.; Perkov, A.; Sasyuk, V.
2016-06-01
Here we present a summary of first years of operation and first results of a novel 9-channel wide-field optical monitoring system with sub-second temporal resolution, Mini-MegaTORTORA (MMT-9), which is in operation now at Special Astrophysical Observatory on Russian Caucasus. The system is able to observe the sky simultaneously in either wide (~900 square degrees) or narrow (~100 square degrees) fields of view, either in clear light or with any combination of color (Johnson-Cousins B, V or R) and polarimetric filters installed, with exposure times ranging from 0.1 s to hundreds of seconds. The real-time system data analysis pipeline performs automatic detection of rapid transient events, both near-Earth and extragalactic. The objects routinely detected by MMT include faint meteors and artificial satellites. The pipeline for a longer time scales variability analysis is still in development.
Automatic Generalizability Method of Urban Drainage Pipe Network Considering Multi-Features
NASA Astrophysics Data System (ADS)
Zhu, S.; Yang, Q.; Shao, J.
2018-05-01
Urban drainage systems are indispensable dataset for storm-flooding simulation. Given data availability and current computing power, the structure and complexity of urban drainage systems require to be simplify. However, till data, the simplify procedure mainly depend on manual operation that always leads to mistakes and lower work efficiency. This work referenced the classification methodology of road system, and proposed a conception of pipeline stroke. Further, length of pipeline, angle between two pipelines, the pipeline belonged road level and diameter of pipeline were chosen as the similarity criterion to generate the pipeline stroke. Finally, designed the automatic method to generalize drainage systems with the concern of multi-features. This technique can improve the efficiency and accuracy of the generalization of drainage systems. In addition, it is beneficial to the study of urban storm-floods.
NASA Astrophysics Data System (ADS)
Kitov, I.; Bobrov, D.; Rozhkov, M.
2016-12-01
Aftershocks of larger earthquakes represent an important source of information on the distribution and evolution of stresses and deformations in pre-seismic, co-seismic and post-seismic phases. For the International Data Centre (IDC) of the Comprehensive Nuclear-Test-Ban Organization (CTBTO) largest aftershocks sequences are also a challenge for automatic and interactive processing. The highest rate of events recorded by two and more seismic stations of the International Monitoring System from a relatively small aftershock area may reach hundreds per hour (e.g. Sumatra 2004 and Tohoku 2011). Moreover, there are thousands of reflected/refracted phases per hour with azimuth and slowness within the uncertainty limits of the first P-waves. Misassociation of these later phases, both regular and site specific, as the first P-wave results in creation of numerous wrong event hypotheses in automatic IDC pipeline. In turn, interactive review of such wrong hypotheses is direct waste of analysts' resources. Waveform cross correlation (WCC) is a powerful tool to separate coda phases from actual P-wave arrivals and to fully utilize the repeat character of waveforms generated by events close in space. Array seismic stations of the IMS enhance the performance of the WCC in two important aspects - they reduce detection threshold and effectively suppress arrivals from all sources except master events. An IDC specific aftershock tool has been developed and merged with standard IDC pipeline. The tool includes several procedures: creation of master events consisting of waveform templates at ten and more IMS stations; cross correlation (CC) of real-time waveforms with these templates, association of arrivals detected at CC-traces in event hypotheses; building events matching IDC quality criteria; and resolution of conflicts between events hypotheses created by neighboring master-events. The final cross correlation standard event lists (XSEL) is a start point of interactive analysis. Since global monitoring of underground nuclear tests is based on historical and synthetic data, each aftershock sequence can be tested for the CTBT violation with big earthquakes as an evasion scenario.
Real-time multiple objects tracking on Raspberry-Pi-based smart embedded camera
NASA Astrophysics Data System (ADS)
Dziri, Aziz; Duranton, Marc; Chapuis, Roland
2016-07-01
Multiple-object tracking constitutes a major step in several computer vision applications, such as surveillance, advanced driver assistance systems, and automatic traffic monitoring. Because of the number of cameras used to cover a large area, these applications are constrained by the cost of each node, the power consumption, the robustness of the tracking, the processing time, and the ease of deployment of the system. To meet these challenges, the use of low-power and low-cost embedded vision platforms to achieve reliable tracking becomes essential in networks of cameras. We propose a tracking pipeline that is designed for fixed smart cameras and which can handle occlusions between objects. We show that the proposed pipeline reaches real-time processing on a low-cost embedded smart camera composed of a Raspberry-Pi board and a RaspiCam camera. The tracking quality and the processing speed obtained with the proposed pipeline are evaluated on publicly available datasets and compared to the state-of-the-art methods.
Bat detective-Deep learning tools for bat acoustic signal detection.
Mac Aodha, Oisin; Gibb, Rory; Barlow, Kate E; Browning, Ella; Firman, Michael; Freeman, Robin; Harder, Briana; Kinsey, Libby; Mead, Gary R; Newson, Stuart E; Pandourski, Ivan; Parsons, Stuart; Russ, Jon; Szodoray-Paradi, Abigel; Szodoray-Paradi, Farkas; Tilova, Elena; Girolami, Mark; Brostow, Gabriel; Jones, Kate E
2018-03-01
Passive acoustic sensing has emerged as a powerful tool for quantifying anthropogenic impacts on biodiversity, especially for echolocating bat species. To better assess bat population trends there is a critical need for accurate, reliable, and open source tools that allow the detection and classification of bat calls in large collections of audio recordings. The majority of existing tools are commercial or have focused on the species classification task, neglecting the important problem of first localizing echolocation calls in audio which is particularly problematic in noisy recordings. We developed a convolutional neural network based open-source pipeline for detecting ultrasonic, full-spectrum, search-phase calls produced by echolocating bats. Our deep learning algorithms were trained on full-spectrum ultrasonic audio collected along road-transects across Europe and labelled by citizen scientists from www.batdetective.org. When compared to other existing algorithms and commercial systems, we show significantly higher detection performance of search-phase echolocation calls with our test sets. As an example application, we ran our detection pipeline on bat monitoring data collected over five years from Jersey (UK), and compared results to a widely-used commercial system. Our detection pipeline can be used for the automatic detection and monitoring of bat populations, and further facilitates their use as indicator species on a large scale. Our proposed pipeline makes only a small number of bat specific design decisions, and with appropriate training data it could be applied to detecting other species in audio. A crucial novelty of our work is showing that with careful, non-trivial, design and implementation considerations, state-of-the-art deep learning methods can be used for accurate and efficient monitoring in audio.
Bat detective—Deep learning tools for bat acoustic signal detection
Barlow, Kate E.; Firman, Michael; Freeman, Robin; Harder, Briana; Kinsey, Libby; Mead, Gary R.; Newson, Stuart E.; Pandourski, Ivan; Russ, Jon; Szodoray-Paradi, Abigel; Tilova, Elena; Girolami, Mark; Jones, Kate E.
2018-01-01
Passive acoustic sensing has emerged as a powerful tool for quantifying anthropogenic impacts on biodiversity, especially for echolocating bat species. To better assess bat population trends there is a critical need for accurate, reliable, and open source tools that allow the detection and classification of bat calls in large collections of audio recordings. The majority of existing tools are commercial or have focused on the species classification task, neglecting the important problem of first localizing echolocation calls in audio which is particularly problematic in noisy recordings. We developed a convolutional neural network based open-source pipeline for detecting ultrasonic, full-spectrum, search-phase calls produced by echolocating bats. Our deep learning algorithms were trained on full-spectrum ultrasonic audio collected along road-transects across Europe and labelled by citizen scientists from www.batdetective.org. When compared to other existing algorithms and commercial systems, we show significantly higher detection performance of search-phase echolocation calls with our test sets. As an example application, we ran our detection pipeline on bat monitoring data collected over five years from Jersey (UK), and compared results to a widely-used commercial system. Our detection pipeline can be used for the automatic detection and monitoring of bat populations, and further facilitates their use as indicator species on a large scale. Our proposed pipeline makes only a small number of bat specific design decisions, and with appropriate training data it could be applied to detecting other species in audio. A crucial novelty of our work is showing that with careful, non-trivial, design and implementation considerations, state-of-the-art deep learning methods can be used for accurate and efficient monitoring in audio. PMID:29518076
Aozan: an automated post-sequencing data-processing pipeline.
Perrin, Sandrine; Firmo, Cyril; Lemoine, Sophie; Le Crom, Stéphane; Jourdren, Laurent
2017-07-15
Data management and quality control of output from Illumina sequencers is a disk space- and time-consuming task. Thus, we developed Aozan to automatically handle data transfer, demultiplexing, conversion and quality control once a run has finished. This software greatly improves run data management and the monitoring of run statistics via automatic emails and HTML web reports. Aozan is implemented in Java and Python, supported on Linux systems, and distributed under the GPLv3 License at: http://www.outils.genomique.biologie.ens.fr/aozan/ . Aozan source code is available on GitHub: https://github.com/GenomicParisCentre/aozan . aozan@biologie.ens.fr. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com
Lin, Ching-Heng; Wu, Nai-Yuan; Lai, Wei-Shao; Liou, Der-Ming
2015-01-01
Electronic medical records with encoded entries should enhance the semantic interoperability of document exchange. However, it remains a challenge to encode the narrative concept and to transform the coded concepts into a standard entry-level document. This study aimed to use a novel approach for the generation of entry-level interoperable clinical documents. Using HL7 clinical document architecture (CDA) as the example, we developed three pipelines to generate entry-level CDA documents. The first approach was a semi-automatic annotation pipeline (SAAP), the second was a natural language processing (NLP) pipeline, and the third merged the above two pipelines. We randomly selected 50 test documents from the i2b2 corpora to evaluate the performance of the three pipelines. The 50 randomly selected test documents contained 9365 words, including 588 Observation terms and 123 Procedure terms. For the Observation terms, the merged pipeline had a significantly higher F-measure than the NLP pipeline (0.89 vs 0.80, p<0.0001), but a similar F-measure to that of the SAAP (0.89 vs 0.87). For the Procedure terms, the F-measure was not significantly different among the three pipelines. The combination of a semi-automatic annotation approach and the NLP application seems to be a solution for generating entry-level interoperable clinical documents. © The Author 2014. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.comFor numbered affiliation see end of article.
NASA Astrophysics Data System (ADS)
Pozo Nuñez, Francisco; Chelouche, Doron; Kaspi, Shai; Niv, Saar
2017-09-01
We present the first results of an ongoing variability monitoring program of active galactic nuclei (AGNs) using the 46 cm telescope of the Wise Observatory in Israel. The telescope has a field of view of 1.25^\\circ × 0.84^\\circ and is specially equipped with five narrowband filters at 4300, 5200, 5700, 6200, and 7000 Å to perform photometric reverberation mapping studies of the central engine of AGNs. The program aims to observe a sample of 27 AGNs (V < 17 mag) selected according to tentative continuum and line time delay measurements obtained in previous works. We describe the autonomous operation of the telescope together with the fully automatic pipeline used to achieve high-performance unassisted observations, data reduction, and light curves extraction using different photometric methods. The science verification data presented here demonstrates the performance of the monitoring program in particular for efficiently photometric reverberation mapping of AGNs with additional capabilities to carry out complementary studies of other transient and variable phenomena such as variable stars studies.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brunton, Steven
Optical systems provide valuable information for evaluating interactions and associations between organisms and MHK energy converters and for capturing potentially rare encounters between marine organisms and MHK device. The deluge of optical data from cabled monitoring packages makes expert review time-consuming and expensive. We propose algorithms and a processing framework to automatically extract events of interest from underwater video. The open-source software framework consists of background subtraction, filtering, feature extraction and hierarchical classification algorithms. This principle classification pipeline was validated on real-world data collected with an experimental underwater monitoring package. An event detection rate of 100% was achieved using robustmore » principal components analysis (RPCA), Fourier feature extraction and a support vector machine (SVM) binary classifier. The detected events were then further classified into more complex classes – algae | invertebrate | vertebrate, one species | multiple species of fish, and interest rank. Greater than 80% accuracy was achieved using a combination of machine learning techniques.« less
ERIC Educational Resources Information Center
Elias, Marilyn
2013-01-01
Policies that encourage police presence at schools, harsh tactics including physical restraint, and automatic punishments that result in suspensions and out-of-class time are huge contributors to the school-to-prison pipeline, but the problem is more complex than that. The school-to-prison pipeline starts (or is best avoided) in the classroom.…
Remote laser spectroscopy of oil and gas deposits
NASA Astrophysics Data System (ADS)
Zhevlakov, A. P.; Bespalov, V. G.; Elizarov, V. V.; Grishkanich, A. S.; Kascheev, S. V.; Makarov, E. A.; Bogoslovsky, S. A.; Il'inskiy, A. A.
2014-06-01
We developed a Raman lidar with ultraspectral resolution for automatic airborne monitoring of pipeline leaks and for oil and gas exploration. Test flights indicate that a sensitivity of 6 ppm for methane and 2 ppm for hydrogen sulfide has been reached for leakage detection. The lidar is based on the CARS method with a Ti:Sapphire pump laser and a frequencydoubled YLF:Nd probe beam whose frequency is displaced by a BBO crystal. In ground-based experiments, a detection level of 3 to 10 molecules has been reached.
A graph-based approach for designing extensible pipelines
2012-01-01
Background In bioinformatics, it is important to build extensible and low-maintenance systems that are able to deal with the new tools and data formats that are constantly being developed. The traditional and simplest implementation of pipelines involves hardcoding the execution steps into programs or scripts. This approach can lead to problems when a pipeline is expanding because the incorporation of new tools is often error prone and time consuming. Current approaches to pipeline development such as workflow management systems focus on analysis tasks that are systematically repeated without significant changes in their course of execution, such as genome annotation. However, more dynamism on the pipeline composition is necessary when each execution requires a different combination of steps. Results We propose a graph-based approach to implement extensible and low-maintenance pipelines that is suitable for pipeline applications with multiple functionalities that require different combinations of steps in each execution. Here pipelines are composed automatically by compiling a specialised set of tools on demand, depending on the functionality required, instead of specifying every sequence of tools in advance. We represent the connectivity of pipeline components with a directed graph in which components are the graph edges, their inputs and outputs are the graph nodes, and the paths through the graph are pipelines. To that end, we developed special data structures and a pipeline system algorithm. We demonstrate the applicability of our approach by implementing a format conversion pipeline for the fields of population genetics and genetic epidemiology, but our approach is also helpful in other fields where the use of multiple software is necessary to perform comprehensive analyses, such as gene expression and proteomics analyses. The project code, documentation and the Java executables are available under an open source license at http://code.google.com/p/dynamic-pipeline. The system has been tested on Linux and Windows platforms. Conclusions Our graph-based approach enables the automatic creation of pipelines by compiling a specialised set of tools on demand, depending on the functionality required. It also allows the implementation of extensible and low-maintenance pipelines and contributes towards consolidating openness and collaboration in bioinformatics systems. It is targeted at pipeline developers and is suited for implementing applications with sequential execution steps and combined functionalities. In the format conversion application, the automatic combination of conversion tools increased both the number of possible conversions available to the user and the extensibility of the system to allow for future updates with new file formats. PMID:22788675
Iterative Strategies for Aftershock Classification in Automatic Seismic Processing Pipelines
NASA Astrophysics Data System (ADS)
Gibbons, Steven J.; Kværna, Tormod; Harris, David B.; Dodge, Douglas A.
2016-04-01
Aftershock sequences following very large earthquakes present enormous challenges to near-realtime generation of seismic bulletins. The increase in analyst resources needed to relocate an inflated number of events is compounded by failures of phase association algorithms and a significant deterioration in the quality of underlying fully automatic event bulletins. Current processing pipelines were designed a generation ago and, due to computational limitations of the time, are usually limited to single passes over the raw data. With current processing capability, multiple passes over the data are feasible. Processing the raw data at each station currently generates parametric data streams which are then scanned by a phase association algorithm to form event hypotheses. We consider the scenario where a large earthquake has occurred and propose to define a region of likely aftershock activity in which events are detected and accurately located using a separate specially targeted semi-automatic process. This effort may focus on so-called pattern detectors, but here we demonstrate a more general grid search algorithm which may cover wider source regions without requiring waveform similarity. Given many well-located aftershocks within our source region, we may remove all associated phases from the original detection lists prior to a new iteration of the phase association algorithm. We provide a proof-of-concept example for the 2015 Gorkha sequence, Nepal, recorded on seismic arrays of the International Monitoring System. Even with very conservative conditions for defining event hypotheses within the aftershock source region, we can automatically remove over half of the original detections which could have been generated by Nepal earthquakes and reduce the likelihood of false associations and spurious event hypotheses. Further reductions in the number of detections in the parametric data streams are likely using correlation and subspace detectors and/or empirical matched field processing.
Emadzadeh, Ehsan; Sarker, Abeed; Nikfarjam, Azadeh; Gonzalez, Graciela
2017-01-01
Social networks, such as Twitter, have become important sources for active monitoring of user-reported adverse drug reactions (ADRs). Automatic extraction of ADR information can be crucial for healthcare providers, drug manufacturers, and consumers. However, because of the non-standard nature of social media language, automatically extracted ADR mentions need to be mapped to standard forms before they can be used by operational pharmacovigilance systems. We propose a modular natural language processing pipeline for mapping (normalizing) colloquial mentions of ADRs to their corresponding standardized identifiers. We seek to accomplish this task and enable customization of the pipeline so that distinct unlabeled free text resources can be incorporated to use the system for other normalization tasks. Our approach, which we call Hybrid Semantic Analysis (HSA), sequentially employs rule-based and semantic matching algorithms for mapping user-generated mentions to concept IDs in the Unified Medical Language System vocabulary. The semantic matching component of HSA is adaptive in nature and uses a regression model to combine various measures of semantic relatedness and resources to optimize normalization performance on the selected data source. On a publicly available corpus, our normalization method achieves 0.502 recall and 0.823 precision (F-measure: 0.624). Our proposed method outperforms a baseline based on latent semantic analysis and another that uses MetaMap.
FliPer: checking the reliability of global seismic parameters from automatic pipelines
NASA Astrophysics Data System (ADS)
Bugnet, L.; García, R. A.; Davies, G. R.; Mathur, S.; Corsaro, E.
2017-12-01
Our understanding of stars through asteroseismic data analysis is limited by our ability to take advantage of the huge amount of observed stars provided by space missions such as CoRoT, \\keplerp, \\ktop, and soon TESS and PLATO. Global seismic pipelines provide global stellar parameters such as mass and radius using the mean seismic parameters, as well as the effective temperature. These pipelines are commonly used automatically on thousands of stars observed by K2 for 3 months (and soon TESS for at least ˜ 1 month). However, pipelines are not immune from misidentifying noise peaks and stellar oscillations. Therefore, new validation techniques are required to assess the quality of these results. We present a new metric called FliPer (Flicker in Power), which takes into account the average variability at all measured time scales. The proper calibration of \\powvar enables us to obtain good estimations of global stellar parameters such as surface gravity that are robust against the influence of noise peaks and hence are an excellent way to find faults in asteroseismic pipelines.
CARS technique for geological exploration of hydrocarbons deposits
NASA Astrophysics Data System (ADS)
Zhevlakov, A. P.; Bespalov, Victor; Elizarov, V. V.; Grishkanich, A. S.; Kascheev, S. V.; Makarov, E. A.; Bogoslovsky, S. A.; Il'inskiy, A. A.
2014-10-01
We developed a Raman lidar with ultraspectral resolution for automatic airborne monitoring of pipeline leaks and for oil and gas exploration. Experiments were carried out under the CARS circuit. Minimal concentrations of 200 ppb of heavy hydrocarbon gas have been remotely measured in laboratory tests. Test flights indicate that a sensitivity of 6 ppm for methane and 2 ppm for hydrogen sulfide has been reached for leakage detection. As estimations have shown the reliability of heavy hydrocarbon gas detection by the integration method of seismic prospecting and remote laser sensing in CARS circuit can exceed 80%.
Main Pipelines Corrosion Monitoring Device
NASA Astrophysics Data System (ADS)
Anatoliy, Bazhenov; Galina, Bondareva; Natalia, Grivennaya; Sergey, Malygin; Mikhail, Goryainov
2017-01-01
The aim of the article is to substantiate the technical solution for the problem of monitoring corrosion changes in oil and gas pipelines with use (using) of an electromagnetic NDT method. Pipeline wall thinning under operating conditions can lead to perforations and leakage of the product to be transported outside the pipeline. In most cases there is danger for human life and environment. Monitoring of corrosion changes in pipeline inner wall under operating conditions is complicated because pipelines are mainly made of structural steels with conductive and magnetic properties that complicate test signal passage through the entire thickness of the object under study. The technical solution of this problem lies in monitoring of the internal corrosion changes in pipes under operating conditions in order to increase safety of pipelines by automated prediction of achieving the threshold pre-crash values due to corrosion.
Semi-Automatic Segmentation Software for Quantitative Clinical Brain Glioblastoma Evaluation
Zhu, Y; Young, G; Xue, Z; Huang, R; You, H; Setayesh, K; Hatabu, H; Cao, F; Wong, S.T.
2012-01-01
Rationale and Objectives Quantitative measurement provides essential information about disease progression and treatment response in patients with Glioblastoma multiforme (GBM). The goal of this paper is to present and validate a software pipeline for semi-automatic GBM segmentation, called AFINITI (Assisted Follow-up in NeuroImaging of Therapeutic Intervention), using clinical data from GBM patients. Materials and Methods Our software adopts the current state-of-the-art tumor segmentation algorithms and combines them into one clinically usable pipeline. Both the advantages of the traditional voxel-based and the deformable shape-based segmentation are embedded into the software pipeline. The former provides an automatic tumor segmentation scheme based on T1- and T2-weighted MR brain data, and the latter refines the segmentation results with minimal manual input. Results Twenty six clinical MR brain images of GBM patients were processed and compared with manual results. The results can be visualized using the embedded graphic user interface (GUI). Conclusion Validation results using clinical GBM data showed high correlation between the AFINITI results and manual annotation. Compared to the voxel-wise segmentation, AFINITI yielded more accurate results in segmenting the enhanced GBM from multimodality MRI data. The proposed pipeline could be used as additional information to interpret MR brain images in neuroradiology. PMID:22591720
NASA Astrophysics Data System (ADS)
Yang, Xiaojun; Zhu, Xiaofei; Deng, Chi; Li, Junyi; Liu, Cheng; Yu, Wenpeng; Luo, Hui
2017-10-01
To improve the level of management and monitoring of leakage and abnormal disturbance of long distance oil pipeline, the distributed optical fiber temperature and vibration sensing system is employed to test the feasibility for the healthy monitoring of a domestic oil pipeline. The simulating leakage and abnormal disturbance affairs of oil pipeline are performed in the experiment. It is demonstrated that the leakage and abnormal disturbance affairs of oil pipeline can be monitored and located accurately with the distributed optical fiber sensing system, which exhibits good performance in the sensitivity, reliability, operation and maintenance etc., and shows good market application prospect.
Gap-free segmentation of vascular networks with automatic image processing pipeline.
Hsu, Chih-Yang; Ghaffari, Mahsa; Alaraj, Ali; Flannery, Michael; Zhou, Xiaohong Joe; Linninger, Andreas
2017-03-01
Current image processing techniques capture large vessels reliably but often fail to preserve connectivity in bifurcations and small vessels. Imaging artifacts and noise can create gaps and discontinuity of intensity that hinders segmentation of vascular trees. However, topological analysis of vascular trees require proper connectivity without gaps, loops or dangling segments. Proper tree connectivity is also important for high quality rendering of surface meshes for scientific visualization or 3D printing. We present a fully automated vessel enhancement pipeline with automated parameter settings for vessel enhancement of tree-like structures from customary imaging sources, including 3D rotational angiography, magnetic resonance angiography, magnetic resonance venography, and computed tomography angiography. The output of the filter pipeline is a vessel-enhanced image which is ideal for generating anatomical consistent network representations of the cerebral angioarchitecture for further topological or statistical analysis. The filter pipeline combined with computational modeling can potentially improve computer-aided diagnosis of cerebrovascular diseases by delivering biometrics and anatomy of the vasculature. It may serve as the first step in fully automatic epidemiological analysis of large clinical datasets. The automatic analysis would enable rigorous statistical comparison of biometrics in subject-specific vascular trees. The robust and accurate image segmentation using a validated filter pipeline would also eliminate operator dependency that has been observed in manual segmentation. Moreover, manual segmentation is time prohibitive given that vascular trees have more than thousands of segments and bifurcations so that interactive segmentation consumes excessive human resources. Subject-specific trees are a first step toward patient-specific hemodynamic simulations for assessing treatment outcomes. Copyright © 2017 Elsevier Ltd. All rights reserved.
Pipeline Reduction of Binary Light Curves from Large-Scale Surveys
NASA Astrophysics Data System (ADS)
Prša, Andrej; Zwitter, Tomaž
2007-08-01
One of the most important changes in observational astronomy of the 21st Century is a rapid shift from classical object-by-object observations to extensive automatic surveys. As CCD detectors are getting better and their prices are getting lower, more and more small and medium-size observatories are refocusing their attention to detection of stellar variability through systematic sky-scanning missions. This trend is additionally powered by the success of pioneering surveys such as ASAS, DENIS, OGLE, TASS, their space counterpart Hipparcos and others. Such surveys produce massive amounts of data and it is not at all clear how these data are to be reduced and analysed. This is especially striking in the eclipsing binary (EB) field, where most frequently used tools are optimized for object-by-object analysis. A clear need for thorough, reliable and fully automated approaches to modeling and analysis of EB data is thus obvious. This task is very difficult because of limited data quality, non-uniform phase coverage and parameter degeneracy. The talk will review recent advancements in putting together semi-automatic and fully automatic pipelines for EB data processing. Automatic procedures have already been used to process the Hipparcos data, LMC/SMC observations, OGLE and ASAS catalogs etc. We shall discuss the advantages and shortcomings of these procedures and overview the current status of automatic EB modeling pipelines for the upcoming missions such as CoRoT, Kepler, Gaia and others.
CFHT's SkyProbe: a real-time sky-transparency monitor
NASA Astrophysics Data System (ADS)
Cuillandre, Jean-Charles; Magnier, Eugene A.; Isani, Sidik; Sabin, Daniel; Knight, Wiley; Kras, Simon; Lai, Kamson
2002-12-01
We have developed a system at the Canada-France-Hawaii Telescope (CFHT), SkyProbe, which allows for the direct measurement of the true attenuation by clouds once per minute, within a percent, directly on the field pointed by the telescope. It has been possible to make this system relatively inexpensively due to the low-cost CCD cameras from the amateur market. A crucial addition to this hardware is the quite recent availability of a full-sky photometry catalog at the appropriate depth: the Tycho catalog, from the Hipparcos mission. The central element is the automatic data analysis pipeline developed at CFHT, Elixir, for the improved operation of the CFHT wide-field imagers, CFH12K and MegaCam. SkyProbe"s FITS images are processed in real-time and the pipeline output (a zero point attenuation) provides the current sky transmission to the observers and helps immediate decision making. These measurements are also attached to the archived data, adding a key criteria for future use by other astronomers.
49 CFR 195.583 - What must I do to monitor atmospheric corrosion control?
Code of Federal Regulations, 2013 CFR
2013-10-01
... 49 Transportation 3 2013-10-01 2013-10-01 false What must I do to monitor atmospheric corrosion... monitor atmospheric corrosion control? (a) You must inspect each pipeline or portion of pipeline that is exposed to the atmosphere for evidence of atmospheric corrosion, as follows: If the pipeline islocated...
49 CFR 195.583 - What must I do to monitor atmospheric corrosion control?
Code of Federal Regulations, 2012 CFR
2012-10-01
... 49 Transportation 3 2012-10-01 2012-10-01 false What must I do to monitor atmospheric corrosion... monitor atmospheric corrosion control? (a) You must inspect each pipeline or portion of pipeline that is exposed to the atmosphere for evidence of atmospheric corrosion, as follows: If the pipeline islocated...
49 CFR 195.583 - What must I do to monitor atmospheric corrosion control?
Code of Federal Regulations, 2014 CFR
2014-10-01
... 49 Transportation 3 2014-10-01 2014-10-01 false What must I do to monitor atmospheric corrosion... monitor atmospheric corrosion control? (a) You must inspect each pipeline or portion of pipeline that is exposed to the atmosphere for evidence of atmospheric corrosion, as follows: If the pipeline islocated...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wynne, Adam S.
2011-05-05
In many application domains in science and engineering, data produced by sensors, instruments and networks is naturally processed by software applications structured as a pipeline . Pipelines comprise a sequence of software components that progressively process discrete units of data to produce a desired outcome. For example, in a Web crawler that is extracting semantics from text on Web sites, the first stage in the pipeline might be to remove all HTML tags to leave only the raw text of the document. The second step may parse the raw text to break it down into its constituent grammatical parts, suchmore » as nouns, verbs and so on. Subsequent steps may look for names of people or places, interesting events or times so documents can be sequenced on a time line. Each of these steps can be written as a specialized program that works in isolation with other steps in the pipeline. In many applications, simple linear software pipelines are sufficient. However, more complex applications require topologies that contain forks and joins, creating pipelines comprising branches where parallel execution is desirable. It is also increasingly common for pipelines to process very large files or high volume data streams which impose end-to-end performance constraints. Additionally, processes in a pipeline may have specific execution requirements and hence need to be distributed as services across a heterogeneous computing and data management infrastructure. From a software engineering perspective, these more complex pipelines become problematic to implement. While simple linear pipelines can be built using minimal infrastructure such as scripting languages, complex topologies and large, high volume data processing requires suitable abstractions, run-time infrastructures and development tools to construct pipelines with the desired qualities-of-service and flexibility to evolve to handle new requirements. The above summarizes the reasons we created the MeDICi Integration Framework (MIF) that is designed for creating high-performance, scalable and modifiable software pipelines. MIF exploits a low friction, robust, open source middleware platform and extends it with component and service-based programmatic interfaces that make implementing complex pipelines simple. The MIF run-time automatically handles queues between pipeline elements in order to handle request bursts, and automatically executes multiple instances of pipeline elements to increase pipeline throughput. Distributed pipeline elements are supported using a range of configurable communications protocols, and the MIF interfaces provide efficient mechanisms for moving data directly between two distributed pipeline elements.« less
49 CFR 195.583 - What must I do to monitor atmospheric corrosion control?
Code of Federal Regulations, 2011 CFR
2011-10-01
... 49 Transportation 3 2011-10-01 2011-10-01 false What must I do to monitor atmospheric corrosion... SAFETY TRANSPORTATION OF HAZARDOUS LIQUIDS BY PIPELINE Corrosion Control § 195.583 What must I do to monitor atmospheric corrosion control? (a) You must inspect each pipeline or portion of pipeline that is...
1997 annual report : environmental monitoring program Louisiana offshore oil port pipeline.
DOT National Transportation Integrated Search
1998-06-01
The Louisiana Offshore Oil Port (LOOP) Environmental Monitoring Program includes an onshore pipeline vegetation and wildlife survey as a continuing study designed to measure the immediate and long-term impacts of LOOP-related pipeline construction an...
Lady Academe and Labor-Market Segmentation
ERIC Educational Resources Information Center
Bousquet, Marc
2012-01-01
The role of gender in the global economy is not represented particularly well by old-school "pipeline" theories of women entering particular industries, whether it is manufacturing, medicine, or college teaching. The pipeline analogy suggests that if women enter a field in equal or greater numbers to men, they will somehow automatically be "piped"…
NASA Astrophysics Data System (ADS)
Karpov, S.; Beskin, G.; Biryukov, A.; Bondar, S.; Ivanov, E.; Katkova, E.; Orekhova, N.; Perkov, A.; Sasyuk, V.
2017-07-01
Here we present the summary of first years of operation and the first results of a novel 9-channel wide-field optical monitoring system with sub-second temporal resolution, Mini-MegaTORTORA (MMT-9), which is in operation now at Special Astrophysical Observatory on Russian Caucasus. The system is able to observe the sky simultaneously in either wide (900 square degrees) or narrow (100 square degrees) fields of view, either in clear light or with any combination of color (Johnson-Cousins B, V or R) and polarimetric filters installed, with exposure times ranging from 0.1 s to hundreds of seconds.The real-time system data analysis pipeline performs automatic detection of rapid transient events, both near-Earth and extragalactic. The objects routinely detected by MMT also include faint meteors and artificial satellites.
NASA Astrophysics Data System (ADS)
Karpov, S.; Beskin, G.; Biryukov, A.; Bondar, S.; Ivanov, E.; Katkova, E.; Orekhova, N.; Perkov, A.; Sasyuk, V.
2017-06-01
Here we present the summary of first years of operation and the first results of a novel 9-channel wide-field optical monitoring system with sub-second temporal resolution, Mini-MegaTORTORA (MMT-9), which is in operation now at Special Astrophysical Observatory on Russian Caucasus. The system is able to observe the sky simultaneously in either wide (˜900 square degrees) or narrow (˜100 square degrees) fields of view, either in clear light or with any combination of color (Johnson-Cousins B, V or R) and polarimetric filters installed, with exposure times ranging from 0.1 s to hundreds of seconds.The real-time system data analysis pipeline performs automatic detection of rapid transient events, both near-Earth and extragalactic. The objects routinely detected by MMT include faint meteors and artificial satellites.
Perspectives of Cross-Correlation in Seismic Monitoring at the International Data Centre
NASA Astrophysics Data System (ADS)
Bobrov, Dmitry; Kitov, Ivan; Zerbo, Lassina
2014-03-01
We demonstrate that several techniques based on waveform cross-correlation are able to significantly reduce the detection threshold of seismic sources worldwide and to improve the reliability of arrivals by a more accurate estimation of their defining parameters. A master event and the events it can find using waveform cross-correlation at array stations of the International Monitoring System (IMS) have to be close. For the purposes of the International Data Centre (IDC), one can use the spatial closeness of the master and slave events in order to construct a new automatic processing pipeline: all qualified arrivals detected using cross-correlation are associated with events matching the current IDC event definition criteria (EDC) in a local association procedure. Considering the repeating character of global seismicity, more than 90 % of events in the reviewed event bulletin (REB) can be built in this automatic processing. Due to the reduced detection threshold, waveform cross-correlation may increase the number of valid REB events by a factor of 1.5-2.0. Therefore, the new pipeline may produce a more comprehensive bulletin than the current pipeline—the goal of seismic monitoring. The analysts' experience with the cross correlation event list (XSEL) shows that the workload of interactive processing might be reduced by a factor of two or even more. Since cross-correlation produces a comprehensive list of detections for a given master event, no additional arrivals from primary stations are expected to be associated with the XSEL events. The number of false alarms, relative to the number of events rejected from the standard event list 3 (SEL3) in the current interactive processing—can also be reduced by the use of several powerful filters. The principal filter is the difference between the arrival times of the master and newly built events at three or more primary stations, which should lie in a narrow range of a few seconds. In this study, one event at a distance of about 2,000 km from the main shock was formed by three stations, with the stations and both events on the same great circle. Such spurious events are rejected by checking consistency between detections at stations at different back azimuths from the source region. Two additional effective pre-filters are f-k analysis and F prob based on correlation traces instead of original waveforms. Overall, waveform cross-correlation is able to improve the REB completeness, to reduce the workload related to IDC interactive analysis, and to provide a precise tool for quality check for both arrivals and events. Some major improvements in automatic and interactive processing achieved by cross-correlation are illustrated using an aftershock sequence from a large continental earthquake. Exploring this sequence, we describe schematically the next steps for the development of a processing pipeline parallel to the existing IDC one in order to improve the quality of the REB together with the reduction of the magnitude threshold.
Caboche, Ségolène; Even, Gaël; Loywick, Alexandre; Audebert, Christophe; Hot, David
2017-12-19
The increase in available sequence data has advanced the field of microbiology; however, making sense of these data without bioinformatics skills is still problematic. We describe MICRA, an automatic pipeline, available as a web interface, for microbial identification and characterization through reads analysis. MICRA uses iterative mapping against reference genomes to identify genes and variations. Additional modules allow prediction of antibiotic susceptibility and resistance and comparing the results of several samples. MICRA is fast, producing few false-positive annotations and variant calls compared to current methods, making it a tool of great interest for fully exploiting sequencing data.
Pipeline monitoring with unmanned aerial vehicles
NASA Astrophysics Data System (ADS)
Kochetkova, L. I.
2018-05-01
Pipeline leakage during transportation of combustible substances leads to explosion and fire thus causing death of people and destruction of production and accommodation facilities. Continuous pipeline monitoring allows identifying leaks in due time and quickly taking measures for their elimination. The paper describes the solution of identification of pipeline leakage using unmanned aerial vehicles. It is recommended to apply the spectral analysis with input RGB signal to identify pipeline damages. The application of multi-zone digital images allows defining potential spill of oil hydrocarbons as well as possible soil pollution. The method of multi-temporal digital images within the visible region makes it possible to define changes in soil morphology for its subsequent analysis. The given solution is cost efficient and reliable thus allowing reducing timing and labor resources in comparison with other methods of pipeline monitoring.
Human Factors Analysis of Pipeline Monitoring and Control Operations: Final Technical Report
DOT National Transportation Integrated Search
2008-11-26
The purpose of the Human Factors Analysis of Pipeline Monitoring and Control Operations project was to develop procedures that could be used by liquid pipeline operators to assess and manage the human factors risks in their control rooms that may adv...
Continuous Turbidity Monitoring in the Indian Creek Watershed, Tazewell County, Virginia, 2006-08
Moyer, Douglas; Hyer, Kenneth
2009-01-01
Thousands of miles of natural gas pipelines are installed annually in the United States. These pipelines commonly cross streams, rivers, and other water bodies during pipeline construction. A major concern associated with pipelines crossing water bodies is increased sediment loading and the subsequent impact to the ecology of the aquatic system. Several studies have investigated the techniques used to install pipelines across surface-water bodies and their effect on downstream suspended-sediment concentrations. These studies frequently employ the evaluation of suspended-sediment or turbidity data that were collected using discrete sample-collection methods. No studies, however, have evaluated the utility of continuous turbidity monitoring for identifying real-time sediment input and providing a robust dataset for the evaluation of long-term changes in suspended-sediment concentration as it relates to a pipeline crossing. In 2006, the U.S. Geological Survey, in cooperation with East Tennessee Natural Gas and the U.S. Fish and Wildlife Service, began a study to monitor the effects of construction of the Jewell Ridge Lateral natural gas pipeline on turbidity conditions below pipeline crossings of Indian Creek and an unnamed tributary to Indian Creek, in Tazewell County, Virginia. The potential for increased sediment loading to Indian Creek is of major concern for watershed managers because Indian Creek is listed as one of Virginia's Threatened and Endangered Species Waters and contains critical habitat for two freshwater mussel species, purple bean (Villosa perpurpurea) and rough rabbitsfoot (Quadrula cylindrical strigillata). Additionally, Indian Creek contains the last known reproducing population of the tan riffleshell (Epioblasma florentina walkeri). Therefore, the objectives of the U.S. Geological Survey monitoring effort were to (1) develop a continuous turbidity monitoring network that attempted to measure real-time changes in suspended sediment (using turbidity as a surrogate) downstream from the pipeline crossings, and (2) provide continuous turbidity data that enable the development of a real-time turbidity-input warning system and assessment of long-term changes in turbidity conditions. Water-quality conditions were assessed using continuous water-quality monitors deployed upstream and downstream from the pipeline crossings in Indian Creek and the unnamed tributary. These paired upstream and downstream monitors were outfitted with turbidity, pH (for Indian Creek only), specific-conductance, and water-temperature sensors. Water-quality data were collected continuously (every 15 minutes) during three phases of the pipeline construction: pre-construction, during construction, and post-construction. Continuous turbidity data were evaluated at various time steps to determine whether the construction of the pipeline crossings had an effect on downstream suspended-sediment conditions in Indian Creek and the unnamed tributary. These continuous turbidity data were analyzed in real time with the aid of a turbidity-input warning system. A warning occurred when turbidity values downstream from the pipeline were 6 Formazin Nephelometric Units or 15 percent (depending on the observed range) greater than turbidity upstream from the pipeline crossing. Statistical analyses also were performed on monthly and phase-of-construction turbidity data to determine if the pipeline crossing served as a long-term source of sediment. Results of this intensive water-quality monitoring effort indicate that values of turbidity in Indian Creek increased significantly between the upstream and downstream water-quality monitors during the construction of the Jewell Ridge pipeline. The magnitude of the significant turbidity increase, however, was small (less than 2 Formazin Nephelometric Units). Patterns in the continuous turbidity data indicate that the actual pipeline crossing of Indian Creek had little influence of downstream water quality; co
Early-type galaxies: Automated reduction and analysis of ROSAT PSPC data
NASA Technical Reports Server (NTRS)
Mackie, G.; Fabbiano, G.; Harnden, F. R., Jr.; Kim, D.-W.; Maggio, A.; Micela, G.; Sciortino, S.; Ciliegi, P.
1996-01-01
Preliminary results of early-type galaxies that will be part of a galaxy catalog to be derived from the complete Rosat data base are presented. The stored data were reduced and analyzed by an automatic pipeline. This pipeline is based on a command language scrip. The important features of the pipeline include new data time screening in order to maximize the signal to noise ratio of faint point-like sources, source detection via a wavelet algorithm, and the identification of sources with objects from existing catalogs. The pipeline outputs include reduced images, contour maps, surface brightness profiles, spectra, color and hardness ratios.
49 CFR 192.477 - Internal corrosion control: Monitoring.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 49 Transportation 3 2012-10-01 2012-10-01 false Internal corrosion control: Monitoring. 192.477 Section 192.477 Transportation Other Regulations Relating to Transportation (Continued) PIPELINE AND HAZARDOUS MATERIALS SAFETY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED) PIPELINE SAFETY TRANSPORTATION OF NATURAL AND OTHER GAS BY PIPELIN...
49 CFR 192.477 - Internal corrosion control: Monitoring.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 49 Transportation 3 2014-10-01 2014-10-01 false Internal corrosion control: Monitoring. 192.477 Section 192.477 Transportation Other Regulations Relating to Transportation (Continued) PIPELINE AND HAZARDOUS MATERIALS SAFETY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED) PIPELINE SAFETY TRANSPORTATION OF NATURAL AND OTHER GAS BY PIPELIN...
49 CFR 192.477 - Internal corrosion control: Monitoring.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 49 Transportation 3 2013-10-01 2013-10-01 false Internal corrosion control: Monitoring. 192.477 Section 192.477 Transportation Other Regulations Relating to Transportation (Continued) PIPELINE AND HAZARDOUS MATERIALS SAFETY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED) PIPELINE SAFETY TRANSPORTATION OF NATURAL AND OTHER GAS BY PIPELIN...
Almazyad, Abdulaziz S.; Seddiq, Yasser M.; Alotaibi, Ahmed M.; Al-Nasheri, Ahmed Y.; BenSaleh, Mohammed S.; Obeid, Abdulfattah M.; Qasim, Syed Manzoor
2014-01-01
Anomalies such as leakage and bursts in water pipelines have severe consequences for the environment and the economy. To ensure the reliability of water pipelines, they must be monitored effectively. Wireless Sensor Networks (WSNs) have emerged as an effective technology for monitoring critical infrastructure such as water, oil and gas pipelines. In this paper, we present a scalable design and simulation of a water pipeline leakage monitoring system using Radio Frequency IDentification (RFID) and WSN technology. The proposed design targets long-distance aboveground water pipelines that have special considerations for maintenance, energy consumption and cost. The design is based on deploying a group of mobile wireless sensor nodes inside the pipeline and allowing them to work cooperatively according to a prescheduled order. Under this mechanism, only one node is active at a time, while the other nodes are sleeping. The node whose turn is next wakes up according to one of three wakeup techniques: location-based, time-based and interrupt-driven. In this paper, mathematical models are derived for each technique to estimate the corresponding energy consumption and memory size requirements. The proposed equations are analyzed and the results are validated using simulation. PMID:24561404
Almazyad, Abdulaziz S; Seddiq, Yasser M; Alotaibi, Ahmed M; Al-Nasheri, Ahmed Y; BenSaleh, Mohammed S; Obeid, Abdulfattah M; Qasim, Syed Manzoor
2014-02-20
Anomalies such as leakage and bursts in water pipelines have severe consequences for the environment and the economy. To ensure the reliability of water pipelines, they must be monitored effectively. Wireless Sensor Networks (WSNs) have emerged as an effective technology for monitoring critical infrastructure such as water, oil and gas pipelines. In this paper, we present a scalable design and simulation of a water pipeline leakage monitoring system using Radio Frequency IDentification (RFID) and WSN technology. The proposed design targets long-distance aboveground water pipelines that have special considerations for maintenance, energy consumption and cost. The design is based on deploying a group of mobile wireless sensor nodes inside the pipeline and allowing them to work cooperatively according to a prescheduled order. Under this mechanism, only one node is active at a time, while the other nodes are sleeping. The node whose turn is next wakes up according to one of three wakeup techniques: location-based, time-based and interrupt-driven. In this paper, mathematical models are derived for each technique to estimate the corresponding energy consumption and memory size requirements. The proposed equations are analyzed and the results are validated using simulation.
NASA Astrophysics Data System (ADS)
Zulfikar, Can; Pinar, Ali; Tunc, Suleyman; Erdik, Mustafa
2014-05-01
The Istanbul EEW network consisting of 10 inland and 5 OBS strong motion stations located close to the Main Marmara Fault zone is operated by KOERI. Data transmission between the remote stations and the base station at KOERI is provided both with satellite and fiber optic cable systems. The continuous on-line data from these stations is used to provide real time warning for emerging potentially disastrous earthquakes. The data transmission time from the remote stations to the KOERI data center is a few milliseconds through fiber optic lines and less than a second via satellites. The early warning signal (consisting three alarm levels) is communicated to the appropriate servo shut-down systems of the receipent facilities, that automatically decide proper action based on the alarm level. Istanbul Gas Distribution Corporation (IGDAS) is one of the end users of the EEW signal. IGDAS, the primary natural gas provider in Istanbul, operates an extensive system 9,867 km of gas lines with 550 district regulators and 474,000 service boxes. State of-the-art protection systems automatically cut natural gas flow when breaks in the pipelines are detected. Since 2005, buildings in Istanbul using natural gas are required to install seismometers that automatically cut natural gas flow when certain thresholds are exceeded. IGDAS uses a sophisticated SCADA (supervisory control and data acquisition) system to monitor the state-of-health of its pipeline network. This system provides real-time information about quantities related to pipeline monitoring, including input-output pressure, drawing information, positions of station and RTU (remote terminal unit) gates, slum shut mechanism status at 581 district regulator sites. The SCADA system of IGDAŞ receives the EEW signal from KOERI and decide the proper actions according to the previously specified ground acceleration levels. Presently, KOERI sends EEW signal to the SCADA system of IGDAS Natural Gas Network of Istanbul. The EEW signal of KOERI is also transmitted to the serve shut down system of the Marmaray Rail Tube Tunnel and Commuter Rail Mass Transit System in Istanbul. The Marmaray system includes an undersea railway tunnel under the Bosphorus Strait. Several strong motion instruments are installed within the tunnel for taking measurements against strong ground shaking and early warning purposes. This system is integrated with the KOERI EEW System. KOERI sends the EEW signal to the command center of Marmaray. Having received the signal, the command center put into action the previously defined measurements. For example, the trains within the tunnel will be stopped at the nearest station, no access to the tunnel will be allowed to the trains approaching the tunnel, water protective caps will be closed to protect flood closing the connection between the onshore and offshore tunnels.
Held, Christian; Nattkemper, Tim; Palmisano, Ralf; Wittenberg, Thomas
2013-01-01
Research and diagnosis in medicine and biology often require the assessment of a large amount of microscopy image data. Although on the one hand, digital pathology and new bioimaging technologies find their way into clinical practice and pharmaceutical research, some general methodological issues in automated image analysis are still open. In this study, we address the problem of fitting the parameters in a microscopy image segmentation pipeline. We propose to fit the parameters of the pipeline's modules with optimization algorithms, such as, genetic algorithms or coordinate descents, and show how visual exploration of the parameter space can help to identify sub-optimal parameter settings that need to be avoided. This is of significant help in the design of our automatic parameter fitting framework, which enables us to tune the pipeline for large sets of micrographs. The underlying parameter spaces pose a challenge for manual as well as automated parameter optimization, as the parameter spaces can show several local performance maxima. Hence, optimization strategies that are not able to jump out of local performance maxima, like the hill climbing algorithm, often result in a local maximum.
Held, Christian; Nattkemper, Tim; Palmisano, Ralf; Wittenberg, Thomas
2013-01-01
Introduction: Research and diagnosis in medicine and biology often require the assessment of a large amount of microscopy image data. Although on the one hand, digital pathology and new bioimaging technologies find their way into clinical practice and pharmaceutical research, some general methodological issues in automated image analysis are still open. Methods: In this study, we address the problem of fitting the parameters in a microscopy image segmentation pipeline. We propose to fit the parameters of the pipeline's modules with optimization algorithms, such as, genetic algorithms or coordinate descents, and show how visual exploration of the parameter space can help to identify sub-optimal parameter settings that need to be avoided. Results: This is of significant help in the design of our automatic parameter fitting framework, which enables us to tune the pipeline for large sets of micrographs. Conclusion: The underlying parameter spaces pose a challenge for manual as well as automated parameter optimization, as the parameter spaces can show several local performance maxima. Hence, optimization strategies that are not able to jump out of local performance maxima, like the hill climbing algorithm, often result in a local maximum. PMID:23766941
A software framework for pipelined arithmetic algorithms in field programmable gate arrays
NASA Astrophysics Data System (ADS)
Kim, J. B.; Won, E.
2018-03-01
Pipelined algorithms implemented in field programmable gate arrays are extensively used for hardware triggers in the modern experimental high energy physics field and the complexity of such algorithms increases rapidly. For development of such hardware triggers, algorithms are developed in C++, ported to hardware description language for synthesizing firmware, and then ported back to C++ for simulating the firmware response down to the single bit level. We present a C++ software framework which automatically simulates and generates hardware description language code for pipelined arithmetic algorithms.
ATPP: A Pipeline for Automatic Tractography-Based Brain Parcellation
Li, Hai; Fan, Lingzhong; Zhuo, Junjie; Wang, Jiaojian; Zhang, Yu; Yang, Zhengyi; Jiang, Tianzi
2017-01-01
There is a longstanding effort to parcellate brain into areas based on micro-structural, macro-structural, or connectional features, forming various brain atlases. Among them, connectivity-based parcellation gains much emphasis, especially with the considerable progress of multimodal magnetic resonance imaging in the past two decades. The Brainnetome Atlas published recently is such an atlas that follows the framework of connectivity-based parcellation. However, in the construction of the atlas, the deluge of high resolution multimodal MRI data and time-consuming computation poses challenges and there is still short of publically available tools dedicated to parcellation. In this paper, we present an integrated open source pipeline (https://www.nitrc.org/projects/atpp), named Automatic Tractography-based Parcellation Pipeline (ATPP) to realize the framework of parcellation with automatic processing and massive parallel computing. ATPP is developed to have a powerful and flexible command line version, taking multiple regions of interest as input, as well as a user-friendly graphical user interface version for parcellating single region of interest. We demonstrate the two versions by parcellating two brain regions, left precentral gyrus and middle frontal gyrus, on two independent datasets. In addition, ATPP has been successfully utilized and fully validated in a variety of brain regions and the human Brainnetome Atlas, showing the capacity to greatly facilitate brain parcellation. PMID:28611620
Scherer, Sebastian; Kowal, Julia; Chami, Mohamed; Dandey, Venkata; Arheit, Marcel; Ringler, Philippe; Stahlberg, Henning
2014-05-01
The introduction of direct electron detectors (DED) to cryo-electron microscopy has tremendously increased the signal-to-noise ratio (SNR) and quality of the recorded images. We discuss the optimal use of DEDs for cryo-electron crystallography, introduce a new automatic image processing pipeline, and demonstrate the vast improvement in the resolution achieved by the use of both together, especially for highly tilted samples. The new processing pipeline (now included in the software package 2dx) exploits the high SNR and frame readout frequency of DEDs to automatically correct for beam-induced sample movement, and reliably processes individual crystal images without human interaction as data are being acquired. A new graphical user interface (GUI) condenses all information required for quality assessment in one window, allowing the imaging conditions to be verified and adjusted during the data collection session. With this new pipeline an automatically generated unit cell projection map of each recorded 2D crystal is available less than 5 min after the image was recorded. The entire processing procedure yielded a three-dimensional reconstruction of the 2D-crystallized ion-channel membrane protein MloK1 with a much-improved resolution of 5Å in-plane and 7Å in the z-direction, within 2 days of data acquisition and simultaneous processing. The results obtained are superior to those delivered by conventional photographic film-based methodology of the same sample, and demonstrate the importance of drift-correction. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.
Efficient and automatic image reduction framework for space debris detection based on GPU technology
NASA Astrophysics Data System (ADS)
Diprima, Francesco; Santoni, Fabio; Piergentili, Fabrizio; Fortunato, Vito; Abbattista, Cristoforo; Amoruso, Leonardo
2018-04-01
In the last years, the increasing number of space debris has triggered the need of a distributed monitoring system for the prevention of possible space collisions. Space surveillance based on ground telescope allows the monitoring of the traffic of the Resident Space Objects (RSOs) in the Earth orbit. This space debris surveillance has several applications such as orbit prediction and conjunction assessment. In this paper is proposed an optimized and performance-oriented pipeline for sources extraction intended to the automatic detection of space debris in optical data. The detection method is based on the morphological operations and Hough Transform for lines. Near real-time detection is obtained using General Purpose computing on Graphics Processing Units (GPGPU). The high degree of processing parallelism provided by GPGPU allows to split data analysis over thousands of threads in order to process big datasets with a limited computational time. The implementation has been tested on a large and heterogeneous images data set, containing both imaging satellites from different orbit ranges and multiple observation modes (i.e. sidereal and object tracking). These images were taken during an observation campaign performed from the EQUO (EQUatorial Observatory) observatory settled at the Broglio Space Center (BSC) in Kenya, which is part of the ASI-Sapienza Agreement.
40 CFR 98.7 - What standardized methods are incorporated by reference into this part?
Code of Federal Regulations, 2011 CFR
2011-07-01
... 2005) Standard Practice for Automatic Sampling of Petroleum and Petroleum Products, IBR approved for... from Railroad Cars, Barges, Trucks, or Stockpiles, IBR approved for § 98.164(b). (35) ASTM D7430-08ae1... Liquids—Automatic pipeline sampling—Second Edition 1988-12-01, IBR approved for § 98.164(b). (3) [Reserved...
Song, Dandan; Li, Ning; Liao, Lejian
2015-01-01
Due to the generation of enormous amounts of data at both lower costs as well as in shorter times, whole-exome sequencing technologies provide dramatic opportunities for identifying disease genes implicated in Mendelian disorders. Since upwards of thousands genomic variants can be sequenced in each exome, it is challenging to filter pathogenic variants in protein coding regions and reduce the number of missing true variants. Therefore, an automatic and efficient pipeline for finding disease variants in Mendelian disorders is designed by exploiting a combination of variants filtering steps to analyze the family-based exome sequencing approach. Recent studies on the Freeman-Sheldon disease are revisited and show that the proposed method outperforms other existing candidate gene identification methods.
Long-Term Monitoring of Cased Pipelines Using Longrange Guided-Wave Technique
DOT National Transportation Integrated Search
2009-05-19
Integrity management programs for gas transmission pipelines are required by The Office of Pipeline Safety (OPS)/DOT. Direct Assessment (DA) and 'Other Technologies' have become the focus of assessment options for pipeline integrity on cased crossing...
49 CFR 192.481 - Atmospheric corrosion control: Monitoring.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 49 Transportation 3 2012-10-01 2012-10-01 false Atmospheric corrosion control: Monitoring. 192.481... Control § 192.481 Atmospheric corrosion control: Monitoring. (a) Each operator must inspect each pipeline or portion of pipeline that is exposed to the atmosphere for evidence of atmospheric corrosion, as...
49 CFR 192.481 - Atmospheric corrosion control: Monitoring.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 49 Transportation 3 2013-10-01 2013-10-01 false Atmospheric corrosion control: Monitoring. 192.481... Control § 192.481 Atmospheric corrosion control: Monitoring. (a) Each operator must inspect each pipeline or portion of pipeline that is exposed to the atmosphere for evidence of atmospheric corrosion, as...
49 CFR 192.481 - Atmospheric corrosion control: Monitoring.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 49 Transportation 3 2014-10-01 2014-10-01 false Atmospheric corrosion control: Monitoring. 192.481... Control § 192.481 Atmospheric corrosion control: Monitoring. (a) Each operator must inspect each pipeline or portion of pipeline that is exposed to the atmosphere for evidence of atmospheric corrosion, as...
49 CFR 192.465 - External corrosion control: Monitoring.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 49 Transportation 3 2011-10-01 2011-10-01 false External corrosion control: Monitoring. 192.465... TRANSPORTATION OF NATURAL AND OTHER GAS BY PIPELINE: MINIMUM FEDERAL SAFETY STANDARDS Requirements for Corrosion Control § 192.465 External corrosion control: Monitoring. (a) Each pipeline that is under cathodic...
49 CFR 192.481 - Atmospheric corrosion control: Monitoring.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 49 Transportation 3 2011-10-01 2011-10-01 false Atmospheric corrosion control: Monitoring. 192.481... TRANSPORTATION OF NATURAL AND OTHER GAS BY PIPELINE: MINIMUM FEDERAL SAFETY STANDARDS Requirements for Corrosion Control § 192.481 Atmospheric corrosion control: Monitoring. (a) Each operator must inspect each pipeline...
49 CFR 192.465 - External corrosion control: Monitoring.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 49 Transportation 3 2012-10-01 2012-10-01 false External corrosion control: Monitoring. 192.465... TRANSPORTATION OF NATURAL AND OTHER GAS BY PIPELINE: MINIMUM FEDERAL SAFETY STANDARDS Requirements for Corrosion Control § 192.465 External corrosion control: Monitoring. (a) Each pipeline that is under cathodic...
49 CFR 192.465 - External corrosion control: Monitoring.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 49 Transportation 3 2013-10-01 2013-10-01 false External corrosion control: Monitoring. 192.465... TRANSPORTATION OF NATURAL AND OTHER GAS BY PIPELINE: MINIMUM FEDERAL SAFETY STANDARDS Requirements for Corrosion Control § 192.465 External corrosion control: Monitoring. (a) Each pipeline that is under cathodic...
49 CFR 192.465 - External corrosion control: Monitoring.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 49 Transportation 3 2014-10-01 2014-10-01 false External corrosion control: Monitoring. 192.465... TRANSPORTATION OF NATURAL AND OTHER GAS BY PIPELINE: MINIMUM FEDERAL SAFETY STANDARDS Requirements for Corrosion Control § 192.465 External corrosion control: Monitoring. (a) Each pipeline that is under cathodic...
Use of FBG sensors for health monitoring of pipelines
NASA Astrophysics Data System (ADS)
Felli, Ferdinando; Paolozzi, Antonio; Vendittozzi, Cristian; Paris, Claudio; Asanuma, Hiroshi
2016-04-01
The infrastructures for oil and gas production and distribution need reliable monitoring systems. The risks for pipelines, in particular, are not only limited to natural disasters (landslides, earthquakes, extreme environmental conditions) and accidents, but involve also the damages related to criminal activities, such as oil theft. The existing monitoring systems are not adequate for detecting damages from oil theft, and in several occasion the illegal activities resulted in leakage of oil and catastrophic environmental pollution. Systems based on fiber optic FBG (Fiber Bragg Grating) sensors present a number of advantages for pipeline monitoring. FBG sensors can withstand harsh environment, are immune to interferences, and can be used to develop a smart system for monitoring at the same time several physical characteristics, such as strain, temperature, acceleration, pressure, and vibrations. The monitoring station can be positioned tens of kilometers away from the measuring points, lowering the costs and the complexity of the system. This paper describes tests on a sensor, based on FBG technology, developed specifically for detecting damages of pipeline due to illegal activities (drilling of the pipes), that can be integrated into a smart monitoring chain.
49 CFR 193.2635 - Monitoring corrosion control.
Code of Federal Regulations, 2010 CFR
2010-10-01
....2635 Transportation Other Regulations Relating to Transportation (Continued) PIPELINE AND HAZARDOUS MATERIALS SAFETY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED) PIPELINE SAFETY LIQUEFIED NATURAL GAS FACILITIES: FEDERAL SAFETY STANDARDS Maintenance § 193.2635 Monitoring corrosion control...
Automatic creation of three-dimensional avatars
NASA Astrophysics Data System (ADS)
Villa-Uriol, Maria-Cruz; Sainz, Miguel; Kuester, Falko; Bagherzadeh, Nader
2003-01-01
Highly accurate avatars of humans promise a new level of realism in engineering and entertainment applications, including areas such as computer animated movies, computer game development interactive virtual environments and tele-presence. In order to provide high-quality avatars, new techniques for the automatic acquisition and creation are required. A framework for the capture and construction of arbitrary avatars from image data is presented in this paper. Avatars are automatically reconstructed from multiple static images of a human subject by utilizing image information to reshape a synthetic three-dimensional articulated reference model. A pipeline is presented that combines a set of hardware-accelerated stages into one seamless system. Primary stages in this pipeline include pose estimation, skeleton fitting, body part segmentation, geometry construction and coloring, leading to avatars that can be animated and included into interactive environments. The presented system removes traditional constraints in the initial pose of the captured subject by using silhouette-based modification techniques in combination with a reference model. Results can be obtained in near-real time with very limited user intervention.
PyEmir: Data Reduction Pipeline for EMIR, the GTC Near-IR Multi-Object Spectrograph
NASA Astrophysics Data System (ADS)
Pascual, S.; Gallego, J.; Cardiel, N.; Eliche-Moral, M. C.
2010-12-01
EMIR is the near-infrared wide-field camera and multi-slit spectrograph being built for Gran Telescopio Canarias. We present here the work being done on its data processing pipeline. PyEmir is based on Python and it will process automatically data taken in both imaging and spectroscopy mode. PyEmir is begin developed by the UCM Group of Extragalactic Astrophysics and Astronomical Instrumentation.
Iterative Strategies for Aftershock Classification in Automatic Seismic Processing Pipelines
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gibbons, Steven J.; Kvaerna, Tormod; Harris, David B.
We report aftershock sequences following very large earthquakes present enormous challenges to near-real-time generation of seismic bulletins. The increase in analyst resources needed to relocate an inflated number of events is compounded by failures of phase-association algorithms and a significant deterioration in the quality of underlying, fully automatic event bulletins. Current processing pipelines were designed a generation ago, and, due to computational limitations of the time, are usually limited to single passes over the raw data. With current processing capability, multiple passes over the data are feasible. Processing the raw data at each station currently generates parametric data streams thatmore » are then scanned by a phase-association algorithm to form event hypotheses. We consider the scenario in which a large earthquake has occurred and propose to define a region of likely aftershock activity in which events are detected and accurately located, using a separate specially targeted semiautomatic process. This effort may focus on so-called pattern detectors, but here we demonstrate a more general grid-search algorithm that may cover wider source regions without requiring waveform similarity. Given many well-located aftershocks within our source region, we may remove all associated phases from the original detection lists prior to a new iteration of the phase-association algorithm. We provide a proof-of-concept example for the 2015 Gorkha sequence, Nepal, recorded on seismic arrays of the International Monitoring System. Even with very conservative conditions for defining event hypotheses within the aftershock source region, we can automatically remove about half of the original detections that could have been generated by Nepal earthquakes and reduce the likelihood of false associations and spurious event hypotheses. Lastly, further reductions in the number of detections in the parametric data streams are likely, using correlation and subspace detectors and/or empirical matched field processing.« less
Iterative Strategies for Aftershock Classification in Automatic Seismic Processing Pipelines
Gibbons, Steven J.; Kvaerna, Tormod; Harris, David B.; ...
2016-06-08
We report aftershock sequences following very large earthquakes present enormous challenges to near-real-time generation of seismic bulletins. The increase in analyst resources needed to relocate an inflated number of events is compounded by failures of phase-association algorithms and a significant deterioration in the quality of underlying, fully automatic event bulletins. Current processing pipelines were designed a generation ago, and, due to computational limitations of the time, are usually limited to single passes over the raw data. With current processing capability, multiple passes over the data are feasible. Processing the raw data at each station currently generates parametric data streams thatmore » are then scanned by a phase-association algorithm to form event hypotheses. We consider the scenario in which a large earthquake has occurred and propose to define a region of likely aftershock activity in which events are detected and accurately located, using a separate specially targeted semiautomatic process. This effort may focus on so-called pattern detectors, but here we demonstrate a more general grid-search algorithm that may cover wider source regions without requiring waveform similarity. Given many well-located aftershocks within our source region, we may remove all associated phases from the original detection lists prior to a new iteration of the phase-association algorithm. We provide a proof-of-concept example for the 2015 Gorkha sequence, Nepal, recorded on seismic arrays of the International Monitoring System. Even with very conservative conditions for defining event hypotheses within the aftershock source region, we can automatically remove about half of the original detections that could have been generated by Nepal earthquakes and reduce the likelihood of false associations and spurious event hypotheses. Lastly, further reductions in the number of detections in the parametric data streams are likely, using correlation and subspace detectors and/or empirical matched field processing.« less
NASA Astrophysics Data System (ADS)
Korotaev, Valery V.; Denisov, Victor M.; Rodrigues, Joel J. P. C.; Serikova, Mariya G.; Timofeev, Andrey V.
2015-05-01
The paper deals with the creation of integrated monitoring systems. They combine fiber-optic classifiers and local sensor networks. These systems allow for the monitoring of complex industrial objects. Together with adjacent natural objects, they form the so-called geotechnical systems. An integrated monitoring system may include one or more spatially continuous fiber-optic classifiers based on optic fiber and one or more arrays of discrete measurement sensors, which are usually combined in sensor networks. Fiber-optic classifiers are already widely used for the control of hazardous extended objects (oil and gas pipelines, railways, high-rise buildings, etc.). To monitor local objects, discrete measurement sensors are generally used (temperature, pressure, inclinometers, strain gauges, accelerometers, sensors measuring the composition of impurities in the air, and many others). However, monitoring complex geotechnical systems require a simultaneous use of continuous spatially distributed sensors based on fiber-optic cable and connected local discrete sensors networks. In fact, we are talking about integration of the two monitoring methods. This combination provides an additional way to create intelligent monitoring systems. Modes of operation of intelligent systems can automatically adapt to changing environmental conditions. For this purpose, context data received from one sensor (e.g., optical channel) may be used to change modes of work of other sensors within the same monitoring system. This work also presents experimental results of the prototype of the integrated monitoring system.
A Semi-Automatic Image-Based Close Range 3D Modeling Pipeline Using a Multi-Camera Configuration
Rau, Jiann-Yeou; Yeh, Po-Chia
2012-01-01
The generation of photo-realistic 3D models is an important task for digital recording of cultural heritage objects. This study proposes an image-based 3D modeling pipeline which takes advantage of a multi-camera configuration and multi-image matching technique that does not require any markers on or around the object. Multiple digital single lens reflex (DSLR) cameras are adopted and fixed with invariant relative orientations. Instead of photo-triangulation after image acquisition, calibration is performed to estimate the exterior orientation parameters of the multi-camera configuration which can be processed fully automatically using coded targets. The calibrated orientation parameters of all cameras are applied to images taken using the same camera configuration. This means that when performing multi-image matching for surface point cloud generation, the orientation parameters will remain the same as the calibrated results, even when the target has changed. Base on this invariant character, the whole 3D modeling pipeline can be performed completely automatically, once the whole system has been calibrated and the software was seamlessly integrated. Several experiments were conducted to prove the feasibility of the proposed system. Images observed include that of a human being, eight Buddhist statues, and a stone sculpture. The results for the stone sculpture, obtained with several multi-camera configurations were compared with a reference model acquired by an ATOS-I 2M active scanner. The best result has an absolute accuracy of 0.26 mm and a relative accuracy of 1:17,333. It demonstrates the feasibility of the proposed low-cost image-based 3D modeling pipeline and its applicability to a large quantity of antiques stored in a museum. PMID:23112656
A semi-automatic image-based close range 3D modeling pipeline using a multi-camera configuration.
Rau, Jiann-Yeou; Yeh, Po-Chia
2012-01-01
The generation of photo-realistic 3D models is an important task for digital recording of cultural heritage objects. This study proposes an image-based 3D modeling pipeline which takes advantage of a multi-camera configuration and multi-image matching technique that does not require any markers on or around the object. Multiple digital single lens reflex (DSLR) cameras are adopted and fixed with invariant relative orientations. Instead of photo-triangulation after image acquisition, calibration is performed to estimate the exterior orientation parameters of the multi-camera configuration which can be processed fully automatically using coded targets. The calibrated orientation parameters of all cameras are applied to images taken using the same camera configuration. This means that when performing multi-image matching for surface point cloud generation, the orientation parameters will remain the same as the calibrated results, even when the target has changed. Base on this invariant character, the whole 3D modeling pipeline can be performed completely automatically, once the whole system has been calibrated and the software was seamlessly integrated. Several experiments were conducted to prove the feasibility of the proposed system. Images observed include that of a human being, eight Buddhist statues, and a stone sculpture. The results for the stone sculpture, obtained with several multi-camera configurations were compared with a reference model acquired by an ATOS-I 2M active scanner. The best result has an absolute accuracy of 0.26 mm and a relative accuracy of 1:17,333. It demonstrates the feasibility of the proposed low-cost image-based 3D modeling pipeline and its applicability to a large quantity of antiques stored in a museum.
Yoon, Jun-Hee; Kim, Thomas W; Mendez, Pedro; Jablons, David M; Kim, Il-Jin
2017-01-01
The development of next-generation sequencing (NGS) technology allows to sequence whole exomes or genome. However, data analysis is still the biggest bottleneck for its wide implementation. Most laboratories still depend on manual procedures for data handling and analyses, which translates into a delay and decreased efficiency in the delivery of NGS results to doctors and patients. Thus, there is high demand for developing an automatic and an easy-to-use NGS data analyses system. We developed comprehensive, automatic genetic analyses controller named Mobile Genome Express (MGE) that works in smartphones or other mobile devices. MGE can handle all the steps for genetic analyses, such as: sample information submission, sequencing run quality check from the sequencer, secured data transfer and results review. We sequenced an Actrometrix control DNA containing multiple proven human mutations using a targeted sequencing panel, and the whole analysis was managed by MGE, and its data reviewing program called ELECTRO. All steps were processed automatically except for the final sequencing review procedure with ELECTRO to confirm mutations. The data analysis process was completed within several hours. We confirmed the mutations that we have identified were consistent with our previous results obtained by using multi-step, manual pipelines.
Pipelining in structural health monitoring wireless sensor network
NASA Astrophysics Data System (ADS)
Li, Xu; Dorvash, Siavash; Cheng, Liang; Pakzad, Shamim
2010-04-01
Application of wireless sensor network (WSN) for structural health monitoring (SHM), is becoming widespread due to its implementation ease and economic advantage over traditional sensor networks. Beside advantages that have made wireless network preferable, there are some concerns regarding their performance in some applications. In long-span Bridge monitoring the need to transfer data over long distance causes some challenges in design of WSN platforms. Due to the geometry of bridge structures, using multi-hop data transfer between remote nodes and base station is essential. This paper focuses on the performances of pipelining algorithms. We summarize several prevent pipelining approaches, discuss their performances, and propose a new pipelining algorithm, which gives consideration to both boosting of channel usage and the simplicity in deployment.
SAND: an automated VLBI imaging and analysing pipeline - I. Stripping component trajectories
NASA Astrophysics Data System (ADS)
Zhang, M.; Collioud, A.; Charlot, P.
2018-02-01
We present our implementation of an automated very long baseline interferometry (VLBI) data-reduction pipeline that is dedicated to interferometric data imaging and analysis. The pipeline can handle massive VLBI data efficiently, which makes it an appropriate tool to investigate multi-epoch multiband VLBI data. Compared to traditional manual data reduction, our pipeline provides more objective results as less human interference is involved. The source extraction is carried out in the image plane, while deconvolution and model fitting are performed in both the image plane and the uv plane for parallel comparison. The output from the pipeline includes catalogues of CLEANed images and reconstructed models, polarization maps, proper motion estimates, core light curves and multiband spectra. We have developed a regression STRIP algorithm to automatically detect linear or non-linear patterns in the jet component trajectories. This algorithm offers an objective method to match jet components at different epochs and to determine their proper motions.
BALBES: a molecular-replacement pipeline.
Long, Fei; Vagin, Alexei A; Young, Paul; Murshudov, Garib N
2008-01-01
The number of macromolecular structures solved and deposited in the Protein Data Bank (PDB) is higher than 40 000. Using this information in macromolecular crystallography (MX) should in principle increase the efficiency of MX structure solution. This paper describes a molecular-replacement pipeline, BALBES, that makes extensive use of this repository. It uses a reorganized database taken from the PDB with multimeric as well as domain organization. A system manager written in Python controls the workflow of the process. Testing the current version of the pipeline using entries from the PDB has shown that this approach has huge potential and that around 75% of structures can be solved automatically without user intervention.
Method for Stereo Mapping Based on Objectarx and Pipeline Technology
NASA Astrophysics Data System (ADS)
Liu, F.; Chen, T.; Lin, Z.; Yang, Y.
2012-07-01
Stereo mapping is an important way to acquire 4D production. Based on the development of the stereo mapping and the characteristics of ObjectARX and pipeline technology, a new stereo mapping scheme which can realize the interaction between the AutoCAD and digital photogrammetry system is offered by ObjectARX and pipeline technology. An experiment is made in order to make sure the feasibility with the example of the software MAP-AT (Modern Aerial Photogrammetry Automatic Triangulation), the experimental results show that this scheme is feasible and it has very important meaning for the realization of the acquisition and edit integration.
NASA Astrophysics Data System (ADS)
Ryabkov, A. V.; Stafeeva, N. A.; Ivanov, V. A.; Zakuraev, A. F.
2018-05-01
A complex construction consisting of a universal floating pontoon road for laying pipelines in automatic mode on its body all year round and in any weather for Siberia and the Far North has been designed. A new method is proposed for the construction of pipelines on pontoon modules, which are made of composite materials. Pontoons made of composite materials for bedding pipelines with track-forming guides for automated wheeled transport, pipelayer, are designed. The proposed system eliminates the construction of a road along the route, ensures the buoyancy and smoothness of the self-propelled automated stacker in the form of a "centipede", which has a number of significant advantages in the construction and operation of the entire complex in the swamp and watered areas without overburden.
Creating Data that Never Die: Building a Spectrograph Data Pipeline in the Virtual Observatory Era
NASA Astrophysics Data System (ADS)
Mink, D. J.; Wyatt, W. F.; Roll, J. B.; Tokarz, S. P.; Conroy, M. A.; Caldwell, N.; Kurtz, M.; Geller, M. J.
2005-12-01
Data pipelines for modern complex astronomical instruments do not begin when the data is taken and end when it is delivered to the user. Information must flow between the observatory and the observer from the time a project is conceived and between the observatory and the world well past the time when the original observers have extracted all the information they want from the data. For the 300-fiber Hectospec low dispersion spectrograph on the MMT, the SAO Telescope Data Center is constructing a data pipeline which provides assistance from preparing and submitting observing proposals through observation, reduction, and analysis to publication and an afterlife in the Virtual Observatory. We will describe our semi-automatic pipeline and how it has evolved over the first nine months of operation.
Engineering considerations for corrosion monitoring of gas gathering pipeline systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Braga, T.G.; Asperger, R.G.
1987-01-01
Proper corrosion monitoring of gas gathering pipelines requires a system review to determine the appropriate monitor locations and types of monitoring techniques. This paper develops and discusses a classification of conditions such as flow regime and gas composition. Also discussed are junction categories which, for corrosion monitoring, need to be considered from two points of view. The first is related to fluid flow in the line and the second is related corrosion inhibitor movement along the pipeline. The appropriate application of the various monitoring techniques such as coupons, hydrogen detectors, electrical resistance probe and linear polarization probes are discussed inmore » relation to flow regime and gas composition. Problems caused by semi-conduction from iron sulfide are considered. Advantages and disadvantages of fluid gathering methods such as pots and flow-through drips are discussed in relation to their reliability as on-line monitoring locations.« less
Automated Monitoring of Pipeline Rights-of-Way
NASA Technical Reports Server (NTRS)
Frost, Chard Ritchie
2010-01-01
NASA Ames Research Center and the Pipeline Research Council International, Inc. have partnered in the formation of a research program to identify and develop the key technologies required to enable automated detection of threats to gas and oil transmission and distribution pipelines. This presentation describes the Right-of-way Automated Monitoring (RAM) program and highlights research successes to date, continuing challenges to implementing the RAM objectives, and the program's ongoing work and plans.
Multimodal inspection in power engineering and building industries: new challenges and solutions
NASA Astrophysics Data System (ADS)
Kujawińska, Małgorzata; Malesa, Marcin; Malowany, Krzysztof
2013-09-01
Recently the demand and number of applications of full-field, optical measurement methods based on noncoherent light sources increased significantly. They include traditional image processing, thermovision, digital image correlation (DIC) and structured light methods. However, there are still numerous challenges connected with implementation of these methods to in-situ, long-term monitoring in industrial, civil engineering and cultural heritage applications, multimodal measurements of a variety of object features or simply adopting instruments to work in hard environmental conditions. In this paper we focus on 3D DIC method and present its enhancements concerning software modifications (new visualization methods and a method for automatic merging of data distributed in time) and hardware improvements. The modified 3D DIC system combined with infrared camera system is applied in many interesting cases: measurements of boiler drum during annealing and of pipelines in heat power stations and monitoring of different building steel struts at construction site and validation of numerical models of large building structures constructed of graded metal plate arches.
NASA Astrophysics Data System (ADS)
Li, Jun-Wei; Cao, Jun-Wei
2010-04-01
One challenge in large-scale scientific data analysis is to monitor data in real-time in a distributed environment. For the LIGO (Laser Interferometer Gravitational-wave Observatory) project, a dedicated suit of data monitoring tools (DMT) has been developed, yielding good extensibility to new data type and high flexibility to a distributed environment. Several services are provided, including visualization of data information in various forms and file output of monitoring results. In this work, a DMT monitor, OmegaMon, is developed for tracking statistics of gravitational-wave (OW) burst triggers that are generated from a specific OW burst data analysis pipeline, the Omega Pipeline. Such results can provide diagnostic information as reference of trigger post-processing and interferometer maintenance.
49 CFR 192.465 - External corrosion control: Monitoring.
Code of Federal Regulations, 2010 CFR
2010-10-01
... Section 192.465 Transportation Other Regulations Relating to Transportation (Continued) PIPELINE AND HAZARDOUS MATERIALS SAFETY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED) PIPELINE SAFETY TRANSPORTATION OF NATURAL AND OTHER GAS BY PIPELINE: MINIMUM FEDERAL SAFETY STANDARDS Requirements for Corrosion...
Real-time inspection by submarine images
NASA Astrophysics Data System (ADS)
Tascini, Guido; Zingaretti, Primo; Conte, Giuseppe
1996-10-01
A real-time application of computer vision concerning tracking and inspection of a submarine pipeline is described. The objective is to develop automatic procedures for supporting human operators in the real-time analysis of images acquired by means of cameras mounted on underwater remotely operated vehicles (ROV) Implementation of such procedures gives rise to a human-machine system for underwater pipeline inspection that can automatically detect and signal the presence of the pipe, of its structural or accessory elements, and of dangerous or alien objects in its neighborhood. The possibility of modifying the image acquisition rate in the simulations performed on video- recorded images is used to prove that the system performs all necessary processing with an acceptable robustness working in real-time up to a speed of about 2.5 kn, widely greater than that the actual ROVs and the security features allow.
Automatic latency equalization in VHDL-implemented complex pipelined systems
NASA Astrophysics Data System (ADS)
Zabołotny, Wojciech M.
2016-09-01
In the pipelined data processing systems it is very important to ensure that parallel paths delay data by the same number of clock cycles. If that condition is not met, the processing blocks receive data not properly aligned in time and produce incorrect results. Manual equalization of latencies is a tedious and error-prone work. This paper presents an automatic method of latency equalization in systems described in VHDL. The proposed method uses simulation to measure latencies and verify introduced correction. The solution is portable between different simulation and synthesis tools. The method does not increase the complexity of the synthesized design comparing to the solution based on manual latency adjustment. The example implementation of the proposed methodology together with a simple design demonstrating its use is available as an open source project under BSD license.
Toward cognitive pipelines of medical assistance algorithms.
Philipp, Patrick; Maleshkova, Maria; Katic, Darko; Weber, Christian; Götz, Michael; Rettinger, Achim; Speidel, Stefanie; Kämpgen, Benedikt; Nolden, Marco; Wekerle, Anna-Laura; Dillmann, Rüdiger; Kenngott, Hannes; Müller, Beat; Studer, Rudi
2016-09-01
Assistance algorithms for medical tasks have great potential to support physicians with their daily work. However, medicine is also one of the most demanding domains for computer-based support systems, since medical assistance tasks are complex and the practical experience of the physician is crucial. Recent developments in the area of cognitive computing appear to be well suited to tackle medicine as an application domain. We propose a system based on the idea of cognitive computing and consisting of auto-configurable medical assistance algorithms and their self-adapting combination. The system enables automatic execution of new algorithms, given they are made available as Medical Cognitive Apps and are registered in a central semantic repository. Learning components can be added to the system to optimize the results in the cases when numerous Medical Cognitive Apps are available for the same task. Our prototypical implementation is applied to the areas of surgical phase recognition based on sensor data and image progressing for tumor progression mappings. Our results suggest that such assistance algorithms can be automatically configured in execution pipelines, candidate results can be automatically scored and combined, and the system can learn from experience. Furthermore, our evaluation shows that the Medical Cognitive Apps are providing the correct results as they did for local execution and run in a reasonable amount of time. The proposed solution is applicable to a variety of medical use cases and effectively supports the automated and self-adaptive configuration of cognitive pipelines based on medical interpretation algorithms.
Mobile hybrid LiDAR & infrared sensing for natural gas pipeline monitoring compendium.
DOT National Transportation Integrated Search
2016-01-01
This item consists of several documents that were created throughout the Mobile Hybrid LiDAR & Infrared Sensing for Natural Gas Pipeline Monitoring project, No. RITARS-14-H-RUT, which was conducted from January 15, 2014 to June 30, 2016. Documents in...
The Environmental Technology Verification report discusses the technology and performance of the Parametric Emissions Monitoring System (PEMS) manufactured by ANR Pipeline Company, a subsidiary of Coastal Corporation, now El Paso Corporation. The PEMS predicts carbon doixide (CO2...
Corral framework: Trustworthy and fully functional data intensive parallel astronomical pipelines
NASA Astrophysics Data System (ADS)
Cabral, J. B.; Sánchez, B.; Beroiz, M.; Domínguez, M.; Lares, M.; Gurovich, S.; Granitto, P.
2017-07-01
Data processing pipelines represent an important slice of the astronomical software library that include chains of processes that transform raw data into valuable information via data reduction and analysis. In this work we present Corral, a Python framework for astronomical pipeline generation. Corral features a Model-View-Controller design pattern on top of an SQL Relational Database capable of handling: custom data models; processing stages; and communication alerts, and also provides automatic quality and structural metrics based on unit testing. The Model-View-Controller provides concept separation between the user logic and the data models, delivering at the same time multi-processing and distributed computing capabilities. Corral represents an improvement over commonly found data processing pipelines in astronomysince the design pattern eases the programmer from dealing with processing flow and parallelization issues, allowing them to focus on the specific algorithms needed for the successive data transformations and at the same time provides a broad measure of quality over the created pipeline. Corral and working examples of pipelines that use it are available to the community at https://github.com/toros-astro.
JGI Plant Genomics Gene Annotation Pipeline
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shu, Shengqiang; Rokhsar, Dan; Goodstein, David
2014-07-14
Plant genomes vary in size and are highly complex with a high amount of repeats, genome duplication and tandem duplication. Gene encodes a wealth of information useful in studying organism and it is critical to have high quality and stable gene annotation. Thanks to advancement of sequencing technology, many plant species genomes have been sequenced and transcriptomes are also sequenced. To use these vastly large amounts of sequence data to make gene annotation or re-annotation in a timely fashion, an automatic pipeline is needed. JGI plant genomics gene annotation pipeline, called integrated gene call (IGC), is our effort toward thismore » aim with aid of a RNA-seq transcriptome assembly pipeline. It utilizes several gene predictors based on homolog peptides and transcript ORFs. See Methods for detail. Here we present genome annotation of JGI flagship green plants produced by this pipeline plus Arabidopsis and rice except for chlamy which is done by a third party. The genome annotations of these species and others are used in our gene family build pipeline and accessible via JGI Phytozome portal whose URL and front page snapshot are shown below.« less
The PREP pipeline: standardized preprocessing for large-scale EEG analysis.
Bigdely-Shamlo, Nima; Mullen, Tim; Kothe, Christian; Su, Kyung-Min; Robbins, Kay A
2015-01-01
The technology to collect brain imaging and physiological measures has become portable and ubiquitous, opening the possibility of large-scale analysis of real-world human imaging. By its nature, such data is large and complex, making automated processing essential. This paper shows how lack of attention to the very early stages of an EEG preprocessing pipeline can reduce the signal-to-noise ratio and introduce unwanted artifacts into the data, particularly for computations done in single precision. We demonstrate that ordinary average referencing improves the signal-to-noise ratio, but that noisy channels can contaminate the results. We also show that identification of noisy channels depends on the reference and examine the complex interaction of filtering, noisy channel identification, and referencing. We introduce a multi-stage robust referencing scheme to deal with the noisy channel-reference interaction. We propose a standardized early-stage EEG processing pipeline (PREP) and discuss the application of the pipeline to more than 600 EEG datasets. The pipeline includes an automatically generated report for each dataset processed. Users can download the PREP pipeline as a freely available MATLAB library from http://eegstudy.org/prepcode.
Maria L. Sonett
1999-01-01
Integrated surface management techniques for pipeline construction through arid and semi-arid rangeland ecosystems are presented in a case history of a 412-mile pipeline construction project in New Mexico. Planning, implementation and monitoring for restoration of surface hydrology, soil stabilization, soil cover, and plant species succession are discussed. Planning...
ORAC-DR -- integral field spectroscopy data reduction
NASA Astrophysics Data System (ADS)
Todd, Stephen
ORAC-DR is a general-purpose automatic data-reduction pipeline environment. This document describes its use to reduce integral field unit (IFU) data collected at the United Kingdom Infrared Telescope (UKIRT) with the UIST instrument.
The HEASARC Swift Gamma-Ray Burst Archive: The Pipeline and the Catalog
NASA Technical Reports Server (NTRS)
Donato, Davide; Angelini, Lorella; Padgett, C.A.; Reichard, T.; Gehrels, Neil; Marshall, Francis E.; Sakamoto, Takanori
2012-01-01
Since its launch in late 2004, the Swift satellite triggered or observed an average of one gamma-ray burst (GRB) every 3 days, for a total of 771 GRBs by 2012 January. Here, we report the development of a pipeline that semi automatically performs the data-reduction and data-analysis processes for the three instruments on board Swift (BAT, XRT, UVOT). The pipeline is written in Perl, and it uses only HEAsoft tools and can be used to perform the analysis of a majority of the point-like objects (e.g., GRBs, active galactic nuclei, pulsars) observed by Swift. We run the pipeline on the GRBs, and we present a database containing the screened data, the output products, and the results of our ongoing analysis. Furthermore, we created a catalog summarizing some GRB information, collected either by running the pipeline or from the literature. The Perl script, the database, and the catalog are available for downloading and querying at the HEASARC Web site.
The HEASARC Swift Gamma-Ray Burst Archive: The Pipeline and the Catalog
NASA Astrophysics Data System (ADS)
Donato, D.; Angelini, L.; Padgett, C. A.; Reichard, T.; Gehrels, N.; Marshall, F. E.; Sakamoto, T.
2012-11-01
Since its launch in late 2004, the Swift satellite triggered or observed an average of one gamma-ray burst (GRB) every 3 days, for a total of 771 GRBs by 2012 January. Here, we report the development of a pipeline that semi-automatically performs the data-reduction and data-analysis processes for the three instruments on board Swift (BAT, XRT, UVOT). The pipeline is written in Perl, and it uses only HEAsoft tools and can be used to perform the analysis of a majority of the point-like objects (e.g., GRBs, active galactic nuclei, pulsars) observed by Swift. We run the pipeline on the GRBs, and we present a database containing the screened data, the output products, and the results of our ongoing analysis. Furthermore, we created a catalog summarizing some GRB information, collected either by running the pipeline or from the literature. The Perl script, the database, and the catalog are available for downloading and querying at the HEASARC Web site.
Computerized image analysis for quantitative neuronal phenotyping in zebrafish.
Liu, Tianming; Lu, Jianfeng; Wang, Ye; Campbell, William A; Huang, Ling; Zhu, Jinmin; Xia, Weiming; Wong, Stephen T C
2006-06-15
An integrated microscope image analysis pipeline is developed for automatic analysis and quantification of phenotypes in zebrafish with altered expression of Alzheimer's disease (AD)-linked genes. We hypothesize that a slight impairment of neuronal integrity in a large number of zebrafish carrying the mutant genotype can be detected through the computerized image analysis method. Key functionalities of our zebrafish image processing pipeline include quantification of neuron loss in zebrafish embryos due to knockdown of AD-linked genes, automatic detection of defective somites, and quantitative measurement of gene expression levels in zebrafish with altered expression of AD-linked genes or treatment with a chemical compound. These quantitative measurements enable the archival of analyzed results and relevant meta-data. The structured database is organized for statistical analysis and data modeling to better understand neuronal integrity and phenotypic changes of zebrafish under different perturbations. Our results show that the computerized analysis is comparable to manual counting with equivalent accuracy and improved efficacy and consistency. Development of such an automated data analysis pipeline represents a significant step forward to achieve accurate and reproducible quantification of neuronal phenotypes in large scale or high-throughput zebrafish imaging studies.
Study on dynamic response measurement of the submarine pipeline by full-term FBG sensors.
Zhou, Jinghai; Sun, Li; Li, Hongnan
2014-01-01
The field of structural health monitoring is concerned with accurately and reliably assessing the integrity of a given structure to reduce ownership costs, increase operational lifetime, and improve safety. In structural health monitoring systems, fiber Bragg grating (FBG) is a promising measurement technology for its superior ability of explosion proof, immunity to electromagnetic interference, and high accuracy. This paper is a study on the dynamic characteristics of fiber Bragg grating (FBG) sensors applied to a submarine pipeline, as well as an experimental investigation on a laboratory model of the pipeline. The dynamic response of a submarine pipeline under seismic excitation is a coupled vibration of liquid and solid interaction. FBG sensors and strain gauges are used to monitor the dynamic response of a submarine pipeline model under a variety of dynamic loading conditions and the maximum working frequency of an FBG strain sensor is calculated according to its dynamic strain responses. Based on the theoretical and experimental results, it can be concluded that FBG sensor is superior to strain gauge and satisfies the demand of dynamic strain measurement.
Study on Dynamic Response Measurement of the Submarine Pipeline by Full-Term FBG Sensors
Zhou, Jinghai; Sun, Li; Li, Hongnan
2014-01-01
The field of structural health monitoring is concerned with accurately and reliably assessing the integrity of a given structure to reduce ownership costs, increase operational lifetime, and improve safety. In structural health monitoring systems, fiber Bragg grating (FBG) is a promising measurement technology for its superior ability of explosion proof, immunity to electromagnetic interference, and high accuracy. This paper is a study on the dynamic characteristics of fiber Bragg grating (FBG) sensors applied to a submarine pipeline, as well as an experimental investigation on a laboratory model of the pipeline. The dynamic response of a submarine pipeline under seismic excitation is a coupled vibration of liquid and solid interaction. FBG sensors and strain gauges are used to monitor the dynamic response of a submarine pipeline model under a variety of dynamic loading conditions and the maximum working frequency of an FBG strain sensor is calculated according to its dynamic strain responses. Based on the theoretical and experimental results, it can be concluded that FBG sensor is superior to strain gauge and satisfies the demand of dynamic strain measurement. PMID:24971391
Understanding Magnetic Flux Leakage (MFL) Signals from Mechanical Damage in Pipelines - Phase I
DOT National Transportation Integrated Search
2007-09-18
Pipeline inspection tools based on Magnetic Flux Leakage (MFL) principles represent the most cost-effective method for in-line detection and monitoring of pipeline corrosion defects. Mechanical damage also produces MFL signals, but as yet these signa...
Alternating evolutionary pressure in a genetic algorithm facilitates protein model selection
Offman, Marc N; Tournier, Alexander L; Bates, Paul A
2008-01-01
Background Automatic protein modelling pipelines are becoming ever more accurate; this has come hand in hand with an increasingly complicated interplay between all components involved. Nevertheless, there are still potential improvements to be made in template selection, refinement and protein model selection. Results In the context of an automatic modelling pipeline, we analysed each step separately, revealing several non-intuitive trends and explored a new strategy for protein conformation sampling using Genetic Algorithms (GA). We apply the concept of alternating evolutionary pressure (AEP), i.e. intermediate rounds within the GA runs where unrestrained, linear growth of the model populations is allowed. Conclusion This approach improves the overall performance of the GA by allowing models to overcome local energy barriers. AEP enabled the selection of the best models in 40% of all targets; compared to 25% for a normal GA. PMID:18673557
A method for real-time implementation of HOG feature extraction
NASA Astrophysics Data System (ADS)
Luo, Hai-bo; Yu, Xin-rong; Liu, Hong-mei; Ding, Qing-hai
2011-08-01
Histogram of oriented gradient (HOG) is an efficient feature extraction scheme, and HOG descriptors are feature descriptors which is widely used in computer vision and image processing for the purpose of biometrics, target tracking, automatic target detection(ATD) and automatic target recognition(ATR) etc. However, computation of HOG feature extraction is unsuitable for hardware implementation since it includes complicated operations. In this paper, the optimal design method and theory frame for real-time HOG feature extraction based on FPGA were proposed. The main principle is as follows: firstly, the parallel gradient computing unit circuit based on parallel pipeline structure was designed. Secondly, the calculation of arctangent and square root operation was simplified. Finally, a histogram generator based on parallel pipeline structure was designed to calculate the histogram of each sub-region. Experimental results showed that the HOG extraction can be implemented in a pixel period by these computing units.
Taralova, Ekaterina; Dupre, Christophe; Yuste, Rafael
2018-01-01
Animal behavior has been studied for centuries, but few efficient methods are available to automatically identify and classify it. Quantitative behavioral studies have been hindered by the subjective and imprecise nature of human observation, and the slow speed of annotating behavioral data. Here, we developed an automatic behavior analysis pipeline for the cnidarian Hydra vulgaris using machine learning. We imaged freely behaving Hydra, extracted motion and shape features from the videos, and constructed a dictionary of visual features to classify pre-defined behaviors. We also identified unannotated behaviors with unsupervised methods. Using this analysis pipeline, we quantified 6 basic behaviors and found surprisingly similar behavior statistics across animals within the same species, regardless of experimental conditions. Our analysis indicates that the fundamental behavioral repertoire of Hydra is stable. This robustness could reflect a homeostatic neural control of "housekeeping" behaviors which could have been already present in the earliest nervous systems. PMID:29589829
NASA Astrophysics Data System (ADS)
Jenness, Tim; Currie, Malcolm J.; Tilanus, Remo P. J.; Cavanagh, Brad; Berry, David S.; Leech, Jamie; Rizzi, Luca
2015-10-01
With the advent of modern multidetector heterodyne instruments that can result in observations generating thousands of spectra per minute it is no longer feasible to reduce these data as individual spectra. We describe the automated data reduction procedure used to generate baselined data cubes from heterodyne data obtained at the James Clerk Maxwell Telescope (JCMT). The system can automatically detect baseline regions in spectra and automatically determine regridding parameters, all without input from a user. Additionally, it can detect and remove spectra suffering from transient interference effects or anomalous baselines. The pipeline is written as a set of recipes using the ORAC-DR pipeline environment with the algorithmic code using Starlink software packages and infrastructure. The algorithms presented here can be applied to other heterodyne array instruments and have been applied to data from historical JCMT heterodyne instrumentation.
Structural health monitoring of pipelines rehabilitated with lining technology
NASA Astrophysics Data System (ADS)
Farhidzadeh, Alireza; Dehghan-Niri, Ehsan; Salamone, Salvatore
2014-03-01
Damage detection of pipeline systems is a tedious and time consuming job due to digging requirement, accessibility, interference with other facilities, and being extremely wide spread in metropolitans. Therefore, a real-time and automated monitoring system can pervasively reduce labor work, time, and expenditures. This paper presents the results of an experimental study aimed at monitoring the performance of full scale pipe lining systems, subjected to static and dynamic (seismic) loading, using Acoustic Emission (AE) technique and Guided Ultrasonic Waves (GUWs). Particularly, two damage mechanisms are investigated: 1) delamination between pipeline and liner as the early indicator of damage, and 2) onset of nonlinearity and incipient failure of the liner as critical damage state.
Bio-Docklets: virtualization containers for single-step execution of NGS pipelines.
Kim, Baekdoo; Ali, Thahmina; Lijeron, Carlos; Afgan, Enis; Krampis, Konstantinos
2017-08-01
Processing of next-generation sequencing (NGS) data requires significant technical skills, involving installation, configuration, and execution of bioinformatics data pipelines, in addition to specialized postanalysis visualization and data mining software. In order to address some of these challenges, developers have leveraged virtualization containers toward seamless deployment of preconfigured bioinformatics software and pipelines on any computational platform. We present an approach for abstracting the complex data operations of multistep, bioinformatics pipelines for NGS data analysis. As examples, we have deployed 2 pipelines for RNA sequencing and chromatin immunoprecipitation sequencing, preconfigured within Docker virtualization containers we call Bio-Docklets. Each Bio-Docklet exposes a single data input and output endpoint and from a user perspective, running the pipelines as simply as running a single bioinformatics tool. This is achieved using a "meta-script" that automatically starts the Bio-Docklets and controls the pipeline execution through the BioBlend software library and the Galaxy Application Programming Interface. The pipeline output is postprocessed by integration with the Visual Omics Explorer framework, providing interactive data visualizations that users can access through a web browser. Our goal is to enable easy access to NGS data analysis pipelines for nonbioinformatics experts on any computing environment, whether a laboratory workstation, university computer cluster, or a cloud service provider. Beyond end users, the Bio-Docklets also enables developers to programmatically deploy and run a large number of pipeline instances for concurrent analysis of multiple datasets. © The Authors 2017. Published by Oxford University Press.
Bio-Docklets: virtualization containers for single-step execution of NGS pipelines
Kim, Baekdoo; Ali, Thahmina; Lijeron, Carlos; Afgan, Enis
2017-01-01
Abstract Processing of next-generation sequencing (NGS) data requires significant technical skills, involving installation, configuration, and execution of bioinformatics data pipelines, in addition to specialized postanalysis visualization and data mining software. In order to address some of these challenges, developers have leveraged virtualization containers toward seamless deployment of preconfigured bioinformatics software and pipelines on any computational platform. We present an approach for abstracting the complex data operations of multistep, bioinformatics pipelines for NGS data analysis. As examples, we have deployed 2 pipelines for RNA sequencing and chromatin immunoprecipitation sequencing, preconfigured within Docker virtualization containers we call Bio-Docklets. Each Bio-Docklet exposes a single data input and output endpoint and from a user perspective, running the pipelines as simply as running a single bioinformatics tool. This is achieved using a “meta-script” that automatically starts the Bio-Docklets and controls the pipeline execution through the BioBlend software library and the Galaxy Application Programming Interface. The pipeline output is postprocessed by integration with the Visual Omics Explorer framework, providing interactive data visualizations that users can access through a web browser. Our goal is to enable easy access to NGS data analysis pipelines for nonbioinformatics experts on any computing environment, whether a laboratory workstation, university computer cluster, or a cloud service provider. Beyond end users, the Bio-Docklets also enables developers to programmatically deploy and run a large number of pipeline instances for concurrent analysis of multiple datasets. PMID:28854616
49 CFR 195.444 - CPM leak detection.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 49 Transportation 3 2010-10-01 2010-10-01 false CPM leak detection. 195.444 Section 195.444... PIPELINE Operation and Maintenance § 195.444 CPM leak detection. Each computational pipeline monitoring (CPM) leak detection system installed on a hazardous liquid pipeline transporting liquid in single...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Daum, Christopher; Zane, Matthew; Han, James
2011-01-31
The U.S. Department of Energy (DOE) Joint Genome Institute's (JGI) Production Sequencing group is committed to the generation of high-quality genomic DNA sequence to support the mission areas of renewable energy generation, global carbon management, and environmental characterization and clean-up. Within the JGI's Production Sequencing group, a robust Illumina Genome Analyzer and HiSeq pipeline has been established. Optimization of the sesequencer pipelines has been ongoing with the aim of continual process improvement of the laboratory workflow, reducing operational costs and project cycle times to increases ample throughput, and improving the overall quality of the sequence generated. A sequence QC analysismore » pipeline has been implemented to automatically generate read and assembly level quality metrics. The foremost of these optimization projects, along with sequencing and operational strategies, throughput numbers, and sequencing quality results will be presented.« less
Thomsen, Martin Christen Frølund; Ahrenfeldt, Johanne; Cisneros, Jose Luis Bellod; Jurtz, Vanessa; Larsen, Mette Voldby; Hasman, Henrik; Aarestrup, Frank Møller; Lund, Ole
2016-01-01
Recent advances in whole genome sequencing have made the technology available for routine use in microbiological laboratories. However, a major obstacle for using this technology is the availability of simple and automatic bioinformatics tools. Based on previously published and already available web-based tools we developed a single pipeline for batch uploading of whole genome sequencing data from multiple bacterial isolates. The pipeline will automatically identify the bacterial species and, if applicable, assemble the genome, identify the multilocus sequence type, plasmids, virulence genes and antimicrobial resistance genes. A short printable report for each sample will be provided and an Excel spreadsheet containing all the metadata and a summary of the results for all submitted samples can be downloaded. The pipeline was benchmarked using datasets previously used to test the individual services. The reported results enable a rapid overview of the major results, and comparing that to the previously found results showed that the platform is reliable and able to correctly predict the species and find most of the expected genes automatically. In conclusion, a combined bioinformatics platform was developed and made publicly available, providing easy-to-use automated analysis of bacterial whole genome sequencing data. The platform may be of immediate relevance as a guide for investigators using whole genome sequencing for clinical diagnostics and surveillance. The platform is freely available at: https://cge.cbs.dtu.dk/services/CGEpipeline-1.1 and it is the intention that it will continue to be expanded with new features as these become available.
Grid-based International Network for Flu observation (g-INFO).
Doan, Trung-Tung; Bernard, Aurélien; Da-Costa, Ana Lucia; Bloch, Vincent; Le, Thanh-Hoa; Legre, Yannick; Maigne, Lydia; Salzemann, Jean; Sarramia, David; Nguyen, Hong-Quang; Breton, Vincent
2010-01-01
The 2009 H1N1 outbreak has demonstrated that continuing vigilance, planning, and strong public health research capability are essential defenses against emerging health threats. Molecular epidemiology of influenza virus strains provides scientists with clues about the temporal and geographic evolution of the virus. In the present paper, researchers from France and Vietnam are proposing a global surveillance network based on grid technology: the goal is to federate influenza data servers and deploy automatically molecular epidemiology studies. A first prototype based on AMGA and the WISDOM Production Environment extracts daily from NCBI influenza H1N1 sequence data which are processed through a phylogenetic analysis pipeline deployed on EGEE and AuverGrid e-infrastructures. The analysis results are displayed on a web portal (http://g-info.healthgrid.org) for epidemiologists to monitor H1N1 pandemics.
Adaptive Time Stepping for Transient Network Flow Simulation in Rocket Propulsion Systems
NASA Technical Reports Server (NTRS)
Majumdar, Alok K.; Ravindran, S. S.
2017-01-01
Fluid and thermal transients found in rocket propulsion systems such as propellant feedline system is a complex process involving fast phases followed by slow phases. Therefore their time accurate computation requires use of short time step initially followed by the use of much larger time step. Yet there are instances that involve fast-slow-fast phases. In this paper, we present a feedback control based adaptive time stepping algorithm, and discuss its use in network flow simulation of fluid and thermal transients. The time step is automatically controlled during the simulation by monitoring changes in certain key variables and by feedback. In order to demonstrate the viability of time adaptivity for engineering problems, we applied it to simulate water hammer and cryogenic chill down in pipelines. Our comparison and validation demonstrate the accuracy and efficiency of this adaptive strategy.
Robotic Spectroscopy at the Dark Sky Observatory
NASA Astrophysics Data System (ADS)
Rosenberg, Daniel E.; Gray, Richard O.; Mashburn, Jonathan; Swenson, Aaron W.; McGahee, Courtney E.; Briley, Michael M.
2018-06-01
Spectroscopic observations using the classification-resolution Gray-Miller spectrograph attached to the Dark Sky Observatory 32 inch telescope (Appalachian State University, North Carolina) have been automated with a robotic script called the “Robotic Spectroscopist” (RS). RS runs autonomously during the night and controls all operations related to spectroscopic observing. At the heart of RS are a number of algorithms that first select and center the target star in the field of an imaging camera and then on the spectrograph slit. RS monitors the observatory weather station, and suspends operations and closes the dome when weather conditions warrant, and can reopen and resume observations when the weather improves. RS selects targets from a list using a queue-observing protocol based on observer-assigned priorities, but also uses target-selection criteria based on weather conditions, especially seeing. At the end of the night RS transfers the data files to the main campus, where they are reduced with an automatic pipeline. Our experience has shown that RS is more efficient and consistent than a human observer, and produces data sets that are ideal for automatic reduction. RS should be adaptable for use at other similar observatories, and so we are making the code freely available to the astronomical community.
Alizadeh, Mahdi; Conklin, Chris J; Middleton, Devon M; Shah, Pallav; Saksena, Sona; Krisa, Laura; Finsterbusch, Jürgen; Faro, Scott H; Mulcahey, M J; Mohamed, Feroze B
2018-04-01
Ghost artifacts are a major contributor to degradation of spinal cord diffusion tensor images. A multi-stage post-processing pipeline was designed, implemented and validated to automatically remove ghost artifacts arising from reduced field of view diffusion tensor imaging (DTI) of the pediatric spinal cord. A total of 12 pediatric subjects including 7 healthy subjects (mean age=11.34years) with no evidence of spinal cord injury or pathology and 5 patients (mean age=10.96years) with cervical spinal cord injury were studied. Ghost/true cords, labeled as region of interests (ROIs), in non-diffusion weighted b0 images were segmented automatically using mathematical morphological processing. Initially, 21 texture features were extracted from each segmented ROI including 5 first-order features based on the histogram of the image (mean, variance, skewness, kurtosis and entropy) and 16s-order feature vector elements, incorporating four statistical measures (contrast, correlation, homogeneity and energy) calculated from co-occurrence matrices in directions of 0°, 45°, 90° and 135°. Next, ten features with a high value of mutual information (MI) relative to the pre-defined target class and within the features were selected as final features which were input to a trained classifier (adaptive neuro-fuzzy interface system) to separate the true cord from the ghost cord. The implemented pipeline was successfully able to separate the ghost artifacts from true cord structures. The results obtained from the classifier showed a sensitivity of 91%, specificity of 79%, and accuracy of 84% in separating the true cord from ghost artifacts. The results show that the proposed method is promising for the automatic detection of ghost cords present in DTI images of the spinal cord. This step is crucial towards development of accurate, automatic DTI spinal cord post processing pipelines. Copyright © 2017 Elsevier Inc. All rights reserved.
ToTem: a tool for variant calling pipeline optimization.
Tom, Nikola; Tom, Ondrej; Malcikova, Jitka; Pavlova, Sarka; Kubesova, Blanka; Rausch, Tobias; Kolarik, Miroslav; Benes, Vladimir; Bystry, Vojtech; Pospisilova, Sarka
2018-06-26
High-throughput bioinformatics analyses of next generation sequencing (NGS) data often require challenging pipeline optimization. The key problem is choosing appropriate tools and selecting the best parameters for optimal precision and recall. Here we introduce ToTem, a tool for automated pipeline optimization. ToTem is a stand-alone web application with a comprehensive graphical user interface (GUI). ToTem is written in Java and PHP with an underlying connection to a MySQL database. Its primary role is to automatically generate, execute and benchmark different variant calling pipeline settings. Our tool allows an analysis to be started from any level of the process and with the possibility of plugging almost any tool or code. To prevent an over-fitting of pipeline parameters, ToTem ensures the reproducibility of these by using cross validation techniques that penalize the final precision, recall and F-measure. The results are interpreted as interactive graphs and tables allowing an optimal pipeline to be selected, based on the user's priorities. Using ToTem, we were able to optimize somatic variant calling from ultra-deep targeted gene sequencing (TGS) data and germline variant detection in whole genome sequencing (WGS) data. ToTem is a tool for automated pipeline optimization which is freely available as a web application at https://totem.software .
High speed quantitative digital microscopy
NASA Technical Reports Server (NTRS)
Castleman, K. R.; Price, K. H.; Eskenazi, R.; Ovadya, M. M.; Navon, M. A.
1984-01-01
Modern digital image processing hardware makes possible quantitative analysis of microscope images at high speed. This paper describes an application to automatic screening for cervical cancer. The system uses twelve MC6809 microprocessors arranged in a pipeline multiprocessor configuration. Each processor executes one part of the algorithm on each cell image as it passes through the pipeline. Each processor communicates with its upstream and downstream neighbors via shared two-port memory. Thus no time is devoted to input-output operations as such. This configuration is expected to be at least ten times faster than previous systems.
ORAC-DR: Overview and General Introduction
NASA Astrophysics Data System (ADS)
Economou, Frossie; Jenness, Tim; Currie, Malcolm J.; Adamson, Andy; Allan, Alasdair; Cavanagh, Brad
ORAC-DR is a general purpose automatic data reduction pipeline environment. It currently supports data reduction for the United Kingdom Infrared Telescope (UKIRT) instruments UFTI, IRCAM, UIST and CGS4, for the James Clerk Maxwell Telescope (JCMT) instrument SCUBA, for the William Herschel Telescope (WHT) instrument INGRID, for the European Southern Observatory (ESO) instrument ISAAC and for the Anglo-Australian Telescope (AAT) instrument IRIS-2. This document describes the general pipeline environment. For specific information on how to reduce the data for a particular instrument, please consult the appropriate ORAC-DR instrument guide.
Color correction pipeline optimization for digital cameras
NASA Astrophysics Data System (ADS)
Bianco, Simone; Bruna, Arcangelo R.; Naccari, Filippo; Schettini, Raimondo
2013-04-01
The processing pipeline of a digital camera converts the RAW image acquired by the sensor to a representation of the original scene that should be as faithful as possible. There are mainly two modules responsible for the color-rendering accuracy of a digital camera: the former is the illuminant estimation and correction module, and the latter is the color matrix transformation aimed to adapt the color response of the sensor to a standard color space. These two modules together form what may be called the color correction pipeline. We design and test new color correction pipelines that exploit different illuminant estimation and correction algorithms that are tuned and automatically selected on the basis of the image content. Since the illuminant estimation is an ill-posed problem, illuminant correction is not error-free. An adaptive color matrix transformation module is optimized, taking into account the behavior of the first module in order to alleviate the amplification of color errors. The proposed pipelines are tested on a publicly available dataset of RAW images. Experimental results show that exploiting the cross-talks between the modules of the pipeline can lead to a higher color-rendition accuracy.
The PREP pipeline: standardized preprocessing for large-scale EEG analysis
Bigdely-Shamlo, Nima; Mullen, Tim; Kothe, Christian; Su, Kyung-Min; Robbins, Kay A.
2015-01-01
The technology to collect brain imaging and physiological measures has become portable and ubiquitous, opening the possibility of large-scale analysis of real-world human imaging. By its nature, such data is large and complex, making automated processing essential. This paper shows how lack of attention to the very early stages of an EEG preprocessing pipeline can reduce the signal-to-noise ratio and introduce unwanted artifacts into the data, particularly for computations done in single precision. We demonstrate that ordinary average referencing improves the signal-to-noise ratio, but that noisy channels can contaminate the results. We also show that identification of noisy channels depends on the reference and examine the complex interaction of filtering, noisy channel identification, and referencing. We introduce a multi-stage robust referencing scheme to deal with the noisy channel-reference interaction. We propose a standardized early-stage EEG processing pipeline (PREP) and discuss the application of the pipeline to more than 600 EEG datasets. The pipeline includes an automatically generated report for each dataset processed. Users can download the PREP pipeline as a freely available MATLAB library from http://eegstudy.org/prepcode. PMID:26150785
Sensor Network Architectures for Monitoring Underwater Pipelines
Mohamed, Nader; Jawhar, Imad; Al-Jaroodi, Jameela; Zhang, Liren
2011-01-01
This paper develops and compares different sensor network architecture designs that can be used for monitoring underwater pipeline infrastructures. These architectures are underwater wired sensor networks, underwater acoustic wireless sensor networks, RF (Radio Frequency) wireless sensor networks, integrated wired/acoustic wireless sensor networks, and integrated wired/RF wireless sensor networks. The paper also discusses the reliability challenges and enhancement approaches for these network architectures. The reliability evaluation, characteristics, advantages, and disadvantages among these architectures are discussed and compared. Three reliability factors are used for the discussion and comparison: the network connectivity, the continuity of power supply for the network, and the physical network security. In addition, the paper also develops and evaluates a hierarchical sensor network framework for underwater pipeline monitoring. PMID:22346669
Sensor network architectures for monitoring underwater pipelines.
Mohamed, Nader; Jawhar, Imad; Al-Jaroodi, Jameela; Zhang, Liren
2011-01-01
This paper develops and compares different sensor network architecture designs that can be used for monitoring underwater pipeline infrastructures. These architectures are underwater wired sensor networks, underwater acoustic wireless sensor networks, RF (radio frequency) wireless sensor networks, integrated wired/acoustic wireless sensor networks, and integrated wired/RF wireless sensor networks. The paper also discusses the reliability challenges and enhancement approaches for these network architectures. The reliability evaluation, characteristics, advantages, and disadvantages among these architectures are discussed and compared. Three reliability factors are used for the discussion and comparison: the network connectivity, the continuity of power supply for the network, and the physical network security. In addition, the paper also develops and evaluates a hierarchical sensor network framework for underwater pipeline monitoring.
Earthquake Monitoring: SeisComp3 at the Swiss National Seismic Network
NASA Astrophysics Data System (ADS)
Clinton, J. F.; Diehl, T.; Cauzzi, C.; Kaestli, P.
2011-12-01
The Swiss Seismological Service (SED) has an ongoing responsibility to improve the seismicity monitoring capability for Switzerland. This is a crucial issue for a country with low background seismicity but where a large M6+ earthquake is expected in the next decades. With over 30 stations with spacing of ~25km, the SED operates one of the densest broadband networks in the world, which is complimented by ~ 50 realtime strong motion stations. The strong motion network is expected to grow with an additional ~80 stations over the next few years. Furthermore, the backbone of the network is complemented by broadband data from surrounding countries and temporary sub-networks for local monitoring of microseismicity (e.g. at geothermal sites). The variety of seismic monitoring responsibilities as well as the anticipated densifications of our network demands highly flexible processing software. We are transitioning all software to the SeisComP3 (SC3) framework. SC3 is a fully featured automated real-time earthquake monitoring software developed by GeoForschungZentrum Potsdam in collaboration with commercial partner, gempa GmbH. It is in its core open source, and becoming a community standard software for earthquake detection and waveform processing for regional and global networks across the globe. SC3 was originally developed for regional and global rapid monitoring of potentially tsunamagenic earthquakes. In order to fulfill the requirements of a local network recording moderate seismicity, SED has tuned configurations and added several modules. In this contribution, we present our SC3 implementation strategy, focusing on the detection and identification of seismicity on different scales. We operate several parallel processing "pipelines" to detect and locate local, regional and global seismicity. Additional pipelines with lower detection thresholds can be defined to monitor seismicity within dense subnets of the network. To be consistent with existing processing procedures, the nonlinloc algorithm was implemented for manual and automatic locations using 1D and 3D velocity models; plugins for improved automatic phase picking and Ml computation were developed; and the graphical user interface for manual review was extended (including pick uncertainty definition; first motion focal mechanisms; interactive review of station magnitude waveforms; full inclusion of strong motion data). SC3 locations are fully compatible with those derived from the existing in-house processing tools and are stored in a database derived from the QuakeML data model. The database is shared with the SED alerting software, which merges origins from both SC3 and external sources in realtime and handles the alerting procedure. With the monitoring software being transitioned to SeisComp3, acquisition, archival and dissemination of SED waveform data now conforms to the seedlink and ArcLink protocols and continuous archives can be accessed via SED and all EIDA (European Integrated Data Archives) web-sites. Further, a SC3 module for waveform parameterisation has been developed, allowing rapid computation of peak values of ground motion and other engineering parameters within minutes of a new event. An output of this module is USGS ShakeMap XML. n minutes of a new event. An output of this module is USGS ShakeMap XML.
Water Pipeline Monitoring and Leak Detection using Flow Liquid Meter Sensor
NASA Astrophysics Data System (ADS)
Rahmat, R. F.; Satria, I. S.; Siregar, B.; Budiarto, R.
2017-04-01
Water distribution is generally installed through underground pipes. Monitoring the underground water pipelines is more difficult than monitoring the water pipelines located on the ground in open space. This situation will cause a permanent loss if there is a disturbance in the pipeline such as leakage. Leaks in pipes can be caused by several factors, such as the pipe’s age, improper installation, and natural disasters. Therefore, a solution is required to detect and to determine the location of the damage when there is a leak. The detection of the leak location will use fluid mechanics and kinematics physics based on harness water flow rate data obtained using flow liquid meter sensor and Arduino UNO as a microcontroller. The results show that the proposed method is able to work stably to determine the location of the leak which has a maximum distance of 2 metres, and it’s able to determine the leak location as close as possible with flow rate about 10 litters per minute.
High-throughput neuroimaging-genetics computational infrastructure
Dinov, Ivo D.; Petrosyan, Petros; Liu, Zhizhong; Eggert, Paul; Hobel, Sam; Vespa, Paul; Woo Moon, Seok; Van Horn, John D.; Franco, Joseph; Toga, Arthur W.
2014-01-01
Many contemporary neuroscientific investigations face significant challenges in terms of data management, computational processing, data mining, and results interpretation. These four pillars define the core infrastructure necessary to plan, organize, orchestrate, validate, and disseminate novel scientific methods, computational resources, and translational healthcare findings. Data management includes protocols for data acquisition, archival, query, transfer, retrieval, and aggregation. Computational processing involves the necessary software, hardware, and networking infrastructure required to handle large amounts of heterogeneous neuroimaging, genetics, clinical, and phenotypic data and meta-data. Data mining refers to the process of automatically extracting data features, characteristics and associations, which are not readily visible by human exploration of the raw dataset. Result interpretation includes scientific visualization, community validation of findings and reproducible findings. In this manuscript we describe the novel high-throughput neuroimaging-genetics computational infrastructure available at the Institute for Neuroimaging and Informatics (INI) and the Laboratory of Neuro Imaging (LONI) at University of Southern California (USC). INI and LONI include ultra-high-field and standard-field MRI brain scanners along with an imaging-genetics database for storing the complete provenance of the raw and derived data and meta-data. In addition, the institute provides a large number of software tools for image and shape analysis, mathematical modeling, genomic sequence processing, and scientific visualization. A unique feature of this architecture is the Pipeline environment, which integrates the data management, processing, transfer, and visualization. Through its client-server architecture, the Pipeline environment provides a graphical user interface for designing, executing, monitoring validating, and disseminating of complex protocols that utilize diverse suites of software tools and web-services. These pipeline workflows are represented as portable XML objects which transfer the execution instructions and user specifications from the client user machine to remote pipeline servers for distributed computing. Using Alzheimer's and Parkinson's data, we provide several examples of translational applications using this infrastructure1. PMID:24795619
A low-latency pipeline for GRB light curve and spectrum using Fermi/GBM near real-time data
NASA Astrophysics Data System (ADS)
Zhao, Yi; Zhang, Bin-Bin; Xiong, Shao-Lin; Long, Xi; Zhang, Qiang; Song, Li-Ming; Sun, Jian-Chao; Wang, Yuan-Hao; Li, Han-Cheng; Bu, Qing-Cui; Feng, Min-Zi; Li, Zheng-Heng; Wen, Xing; Wu, Bo-Bing; Zhang, Lai-Yu; Zhang, Yong-Jie; Zhang, Shuang-Nan; Shao, Jian-Xiong
2018-05-01
Rapid response and short time latency are very important for Time Domain Astronomy, such as the observations of Gamma-ray Bursts (GRBs) and electromagnetic (EM) counterparts of gravitational waves (GWs). Based on near real-time Fermi/GBM data, we developed a low-latency pipeline to automatically calculate the temporal and spectral properties of GRBs. With this pipeline, some important parameters can be obtained, such as T 90 and fluence, within ∼ 20 min after the GRB trigger. For ∼ 90% of GRBs, T 90 and fluence are consistent with the GBM catalog results within 2σ errors. This pipeline has been used by the Gamma-ray Bursts Polarimeter (POLAR) and the Insight Hard X-ray Modulation Telescope (Insight-HXMT) to follow up the bursts of interest. For GRB 170817A, the first EM counterpart of GW events detected by Fermi/GBM and INTEGRAL/SPI-ACS, the pipeline gave T 90 and spectral information 21 min after the GBM trigger, providing important information for POLAR and Insight-HXMT observations.
ACOUSTIC LOCATION OF LEAKS IN PRESSURIZED UNDERGROUND PETROLEUM PIPELINES
Experiments were conducted at the UST Test Apparatus Pipeline in which three acoustic sensors separated by a maximum distance of 38 m (125-ft) were used to monitor signals produced by 3.0-, 1.5-, and 1.0-gal/h leaks in the wall of a 2-in.-diameter pressurized petroleum pipeline. ...
Automatic Tool for Local Assembly Structures
DOE Office of Scientific and Technical Information (OSTI.GOV)
Whole community shotgun sequencing of total DNA (i.e. metagenomics) and total RNA (i.e. metatranscriptomics) has provided a wealth of information in the microbial community structure, predicted functions, metabolic networks, and is even able to reconstruct complete genomes directly. Here we present ATLAS (Automatic Tool for Local Assembly Structures) a comprehensive pipeline for assembly, annotation, genomic binning of metagenomic and metatranscriptomic data with an integrated framework for Multi-Omics. This will provide an open source tool for the Multi-Omic community at large.
System for corrosion monitoring in pipeline applying fuzzy logic mathematics
NASA Astrophysics Data System (ADS)
Kuzyakov, O. N.; Kolosova, A. L.; Andreeva, M. A.
2018-05-01
A list of factors influencing corrosion rate on the external side of underground pipeline is determined. Principles of constructing a corrosion monitoring system are described; the system performance algorithm and program are elaborated. A comparative analysis of methods for calculating corrosion rate is undertaken. Fuzzy logic mathematics is applied to reduce calculations while considering a wider range of corrosion factors.
Zhang, Jing; Liang, Lichen; Anderson, Jon R; Gatewood, Lael; Rottenberg, David A; Strother, Stephen C
2008-01-01
As functional magnetic resonance imaging (fMRI) becomes widely used, the demands for evaluation of fMRI processing pipelines and validation of fMRI analysis results is increasing rapidly. The current NPAIRS package, an IDL-based fMRI processing pipeline evaluation framework, lacks system interoperability and the ability to evaluate general linear model (GLM)-based pipelines using prediction metrics. Thus, it can not fully evaluate fMRI analytical software modules such as FSL.FEAT and NPAIRS.GLM. In order to overcome these limitations, a Java-based fMRI processing pipeline evaluation system was developed. It integrated YALE (a machine learning environment) into Fiswidgets (a fMRI software environment) to obtain system interoperability and applied an algorithm to measure GLM prediction accuracy. The results demonstrated that the system can evaluate fMRI processing pipelines with univariate GLM and multivariate canonical variates analysis (CVA)-based models on real fMRI data based on prediction accuracy (classification accuracy) and statistical parametric image (SPI) reproducibility. In addition, a preliminary study was performed where four fMRI processing pipelines with GLM and CVA modules such as FSL.FEAT and NPAIRS.CVA were evaluated with the system. The results indicated that (1) the system can compare different fMRI processing pipelines with heterogeneous models (NPAIRS.GLM, NPAIRS.CVA and FSL.FEAT) and rank their performance by automatic performance scoring, and (2) the rank of pipeline performance is highly dependent on the preprocessing operations. These results suggest that the system will be of value for the comparison, validation, standardization and optimization of functional neuroimaging software packages and fMRI processing pipelines.
Research progress of on-line automatic monitoring of chemical oxygen demand (COD) of water
NASA Astrophysics Data System (ADS)
Cai, Youfa; Fu, Xing; Gao, Xiaolu; Li, Lianyin
2018-02-01
With the increasingly stricter control of pollutant emission in China, the on-line automatic monitoring of water quality is particularly urgent. The chemical oxygen demand (COD) is a comprehensive index to measure the contamination caused by organic matters, and thus it is taken as one important index of energy-saving and emission reduction in China’s “Twelve-Five” program. So far, the COD on-line automatic monitoring instrument has played an important role in the field of sewage monitoring. This paper reviews the existing methods to achieve on-line automatic monitoring of COD, and on the basis, points out the future trend of the COD on-line automatic monitoring instruments.
Han, Shuting; Taralova, Ekaterina; Dupre, Christophe; Yuste, Rafael
2018-03-28
Animal behavior has been studied for centuries, but few efficient methods are available to automatically identify and classify it. Quantitative behavioral studies have been hindered by the subjective and imprecise nature of human observation, and the slow speed of annotating behavioral data. Here, we developed an automatic behavior analysis pipeline for the cnidarian Hydra vulgaris using machine learning. We imaged freely behaving Hydra , extracted motion and shape features from the videos, and constructed a dictionary of visual features to classify pre-defined behaviors. We also identified unannotated behaviors with unsupervised methods. Using this analysis pipeline, we quantified 6 basic behaviors and found surprisingly similar behavior statistics across animals within the same species, regardless of experimental conditions. Our analysis indicates that the fundamental behavioral repertoire of Hydra is stable. This robustness could reflect a homeostatic neural control of "housekeeping" behaviors which could have been already present in the earliest nervous systems. © 2018, Han et al.
DOE Office of Scientific and Technical Information (OSTI.GOV)
White, Richard A.; Brown, Joseph M.; Colby, Sean M.
ATLAS (Automatic Tool for Local Assembly Structures) is a comprehensive multiomics data analysis pipeline that is massively parallel and scalable. ATLAS contains a modular analysis pipeline for assembly, annotation, quantification and genome binning of metagenomics and metatranscriptomics data and a framework for reference metaproteomic database construction. ATLAS transforms raw sequence data into functional and taxonomic data at the microbial population level and provides genome-centric resolution through genome binning. ATLAS provides robust taxonomy based on majority voting of protein coding open reading frames rolled-up at the contig level using modified lowest common ancestor (LCA) analysis. ATLAS provides robust taxonomy based onmore » majority voting of protein coding open reading frames rolled-up at the contig level using modified lowest common ancestor (LCA) analysis. ATLAS is user-friendly, easy install through bioconda maintained as open-source on GitHub, and is implemented in Snakemake for modular customizable workflows.« less
TreSpEx—Detection of Misleading Signal in Phylogenetic Reconstructions Based on Tree Information
Struck, Torsten H
2014-01-01
Phylogenies of species or genes are commonplace nowadays in many areas of comparative biological studies. However, for phylogenetic reconstructions one must refer to artificial signals such as paralogy, long-branch attraction, saturation, or conflict between different datasets. These signals might eventually mislead the reconstruction even in phylogenomic studies employing hundreds of genes. Unfortunately, there has been no program allowing the detection of such effects in combination with an implementation into automatic process pipelines. TreSpEx (Tree Space Explorer) now combines different approaches (including statistical tests), which utilize tree-based information like nodal support or patristic distances (PDs) to identify misleading signals. The program enables the parallel analysis of hundreds of trees and/or predefined gene partitions, and being command-line driven, it can be integrated into automatic process pipelines. TreSpEx is implemented in Perl and supported on Linux, Mac OS X, and MS Windows. Source code, binaries, and additional material are freely available at http://www.annelida.de/research/bioinformatics/software.html. PMID:24701118
Food Recognition: A New Dataset, Experiments, and Results.
Ciocca, Gianluigi; Napoletano, Paolo; Schettini, Raimondo
2017-05-01
We propose a new dataset for the evaluation of food recognition algorithms that can be used in dietary monitoring applications. Each image depicts a real canteen tray with dishes and foods arranged in different ways. Each tray contains multiple instances of food classes. The dataset contains 1027 canteen trays for a total of 3616 food instances belonging to 73 food classes. The food on the tray images has been manually segmented using carefully drawn polygonal boundaries. We have benchmarked the dataset by designing an automatic tray analysis pipeline that takes a tray image as input, finds the regions of interest, and predicts for each region the corresponding food class. We have experimented with three different classification strategies using also several visual descriptors. We achieve about 79% of food and tray recognition accuracy using convolutional-neural-networks-based features. The dataset, as well as the benchmark framework, are available to the research community.
Chavarrías, Cristina; García-Vázquez, Verónica; Alemán-Gómez, Yasser; Montesinos, Paula; Pascau, Javier; Desco, Manuel
2016-05-01
The purpose of this study was to develop a multi-platform automatic software tool for full processing of fMRI rodent studies. Existing tools require the usage of several different plug-ins, a significant user interaction and/or programming skills. Based on a user-friendly interface, the tool provides statistical parametric brain maps (t and Z) and percentage of signal change for user-provided regions of interest. The tool is coded in MATLAB (MathWorks(®)) and implemented as a plug-in for SPM (Statistical Parametric Mapping, the Wellcome Trust Centre for Neuroimaging). The automatic pipeline loads default parameters that are appropriate for preclinical studies and processes multiple subjects in batch mode (from images in either Nifti or raw Bruker format). In advanced mode, all processing steps can be selected or deselected and executed independently. Processing parameters and workflow were optimized for rat studies and assessed using 460 male-rat fMRI series on which we tested five smoothing kernel sizes and three different hemodynamic models. A smoothing kernel of FWHM = 1.2 mm (four times the voxel size) yielded the highest t values at the somatosensorial primary cortex, and a boxcar response function provided the lowest residual variance after fitting. fMRat offers the features of a thorough SPM-based analysis combined with the functionality of several SPM extensions in a single automatic pipeline with a user-friendly interface. The code and sample images can be downloaded from https://github.com/HGGM-LIM/fmrat .
VizieR Online Data Catalog: AMBRE project. FEROS archived spectra (Worley+, 2012)
NASA Astrophysics Data System (ADS)
Worley, C. C.; de Laverny, P.; Recio-Blanco, A.; Hill, V.; Bijaoui, A.; Ordenovic, C.
2017-11-01
This first release concerns the FEROS data collected from 2005, October to 2009, December. The spectra were reduced by ESO with the corresponding automatic pipeline and then sent to the Observatoire de la Cote d'Azur (OCA, Nice) for ingestion into a dedicated pipeline (see paper). All FEROS spectra cover the domain 350nm-920nm at a resolution of about 48000. Before their ingestion into MATISSE, these spectra have been convolved at a lower resolution (Δλ=0.33Å), sliced and resampled (total number of pixels = 11890). (1 data file).
A visual programming environment for the Navier-Stokes computer
NASA Technical Reports Server (NTRS)
Tomboulian, Sherryl; Crockett, Thomas W.; Middleton, David
1988-01-01
The Navier-Stokes computer is a high-performance, reconfigurable, pipelined machine designed to solve large computational fluid dynamics problems. Due to the complexity of the architecture, development of effective, high-level language compilers for the system appears to be a very difficult task. Consequently, a visual programming methodology has been developed which allows users to program the system at an architectural level by constructing diagrams of the pipeline configuration. These schematic program representations can then be checked for validity and automatically translated into machine code. The visual environment is illustrated by using a prototype graphical editor to program an example problem.
Nakamae, Kazuki; Nishimura, Yuki; Takenaga, Mitsumasa; Sakamoto, Naoaki; Ide, Hiroshi; Sakuma, Tetsushi; Yamamoto, Takashi
2017-01-01
ABSTRACT The emerging genome editing technology has enabled the creation of gene knock-in cells easily, efficiently, and rapidly, which has dramatically accelerated research in the field of mammalian functional genomics, including in humans. We recently developed a microhomology-mediated end-joining-based gene knock-in method, termed the PITCh system, and presented various examples of its application. Since the PITCh system only requires very short microhomologies (up to 40 bp) and single-guide RNA target sites on the donor vector, the targeting construct can be rapidly prepared compared with the conventional targeting vector for homologous recombination-based knock-in. Here, we established a streamlined pipeline to design and perform PITCh knock-in to further expand the availability of this method by creating web-based design software, PITCh designer (http://www.mls.sci.hiroshima-u.ac.jp/smg/PITChdesigner/index.html), as well as presenting an experimental example of versatile gene cassette knock-in. PITCh designer can automatically design not only the appropriate microhomologies but also the primers to construct locus-specific donor vectors for PITCh knock-in. By using our newly established pipeline, a reporter cell line for monitoring endogenous gene expression, and transgenesis (TG) or knock-in/knockout (KIKO) cell line can be produced systematically. Using these new variations of PITCh, an exogenous promoter-driven gene cassette expressing fluorescent protein gene and drug resistance gene can be integrated into a safe harbor or a specific gene locus to create transgenic reporter cells (PITCh-TG) or knockout cells with reporter knock-in (PITCh-KIKO), respectively. PMID:28453368
Nakamae, Kazuki; Nishimura, Yuki; Takenaga, Mitsumasa; Nakade, Shota; Sakamoto, Naoaki; Ide, Hiroshi; Sakuma, Tetsushi; Yamamoto, Takashi
2017-05-04
The emerging genome editing technology has enabled the creation of gene knock-in cells easily, efficiently, and rapidly, which has dramatically accelerated research in the field of mammalian functional genomics, including in humans. We recently developed a microhomology-mediated end-joining-based gene knock-in method, termed the PITCh system, and presented various examples of its application. Since the PITCh system only requires very short microhomologies (up to 40 bp) and single-guide RNA target sites on the donor vector, the targeting construct can be rapidly prepared compared with the conventional targeting vector for homologous recombination-based knock-in. Here, we established a streamlined pipeline to design and perform PITCh knock-in to further expand the availability of this method by creating web-based design software, PITCh designer ( http://www.mls.sci.hiroshima-u.ac.jp/smg/PITChdesigner/index.html ), as well as presenting an experimental example of versatile gene cassette knock-in. PITCh designer can automatically design not only the appropriate microhomologies but also the primers to construct locus-specific donor vectors for PITCh knock-in. By using our newly established pipeline, a reporter cell line for monitoring endogenous gene expression, and transgenesis (TG) or knock-in/knockout (KIKO) cell line can be produced systematically. Using these new variations of PITCh, an exogenous promoter-driven gene cassette expressing fluorescent protein gene and drug resistance gene can be integrated into a safe harbor or a specific gene locus to create transgenic reporter cells (PITCh-TG) or knockout cells with reporter knock-in (PITCh-KIKO), respectively.
Wan, Jiangwen; Yu, Yang; Wu, Yinfeng; Feng, Renjian; Yu, Ning
2012-01-01
In light of the problems of low recognition efficiency, high false rates and poor localization accuracy in traditional pipeline security detection technology, this paper proposes a type of hierarchical leak detection and localization method for use in natural gas pipeline monitoring sensor networks. In the signal preprocessing phase, original monitoring signals are dealt with by wavelet transform technology to extract the single mode signals as well as characteristic parameters. In the initial recognition phase, a multi-classifier model based on SVM is constructed and characteristic parameters are sent as input vectors to the multi-classifier for initial recognition. In the final decision phase, an improved evidence combination rule is designed to integrate initial recognition results for final decisions. Furthermore, a weighted average localization algorithm based on time difference of arrival is introduced for determining the leak point’s position. Experimental results illustrate that this hierarchical pipeline leak detection and localization method could effectively improve the accuracy of the leak point localization and reduce the undetected rate as well as false alarm rate. PMID:22368464
Wan, Jiangwen; Yu, Yang; Wu, Yinfeng; Feng, Renjian; Yu, Ning
2012-01-01
In light of the problems of low recognition efficiency, high false rates and poor localization accuracy in traditional pipeline security detection technology, this paper proposes a type of hierarchical leak detection and localization method for use in natural gas pipeline monitoring sensor networks. In the signal preprocessing phase, original monitoring signals are dealt with by wavelet transform technology to extract the single mode signals as well as characteristic parameters. In the initial recognition phase, a multi-classifier model based on SVM is constructed and characteristic parameters are sent as input vectors to the multi-classifier for initial recognition. In the final decision phase, an improved evidence combination rule is designed to integrate initial recognition results for final decisions. Furthermore, a weighted average localization algorithm based on time difference of arrival is introduced for determining the leak point's position. Experimental results illustrate that this hierarchical pipeline leak detection and localization method could effectively improve the accuracy of the leak point localization and reduce the undetected rate as well as false alarm rate.
Automatic orientation and 3D modelling from markerless rock art imagery
NASA Astrophysics Data System (ADS)
Lerma, J. L.; Navarro, S.; Cabrelles, M.; Seguí, A. E.; Hernández, D.
2013-02-01
This paper investigates the use of two detectors and descriptors on image pyramids for automatic image orientation and generation of 3D models. The detectors and descriptors replace manual measurements and are used to detect, extract and match features across multiple imagery. The Scale-Invariant Feature Transform (SIFT) and the Speeded Up Robust Features (SURF) will be assessed based on speed, number of features, matched features, and precision in image and object space depending on the adopted hierarchical matching scheme. The influence of applying in addition Area Based Matching (ABM) with normalised cross-correlation (NCC) and least squares matching (LSM) is also investigated. The pipeline makes use of photogrammetric and computer vision algorithms aiming minimum interaction and maximum accuracy from a calibrated camera. Both the exterior orientation parameters and the 3D coordinates in object space are sequentially estimated combining relative orientation, single space resection and bundle adjustment. The fully automatic image-based pipeline presented herein to automate the image orientation step of a sequence of terrestrial markerless imagery is compared with manual bundle block adjustment and terrestrial laser scanning (TLS) which serves as ground truth. The benefits of applying ABM after FBM will be assessed both in image and object space for the 3D modelling of a complex rock art shelter.
Oil pipeline geohazard monitoring using optical fiber FBG strain sensors (Conference Presentation)
NASA Astrophysics Data System (ADS)
Salazar-Ferro, Andres; Mendez, Alexis
2016-04-01
Pipelines are naturally vulnerable to operational, environmental and man-made effects such as internal erosion and corrosion; mechanical deformation due to geophysical risks and ground movements; leaks from neglect and vandalism; as well as encroachments from nearby excavations or illegal intrusions. The actual detection and localization of incipient and advanced faults in pipelines is a very difficult, expensive and inexact task. Anything that operators can do to mitigate the effects of these faults will provide increased reliability, reduced downtime and maintenance costs, as well as increased revenues. This talk will review the on-line monitoring of an extensive network of oil pipelines in service in Colombia using optical fiber Bragg grating (FBG) strain sensors for the measurement of strains and bending caused by geohazard risks such as soil movements, landslides, settlements, flooding and seismic activity. The FBG sensors were mounted on the outside of the pipelines at discrete locations where geohazard risk was expected. The system has been in service for the past 3 years with over 1,000 strain sensors mounted. The technique has been reliable and effective in giving advanced warning of accumulated pipeline strains as well as possible ruptures.
Integrating Semantic Information in Metadata Descriptions for a Geoscience-wide Resource Inventory.
NASA Astrophysics Data System (ADS)
Zaslavsky, I.; Richard, S. M.; Gupta, A.; Valentine, D.; Whitenack, T.; Ozyurt, I. B.; Grethe, J. S.; Schachne, A.
2016-12-01
Integrating semantic information into legacy metadata catalogs is a challenging issue and so far has been mostly done on a limited scale. We present experience of CINERGI (Community Inventory of Earthcube Resources for Geoscience Interoperability), an NSF Earthcube Building Block project, in creating a large cross-disciplinary catalog of geoscience information resources to enable cross-domain discovery. The project developed a pipeline for automatically augmenting resource metadata, in particular generating keywords that describe metadata documents harvested from multiple geoscience information repositories or contributed by geoscientists through various channels including surveys and domain resource inventories. The pipeline examines available metadata descriptions using text parsing, vocabulary management and semantic annotation and graph navigation services of GeoSciGraph. GeoSciGraph, in turn, relies on a large cross-domain ontology of geoscience terms, which bridges several independently developed ontologies or taxonomies including SWEET, ENVO, YAGO, GeoSciML, GCMD, SWO, and CHEBI. The ontology content enables automatic extraction of keywords reflecting science domains, equipment used, geospatial features, measured properties, methods, processes, etc. We specifically focus on issues of cross-domain geoscience ontology creation, resolving several types of semantic conflicts among component ontologies or vocabularies, and constructing and managing facets for improved data discovery and navigation. The ontology and keyword generation rules are iteratively improved as pipeline results are presented to data managers for selective manual curation via a CINERGI Annotator user interface. We present lessons learned from applying CINERGI metadata augmentation pipeline to a number of federal agency and academic data registries, in the context of several use cases that require data discovery and integration across multiple earth science data catalogs of varying quality and completeness. The inventory is accessible at http://cinergi.sdsc.edu, and the CINERGI project web page is http://earthcube.org/group/cinergi
49 CFR 193.2507 - Monitoring operations.
Code of Federal Regulations, 2010 CFR
2010-10-01
... Transportation Other Regulations Relating to Transportation (Continued) PIPELINE AND HAZARDOUS MATERIALS SAFETY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED) PIPELINE SAFETY LIQUEFIED NATURAL GAS FACILITIES... watching or listening from an attended control center for warning alarms, such as gas, temperature...
Steam flooding from mine workings, a viable alternative
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ayler, M.F.; Brechtel, C.
1987-05-01
The advent of steam flooding has given new life to several fields in California, substantially increasing the recoverable reserve. This process can be combined with a newly developed concept combining petroleum and mining technology. By placing mine workings 100 ft, more or less, below the bottom of the reservoir, it is possible to safely drill wells upward through the reservoir and complete them in such a way that all produced cuttings and fluids are contained within closed pipelines. Each completed well could serve as a steam injection well with continuous gravity-produced oil from the same well. As all fluids wouldmore » flow by gravity to a collection pipeline, the only needed pumps would be at the discharge within the mine shaft. Mine shafts serving the oil field could be placed in environmentally optimum sites roughly one mile apart, eliminating many of the visually objectionable disturbances. Production wells could be placed on one acre or even closer spacing, whatever good engineering dictates. Automatic controls can continuously monitor and control production from each well. Assuming one-acre well spacing, continuous steam flooding, and production from each well, a detailed analysis of anticipated mining costs indicate oil production costs under $5/bbl are possible. Even at $10/BO, a positive cash flow within two years after the start of shaft sinking is expected.« less
Chan, Kuang-Lim; Rosli, Rozana; Tatarinova, Tatiana V; Hogan, Michael; Firdaus-Raih, Mohd; Low, Eng-Ti Leslie
2017-01-27
Gene prediction is one of the most important steps in the genome annotation process. A large number of software tools and pipelines developed by various computing techniques are available for gene prediction. However, these systems have yet to accurately predict all or even most of the protein-coding regions. Furthermore, none of the currently available gene-finders has a universal Hidden Markov Model (HMM) that can perform gene prediction for all organisms equally well in an automatic fashion. We present an automated gene prediction pipeline, Seqping that uses self-training HMM models and transcriptomic data. The pipeline processes the genome and transcriptome sequences of the target species using GlimmerHMM, SNAP, and AUGUSTUS pipelines, followed by MAKER2 program to combine predictions from the three tools in association with the transcriptomic evidence. Seqping generates species-specific HMMs that are able to offer unbiased gene predictions. The pipeline was evaluated using the Oryza sativa and Arabidopsis thaliana genomes. Benchmarking Universal Single-Copy Orthologs (BUSCO) analysis showed that the pipeline was able to identify at least 95% of BUSCO's plantae dataset. Our evaluation shows that Seqping was able to generate better gene predictions compared to three HMM-based programs (MAKER2, GlimmerHMM and AUGUSTUS) using their respective available HMMs. Seqping had the highest accuracy in rice (0.5648 for CDS, 0.4468 for exon, and 0.6695 nucleotide structure) and A. thaliana (0.5808 for CDS, 0.5955 for exon, and 0.8839 nucleotide structure). Seqping provides researchers a seamless pipeline to train species-specific HMMs and predict genes in newly sequenced or less-studied genomes. We conclude that the Seqping pipeline predictions are more accurate than gene predictions using the other three approaches with the default or available HMMs.
Historical analysis of US pipeline accidents triggered by natural hazards
NASA Astrophysics Data System (ADS)
Girgin, Serkan; Krausmann, Elisabeth
2015-04-01
Natural hazards, such as earthquakes, floods, landslides, or lightning, can initiate accidents in oil and gas pipelines with potentially major consequences on the population or the environment due to toxic releases, fires and explosions. Accidents of this type are also referred to as Natech events. Many major accidents highlight the risk associated with natural-hazard impact on pipelines transporting dangerous substances. For instance, in the USA in 1994, flooding of the San Jacinto River caused the rupture of 8 and the undermining of 29 pipelines by the floodwaters. About 5.5 million litres of petroleum and related products were spilled into the river and ignited. As a results, 547 people were injured and significant environmental damage occurred. Post-incident analysis is a valuable tool for better understanding the causes, dynamics and impacts of pipeline Natech accidents in support of future accident prevention and mitigation. Therefore, data on onshore hazardous-liquid pipeline accidents collected by the US Pipeline and Hazardous Materials Safety Administration (PHMSA) was analysed. For this purpose, a database-driven incident data analysis system was developed to aid the rapid review and categorization of PHMSA incident reports. Using an automated data-mining process followed by a peer review of the incident records and supported by natural hazard databases and external information sources, the pipeline Natechs were identified. As a by-product of the data-collection process, the database now includes over 800,000 incidents from all causes in industrial and transportation activities, which are automatically classified in the same way as the PHMSA record. This presentation describes the data collection and reviewing steps conducted during the study, provides information on the developed database and data analysis tools, and reports the findings of a statistical analysis of the identified hazardous liquid pipeline incidents in terms of accident dynamics and consequences.
Kepler Science Operations Center Pipeline Framework
NASA Technical Reports Server (NTRS)
Klaus, Todd C.; McCauliff, Sean; Cote, Miles T.; Girouard, Forrest R.; Wohler, Bill; Allen, Christopher; Middour, Christopher; Caldwell, Douglas A.; Jenkins, Jon M.
2010-01-01
The Kepler mission is designed to continuously monitor up to 170,000 stars at a 30 minute cadence for 3.5 years searching for Earth-size planets. The data are processed at the Science Operations Center (SOC) at NASA Ames Research Center. Because of the large volume of data and the memory and CPU-intensive nature of the analysis, significant computing hardware is required. We have developed generic pipeline framework software that is used to distribute and synchronize the processing across a cluster of CPUs and to manage the resulting products. The framework is written in Java and is therefore platform-independent, and scales from a single, standalone workstation (for development and research on small data sets) to a full cluster of homogeneous or heterogeneous hardware with minimal configuration changes. A plug-in architecture provides customized control of the unit of work without the need to modify the framework itself. Distributed transaction services provide for atomic storage of pipeline products for a unit of work across a relational database and the custom Kepler DB. Generic parameter management and data accountability services are provided to record the parameter values, software versions, and other meta-data used for each pipeline execution. A graphical console allows for the configuration, execution, and monitoring of pipelines. An alert and metrics subsystem is used to monitor the health and performance of the pipeline. The framework was developed for the Kepler project based on Kepler requirements, but the framework itself is generic and could be used for a variety of applications where these features are needed.
Development of Protective Coatings for Co-Sequestration Processes and Pipelines
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bierwagen, Gordon; Huang, Yaping
2011-11-30
The program, entitled Development of Protective Coatings for Co-Sequestration Processes and Pipelines, examined the sensitivity of existing coating systems to supercritical carbon dioxide (SCCO2) exposure and developed new coating system to protect pipelines from their corrosion under SCCO2 exposure. A literature review was also conducted regarding pipeline corrosion sensors to monitor pipes used in handling co-sequestration fluids. Research was to ensure safety and reliability for a pipeline involving transport of SCCO2 from the power plant to the sequestration site to mitigate the greenhouse gas effect. Results showed that one commercial coating and one designed formulation can both be supplied asmore » potential candidates for internal pipeline coating to transport SCCO2.« less
Corrosion monitoring along infrastructures using distributed fiber optic sensing
NASA Astrophysics Data System (ADS)
Alhandawi, Khalil B.; Vahdati, Nader; Shiryayev, Oleg; Lawand, Lydia
2016-04-01
Pipeline Inspection Gauges (PIGs) are used for internal corrosion inspection of oil pipelines every 3-5 years. However, between inspection intervals, rapid corrosion may occur, potentially resulting in major accidents. The motivation behind this research project was to develop a safe distributed corrosion sensor placed inside oil pipelines continuously monitoring corrosion. The intrinsically safe nature of light provided motivation for researching fiber optic sensors as a solution. The sensing fiber's cladding features polymer plastic that is chemically sensitive to hydrocarbons within crude oil mixtures. A layer of metal, used in the oil pipeline's construction, is deposited on the polymer cladding, which upon corrosion, exposes the cladding to surrounding hydrocarbons. The hydrocarbon's interaction with the cladding locally increases the cladding's refractive index in the radial direction. Light intensity of a traveling pulse is reduced due to local reduction in the modal capacity which is interrogated by Optical Time Domain Reflectometery. Backscattered light is captured in real-time while using time delay to resolve location, allowing real-time spatial monitoring of environmental internal corrosion within pipelines spanning large distances. Step index theoretical solutions were used to calculate the power loss due changes in the intensity profile. The power loss is translated into an attenuation coefficient characterizing the expected OTDR trace which was verified against similar experimental results from the literature. A laboratory scale experiment is being developed to assess the validity of the model and the practicality of the solution.
An integrated pipeline to create and experience compelling scenarios in virtual reality
NASA Astrophysics Data System (ADS)
Springer, Jan P.; Neumann, Carsten; Reiners, Dirk; Cruz-Neira, Carolina
2011-03-01
One of the main barriers to create and use compelling scenarios in virtual reality is the complexity and time-consuming efforts for modeling, element integration, and the software development to properly display and interact with the content in the available systems. Still today, most virtual reality applications are tedious to create and they are hard-wired to the specific display and interaction system available to the developers when creating the application. Furthermore, it is not possible to alter the content or the dynamics of the content once the application has been created. We present our research on designing a software pipeline that enables the creation of compelling scenarios with a fair degree of visual and interaction complexity in a semi-automated way. Specifically, we are targeting drivable urban scenarios, ranging from large cities to sparsely populated rural areas that incorporate both static components (e. g., houses, trees) and dynamic components (e. g., people, vehicles) as well as events, such as explosions or ambient noise. Our pipeline has four basic components. First, an environment designer, where users sketch the overall layout of the scenario, and an automated method constructs the 3D environment from the information in the sketch. Second, a scenario editor used for authoring the complete scenario, incorporate the dynamic elements and events, fine tune the automatically generated environment, define the execution conditions of the scenario, and set up any data gathering that may be necessary during the execution of the scenario. Third, a run-time environment for different virtual-reality systems provides users with the interactive experience as designed with the designer and the editor. And fourth, a bi-directional monitoring system that allows for capturing and modification of information from the virtual environment. One of the interesting capabilities of our pipeline is that scenarios can be built and modified on-the-fly as they are being presented in the virtual-reality systems. Users can quickly prototype the basic scene using the designer and the editor on a control workstation. More elements can then be introduced into the scene from both the editor and the virtual-reality display. In this manner, users are able to gradually increase the complexity of the scenario with immediate feedback. The main use of this pipeline is the rapid development of scenarios for human-factors studies. However, it is applicable in a much more general context.
Extraction of UMLS® Concepts Using Apache cTAKES™ for German Language.
Becker, Matthias; Böckmann, Britta
2016-01-01
Automatic information extraction of medical concepts and classification with semantic standards from medical reports is useful for standardization and for clinical research. This paper presents an approach for an UMLS concept extraction with a customized natural language processing pipeline for German clinical notes using Apache cTAKES. The objectives are, to test the natural language processing tool for German language if it is suitable to identify UMLS concepts and map these with SNOMED-CT. The German UMLS database and German OpenNLP models extended the natural language processing pipeline, so the pipeline can normalize to domain ontologies such as SNOMED-CT using the German concepts. For testing, the ShARe/CLEF eHealth 2013 training dataset translated into German was used. The implemented algorithms are tested with a set of 199 German reports, obtaining a result of average 0.36 F1 measure without German stemming, pre- and post-processing of the reports.
Kravatsky, Yuri; Chechetkin, Vladimir; Fedoseeva, Daria; Gorbacheva, Maria; Kravatskaya, Galina; Kretova, Olga; Tchurikov, Nickolai
2017-11-23
The efficient development of antiviral drugs, including efficient antiviral small interfering RNAs (siRNAs), requires continuous monitoring of the strict correspondence between a drug and the related highly variable viral DNA/RNA target(s). Deep sequencing is able to provide an assessment of both the general target conservation and the frequency of particular mutations in the different target sites. The aim of this study was to develop a reliable bioinformatic pipeline for the analysis of millions of short, deep sequencing reads corresponding to selected highly variable viral sequences that are drug target(s). The suggested bioinformatic pipeline combines the available programs and the ad hoc scripts based on an original algorithm of the search for the conserved targets in the deep sequencing data. We also present the statistical criteria for the threshold of reliable mutation detection and for the assessment of variations between corresponding data sets. These criteria are robust against the possible sequencing errors in the reads. As an example, the bioinformatic pipeline is applied to the study of the conservation of RNA interference (RNAi) targets in human immunodeficiency virus 1 (HIV-1) subtype A. The developed pipeline is freely available to download at the website http://virmut.eimb.ru/. Brief comments and comparisons between VirMut and other pipelines are also presented.
Tsugawa, Hiroshi; Ohta, Erika; Izumi, Yoshihiro; Ogiwara, Atsushi; Yukihira, Daichi; Bamba, Takeshi; Fukusaki, Eiichiro; Arita, Masanori
2014-01-01
Based on theoretically calculated comprehensive lipid libraries, in lipidomics as many as 1000 multiple reaction monitoring (MRM) transitions can be monitored for each single run. On the other hand, lipid analysis from each MRM chromatogram requires tremendous manual efforts to identify and quantify lipid species. Isotopic peaks differing by up to a few atomic masses further complicate analysis. To accelerate the identification and quantification process we developed novel software, MRM-DIFF, for the differential analysis of large-scale MRM assays. It supports a correlation optimized warping (COW) algorithm to align MRM chromatograms and utilizes quality control (QC) sample datasets to automatically adjust the alignment parameters. Moreover, user-defined reference libraries that include the molecular formula, retention time, and MRM transition can be used to identify target lipids and to correct peak abundances by considering isotopic peaks. Here, we demonstrate the software pipeline and introduce key points for MRM-based lipidomics research to reduce the mis-identification and overestimation of lipid profiles. The MRM-DIFF program, example data set and the tutorials are downloadable at the "Standalone software" section of the PRIMe (Platform for RIKEN Metabolomics, http://prime.psc.riken.jp/) database website.
Tsugawa, Hiroshi; Ohta, Erika; Izumi, Yoshihiro; Ogiwara, Atsushi; Yukihira, Daichi; Bamba, Takeshi; Fukusaki, Eiichiro; Arita, Masanori
2015-01-01
Based on theoretically calculated comprehensive lipid libraries, in lipidomics as many as 1000 multiple reaction monitoring (MRM) transitions can be monitored for each single run. On the other hand, lipid analysis from each MRM chromatogram requires tremendous manual efforts to identify and quantify lipid species. Isotopic peaks differing by up to a few atomic masses further complicate analysis. To accelerate the identification and quantification process we developed novel software, MRM-DIFF, for the differential analysis of large-scale MRM assays. It supports a correlation optimized warping (COW) algorithm to align MRM chromatograms and utilizes quality control (QC) sample datasets to automatically adjust the alignment parameters. Moreover, user-defined reference libraries that include the molecular formula, retention time, and MRM transition can be used to identify target lipids and to correct peak abundances by considering isotopic peaks. Here, we demonstrate the software pipeline and introduce key points for MRM-based lipidomics research to reduce the mis-identification and overestimation of lipid profiles. The MRM-DIFF program, example data set and the tutorials are downloadable at the “Standalone software” section of the PRIMe (Platform for RIKEN Metabolomics, http://prime.psc.riken.jp/) database website. PMID:25688256
PMAnalyzer: a new web interface for bacterial growth curve analysis.
Cuevas, Daniel A; Edwards, Robert A
2017-06-15
Bacterial growth curves are essential representations for characterizing bacteria metabolism within a variety of media compositions. Using high-throughput, spectrophotometers capable of processing tens of 96-well plates, quantitative phenotypic information can be easily integrated into the current data structures that describe a bacterial organism. The PMAnalyzer pipeline performs a growth curve analysis to parameterize the unique features occurring within microtiter wells containing specific growth media sources. We have expanded the pipeline capabilities and provide a user-friendly, online implementation of this automated pipeline. PMAnalyzer version 2.0 provides fast automatic growth curve parameter analysis, growth identification and high resolution figures of sample-replicate growth curves and several statistical analyses. PMAnalyzer v2.0 can be found at https://edwards.sdsu.edu/pmanalyzer/ . Source code for the pipeline can be found on GitHub at https://github.com/dacuevas/PMAnalyzer . Source code for the online implementation can be found on GitHub at https://github.com/dacuevas/PMAnalyzerWeb . dcuevas08@gmail.com. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press.
Applicability of interferometric SAR technology to ground movement and pipeline monitoring
NASA Astrophysics Data System (ADS)
Grivas, Dimitri A.; Bhagvati, Chakravarthy; Schultz, B. C.; Trigg, Alan; Rizkalla, Moness
1998-03-01
This paper summarizes the findings of a cooperative effort between NOVA Gas Transmission Ltd. (NGTL), the Italian Natural Gas Transmission Company (SNAM), and Arista International, Inc., to determine whether current remote sensing technologies can be utilized to monitor small-scale ground movements over vast geographical areas. This topic is of interest due to the potential for small ground movements to cause strain accumulation in buried pipeline facilities. Ground movements are difficult to monitor continuously, but their cumulative effect over time can have a significant impact on the safety of buried pipelines. Interferometric synthetic aperture radar (InSAR or SARI) is identified as the most promising technique of those considered. InSAR analysis involves combining multiple images from consecutive passes of a radar imaging platform. The resulting composite image can detect changes as small as 2.5 to 5.0 centimeters (based on current analysis methods and radar satellite data of 5 centimeter wavelength). Research currently in progress shows potential for measuring ground movements as small as a few millimeters. Data needed for InSAR analysis is currently commercially available from four satellites, and additional satellites are planned for launch in the near future. A major conclusion of the present study is that InSAR technology is potentially useful for pipeline integrity monitoring. A pilot project is planned to test operational issues.
NASA Astrophysics Data System (ADS)
Jenness, Tim; Economou, Frossie; Cavanagh, Brad
ORAC-DR is a general purpose automatic data reduction pipeline environment. This document describes how to modify data reduction recipes and how to add new instruments. For a general overview of ORAC-DR see SUN/230. For specific information on how to reduce the data for a particular instrument, please consult the appropriate ORAC-DR instrument guide.
76 FR 30322 - Notice of Availability of Government-Owned Inventions; Available for Licensing
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-25
... below are assigned to the United States Government as represented by the Secretary of the Navy. U.S... ``Automatic Clock Synchronization and Distribution Circuit for Counter Clock Flow Pipelined Systems'' issued... Flow and Metallic Conformal Coating of Conductive Templates'' issued on October 12, 2010; U.S. Patent...
Microcomputers, software combine to provide daily product, movement inventory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cable, T.
1985-06-01
This paper describes the efforts of Sante Fe Pipelines Inc. in keeping track of product inventory on the 810 mile, 12-in. Chapparal Pipeline and the 1,913 mile, 8- and 10-in. Gulf Central Pipeline. The decision to use a PC for monitoring the inventory was significant. The application was completed by TRON, Inc. The system is actually two major subsystems. The pipeline system accounts for injections into the pipeline and deliveries of product. This feeds the storage and the terminal inventory system where inventories are maintained at storage locations by shipper and supplier account. The paper further explains the inventory monitoringmore » process in detail. Communications software is described as well.« less
genipe: an automated genome-wide imputation pipeline with automatic reporting and statistical tools.
Lemieux Perreault, Louis-Philippe; Legault, Marc-André; Asselin, Géraldine; Dubé, Marie-Pierre
2016-12-01
Genotype imputation is now commonly performed following genome-wide genotyping experiments. Imputation increases the density of analyzed genotypes in the dataset, enabling fine-mapping across the genome. However, the process of imputation using the most recent publicly available reference datasets can require considerable computation power and the management of hundreds of large intermediate files. We have developed genipe, a complete genome-wide imputation pipeline which includes automatic reporting, imputed data indexing and management, and a suite of statistical tests for imputed data commonly used in genetic epidemiology (Sequence Kernel Association Test, Cox proportional hazards for survival analysis, and linear mixed models for repeated measurements in longitudinal studies). The genipe package is an open source Python software and is freely available for non-commercial use (CC BY-NC 4.0) at https://github.com/pgxcentre/genipe Documentation and tutorials are available at http://pgxcentre.github.io/genipe CONTACT: louis-philippe.lemieux.perreault@statgen.org or marie-pierre.dube@statgen.orgSupplementary information: Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.
bcgTree: automatized phylogenetic tree building from bacterial core genomes.
Ankenbrand, Markus J; Keller, Alexander
2016-10-01
The need for multi-gene analyses in scientific fields such as phylogenetics and DNA barcoding has increased in recent years. In particular, these approaches are increasingly important for differentiating bacterial species, where reliance on the standard 16S rDNA marker can result in poor resolution. Additionally, the assembly of bacterial genomes has become a standard task due to advances in next-generation sequencing technologies. We created a bioinformatic pipeline, bcgTree, which uses assembled bacterial genomes either from databases or own sequencing results from the user to reconstruct their phylogenetic history. The pipeline automatically extracts 107 essential single-copy core genes, found in a majority of bacteria, using hidden Markov models and performs a partitioned maximum-likelihood analysis. Here, we describe the workflow of bcgTree and, as a proof-of-concept, its usefulness in resolving the phylogeny of 293 publically available bacterial strains of the genus Lactobacillus. We also evaluate its performance in both low- and high-level taxonomy test sets. The tool is freely available at github ( https://github.com/iimog/bcgTree ) and our institutional homepage ( http://www.dna-analytics.biozentrum.uni-wuerzburg.de ).
Lee, Unseok; Chang, Sungyul; Putra, Gian Anantrio; Kim, Hyoungseok; Kim, Dong Hwan
2018-01-01
A high-throughput plant phenotyping system automatically observes and grows many plant samples. Many plant sample images are acquired by the system to determine the characteristics of the plants (populations). Stable image acquisition and processing is very important to accurately determine the characteristics. However, hardware for acquiring plant images rapidly and stably, while minimizing plant stress, is lacking. Moreover, most software cannot adequately handle large-scale plant imaging. To address these problems, we developed a new, automated, high-throughput plant phenotyping system using simple and robust hardware, and an automated plant-imaging-analysis pipeline consisting of machine-learning-based plant segmentation. Our hardware acquires images reliably and quickly and minimizes plant stress. Furthermore, the images are processed automatically. In particular, large-scale plant-image datasets can be segmented precisely using a classifier developed using a superpixel-based machine-learning algorithm (Random Forest), and variations in plant parameters (such as area) over time can be assessed using the segmented images. We performed comparative evaluations to identify an appropriate learning algorithm for our proposed system, and tested three robust learning algorithms. We developed not only an automatic analysis pipeline but also a convenient means of plant-growth analysis that provides a learning data interface and visualization of plant growth trends. Thus, our system allows end-users such as plant biologists to analyze plant growth via large-scale plant image data easily.
Channel Measurements for Automatic Vehicle Monitoring Systems
DOT National Transportation Integrated Search
1974-03-01
Co-channel and adjacent channel electromagnetic interference measurements were conducted on the Sierra Research Corp. and the Chicago Transit Authority automatic vehicle monitoring systems. These measurements were made to determine if the automatic v...
2016-09-01
assigned a classification. MLST analysis MLST was determined using an in-house automated pipeline that first searches for homologs of each gene of...and virulence mechanism contributing to their success as pathogens in the wound environment. A novel bioinformatics pipeline was used to incorporate...monitored in two ways: read-based genome QC and assembly based metrics. The JCVI Genome QC pipeline samples sequence reads and performs BLAST
Amar, David; Frades, Itziar; Danek, Agnieszka; Goldberg, Tatyana; Sharma, Sanjeev K; Hedley, Pete E; Proux-Wera, Estelle; Andreasson, Erik; Shamir, Ron; Tzfadia, Oren; Alexandersson, Erik
2014-12-05
For most organisms, even if their genome sequence is available, little functional information about individual genes or proteins exists. Several annotation pipelines have been developed for functional analysis based on sequence, 'omics', and literature data. However, researchers encounter little guidance on how well they perform. Here, we used the recently sequenced potato genome as a case study. The potato genome was selected since its genome is newly sequenced and it is a non-model plant even if there is relatively ample information on individual potato genes, and multiple gene expression profiles are available. We show that the automatic gene annotations of potato have low accuracy when compared to a "gold standard" based on experimentally validated potato genes. Furthermore, we evaluate six state-of-the-art annotation pipelines and show that their predictions are markedly dissimilar (Jaccard similarity coefficient of 0.27 between pipelines on average). To overcome this discrepancy, we introduce a simple GO structure-based algorithm that reconciles the predictions of the different pipelines. We show that the integrated annotation covers more genes, increases by over 50% the number of highly co-expressed GO processes, and obtains much higher agreement with the gold standard. We find that different annotation pipelines produce different results, and show how to integrate them into a unified annotation that is of higher quality than each single pipeline. We offer an improved functional annotation of both PGSC and ITAG potato gene models, as well as tools that can be applied to additional pipelines and improve annotation in other organisms. This will greatly aid future functional analysis of '-omics' datasets from potato and other organisms with newly sequenced genomes. The new potato annotations are available with this paper.
Lay-Ekuakille, Aimé; Fabbiano, Laura; Vacca, Gaetano; Kitoko, Joël Kidiamboko; Kulapa, Patrice Bibala; Telesca, Vito
2018-06-04
Pipelines conveying fluids are considered strategic infrastructures to be protected and maintained. They generally serve for transportation of important fluids such as drinkable water, waste water, oil, gas, chemicals, etc. Monitoring and continuous testing, especially on-line, are necessary to assess the condition of pipelines. The paper presents findings related to a comparison between two spectral response algorithms based on the decimated signal diagonalization (DSD) and decimated Padé approximant (DPA) techniques that allow to one to process signals delivered by pressure sensors mounted on an experimental pipeline.
Ali, Salman; Qaisar, Saad Bin; Saeed, Husnain; Khan, Muhammad Farhan; Naeem, Muhammad; Anpalagan, Alagan
2015-03-25
The synergy of computational and physical network components leading to the Internet of Things, Data and Services has been made feasible by the use of Cyber Physical Systems (CPSs). CPS engineering promises to impact system condition monitoring for a diverse range of fields from healthcare, manufacturing, and transportation to aerospace and warfare. CPS for environment monitoring applications completely transforms human-to-human, human-to-machine and machine-to-machine interactions with the use of Internet Cloud. A recent trend is to gain assistance from mergers between virtual networking and physical actuation to reliably perform all conventional and complex sensing and communication tasks. Oil and gas pipeline monitoring provides a novel example of the benefits of CPS, providing a reliable remote monitoring platform to leverage environment, strategic and economic benefits. In this paper, we evaluate the applications and technical requirements for seamlessly integrating CPS with sensor network plane from a reliability perspective and review the strategies for communicating information between remote monitoring sites and the widely deployed sensor nodes. Related challenges and issues in network architecture design and relevant protocols are also provided with classification. This is supported by a case study on implementing reliable monitoring of oil and gas pipeline installations. Network parameters like node-discovery, node-mobility, data security, link connectivity, data aggregation, information knowledge discovery and quality of service provisioning have been reviewed.
Ali, Salman; Qaisar, Saad Bin; Saeed, Husnain; Farhan Khan, Muhammad; Naeem, Muhammad; Anpalagan, Alagan
2015-01-01
The synergy of computational and physical network components leading to the Internet of Things, Data and Services has been made feasible by the use of Cyber Physical Systems (CPSs). CPS engineering promises to impact system condition monitoring for a diverse range of fields from healthcare, manufacturing, and transportation to aerospace and warfare. CPS for environment monitoring applications completely transforms human-to-human, human-to-machine and machine-to-machine interactions with the use of Internet Cloud. A recent trend is to gain assistance from mergers between virtual networking and physical actuation to reliably perform all conventional and complex sensing and communication tasks. Oil and gas pipeline monitoring provides a novel example of the benefits of CPS, providing a reliable remote monitoring platform to leverage environment, strategic and economic benefits. In this paper, we evaluate the applications and technical requirements for seamlessly integrating CPS with sensor network plane from a reliability perspective and review the strategies for communicating information between remote monitoring sites and the widely deployed sensor nodes. Related challenges and issues in network architecture design and relevant protocols are also provided with classification. This is supported by a case study on implementing reliable monitoring of oil and gas pipeline installations. Network parameters like node-discovery, node-mobility, data security, link connectivity, data aggregation, information knowledge discovery and quality of service provisioning have been reviewed. PMID:25815444
Monitoring winter flow conditions on the Ivishak River, Alaska : final report.
DOT National Transportation Integrated Search
2017-09-01
The Sagavanirktok River, a braided river on the Alaska North Slope, flows adjacent to the trans-Alaska pipeline for approximately 100 miles south of Prudhoe Bay. During an unprecedented flooding event in mid-May 2015, the pipeline was exposed in an a...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-28
... 4th Avenue, Suite 2, Anchorage, Alaska, is relocating to 188 West Northern Lights Boulevard, Suite 500... Pipeline Monitoring office at 411 West 4th Avenue, Suite 2, Anchorage, Alaska, will remain open during the...
Automatically processed alpha-track radon monitor
Langner, Jr., G. Harold
1993-01-01
An automatically processed alpha-track radon monitor is provided which includes a housing having an aperture allowing radon entry, and a filter that excludes the entry of radon daughters into the housing. A flexible track registration material is located within the housing that records alpha-particle emissions from the decay of radon and radon daughters inside the housing. The flexible track registration material is capable of being spliced such that the registration material from a plurality of monitors can be spliced into a single strip to facilitate automatic processing of the registration material from the plurality of monitors. A process for the automatic counting of radon registered by a radon monitor is also provided.
Automatically processed alpha-track radon monitor
Langner, G.H. Jr.
1993-01-12
An automatically processed alpha-track radon monitor is provided which includes a housing having an aperture allowing radon entry, and a filter that excludes the entry of radon daughters into the housing. A flexible track registration material is located within the housing that records alpha-particle emissions from the decay of radon and radon daughters inside the housing. The flexible track registration material is capable of being spliced such that the registration material from a plurality of monitors can be spliced into a single strip to facilitate automatic processing of the registration material from the plurality of monitors. A process for the automatic counting of radon registered by a radon monitor is also provided.
Evaluation of the Monitor-CTA Automatic Vehicle Monitoring System
DOT National Transportation Integrated Search
1974-03-01
In June 1972 the Urban Mass Transportation Administration requested that the Transportation System Center of DOT perform an evaluation of the CTA (Chicago Transit Authority) Monitor-Automatic Vehicle Monitor (AVM) system. TSC planned the overall eval...
Forecasting and Evaluation of Gas Pipelines Geometric Forms Breach Hazard
NASA Astrophysics Data System (ADS)
Voronin, K. S.
2016-10-01
Main gas pipelines during operation are under the influence of the permanent pressure drops which leads to their lengthening and as a result, to instability of their position in space. In dynamic systems that have feedback, phenomena, preceding emergencies, should be observed. The article discusses the forced vibrations of the gas pipeline cylindrical surface under the influence of dynamic loads caused by pressure surges, and the process of its geometric shape deformation. Frequency of vibrations, arising in the pipeline at the stage preceding its bending, is being determined. Identification of this frequency can be the basis for the development of a method of monitoring the technical condition of the gas pipeline, and forecasting possible emergency situations allows planning and carrying out in due time reconstruction works on sections of gas pipeline with a possible deviation from the design position.
Drive Control System for Pipeline Crawl Robot Based on CAN Bus
NASA Astrophysics Data System (ADS)
Chen, H. J.; Gao, B. T.; Zhang, X. H.; Deng2, Z. Q.
2006-10-01
Drive control system plays important roles in pipeline robot. In order to inspect the flaw and corrosion of seabed crude oil pipeline, an original mobile pipeline robot with crawler drive unit, power and monitor unit, central control unit, and ultrasonic wave inspection device is developed. The CAN bus connects these different function units and presents a reliable information channel. Considering the limited space, a compact hardware system is designed based on an ARM processor with two CAN controllers. With made-to-order CAN protocol for the crawl robot, an intelligent drive control system is developed. The implementation of the crawl robot demonstrates that the presented drive control scheme can meet the motion control requirements of the underwater pipeline crawl robot.
The VIRUS data reduction pipeline
NASA Astrophysics Data System (ADS)
Goessl, Claus A.; Drory, Niv; Relke, Helena; Gebhardt, Karl; Grupp, Frank; Hill, Gary; Hopp, Ulrich; Köhler, Ralf; MacQueen, Phillip
2006-06-01
The Hobby-Eberly Telescope Dark Energy Experiment (HETDEX) will measure baryonic acoustic oscillations, first discovered in the Cosmic Microwave Background (CMB), to constrain the nature of dark energy by performing a blind search for Ly-α emitting galaxies within a 200 deg2 field and a redshift bin of 1.8 < z < 3.7. This will be achieved by VIRUS, a wide field, low resolution, 145 IFU spectrograph. The data reduction pipeline will have to extract ~ 35.000 spectra per exposure (~5 million per night, i.e. 500 million in total), perform an astrometric, photometric, and wavelength calibration, and find and classify objects in the spectra fully automatically. We will describe our ideas how to achieve this goal.
COSMOS: Carnegie Observatories System for MultiObject Spectroscopy
NASA Astrophysics Data System (ADS)
Oemler, A.; Clardy, K.; Kelson, D.; Walth, G.; Villanueva, E.
2017-05-01
COSMOS (Carnegie Observatories System for MultiObject Spectroscopy) reduces multislit spectra obtained with the IMACS and LDSS3 spectrographs on the Magellan Telescopes. It can be used for the quick-look analysis of data at the telescope as well as for pipeline reduction of large data sets. COSMOS is based on a precise optical model of the spectrographs, which allows (after alignment and calibration) an accurate prediction of the location of spectra features. This eliminates the line search procedure which is fundamental to many spectral reduction programs, and allows a robust data pipeline to be run in an almost fully automatic mode, allowing large amounts of data to be reduced with minimal intervention.
cisTEM, user-friendly software for single-particle image processing.
Grant, Timothy; Rohou, Alexis; Grigorieff, Nikolaus
2018-03-07
We have developed new open-source software called cis TEM (computational imaging system for transmission electron microscopy) for the processing of data for high-resolution electron cryo-microscopy and single-particle averaging. cis TEM features a graphical user interface that is used to submit jobs, monitor their progress, and display results. It implements a full processing pipeline including movie processing, image defocus determination, automatic particle picking, 2D classification, ab-initio 3D map generation from random parameters, 3D classification, and high-resolution refinement and reconstruction. Some of these steps implement newly-developed algorithms; others were adapted from previously published algorithms. The software is optimized to enable processing of typical datasets (2000 micrographs, 200 k - 300 k particles) on a high-end, CPU-based workstation in half a day or less, comparable to GPU-accelerated processing. Jobs can also be scheduled on large computer clusters using flexible run profiles that can be adapted for most computing environments. cis TEM is available for download from cistem.org. © 2018, Grant et al.
cisTEM, user-friendly software for single-particle image processing
2018-01-01
We have developed new open-source software called cisTEM (computational imaging system for transmission electron microscopy) for the processing of data for high-resolution electron cryo-microscopy and single-particle averaging. cisTEM features a graphical user interface that is used to submit jobs, monitor their progress, and display results. It implements a full processing pipeline including movie processing, image defocus determination, automatic particle picking, 2D classification, ab-initio 3D map generation from random parameters, 3D classification, and high-resolution refinement and reconstruction. Some of these steps implement newly-developed algorithms; others were adapted from previously published algorithms. The software is optimized to enable processing of typical datasets (2000 micrographs, 200 k – 300 k particles) on a high-end, CPU-based workstation in half a day or less, comparable to GPU-accelerated processing. Jobs can also be scheduled on large computer clusters using flexible run profiles that can be adapted for most computing environments. cisTEM is available for download from cistem.org. PMID:29513216
DOT National Transportation Integrated Search
2012-08-30
Preventing unauthorized intrusions on pipeline Right of Ways (ROWs) and mechanical damage due to third party strikes by machinery is a constant challenge for the pipeline industry. Equally important for safety and environmental protection is the dete...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-28
... sharp increase in demand for natural gas outside of the traditional winter months. Withdrawals and... activities and unbundled sales activities of interstate natural gas pipelines and blanket marketing... and to monitor and evaluate transactions and operations of interstate pipelines and blanket marketing...
76 FR 46783 - Commission Information Collection Activities (FERC-549); Comment Request; Extension
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-03
... 1992. There has been a sharp increase in demand for natural gas outside of the traditional winter... activities and unbundled sales activities of interstate natural gas pipelines and blanket marketing... and to monitor and evaluate transactions and operations of interstate pipelines and blanket marketing...
EEGVIS: A MATLAB Toolbox for Browsing, Exploring, and Viewing Large Datasets.
Robbins, Kay A
2012-01-01
Recent advances in data monitoring and sensor technology have accelerated the acquisition of very large data sets. Streaming data sets from instrumentation such as multi-channel EEG recording usually must undergo substantial pre-processing and artifact removal. Even when using automated procedures, most scientists engage in laborious manual examination and processing to assure high quality data and to indentify interesting or problematic data segments. Researchers also do not have a convenient method of method of visually assessing the effects of applying any stage in a processing pipeline. EEGVIS is a MATLAB toolbox that allows users to quickly explore multi-channel EEG and other large array-based data sets using multi-scale drill-down techniques. Customizable summary views reveal potentially interesting sections of data, which users can explore further by clicking to examine using detailed viewing components. The viewer and a companion browser are built on our MoBBED framework, which has a library of modular viewing components that can be mixed and matched to best reveal structure. Users can easily create new viewers for their specific data without any programming during the exploration process. These viewers automatically support pan, zoom, resizing of individual components, and cursor exploration. The toolbox can be used directly in MATLAB at any stage in a processing pipeline, as a plug-in for EEGLAB, or as a standalone precompiled application without MATLAB running. EEGVIS and its supporting packages are freely available under the GNU general public license at http://visual.cs.utsa.edu/eegvis.
Reproducible Large-Scale Neuroimaging Studies with the OpenMOLE Workflow Management System.
Passerat-Palmbach, Jonathan; Reuillon, Romain; Leclaire, Mathieu; Makropoulos, Antonios; Robinson, Emma C; Parisot, Sarah; Rueckert, Daniel
2017-01-01
OpenMOLE is a scientific workflow engine with a strong emphasis on workload distribution. Workflows are designed using a high level Domain Specific Language (DSL) built on top of Scala. It exposes natural parallelism constructs to easily delegate the workload resulting from a workflow to a wide range of distributed computing environments. OpenMOLE hides the complexity of designing complex experiments thanks to its DSL. Users can embed their own applications and scale their pipelines from a small prototype running on their desktop computer to a large-scale study harnessing distributed computing infrastructures, simply by changing a single line in the pipeline definition. The construction of the pipeline itself is decoupled from the execution context. The high-level DSL abstracts the underlying execution environment, contrary to classic shell-script based pipelines. These two aspects allow pipelines to be shared and studies to be replicated across different computing environments. Workflows can be run as traditional batch pipelines or coupled with OpenMOLE's advanced exploration methods in order to study the behavior of an application, or perform automatic parameter tuning. In this work, we briefly present the strong assets of OpenMOLE and detail recent improvements targeting re-executability of workflows across various Linux platforms. We have tightly coupled OpenMOLE with CARE, a standalone containerization solution that allows re-executing on a Linux host any application that has been packaged on another Linux host previously. The solution is evaluated against a Python-based pipeline involving packages such as scikit-learn as well as binary dependencies. All were packaged and re-executed successfully on various HPC environments, with identical numerical results (here prediction scores) obtained on each environment. Our results show that the pair formed by OpenMOLE and CARE is a reliable solution to generate reproducible results and re-executable pipelines. A demonstration of the flexibility of our solution showcases three neuroimaging pipelines harnessing distributed computing environments as heterogeneous as local clusters or the European Grid Infrastructure (EGI).
Reproducible Large-Scale Neuroimaging Studies with the OpenMOLE Workflow Management System
Passerat-Palmbach, Jonathan; Reuillon, Romain; Leclaire, Mathieu; Makropoulos, Antonios; Robinson, Emma C.; Parisot, Sarah; Rueckert, Daniel
2017-01-01
OpenMOLE is a scientific workflow engine with a strong emphasis on workload distribution. Workflows are designed using a high level Domain Specific Language (DSL) built on top of Scala. It exposes natural parallelism constructs to easily delegate the workload resulting from a workflow to a wide range of distributed computing environments. OpenMOLE hides the complexity of designing complex experiments thanks to its DSL. Users can embed their own applications and scale their pipelines from a small prototype running on their desktop computer to a large-scale study harnessing distributed computing infrastructures, simply by changing a single line in the pipeline definition. The construction of the pipeline itself is decoupled from the execution context. The high-level DSL abstracts the underlying execution environment, contrary to classic shell-script based pipelines. These two aspects allow pipelines to be shared and studies to be replicated across different computing environments. Workflows can be run as traditional batch pipelines or coupled with OpenMOLE's advanced exploration methods in order to study the behavior of an application, or perform automatic parameter tuning. In this work, we briefly present the strong assets of OpenMOLE and detail recent improvements targeting re-executability of workflows across various Linux platforms. We have tightly coupled OpenMOLE with CARE, a standalone containerization solution that allows re-executing on a Linux host any application that has been packaged on another Linux host previously. The solution is evaluated against a Python-based pipeline involving packages such as scikit-learn as well as binary dependencies. All were packaged and re-executed successfully on various HPC environments, with identical numerical results (here prediction scores) obtained on each environment. Our results show that the pair formed by OpenMOLE and CARE is a reliable solution to generate reproducible results and re-executable pipelines. A demonstration of the flexibility of our solution showcases three neuroimaging pipelines harnessing distributed computing environments as heterogeneous as local clusters or the European Grid Infrastructure (EGI). PMID:28381997
NASA Astrophysics Data System (ADS)
Goldoni, P.
2011-03-01
The X-shooter data reduction pipeline is an integral part of the X-shooter project, it allows the production of reduced data in physical quantities from the raw data produced by the instrument. The pipeline is based on the data reduction library developed by the X-shooter consortium with contributions from France, The Netherlands and ESO and it uses the Common Pipeline Library (CPL) developed at ESO. The pipeline has been developed for two main functions. The first function is to monitor the operation of the instrument through the reduction of the acquired data, both at Paranal, for a quick-look control, and in Garching, for a more thorough evaluation. The second function is to allow an optimized data reduction for a scientific user. In the following I will first outline the main steps of data reduction with the pipeline then I will briefly show two examples of optimization of the results for science reduction.
Pipeline welding goes mechanized
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beeson, R.
1999-11-01
Spread four has bugs in the cornfield--but not to worry. The bug referred to here is a mechanized welding bug, specifically a single welding head, computer-aided gas metal arc (GMAW) system from CRC-Evans Automatic Welding powered by a Miller Electric XMT{reg{underscore}sign} 304 inverter-based welding machine. The bug operator and owner of 32 inverters is Welded Construction, L.P., of Perrysburgh, Ohio. Spread four is a 147-mile stretch of the Alliance Pipeline system (Alliance) cutting through the cornfields of northeast Iowa. While used successfully in Canada and Europe for onshore and offshore pipeline construction for 30 years, this is the first large-scalemore » use of mechanized welding in the US on a cross-country pipeline. On longer, larger-diameter and thicker-wall pipe projects--the Alliance mainline has 1,844 miles of pipe, most of it 36-in. diameter with a 0.622-in. wall thickness--mechanized GMAW offers better productivity than manual shielded metal arc welding (SMAW). In addition, high-strength steels, such as the API 5L Grade X70 pipe used on the Alliance, benefit from the low-hydrogen content of certain solid and tubular wire electrodes.« less
NASA Astrophysics Data System (ADS)
Ge, Yaomou
Oil and gas pipelines play a critical role in delivering the energy resources from producing fields to power communities around the world. However, there are many threats to pipeline integrity, which may lead to significant incidents, causing safety, environmental and economic problems. Corrosion has been a big threat to oil and gas pipelines for a long time, which has attributed to approximately 18% of the significant incidents in oil and gas pipelines. In addition, external corrosion of pipelines accounts for a significant portion (more than 25%) of pipeline failure. External corrosion detection is the research area of this thesis. In this thesis, a review of existing corrosion detection or monitoring methods is presented, and optical fiber sensors show a great promise in corrosion detection of oil and gas pipelines. Several scenarios of optical fiber corrosion sensors are discussed, and two of them are selected for future research. A new corrosion and leakage detection sensor, consisting of a custom designed trigger and a FBG optical fiber, will be presented. This new device has been experimentally tested and it shows great promise.
González, Jorge Ernesto; Radl, Analía; Romero, Ivonne; Barquinero, Joan Francesc; García, Omar; Di Giorgio, Marina
2016-12-01
Mitotic Index (MI) estimation expressed as percentage of mitosis plays an important role as quality control endpoint. To this end, MI is applied to check the lot of media and reagents to be used throughout the assay and also to check cellular viability after blood sample shipping, indicating satisfactory/unsatisfactory conditions for the progression of cell culture. The objective of this paper was to apply the CellProfiler open-source software for automatic detection of mitotic and nuclei figures from digitized images of cultured human lymphocytes for MI assessment, and to compare its performance to that performed through semi-automatic and visual detection. Lymphocytes were irradiated and cultured for mitosis detection. Sets of images from cultures were analyzed visually and findings were compared with those using CellProfiler software. The CellProfiler pipeline includes the detection of nuclei and mitosis with 80% sensitivity and more than 99% specificity. We conclude that CellProfiler is a reliable tool for counting mitosis and nuclei from cytogenetic images, saves considerable time compared to manual operation and reduces the variability derived from the scoring criteria of different scorers. The CellProfiler automated pipeline achieves good agreement with visual counting workflow, i.e. it allows fully automated mitotic and nuclei scoring in cytogenetic images yielding reliable information with minimal user intervention. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Computer systems for automatic earthquake detection
Stewart, S.W.
1974-01-01
U.S Geological Survey seismologists in Menlo park, California, are utilizing the speed, reliability, and efficiency of minicomputers to monitor seismograph stations and to automatically detect earthquakes. An earthquake detection computer system, believed to be the only one of its kind in operation, automatically reports about 90 percent of all local earthquakes recorded by a network of over 100 central California seismograph stations. The system also monitors the stations for signs of malfunction or abnormal operation. Before the automatic system was put in operation, all of the earthquakes recorded had to be detected by manually searching the records, a time-consuming process. With the automatic detection system, the stations are efficiently monitored continuously.
The Environmental Technology Verification report discusses the technology and performance of a gaseous-emissions monitoring system for large, natural-gas-fired internal combustion engines. The device tested is the Parametric Emissions Monitoring System (PEMS) manufactured by ANR ...
Bifrost: a Modular Python/C++ Framework for Development of High-Throughput Data Analysis Pipelines
NASA Astrophysics Data System (ADS)
Cranmer, Miles; Barsdell, Benjamin R.; Price, Danny C.; Garsden, Hugh; Taylor, Gregory B.; Dowell, Jayce; Schinzel, Frank; Costa, Timothy; Greenhill, Lincoln J.
2017-01-01
Large radio interferometers have data rates that render long-term storage of raw correlator data infeasible, thus motivating development of real-time processing software. For high-throughput applications, processing pipelines are challenging to design and implement. Motivated by science efforts with the Long Wavelength Array, we have developed Bifrost, a novel Python/C++ framework that eases the development of high-throughput data analysis software by packaging algorithms as black box processes in a directed graph. This strategy to modularize code allows astronomers to create parallelism without code adjustment. Bifrost uses CPU/GPU ’circular memory’ data buffers that enable ready introduction of arbitrary functions into the processing path for ’streams’ of data, and allow pipelines to automatically reconfigure in response to astrophysical transient detection or input of new observing settings. We have deployed and tested Bifrost at the latest Long Wavelength Array station, in Sevilleta National Wildlife Refuge, NM, where it handles throughput exceeding 10 Gbps per CPU core.
An Automatic Quality Control Pipeline for High-Throughput Screening Hit Identification.
Zhai, Yufeng; Chen, Kaisheng; Zhong, Yang; Zhou, Bin; Ainscow, Edward; Wu, Ying-Ta; Zhou, Yingyao
2016-09-01
The correction or removal of signal errors in high-throughput screening (HTS) data is critical to the identification of high-quality lead candidates. Although a number of strategies have been previously developed to correct systematic errors and to remove screening artifacts, they are not universally effective and still require fair amount of human intervention. We introduce a fully automated quality control (QC) pipeline that can correct generic interplate systematic errors and remove intraplate random artifacts. The new pipeline was first applied to ~100 large-scale historical HTS assays; in silico analysis showed auto-QC led to a noticeably stronger structure-activity relationship. The method was further tested in several independent HTS runs, where QC results were sampled for experimental validation. Significantly increased hit confirmation rates were obtained after the QC steps, confirming that the proposed method was effective in enriching true-positive hits. An implementation of the algorithm is available to the screening community. © 2016 Society for Laboratory Automation and Screening.
Camerlengo, Terry; Ozer, Hatice Gulcin; Onti-Srinivasan, Raghuram; Yan, Pearlly; Huang, Tim; Parvin, Jeffrey; Huang, Kun
2012-01-01
Next Generation Sequencing is highly resource intensive. NGS Tasks related to data processing, management and analysis require high-end computing servers or even clusters. Additionally, processing NGS experiments requires suitable storage space and significant manual interaction. At The Ohio State University's Biomedical Informatics Shared Resource, we designed and implemented a scalable architecture to address the challenges associated with the resource intensive nature of NGS secondary analysis built around Illumina Genome Analyzer II sequencers and Illumina's Gerald data processing pipeline. The software infrastructure includes a distributed computing platform consisting of a LIMS called QUEST (http://bisr.osumc.edu), an Automation Server, a computer cluster for processing NGS pipelines, and a network attached storage device expandable up to 40TB. The system has been architected to scale to multiple sequencers without requiring additional computing or labor resources. This platform provides demonstrates how to manage and automate NGS experiments in an institutional or core facility setting.
Wei, Wei; Ji, Zhanglong; He, Yupeng; Zhang, Kai; Ha, Yuanchi; Li, Qi; Ohno-Machado, Lucila
2018-01-01
Abstract The number and diversity of biomedical datasets grew rapidly in the last decade. A large number of datasets are stored in various repositories, with different formats. Existing dataset retrieval systems lack the capability of cross-repository search. As a result, users spend time searching datasets in known repositories, and they typically do not find new repositories. The biomedical and healthcare data discovery index ecosystem (bioCADDIE) team organized a challenge to solicit new indexing and searching strategies for retrieving biomedical datasets across repositories. We describe the work of one team that built a retrieval pipeline and examined its performance. The pipeline used online resources to supplement dataset metadata, automatically generated queries from users’ free-text questions, produced high-quality retrieval results and achieved the highest inferred Normalized Discounted Cumulative Gain among competitors. The results showed that it is a promising solution for cross-database, cross-domain and cross-repository biomedical dataset retrieval. Database URL: https://github.com/w2wei/dataset_retrieval_pipeline PMID:29688374
Automatising the analysis of stochastic biochemical time-series
2015-01-01
Background Mathematical and computational modelling of biochemical systems has seen a lot of effort devoted to the definition and implementation of high-performance mechanistic simulation frameworks. Within these frameworks it is possible to analyse complex models under a variety of configurations, eventually selecting the best setting of, e.g., parameters for a target system. Motivation This operational pipeline relies on the ability to interpret the predictions of a model, often represented as simulation time-series. Thus, an efficient data analysis pipeline is crucial to automatise time-series analyses, bearing in mind that errors in this phase might mislead the modeller's conclusions. Results For this reason we have developed an intuitive framework-independent Python tool to automate analyses common to a variety of modelling approaches. These include assessment of useful non-trivial statistics for simulation ensembles, e.g., estimation of master equations. Intuitive and domain-independent batch scripts will allow the researcher to automatically prepare reports, thus speeding up the usual model-definition, testing and refinement pipeline. PMID:26051821
Sinha, S K; Karray, F
2002-01-01
Pipeline surface defects such as holes and cracks cause major problems for utility managers, particularly when the pipeline is buried under the ground. Manual inspection for surface defects in the pipeline has a number of drawbacks, including subjectivity, varying standards, and high costs. Automatic inspection system using image processing and artificial intelligence techniques can overcome many of these disadvantages and offer utility managers an opportunity to significantly improve quality and reduce costs. A recognition and classification of pipe cracks using images analysis and neuro-fuzzy algorithm is proposed. In the preprocessing step the scanned images of pipe are analyzed and crack features are extracted. In the classification step the neuro-fuzzy algorithm is developed that employs a fuzzy membership function and error backpropagation algorithm. The idea behind the proposed approach is that the fuzzy membership function will absorb variation of feature values and the backpropagation network, with its learning ability, will show good classification efficiency.
An image processing pipeline to detect and segment nuclei in muscle fiber microscopic images.
Guo, Yanen; Xu, Xiaoyin; Wang, Yuanyuan; Wang, Yaming; Xia, Shunren; Yang, Zhong
2014-08-01
Muscle fiber images play an important role in the medical diagnosis and treatment of many muscular diseases. The number of nuclei in skeletal muscle fiber images is a key bio-marker of the diagnosis of muscular dystrophy. In nuclei segmentation one primary challenge is to correctly separate the clustered nuclei. In this article, we developed an image processing pipeline to automatically detect, segment, and analyze nuclei in microscopic image of muscle fibers. The pipeline consists of image pre-processing, identification of isolated nuclei, identification and segmentation of clustered nuclei, and quantitative analysis. Nuclei are initially extracted from background by using local Otsu's threshold. Based on analysis of morphological features of the isolated nuclei, including their areas, compactness, and major axis lengths, a Bayesian network is trained and applied to identify isolated nuclei from clustered nuclei and artifacts in all the images. Then a two-step refined watershed algorithm is applied to segment clustered nuclei. After segmentation, the nuclei can be quantified for statistical analysis. Comparing the segmented results with those of manual analysis and an existing technique, we find that our proposed image processing pipeline achieves good performance with high accuracy and precision. The presented image processing pipeline can therefore help biologists increase their throughput and objectivity in analyzing large numbers of nuclei in muscle fiber images. © 2014 Wiley Periodicals, Inc.
Makropoulos, Antonios; Robinson, Emma C; Schuh, Andreas; Wright, Robert; Fitzgibbon, Sean; Bozek, Jelena; Counsell, Serena J; Steinweg, Johannes; Vecchiato, Katy; Passerat-Palmbach, Jonathan; Lenz, Gregor; Mortari, Filippo; Tenev, Tencho; Duff, Eugene P; Bastiani, Matteo; Cordero-Grande, Lucilio; Hughes, Emer; Tusor, Nora; Tournier, Jacques-Donald; Hutter, Jana; Price, Anthony N; Teixeira, Rui Pedro A G; Murgasova, Maria; Victor, Suresh; Kelly, Christopher; Rutherford, Mary A; Smith, Stephen M; Edwards, A David; Hajnal, Joseph V; Jenkinson, Mark; Rueckert, Daniel
2018-06-01
The Developing Human Connectome Project (dHCP) seeks to create the first 4-dimensional connectome of early life. Understanding this connectome in detail may provide insights into normal as well as abnormal patterns of brain development. Following established best practices adopted by the WU-MINN Human Connectome Project (HCP), and pioneered by FreeSurfer, the project utilises cortical surface-based processing pipelines. In this paper, we propose a fully automated processing pipeline for the structural Magnetic Resonance Imaging (MRI) of the developing neonatal brain. This proposed pipeline consists of a refined framework for cortical and sub-cortical volume segmentation, cortical surface extraction, and cortical surface inflation, which has been specifically designed to address considerable differences between adult and neonatal brains, as imaged using MRI. Using the proposed pipeline our results demonstrate that images collected from 465 subjects ranging from 28 to 45 weeks post-menstrual age (PMA) can be processed fully automatically; generating cortical surface models that are topologically correct, and correspond well with manual evaluations of tissue boundaries in 85% of cases. Results improve on state-of-the-art neonatal tissue segmentation models and significant errors were found in only 2% of cases, where these corresponded to subjects with high motion. Downstream, these surfaces will enhance comparisons of functional and diffusion MRI datasets, supporting the modelling of emerging patterns of brain connectivity. Copyright © 2018 Elsevier Inc. All rights reserved.
40 CFR 63.11092 - What testing and monitoring requirements must I meet?
Code of Federal Regulations, 2013 CFR
2013-07-01
... 40 Protection of Environment 15 2013-07-01 2013-07-01 false What testing and monitoring... Distribution Bulk Terminals, Bulk Plants, and Pipeline Facilities Testing and Monitoring Requirements § 63.11092 What testing and monitoring requirements must I meet? (a) Each owner or operator of a bulk...
40 CFR 63.11092 - What testing and monitoring requirements must I meet?
Code of Federal Regulations, 2012 CFR
2012-07-01
... 40 Protection of Environment 15 2012-07-01 2012-07-01 false What testing and monitoring... Distribution Bulk Terminals, Bulk Plants, and Pipeline Facilities Testing and Monitoring Requirements § 63.11092 What testing and monitoring requirements must I meet? (a) Each owner or operator of a bulk...
40 CFR 63.11092 - What testing and monitoring requirements must I meet?
Code of Federal Regulations, 2014 CFR
2014-07-01
... 40 Protection of Environment 15 2014-07-01 2014-07-01 false What testing and monitoring... Distribution Bulk Terminals, Bulk Plants, and Pipeline Facilities Testing and Monitoring Requirements § 63.11092 What testing and monitoring requirements must I meet? (a) Each owner or operator of a bulk...
40 CFR 63.11092 - What testing and monitoring requirements must I meet?
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 14 2010-07-01 2010-07-01 false What testing and monitoring... Distribution Bulk Terminals, Bulk Plants, and Pipeline Facilities Testing and Monitoring Requirements § 63.11092 What testing and monitoring requirements must I meet? (a) Each owner or operator subject to the...
40 CFR 63.11092 - What testing and monitoring requirements must I meet?
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 14 2011-07-01 2011-07-01 false What testing and monitoring... Distribution Bulk Terminals, Bulk Plants, and Pipeline Facilities Testing and Monitoring Requirements § 63.11092 What testing and monitoring requirements must I meet? (a) Each owner or operator of a bulk...
An integrated GPS-FID system for airborne gas detection of pipeline right-of-ways
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gehue, H.L.; Sommer, P.
1996-12-31
Pipeline integrity, safety and environmental concerns are of prime importance in the Canadian natural gas industry. Terramatic Technology Inc. (TTI) has developed an integrated GPS/FID gas detection system known as TTI-AirTrac{trademark} for use in airborne gas detection (AGD) along pipeline right-of-ways. The Flame Ionization Detector (FID), which has traditionally been used to monitor air quality for gas plants and refineries, has been integrated with the Global Positioning System (GPS) via a 486 DX2-50 computer and specialized open architecture data acquisition software. The purpose of this technology marriage is to be able to continuously monitor air quality during airborne pipeline inspection.more » Event tagging from visual surveillance is used to determine an explanation of any delta line deviations (DLD). These deviations are an indication of hydrocarbon gases present in the plume that the aircraft has passed through. The role of the GPS system is to provide mapping information and coordinate data for ground inspections. The ground based inspection using a handheld multi gas detector will confirm whether or not a leak exists.« less
Automatic pelvis segmentation from x-ray images of a mouse model
NASA Astrophysics Data System (ADS)
Al Okashi, Omar M.; Du, Hongbo; Al-Assam, Hisham
2017-05-01
The automatic detection and quantification of skeletal structures has a variety of different applications for biological research. Accurate segmentation of the pelvis from X-ray images of mice in a high-throughput project such as the Mouse Genomes Project not only saves time and cost but also helps achieving an unbiased quantitative analysis within the phenotyping pipeline. This paper proposes an automatic solution for pelvis segmentation based on structural and orientation properties of the pelvis in X-ray images. The solution consists of three stages including pre-processing image to extract pelvis area, initial pelvis mask preparation and final pelvis segmentation. Experimental results on a set of 100 X-ray images showed consistent performance of the algorithm. The automated solution overcomes the weaknesses of a manual annotation procedure where intra- and inter-observer variations cannot be avoided.
Segmentation of facial bone surfaces by patch growing from cone beam CT volumes
Lilja, Mikko; Kalke, Martti
2016-01-01
Objectives: The motivation behind this work was to design an automatic algorithm capable of segmenting the exterior of the dental and facial bones including the mandible, teeth, maxilla and zygomatic bone with an open surface (a surface with a boundary) from CBCT images for the anatomy-based reconstruction of radiographs. Such an algorithm would provide speed, consistency and improved image quality for clinical workflows, for example, in planning of implants. Methods: We used CBCT images from two studies: first to develop (n = 19) and then to test (n = 30) a segmentation pipeline. The pipeline operates by parameterizing the topology and shape of the target, searching for potential points on the facial bone–soft tissue edge, reconstructing a triangular mesh by growing patches on from the edge points with good contrast and regularizing the result with a surface polynomial. This process is repeated for convergence. Results: The output of the algorithm was benchmarked against a hand-drawn reference and reached a 0.50 ± 1.0-mm average and 1.1-mm root mean squares error in Euclidean distance from the reference to our automatically segmented surface. These results were achieved with images affected by inhomogeneity, noise and metal artefacts that are typical for dental CBCT. Conclusions: Previously, this level of accuracy and precision in dental CBCT has been reported in segmenting only the mandible, a much easier target. The segmentation results were consistent throughout the data set and the pipeline was found fast enough (<1-min average computation time) to be considered for clinical use. PMID:27482878
ERIC Educational Resources Information Center
Steffens, Melanie C.; Jelenec, Petra; Noack, Peter
2010-01-01
Many models assume that habitual human behavior is guided by spontaneous, automatic, or implicit processes rather than by deliberate, rule-based, or explicit processes. Thus, math-ability self-concepts and math performance could be related to implicit math-gender stereotypes in addition to explicit stereotypes. Two studies assessed at what age…
Satellite Radar Interferometry For Risk Management Of Gas Pipeline Networks
NASA Astrophysics Data System (ADS)
Ianoschi, Raluca; Schouten, Mathijs; Bas Leezenberg, Pieter; Dheenathayalan, Prabu; Hanssen, Ramon
2013-12-01
InSAR time series analyses can be fine-tuned for specific applications, yielding a potential increase in benchmark density, precision and reliability. Here we demonstrate the algorithms developed for gas pipeline monitoring, enabling operators to precisely pinpoint unstable locations. This helps asset management in planning, prioritizing and focusing in-situ inspections, thus reducing maintenance costs. In unconsolidated Quaternary soils, ground settlement contributes to possible failure of brittle cast iron gas pipes and their connections to houses. Other risk factors include the age and material of the pipe. The soil dynamics have led to a catastrophic explosion in the city of Amsterdam, which triggered an increased awareness for the significance of this problem. As the extent of the networks can be very wide, InSAR is shown to be a valuable source of information for identifying the hazard regions. We monitor subsidence affecting an urban gas transportation network in the Netherlands using both medium and high resolution SAR data. Results for the 2003-2010 period provide clear insights on the differential subsidence rates in the area. This enables characterization of underground motion that affects the integrity of the pipeline. High resolution SAR data add extra detail of door-to-door pipeline connections, which are vulnerable due to different settlements between house connections and main pipelines. The rates which we measure represent important input in planning of maintenance works. Managers can decide the priority and timing for inspecting the pipelines. The service helps manage the risk and reduce operational cost in gas transportation networks.
NASA Astrophysics Data System (ADS)
Raza, Shan-e.-Ahmed; Marjan, M. Q.; Arif, Muhammad; Butt, Farhana; Sultan, Faisal; Rajpoot, Nasir M.
2015-03-01
One of the main factors for high workload in pulmonary pathology in developing countries is the relatively large proportion of tuberculosis (TB) cases which can be detected with high throughput using automated approaches. TB is caused by Mycobacterium tuberculosis, which appears as thin, rod-shaped acid-fast bacillus (AFB) in Ziehl-Neelsen (ZN) stained sputum smear samples. In this paper, we present an algorithm for automatic detection of AFB in digitized images of ZN stained sputum smear samples under a light microscope. A key component of the proposed algorithm is the enhancement of raw input image using a novel anisotropic tubular filter (ATF) which suppresses the background noise while simultaneously enhancing strong anisotropic features of AFBs present in the image. The resulting image is then segmented using color features and candidate AFBs are identified. Finally, a support vector machine classifier using morphological features from candidate AFBs decides whether a given image is AFB positive or not. We demonstrate the effectiveness of the proposed ATF method with two different feature sets by showing that the proposed image analysis pipeline results in higher accuracy and F1-score than the same pipeline with standard median filtering for image enhancement.
ISEScan: automated identification of insertion sequence elements in prokaryotic genomes.
Xie, Zhiqun; Tang, Haixu
2017-11-01
The insertion sequence (IS) elements are the smallest but most abundant autonomous transposable elements in prokaryotic genomes, which play a key role in prokaryotic genome organization and evolution. With the fast growing genomic data, it is becoming increasingly critical for biology researchers to be able to accurately and automatically annotate ISs in prokaryotic genome sequences. The available automatic IS annotation systems are either providing only incomplete IS annotation or relying on the availability of existing genome annotations. Here, we present a new IS elements annotation pipeline to address these issues. ISEScan is a highly sensitive software pipeline based on profile hidden Markov models constructed from manually curated IS elements. ISEScan performs better than existing IS annotation systems when tested on prokaryotic genomes with curated annotations of IS elements. Applying it to 2784 prokaryotic genomes, we report the global distribution of IS families across taxonomic clades in Archaea and Bacteria. ISEScan is implemented in Python and released as an open source software at https://github.com/xiezhq/ISEScan. hatang@indiana.edu. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com
Automatic Beam Path Analysis of Laser Wakefield Particle Acceleration Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rubel, Oliver; Geddes, Cameron G.R.; Cormier-Michel, Estelle
2009-10-19
Numerical simulations of laser wakefield particle accelerators play a key role in the understanding of the complex acceleration process and in the design of expensive experimental facilities. As the size and complexity of simulation output grows, an increasingly acute challenge is the practical need for computational techniques that aid in scientific knowledge discovery. To that end, we present a set of data-understanding algorithms that work in concert in a pipeline fashion to automatically locate and analyze high energy particle bunches undergoing acceleration in very large simulation datasets. These techniques work cooperatively by first identifying features of interest in individual timesteps,more » then integrating features across timesteps, and based on the information derived perform analysis of temporally dynamic features. This combination of techniques supports accurate detection of particle beams enabling a deeper level of scientific understanding of physical phenomena than hasbeen possible before. By combining efficient data analysis algorithms and state-of-the-art data management we enable high-performance analysis of extremely large particle datasets in 3D. We demonstrate the usefulness of our methods for a variety of 2D and 3D datasets and discuss the performance of our analysis pipeline.« less
Geolocation Support for Water Supply and Sewerage Projects in Azerbaijan
NASA Astrophysics Data System (ADS)
Qocamanov, M. H.; Gurbanov, Ch. Z.
2016-10-01
Drinking water supply and sewerage system designing and reconstruction projects are being extensively conducted in Azerbaijan Republic. During implementation of such projects, collecting large amount of information about the area and detailed investigations are crucial. Joint use of the aerospace monitoring and GIS play an essential role for the studies of the impact of environmental factors, development of the analytical information systems and others, while achieving the reliable performance of the existing and designed major water supply pipelines, as well as construction and exploitation of the technical installations. With our participation the GIS has been created in "Azersu" OJSC that includes systematic database of the drinking water supply and sewerage system, and rain water networks to carry out necessary geo information analysis. GIScreated based on "Microstation" platform and aerospace data. Should be mentioned that, in the country, specifically in large cities (i.e. Baku, Ganja, Sumqait, etc.,) drinking water supply pipelines cross regions with different physico-geographical conditions, geo-morphological compositions and seismotectonics.Mains water supply lines in many accidents occur during the operation, it also creates problems with drinking water consumers. In some cases the damage is caused by large-scale accidents. Long-term experience gives reason to say that the elimination of the consequences of accidents is a major cost. Therefore, to avoid such events and to prevent their exploitation and geodetic monitoring system to improve the rules on key issues. Therefore, constant control of the plan-height positioning, geodetic measurements for the detailed examination of the dynamics, repetition of the geodetic measurements for certain time intervals, or in other words regular monitoring is very important. During geodetic monitoring using the GIS has special significance. Given that, collecting geodetic monitoring measurements of the main pipelines on the same coordinate system and processing these data on a single GIS system allows the implementation of overall assessment of plan-height state of major water supply pipeline network facilities and the study of the impact of water supply network on environment and alternatively, the impact of natural processes on major pipeline.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Continuous methane monitoring device... Installations § 77.211-1 Continuous methane monitoring device; installation and operation; automatic deenergization of electric equipment. Continuous methane monitoring devices shall be set to deenergize...
Code of Federal Regulations, 2011 CFR
2011-07-01
... 30 Mineral Resources 1 2011-07-01 2011-07-01 false Continuous methane monitoring device... Installations § 77.211-1 Continuous methane monitoring device; installation and operation; automatic deenergization of electric equipment. Continuous methane monitoring devices shall be set to deenergize...
Socioeconomic Impact Assessment of the Los Angeles Automatic Vehicle Monitoring (AVM) Demonstration
DOT National Transportation Integrated Search
1982-09-01
This report presents a socioeconomic impact assessment of the Automatic Vehicle Monitoring (AVM) Demonstration in Los Angeles. An AVM system uses location, communication, and data processing subsystems to monitor the locations of appropriately equipp...
Cohen Freue, Gabriela V.; Meredith, Anna; Smith, Derek; Bergman, Axel; Sasaki, Mayu; Lam, Karen K. Y.; Hollander, Zsuzsanna; Opushneva, Nina; Takhar, Mandeep; Lin, David; Wilson-McManus, Janet; Balshaw, Robert; Keown, Paul A.; Borchers, Christoph H.; McManus, Bruce; Ng, Raymond T.; McMaster, W. Robert
2013-01-01
Recent technical advances in the field of quantitative proteomics have stimulated a large number of biomarker discovery studies of various diseases, providing avenues for new treatments and diagnostics. However, inherent challenges have limited the successful translation of candidate biomarkers into clinical use, thus highlighting the need for a robust analytical methodology to transition from biomarker discovery to clinical implementation. We have developed an end-to-end computational proteomic pipeline for biomarkers studies. At the discovery stage, the pipeline emphasizes different aspects of experimental design, appropriate statistical methodologies, and quality assessment of results. At the validation stage, the pipeline focuses on the migration of the results to a platform appropriate for external validation, and the development of a classifier score based on corroborated protein biomarkers. At the last stage towards clinical implementation, the main aims are to develop and validate an assay suitable for clinical deployment, and to calibrate the biomarker classifier using the developed assay. The proposed pipeline was applied to a biomarker study in cardiac transplantation aimed at developing a minimally invasive clinical test to monitor acute rejection. Starting with an untargeted screening of the human plasma proteome, five candidate biomarker proteins were identified. Rejection-regulated proteins reflect cellular and humoral immune responses, acute phase inflammatory pathways, and lipid metabolism biological processes. A multiplex multiple reaction monitoring mass-spectrometry (MRM-MS) assay was developed for the five candidate biomarkers and validated by enzyme-linked immune-sorbent (ELISA) and immunonephelometric assays (INA). A classifier score based on corroborated proteins demonstrated that the developed MRM-MS assay provides an appropriate methodology for an external validation, which is still in progress. Plasma proteomic biomarkers of acute cardiac rejection may offer a relevant post-transplant monitoring tool to effectively guide clinical care. The proposed computational pipeline is highly applicable to a wide range of biomarker proteomic studies. PMID:23592955
Data Pre-Processing for Label-Free Multiple Reaction Monitoring (MRM) Experiments
Chung, Lisa M.; Colangelo, Christopher M.; Zhao, Hongyu
2014-01-01
Multiple Reaction Monitoring (MRM) conducted on a triple quadrupole mass spectrometer allows researchers to quantify the expression levels of a set of target proteins. Each protein is often characterized by several unique peptides that can be detected by monitoring predetermined fragment ions, called transitions, for each peptide. Concatenating large numbers of MRM transitions into a single assay enables simultaneous quantification of hundreds of peptides and proteins. In recognition of the important role that MRM can play in hypothesis-driven research and its increasing impact on clinical proteomics, targeted proteomics such as MRM was recently selected as the Nature Method of the Year. However, there are many challenges in MRM applications, especially data pre‑processing where many steps still rely on manual inspection of each observation in practice. In this paper, we discuss an analysis pipeline to automate MRM data pre‑processing. This pipeline includes data quality assessment across replicated samples, outlier detection, identification of inaccurate transitions, and data normalization. We demonstrate the utility of our pipeline through its applications to several real MRM data sets. PMID:24905083
Data Pre-Processing for Label-Free Multiple Reaction Monitoring (MRM) Experiments.
Chung, Lisa M; Colangelo, Christopher M; Zhao, Hongyu
2014-06-05
Multiple Reaction Monitoring (MRM) conducted on a triple quadrupole mass spectrometer allows researchers to quantify the expression levels of a set of target proteins. Each protein is often characterized by several unique peptides that can be detected by monitoring predetermined fragment ions, called transitions, for each peptide. Concatenating large numbers of MRM transitions into a single assay enables simultaneous quantification of hundreds of peptides and proteins. In recognition of the important role that MRM can play in hypothesis-driven research and its increasing impact on clinical proteomics, targeted proteomics such as MRM was recently selected as the Nature Method of the Year. However, there are many challenges in MRM applications, especially data pre‑processing where many steps still rely on manual inspection of each observation in practice. In this paper, we discuss an analysis pipeline to automate MRM data pre‑processing. This pipeline includes data quality assessment across replicated samples, outlier detection, identification of inaccurate transitions, and data normalization. We demonstrate the utility of our pipeline through its applications to several real MRM data sets.
Nayor, Jennifer; Borges, Lawrence F; Goryachev, Sergey; Gainer, Vivian S; Saltzman, John R
2018-07-01
ADR is a widely used colonoscopy quality indicator. Calculation of ADR is labor-intensive and cumbersome using current electronic medical databases. Natural language processing (NLP) is a method used to extract meaning from unstructured or free text data. (1) To develop and validate an accurate automated process for calculation of adenoma detection rate (ADR) and serrated polyp detection rate (SDR) on data stored in widely used electronic health record systems, specifically Epic electronic health record system, Provation ® endoscopy reporting system, and Sunquest PowerPath pathology reporting system. Screening colonoscopies performed between June 2010 and August 2015 were identified using the Provation ® reporting tool. An NLP pipeline was developed to identify adenomas and sessile serrated polyps (SSPs) on pathology reports corresponding to these colonoscopy reports. The pipeline was validated using a manual search. Precision, recall, and effectiveness of the natural language processing pipeline were calculated. ADR and SDR were then calculated. We identified 8032 screening colonoscopies that were linked to 3821 pathology reports (47.6%). The NLP pipeline had an accuracy of 100% for adenomas and 100% for SSPs. Mean total ADR was 29.3% (range 14.7-53.3%); mean male ADR was 35.7% (range 19.7-62.9%); and mean female ADR was 24.9% (range 9.1-51.0%). Mean total SDR was 4.0% (0-9.6%). We developed and validated an NLP pipeline that accurately and automatically calculates ADRs and SDRs using data stored in Epic, Provation ® and Sunquest PowerPath. This NLP pipeline can be used to evaluate colonoscopy quality parameters at both individual and practice levels.
Detection of leaks in buried rural water pipelines using thermal infrared images
Eidenshink, Jeffery C.
1985-01-01
Leakage is a major problem in many pipelines. Minor leaks called 'seeper leaks', which generally range from 2 to 10 m3 per day, are common and are difficult to detect using conventional ground surveys. The objective of this research was to determine whether airborne thermal-infrared remote sensing could be used in detecting leaks and monitoring rural water pipelines. This study indicates that such leaks can be detected using low-altitude 8.7- to 11.5. micrometer wavelength, thermal infrared images collected under proper conditions.
Cathodic protection of a remote river pipeline
DOE Office of Scientific and Technical Information (OSTI.GOV)
Martin, B.A.
1994-03-01
The 261-km long 500-mm diam Kutubu pipeline, which runs through dense jungle swamps in Papua, New Guinea, was built for Chevron Niugini to transport oil from the remote Kutubu oil production facility in the Southern Highlands to an offshore loading facility. The pipeline was laid with a section in the bed of a wide, fast-flowing river. This section was subject to substantial telluric effects and current density variations from changing water resistivities. The cathodic protection system's effectiveness was monitored by coupon off'' potentials and required an innovative approach.
ACOUSTIC LOCATION OF LEAKS IN PRESSURIZED UNDER- GROUND PETROLEUM PIPELINES
Experiments were conducted at the Underground Storage Tank (UST) Test Apparatus Pipeline in which three acoustic sensors separated by a maximum distance of 38.1 m (125 ft) were used to monitor signals produced by 11.4-, 5.7-, and 3.8-L/h (3.0-, 1.5-, and 1.0-gal/h) leaks in th...
Techniques for Minimizing and Monitoring the Impact of Pipeline Construction on Coastal Streams
Thomas W. Mulroy; John R. Storrer; Vincent J. Semonsen; Michael L. Dungan
1989-01-01
This paper describes specific measures recently employed for protection of riparian resources during construction of an oil and gas pipeline that crossed coastal reaches of 23 perennial and intermittent streams between Point Conception and Gaviota in Santa Barbara County, California. Flumes were constructed to maintain stream flow; anchored straw bales and silt fences...
Chen, Shou-Qiang; Xing, Shan-Shan; Gao, Hai-Qing
2014-01-01
Objective: In addition to ambulatory Holter electrocardiographic recording and transtelephonic electrocardiographic monitoring (TTM), a cardiac remote monitoring system can provide an automatic warning function through the general packet radio service (GPRS) network, enabling earlier diagnosis, treatment and improved outcome of cardiac diseases. The purpose of this study was to estimate its clinical significance in preventing acute cardiac episodes. Methods: Using 2 leads (V1 and V5 leads) and the automatic warning mode, 7160 patients were tested with a cardiac remote monitoring system from October 2004 to September 2007. If malignant arrhythmias or obvious ST-T changes appeared in the electrocardiogram records was automatically transferred to the monitoring center, the patient and his family members were informed, and the corresponding precautionary or therapeutic measures were implemented immediately. Results: In our study, 274 cases of malignant arrhythmia, including sinus standstill and ventricular tachycardia, and 43 cases of obvious ST-segment elevation were detected and treated. Because of early detection, there was no death or deformity. Conclusions: A cardiac remote monitoring system providing an automatic warning function can play an important role in preventing acute cardiac episodes. PMID:25674124
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-08
... the CBOE Stock Exchange, LLC (``CBSX'') to delete references to the automatic quote regeneration and....24(b) (Automatic Quote Regeneration) and Rule 53.24(c) (Quote Risk Monitor Function) from CBOE Stock... the automatic quote regeneration nor the quote risk monitor function has been made available or been...
NASA Astrophysics Data System (ADS)
Bedi, Amna; Kothari, Vaishali; Kumar, Santosh
2018-02-01
The under laid gas and oil pipelines on the seafloor are prone to various disturbances like seismic movements of the sea bed, oceanic currents, tsunamis. These factors tend to damage such pipelines connecting different locations of the world dependent on these pipelines for their day-to-day use of oil and natural gas. If damaged, the oil spills in the water bodies cause grave loss to marine life along with serious economic issues. It is not feasible to monitor the undersea pipelines manually because of the huge seafloor depth. For timely detection of such damage, a new technique using optical Fiber Bragg grating (FBG) sensors and its installation has been given in this work. The idea of an FBG sensor for detecting damage in pipeline structure based on the acoustic emission has been worked out. The numerical calculation has been done based on the fundamental of strain measurement and the output has been simulated using MATLAB.
The FieldTrip-SimBio pipeline for EEG forward solutions.
Vorwerk, Johannes; Oostenveld, Robert; Piastra, Maria Carla; Magyari, Lilla; Wolters, Carsten H
2018-03-27
Accurately solving the electroencephalography (EEG) forward problem is crucial for precise EEG source analysis. Previous studies have shown that the use of multicompartment head models in combination with the finite element method (FEM) can yield high accuracies both numerically and with regard to the geometrical approximation of the human head. However, the workload for the generation of multicompartment head models has often been too high and the use of publicly available FEM implementations too complicated for a wider application of FEM in research studies. In this paper, we present a MATLAB-based pipeline that aims to resolve this lack of easy-to-use integrated software solutions. The presented pipeline allows for the easy application of five-compartment head models with the FEM within the FieldTrip toolbox for EEG source analysis. The FEM from the SimBio toolbox, more specifically the St. Venant approach, was integrated into the FieldTrip toolbox. We give a short sketch of the implementation and its application, and we perform a source localization of somatosensory evoked potentials (SEPs) using this pipeline. We then evaluate the accuracy that can be achieved using the automatically generated five-compartment hexahedral head model [skin, skull, cerebrospinal fluid (CSF), gray matter, white matter] in comparison to a highly accurate tetrahedral head model that was generated on the basis of a semiautomatic segmentation with very careful and time-consuming manual corrections. The source analysis of the SEP data correctly localizes the P20 component and achieves a high goodness of fit. The subsequent comparison to the highly detailed tetrahedral head model shows that the automatically generated five-compartment head model performs about as well as a highly detailed four-compartment head model (skin, skull, CSF, brain). This is a significant improvement in comparison to a three-compartment head model, which is frequently used in praxis, since the importance of modeling the CSF compartment has been shown in a variety of studies. The presented pipeline facilitates the use of five-compartment head models with the FEM for EEG source analysis. The accuracy with which the EEG forward problem can thereby be solved is increased compared to the commonly used three-compartment head models, and more reliable EEG source reconstruction results can be obtained.
49 CFR 192.477 - Internal corrosion control: Monitoring.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 49 Transportation 3 2011-10-01 2011-10-01 false Internal corrosion control: Monitoring. 192.477... TRANSPORTATION OF NATURAL AND OTHER GAS BY PIPELINE: MINIMUM FEDERAL SAFETY STANDARDS Requirements for Corrosion Control § 192.477 Internal corrosion control: Monitoring. If corrosive gas is being transported, coupons...
49 CFR 192.477 - Internal corrosion control: Monitoring.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 49 Transportation 3 2010-10-01 2010-10-01 false Internal corrosion control: Monitoring. 192.477... TRANSPORTATION OF NATURAL AND OTHER GAS BY PIPELINE: MINIMUM FEDERAL SAFETY STANDARDS Requirements for Corrosion Control § 192.477 Internal corrosion control: Monitoring. If corrosive gas is being transported, coupons...
Redefining the Data Pipeline Using GPUs
NASA Astrophysics Data System (ADS)
Warner, C.; Eikenberry, S. S.; Gonzalez, A. H.; Packham, C.
2013-10-01
There are two major challenges facing the next generation of data processing pipelines: 1) handling an ever increasing volume of data as array sizes continue to increase and 2) the desire to process data in near real-time to maximize observing efficiency by providing rapid feedback on data quality. Combining the power of modern graphics processing units (GPUs), relational database management systems (RDBMSs), and extensible markup language (XML) to re-imagine traditional data pipelines will allow us to meet these challenges. Modern GPUs contain hundreds of processing cores, each of which can process hundreds of threads concurrently. Technologies such as Nvidia's Compute Unified Device Architecture (CUDA) platform and the PyCUDA (http://mathema.tician.de/software/pycuda) module for Python allow us to write parallel algorithms and easily link GPU-optimized code into existing data pipeline frameworks. This approach has produced speed gains of over a factor of 100 compared to CPU implementations for individual algorithms and overall pipeline speed gains of a factor of 10-25 compared to traditionally built data pipelines for both imaging and spectroscopy (Warner et al., 2011). However, there are still many bottlenecks inherent in the design of traditional data pipelines. For instance, file input/output of intermediate steps is now a significant portion of the overall processing time. In addition, most traditional pipelines are not designed to be able to process data on-the-fly in real time. We present a model for a next-generation data pipeline that has the flexibility to process data in near real-time at the observatory as well as to automatically process huge archives of past data by using a simple XML configuration file. XML is ideal for describing both the dataset and the processes that will be applied to the data. Meta-data for the datasets would be stored using an RDBMS (such as mysql or PostgreSQL) which could be easily and rapidly queried and file I/O would be kept at a minimum. We believe this redefined data pipeline will be able to process data at the telescope, concurrent with continuing observations, thus maximizing precious observing time and optimizing the observational process in general. We also believe that using this design, it is possible to obtain a speed gain of a factor of 30-40 over traditional data pipelines when processing large archives of data.
Automatic quality assessment of planetary images
NASA Astrophysics Data System (ADS)
Sidiropoulos, P.; Muller, J.-P.
2015-10-01
A significant fraction of planetary images are corrupted beyond the point that much scientific meaning can be extracted. For example, transmission errors result in missing data which is unrecoverable. The available planetary image datasets include many such "bad data", which both occupy valuable scientific storage resources and create false impressions about planetary image availability for specific planetary objects or target areas. In this work, we demonstrate a pipeline that we have developed to automatically assess the quality of planetary images. Additionally, this method discriminates between different types of image degradation, such as low-quality originating from camera flaws or low-quality triggered by atmospheric conditions, etc. Examples of quality assessment results for Viking Orbiter imagery will be also presented.
Automatic Building Abstraction from Aerial Photogrammetry
NASA Astrophysics Data System (ADS)
Ley, A.; Hänsch, R.; Hellwich, O.
2017-09-01
Multi-view stereo has been shown to be a viable tool for the creation of realistic 3D city models. Nevertheless, it still states significant challenges since it results in dense, but noisy and incomplete point clouds when applied to aerial images. 3D city modelling usually requires a different representation of the 3D scene than these point clouds. This paper applies a fully-automatic pipeline to generate a simplified mesh from a given dense point cloud. The mesh provides a certain level of abstraction as it only consists of relatively large planar and textured surfaces. Thus, it is possible to remove noise, outlier, as well as clutter, while maintaining a high level of accuracy.
Tonti, Simone; Di Cataldo, Santa; Bottino, Andrea; Ficarra, Elisa
2015-03-01
The automatization of the analysis of Indirect Immunofluorescence (IIF) images is of paramount importance for the diagnosis of autoimmune diseases. This paper proposes a solution to one of the most challenging steps of this process, the segmentation of HEp-2 cells, through an adaptive marker-controlled watershed approach. Our algorithm automatically conforms the marker selection pipeline to the peculiar characteristics of the input image, hence it is able to cope with different fluorescent intensities and staining patterns without any a priori knowledge. Furthermore, it shows a reduced sensitivity to over-segmentation errors and uneven illumination, that are typical issues of IIF imaging. Copyright © 2015 Elsevier Ltd. All rights reserved.
A distributed pipeline for DIDSON data processing
Li, Liling; Danner, Tyler; Eickholt, Jesse; McCann, Erin L.; Pangle, Kevin; Johnson, Nicholas
2018-01-01
Technological advances in the field of ecology allow data on ecological systems to be collected at high resolution, both temporally and spatially. Devices such as Dual-frequency Identification Sonar (DIDSON) can be deployed in aquatic environments for extended periods and easily generate several terabytes of underwater surveillance data which may need to be processed multiple times. Due to the large amount of data generated and need for flexibility in processing, a distributed pipeline was constructed for DIDSON data making use of the Hadoop ecosystem. The pipeline is capable of ingesting raw DIDSON data, transforming the acoustic data to images, filtering the images, detecting and extracting motion, and generating feature data for machine learning and classification. All of the tasks in the pipeline can be run in parallel and the framework allows for custom processing. Applications of the pipeline include monitoring migration times, determining the presence of a particular species, estimating population size and other fishery management tasks.
Automatic computational labeling of glomerular textural boundaries
NASA Astrophysics Data System (ADS)
Ginley, Brandon; Tomaszewski, John E.; Sarder, Pinaki
2017-03-01
The glomerulus, a specialized bundle of capillaries, is the blood filtering unit of the kidney. Each human kidney contains about 1 million glomeruli. Structural damages in the glomerular micro-compartments give rise to several renal conditions; most severe of which is proteinuria, where excessive blood proteins flow freely to the urine. The sole way to confirm glomerular structural damage in renal pathology is by examining histopathological or immunofluorescence stained needle biopsies under a light microscope. However, this method is extremely tedious and time consuming, and requires manual scoring on the number and volume of structures. Computational quantification of equivalent features promises to greatly ease this manual burden. The largest obstacle to computational quantification of renal tissue is the ability to recognize complex glomerular textural boundaries automatically. Here we present a computational pipeline to accurately identify glomerular boundaries with high precision and accuracy. The computational pipeline employs an integrated approach composed of Gabor filtering, Gaussian blurring, statistical F-testing, and distance transform, and performs significantly better than standard Gabor based textural segmentation method. Our integrated approach provides mean accuracy/precision of 0.89/0.97 on n = 200Hematoxylin and Eosin (HE) glomerulus images, and mean 0.88/0.94 accuracy/precision on n = 200 Periodic Acid Schiff (PAS) glomerulus images. Respective accuracy/precision of the Gabor filter bank based method is 0.83/0.84 for HE and 0.78/0.8 for PAS. Our method will simplify computational partitioning of glomerular micro-compartments hidden within dense textural boundaries. Automatic quantification of glomeruli will streamline structural analysis in clinic, and can help realize real time diagnoses and interventions.
Maser: one-stop platform for NGS big data from analysis to visualization
Kinjo, Sonoko; Monma, Norikazu; Misu, Sadahiko; Kitamura, Norikazu; Imoto, Junichi; Yoshitake, Kazutoshi; Gojobori, Takashi; Ikeo, Kazuho
2018-01-01
Abstract A major challenge in analyzing the data from high-throughput next-generation sequencing (NGS) is how to handle the huge amounts of data and variety of NGS tools and visualize the resultant outputs. To address these issues, we developed a cloud-based data analysis platform, Maser (Management and Analysis System for Enormous Reads), and an original genome browser, Genome Explorer (GE). Maser enables users to manage up to 2 terabytes of data to conduct analyses with easy graphical user interface operations and offers analysis pipelines in which several individual tools are combined as a single pipeline for very common and standard analyses. GE automatically visualizes genome assembly and mapping results output from Maser pipelines, without requiring additional data upload. With this function, the Maser pipelines can graphically display the results output from all the embedded tools and mapping results in a web browser. Therefore Maser realized a more user-friendly analysis platform especially for beginners by improving graphical display and providing the selected standard pipelines that work with built-in genome browser. In addition, all the analyses executed on Maser are recorded in the analysis history, helping users to trace and repeat the analyses. The entire process of analysis and its histories can be shared with collaborators or opened to the public. In conclusion, our system is useful for managing, analyzing, and visualizing NGS data and achieves traceability, reproducibility, and transparency of NGS analysis. Database URL: http://cell-innovation.nig.ac.jp/maser/ PMID:29688385
Arctic Undersea Inspection of Pipelines and Structures.
1983-06-01
approaches . Inspection Requirements ’.5 Underwater inspection requirements for Arctic structures and pipelines *can be met by present techniques with two...look for surface evidence of leakage. An .ce cover negates this approach . The second exception is the requirement to regularly monitor the cathodic...training, payload, transiting capability, depth capability and an unknown degree of judgement degradation brought about by the psychological aspects of
LOCATION OF LEAKS IN PRESSURIZED PETROLEUM PIPELINES BY MEANS OF PASSIVE-ACOUSTIC METHODS
Experiments were conducted on the underground pipeline at the EPA's UST Test Apparatus n which three acoustic sensors separated by a maximum distance of 38m (125 ft) were used to monitor signals produced by 11.4-, 5.7-, and 3.8-L/h (3.0-, 1.5-, and 1.0-gal/h) leaks in the wall of...
Underwater Adhesives Retrofit Pipelines with Advanced Sensors
NASA Technical Reports Server (NTRS)
2015-01-01
Houston-based Astro Technology Inc. used a partnership with Johnson Space Center to pioneer an advanced fiber-optic monitoring system for offshore oil pipelines. The company's underwater adhesives allow it to retrofit older deepwater systems in order to measure pressure, temperature, strain, and flow properties, giving energy companies crucial data in real time and significantly decreasing the risk of a catastrophe.
The BEL information extraction workflow (BELIEF): evaluation in the BioCreative V BEL and IAT track
Madan, Sumit; Hodapp, Sven; Senger, Philipp; Ansari, Sam; Szostak, Justyna; Hoeng, Julia; Peitsch, Manuel; Fluck, Juliane
2016-01-01
Network-based approaches have become extremely important in systems biology to achieve a better understanding of biological mechanisms. For network representation, the Biological Expression Language (BEL) is well designed to collate findings from the scientific literature into biological network models. To facilitate encoding and biocuration of such findings in BEL, a BEL Information Extraction Workflow (BELIEF) was developed. BELIEF provides a web-based curation interface, the BELIEF Dashboard, that incorporates text mining techniques to support the biocurator in the generation of BEL networks. The underlying UIMA-based text mining pipeline (BELIEF Pipeline) uses several named entity recognition processes and relationship extraction methods to detect concepts and BEL relationships in literature. The BELIEF Dashboard allows easy curation of the automatically generated BEL statements and their context annotations. Resulting BEL statements and their context annotations can be syntactically and semantically verified to ensure consistency in the BEL network. In summary, the workflow supports experts in different stages of systems biology network building. Based on the BioCreative V BEL track evaluation, we show that the BELIEF Pipeline automatically extracts relationships with an F-score of 36.4% and fully correct statements can be obtained with an F-score of 30.8%. Participation in the BioCreative V Interactive task (IAT) track with BELIEF revealed a systems usability scale (SUS) of 67. Considering the complexity of the task for new users—learning BEL, working with a completely new interface, and performing complex curation—a score so close to the overall SUS average highlights the usability of BELIEF. Database URL: BELIEF is available at http://www.scaiview.com/belief/ PMID:27694210
Cusack, Rhodri; Vicente-Grabovetsky, Alejandro; Mitchell, Daniel J; Wild, Conor J; Auer, Tibor; Linke, Annika C; Peelle, Jonathan E
2014-01-01
Recent years have seen neuroimaging data sets becoming richer, with larger cohorts of participants, a greater variety of acquisition techniques, and increasingly complex analyses. These advances have made data analysis pipelines complicated to set up and run (increasing the risk of human error) and time consuming to execute (restricting what analyses are attempted). Here we present an open-source framework, automatic analysis (aa), to address these concerns. Human efficiency is increased by making code modular and reusable, and managing its execution with a processing engine that tracks what has been completed and what needs to be (re)done. Analysis is accelerated by optional parallel processing of independent tasks on cluster or cloud computing resources. A pipeline comprises a series of modules that each perform a specific task. The processing engine keeps track of the data, calculating a map of upstream and downstream dependencies for each module. Existing modules are available for many analysis tasks, such as SPM-based fMRI preprocessing, individual and group level statistics, voxel-based morphometry, tractography, and multi-voxel pattern analyses (MVPA). However, aa also allows for full customization, and encourages efficient management of code: new modules may be written with only a small code overhead. aa has been used by more than 50 researchers in hundreds of neuroimaging studies comprising thousands of subjects. It has been found to be robust, fast, and efficient, for simple-single subject studies up to multimodal pipelines on hundreds of subjects. It is attractive to both novice and experienced users. aa can reduce the amount of time neuroimaging laboratories spend performing analyses and reduce errors, expanding the range of scientific questions it is practical to address.
The BEL information extraction workflow (BELIEF): evaluation in the BioCreative V BEL and IAT track.
Madan, Sumit; Hodapp, Sven; Senger, Philipp; Ansari, Sam; Szostak, Justyna; Hoeng, Julia; Peitsch, Manuel; Fluck, Juliane
2016-01-01
Network-based approaches have become extremely important in systems biology to achieve a better understanding of biological mechanisms. For network representation, the Biological Expression Language (BEL) is well designed to collate findings from the scientific literature into biological network models. To facilitate encoding and biocuration of such findings in BEL, a BEL Information Extraction Workflow (BELIEF) was developed. BELIEF provides a web-based curation interface, the BELIEF Dashboard, that incorporates text mining techniques to support the biocurator in the generation of BEL networks. The underlying UIMA-based text mining pipeline (BELIEF Pipeline) uses several named entity recognition processes and relationship extraction methods to detect concepts and BEL relationships in literature. The BELIEF Dashboard allows easy curation of the automatically generated BEL statements and their context annotations. Resulting BEL statements and their context annotations can be syntactically and semantically verified to ensure consistency in the BEL network. In summary, the workflow supports experts in different stages of systems biology network building. Based on the BioCreative V BEL track evaluation, we show that the BELIEF Pipeline automatically extracts relationships with an F-score of 36.4% and fully correct statements can be obtained with an F-score of 30.8%. Participation in the BioCreative V Interactive task (IAT) track with BELIEF revealed a systems usability scale (SUS) of 67. Considering the complexity of the task for new users-learning BEL, working with a completely new interface, and performing complex curation-a score so close to the overall SUS average highlights the usability of BELIEF.Database URL: BELIEF is available at http://www.scaiview.com/belief/. © The Author(s) 2016. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Iwasaki, Wataru; Fukunaga, Tsukasa; Isagozawa, Ryota; Yamada, Koichiro; Maeda, Yasunobu; Satoh, Takashi P.; Sado, Tetsuya; Mabuchi, Kohji; Takeshima, Hirohiko; Miya, Masaki; Nishida, Mutsumi
2013-01-01
Mitofish is a database of fish mitochondrial genomes (mitogenomes) that includes powerful and precise de novo annotations for mitogenome sequences. Fish occupy an important position in the evolution of vertebrates and the ecology of the hydrosphere, and mitogenomic sequence data have served as a rich source of information for resolving fish phylogenies and identifying new fish species. The importance of a mitogenomic database continues to grow at a rapid pace as massive amounts of mitogenomic data are generated with the advent of new sequencing technologies. A severe bottleneck seems likely to occur with regard to mitogenome annotation because of the overwhelming pace of data accumulation and the intrinsic difficulties in annotating sequences with degenerating transfer RNA structures, divergent start/stop codons of the coding elements, and the overlapping of adjacent elements. To ease this data backlog, we developed an annotation pipeline named MitoAnnotator. MitoAnnotator automatically annotates a fish mitogenome with a high degree of accuracy in approximately 5 min; thus, it is readily applicable to data sets of dozens of sequences. MitoFish also contains re-annotations of previously sequenced fish mitogenomes, enabling researchers to refer to them when they find annotations that are likely to be erroneous or while conducting comparative mitogenomic analyses. For users who need more information on the taxonomy, habitats, phenotypes, or life cycles of fish, MitoFish provides links to related databases. MitoFish and MitoAnnotator are freely available at http://mitofish.aori.u-tokyo.ac.jp/ (last accessed August 28, 2013); all of the data can be batch downloaded, and the annotation pipeline can be used via a web interface. PMID:23955518
Visual analysis of trash bin processing on garbage trucks in low resolution video
NASA Astrophysics Data System (ADS)
Sidla, Oliver; Loibner, Gernot
2015-03-01
We present a system for trash can detection and counting from a camera which is mounted on a garbage collection truck. A working prototype has been successfully implemented and tested with several hours of real-world video. The detection pipeline consists of HOG detectors for two trash can sizes, and meanshift tracking and low level image processing for the analysis of the garbage disposal process. Considering the harsh environment and unfavorable imaging conditions, the process works already good enough so that very useful measurements from video data can be extracted. The false positive/false negative rate of the full processing pipeline is about 5-6% at fully automatic operation. Video data of a full day (about 8 hrs) can be processed in about 30 minutes on a standard PC.
MARZ: Manual and automatic redshifting software
NASA Astrophysics Data System (ADS)
Hinton, S. R.; Davis, Tamara M.; Lidman, C.; Glazebrook, K.; Lewis, G. F.
2016-04-01
The Australian Dark Energy Survey (OzDES) is a 100-night spectroscopic survey underway on the Anglo-Australian Telescope using the fibre-fed 2-degree-field (2dF) spectrograph. We have developed a new redshifting application MARZ with greater usability, flexibility, and the capacity to analyse a wider range of object types than the RUNZ software package previously used for redshifting spectra from 2dF. MARZ is an open-source, client-based, Javascript web-application which provides an intuitive interface and powerful automatic matching capabilities on spectra generated from the AAOmega spectrograph to produce high quality spectroscopic redshift measurements. The software can be run interactively or via the command line, and is easily adaptable to other instruments and pipelines if conforming to the current FITS file standard is not possible. Behind the scenes, a modified version of the AUTOZ cross-correlation algorithm is used to match input spectra against a variety of stellar and galaxy templates, and automatic matching performance for OzDES spectra has increased from 54% (RUNZ) to 91% (MARZ). Spectra not matched correctly by the automatic algorithm can be easily redshifted manually by cycling automatic results, manual template comparison, or marking spectral features.
Olaguer, Eduardo P; Erickson, Matthew H; Wijesinghe, Asanga; Neish, Bradley S
2016-02-01
A mobile laboratory equipped with a proton transfer reaction mass spectrometer (PTR-MS) operated in Galena Park, Texas, near the Houston Ship Channel during the Benzene and other Toxics Exposure Study (BEE-TEX). The mobile laboratory measured transient peaks of benzene of up to 37 ppbv in the afternoon and evening of February 19, 2015. Plume reconstruction and source attribution were performed using the four-dimensional (4D) variational data assimilation technique and a three-dimensional (3D) micro-scale forward and adjoint air quality model based on mobile PTR-MS data and nearby stationary wind measurements at the Galena Park Continuous Air Monitoring Station (CAMS). The results of inverse modeling indicate that significant pipeline emissions of benzene may at least partly explain the ambient concentration peaks observed in Galena Park during BEE-TEX. Total pipeline emissions of benzene inferred within the 16-km(2) model domain exceeded point source emissions by roughly a factor of 2 during the observational episode. Besides pipeline leaks, the model also inferred significant benzene emissions from marine, railcar, and tank truck loading/unloading facilities, consistent with the presence of a tanker and barges in the Kinder Morgan port terminal during the afternoon and evening of February 19. Total domain emissions of benzene exceeded corresponding 2011 National Emissions Inventory (NEI) estimates by a factor of 2-6. Port operations involving petrochemicals may significantly increase emissions of air toxics from the transfer and storage of materials. Pipeline leaks, in particular, can lead to sporadic emissions greater than in emission inventories, resulting in higher ambient concentrations than are sampled by the existing monitoring network. The use of updated methods for ambient monitoring and source attribution in real time should be encouraged as an alternative to expanding the conventional monitoring network.
preAssemble: a tool for automatic sequencer trace data processing.
Adzhubei, Alexei A; Laerdahl, Jon K; Vlasova, Anna V
2006-01-17
Trace or chromatogram files (raw data) are produced by automatic nucleic acid sequencing equipment or sequencers. Each file contains information which can be interpreted by specialised software to reveal the sequence (base calling). This is done by the sequencer proprietary software or publicly available programs. Depending on the size of a sequencing project the number of trace files can vary from just a few to thousands of files. Sequencing quality assessment on various criteria is important at the stage preceding clustering and contig assembly. Two major publicly available packages--Phred and Staden are used by preAssemble to perform sequence quality processing. The preAssemble pre-assembly sequence processing pipeline has been developed for small to large scale automatic processing of DNA sequencer chromatogram (trace) data. The Staden Package Pregap4 module and base-calling program Phred are utilized in the pipeline, which produces detailed and self-explanatory output that can be displayed with a web browser. preAssemble can be used successfully with very little previous experience, however options for parameter tuning are provided for advanced users. preAssemble runs under UNIX and LINUX operating systems. It is available for downloading and will run as stand-alone software. It can also be accessed on the Norwegian Salmon Genome Project web site where preAssemble jobs can be run on the project server. preAssemble is a tool allowing to perform quality assessment of sequences generated by automatic sequencing equipment. preAssemble is flexible since both interactive jobs on the preAssemble server and the stand alone downloadable version are available. Virtually no previous experience is necessary to run a default preAssemble job, on the other hand options for parameter tuning are provided. Consequently preAssemble can be used as efficiently for just several trace files as for large scale sequence processing.
NASA Astrophysics Data System (ADS)
Golik, V. V.; Zemenkova, M. Yu; Shipovalov, A. N.; Akulov, K. A.
2018-05-01
The paper presents calculations and an example of energy efficiency justification of the regimes of the equipment used. The engineering design of the gas pipeline in the part of monitoring the energy efficiency of a gas compressor unit (GCU) is considered. The results of the GCU characteristics and its components evaluation are described. The evaluation results of the energy efficiency indicators of the gas pipeline are presented. As an example of the result of the analysis, it is proposed to use gas compressor unit GCU-32 "Ladoga" because of its efficiency and cost effectiveness, in comparison with analogues.
NASA Astrophysics Data System (ADS)
Yang, J.; Lee, H.; Sohn, H.
2012-05-01
This study presents an embedded laser ultrasonic system for pipeline monitoring under high temperature environment. Recently, laser ultrasonics is becoming popular because of their advantageous characteristics such as (a) noncontact inspection, (b) immunity against electromagnetic interference (EMI), and (c) applicability under high temperature. However, the performance of conventional laser ultrasonic techniques for pipeline monitoring has been limited because many pipelines are covered by insulating materials and target surfaces are inaccessible. To overcome the problem, this study designs an embeddable optical fibers and fixing devices that deliver laser beams from laser sources to a target pipe using embedded optical fibers. For guided wave generation, an optical fiber is furnished with a beam collimator for irradiating a laser beam onto a target structure. The corresponding response is measured based on the principle of laser interferometry. Light from a monochromatic source is colliminated and delivered to a target surface by another optical with a focusing module, and reflected light is transmitted back to the interferometer through the same fiber. The feasibility of the proposed system for embedded ultrasonic measurement has been experimentally verified using a pipe specimen under high temperature.
The optimization of design parameters for surge relief valve for pipeline systems
NASA Astrophysics Data System (ADS)
Kim, Hyunjun; Hur, Jisung; Kim, Sanghyun
2017-06-01
Surge is an abnormal pressure which induced by rapid changes of flow rate in pipeline systems. In order to protect pipeline system from the surge pressure, various hydraulic devices have been developed. Surge-relief valve(SRV) is one of the widely applied devices to control surge due to its feasibility in application, efficiency and cost-effectiveness. SRV is designed to automatically open under abnormal pressure and discharge the flow and makes pressure of the system drop to the allowable level. The performance of the SRV is influenced by hydraulics. According to previous studies, there are several affecting factors which determine performance of the PRV such as design parameters (e.g. size of the valve), system parameters (e.g. number of the valves and location of the valve), and operation parameters (e.g. set point and operation time). Therefore, the systematic consideration for factors affecting performance of SRV is required for the proper installation of SRV in the system. In this study, methodology for finding optimum parameters of the SRV is explored through the integration of Genetic Algorithm(GA) into surge analysis.
A Performance Evaluation of a Lean Reparable Pipeline in Various Demand Environments
2004-03-23
of defects (Dennis, 2002:90). Shingo espoused the true goal should be zero defects and to this end, invented the poka - yoke , or a simple, inexpensive...92). Despite the inability to eliminate human errors, poka - yoke devices can still enable the elimination of production defects (Dennis, 2002:91... Poka - yoke devices are essentially foolproofing mechanisms which incorporate automatic inspection into the production process. Despite the fact
Unlocking Short Read Sequencing for Metagenomics
Rodrigue, Sébastien; Materna, Arne C.; Timberlake, Sonia C.; ...
2010-07-28
We describe an experimental and computational pipeline yielding millions of reads that can exceed 200 bp with quality scores approaching that of traditional Sanger sequencing. The method combines an automatable gel-less library construction step with paired-end sequencing on a short-read instrument. With appropriately sized library inserts, mate-pair sequences can overlap, and we describe the SHERA software package that joins them to form a longer composite read.
NASA Technical Reports Server (NTRS)
1976-01-01
McDonnel Douglas Corporation is using a heat-pipe device, developed through the space program, to transport oil from Alaska's rich North Slope fields. It is being used to keep the ground frozen along the 798- mile pipeline saving hundreds of millions of dollars and protecting the tundra environment. Heatpipes are totally automatic, they sense and respond to climatic conditions with no moving parts, require no external power, and never need adjustment or servicing.
Analysis and Comparison of Some Automatic Vehicle Monitoring Systems
DOT National Transportation Integrated Search
1973-07-01
In 1970 UMTA solicited proposals and selected four companies to develop systems to demonstrate the feasibility of different automatic vehicle monitoring techniques. The demonstrations culminated in experiments in Philadelphia to assess the performanc...
Travis, Fred; Shear, Jonathan
2010-12-01
This paper proposes a third meditation-category--automatic self-transcending--to extend the dichotomy of focused attention and open monitoring proposed by Lutz. Automatic self-transcending includes techniques designed to transcend their own activity. This contrasts with focused attention, which keeps attention focused on an object; and open monitoring, which keeps attention involved in the monitoring process. Each category was assigned EEG bands, based on reported brain patterns during mental tasks, and meditations were categorized based on their reported EEG. Focused attention, characterized by beta/gamma activity, included meditations from Tibetan Buddhist, Buddhist, and Chinese traditions. Open monitoring, characterized by theta activity, included meditations from Buddhist, Chinese, and Vedic traditions. Automatic self-transcending, characterized by alpha1 activity, included meditations from Vedic and Chinese traditions. Between categories, the included meditations differed in focus, subject/object relation, and procedures. These findings shed light on the common mistake of averaging meditations together to determine mechanisms or clinical effects. Copyright © 2010 Elsevier Inc. All rights reserved.
Strategies for automatic processing of large aftershock sequences
NASA Astrophysics Data System (ADS)
Kvaerna, T.; Gibbons, S. J.
2017-12-01
Aftershock sequences following major earthquakes present great challenges to seismic bulletin generation. The analyst resources needed to locate events increase with increased event numbers as the quality of underlying, fully automatic, event lists deteriorates. While current pipelines, designed a generation ago, are usually limited to single passes over the raw data, modern systems also allow multiple passes. Processing the raw data from each station currently generates parametric data streams that are later subject to phase-association algorithms which form event hypotheses. We consider a major earthquake scenario and propose to define a region of likely aftershock activity in which we will detect and accurately locate events using a separate, specially targeted, semi-automatic process. This effort may use either pattern detectors or more general algorithms that cover wider source regions without requiring waveform similarity. An iterative procedure to generate automatic bulletins would incorporate all the aftershock event hypotheses generated by the auxiliary process, and filter all phases from these events from the original detection lists prior to a new iteration of the global phase-association algorithm.
Extending the Fermi-LAT data processing pipeline to the grid
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zimmer, S.; Arrabito, L.; Glanzman, T.
2015-05-12
The Data Handling Pipeline ("Pipeline") has been developed for the Fermi Gamma-Ray Space Telescope (Fermi) Large Area Telescope (LAT) which launched in June 2008. Since then it has been in use to completely automate the production of data quality monitoring quantities, reconstruction and routine analysis of all data received from the satellite and to deliver science products to the collaboration and the Fermi Science Support Center. Aside from the reconstruction of raw data from the satellite (Level 1), data reprocessing and various event-level analyses are also reasonably heavy loads on the pipeline and computing resources. These other loads, unlike Levelmore » 1, can run continuously for weeks or months at a time. Additionally, it receives heavy use in performing production Monte Carlo tasks.« less
Sahin, Sükran; Kurum, Ekrem
2009-09-01
Ecological monitoring is a complementary component of the overall environmental management and monitoring program of any Environmental Impact Assessment (EIA) report. The monitoring method should be developed for each project phase and allow for periodic reporting and assessment of compliance with the environmental conditions and requirements of the EIA. Also, this method should incorporate a variance request program since site-specific conditions can affect construction on a daily basis and require time-critical application of alternative construction scenarios or environmental management methods integrated with alternative mitigation measures. Finally, taking full advantage of the latest information and communication technologies can enhance the quality of, and public involvement in, the environmental management program. In this paper, a landscape-scale ecological monitoring method for major construction projects is described using, as a basis, 20 months of experience on the Baku-Tbilisi-Ceyhan (BTC) Crude Oil Pipeline Project, covering Turkish Sections Lot B and Lot C. This analysis presents suggestions for improving ecological monitoring for major construction activities.
Automatic detection of surface changes on Mars - a status report
NASA Astrophysics Data System (ADS)
Sidiropoulos, Panagiotis; Muller, Jan-Peter
2016-10-01
Orbiter missions have acquired approximately 500,000 high-resolution visible images of the Martian surface, covering an area approximately 6 times larger than the overall area of Mars. This data abundance allows the scientific community to examine the Martian surface thoroughly and potentially make exciting new discoveries. However, the increased data volume, as well as its complexity, generate problems at the data processing stages, which are mainly related to a number of unresolved issues that batch-mode planetary data processing presents. As a matter of fact, the scientific community is currently struggling to scale the common ("one-at-a-time" processing of incoming products by expert scientists) paradigm to tackle the large volumes of input data. Moreover, expert scientists are more or less forced to use complex software in order to extract input information for their research from raw data, even though they are not data scientists themselves.Our work within the STFC and EU FP7 i-Mars projects aims at developing automated software that will process all of the acquired data, leaving domain expert planetary scientists to focus on their final analysis and interpretation. Moreover, after completing the development of a fully automated pipeline that processes automatically the co-registration of high-resolution NASA images to ESA/DLR HRSC baseline, our main goal has shifted to the automated detection of surface changes on Mars. In particular, we are developing a pipeline that uses as an input multi-instrument image pairs, which are processed by an automated pipeline, in order to identify changes that are correlated with Mars surface dynamic phenomena. The pipeline has currently been tested in anger on 8,000 co-registered images and by the time of DPS/EPSC we expect to have processed many tens of thousands of image pairs, producing a set of change detection results, a subset of which will be shown in the presentation.The research leading to these results has received funding from the STFC "MSSL Consolidated Grant under "Planetary Surface Data Mining" ST/K000977/1 and partial support from the European Union's Seventh Framework Programme (FP7/2007-2013) under iMars grant agreement number 607379
Use of geographic information systems for applications on gas pipeline rights-of-way
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sydelko, P.J.; Wilkey, P.L.
1992-12-01
Geographic information system (GIS) applications for the siting and monitoring of gas pipeline rights-of-way (ROWS) were developed for areas near Rio Vista, California. The data layers developed for this project represent geographic features, such as landcover, elevation, aspect, slope, soils, hydrography, transportation, endangered species, wetlands, and public line surveys. A GIS was used to develop and store spatial data from several sources; to manipulate spatial data to evaluate environmental and engineering issues associated with the siting, permitting, construction, maintenance, and monitoring of gas pipeline ROWS; and to graphically display analysis results. Examples of these applications include (1) determination of environmentallymore » sensitive areas, such as endangered species habitat, wetlands, and areas of highly erosive soils; (2) evaluation of engineering constraints, including shallow depth to bedrock, major hydrographic features, and shallow water table; (3) classification of satellite imagery for landuse/landcover that will affect ROWS; and (4) identification of alternative ROW corridors that avoid environmentally sensitive areas or areas with severe engineering constraints.« less
GIS least-cost analysis approach for siting gas pipeline ROWs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sydelko, P.J.; Wilkey, P.L.
1994-09-01
Geographic-information-system applications for the siting and monitoring of gas pipeline rights-of-way (ROWS) were developed for areas near Rio Vista, California. The data layers developed for this project represent geographic features, such as landcover, elevation, aspect, slope, soils, hydrography, transportation corridors, endangered species habitats, wetlands, and public line surveys. A geographic information system was used to develop and store spatial data from several sources; to manipulate spatial data to evaluate environmental and engineering issues associated with the siting, permitting, construction, maintenance, and monitoring of gas-pipeline ROWS; and to graphically display analysis results. Examples of these applications include (1) determination of environmentallymore » sensitive areas, such as endangered species habitat, wetlands, and areas of highly erosive soils; (2) evaluation of engineering constraints, including shallow depth to bedrock, major hydrographic features, and shallow water table; (3) classification of satellite imagery for landuse/landcover that will affect ROWS; and (4) identification of alternative ROW corridors that avoid environmentally sensitive areas or areas with severe engineering constraints.« less
Use of geographic information systems for applications on gas pipeline rights-of-way
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sydelko, P.J.
1993-10-01
Geographic information system (GIS) applications for the siting and monitoring of gas pipeline rights-of-way (ROWS) were developed for areas near Rio Vista, California. The data layers developed for this project represent geographic features, such as landcover, elevation, aspect, slope, soils, hydrography, transportation, endangered species, wetlands, and public line surveys. A GIS was used to develop and store spatial data from several sources; to manipulate spatial data to evaluate environmental and engineering issues associated with the siting, permitting, construction, maintenance, and monitoring of gas pipeline ROWS; and to graphically display analysis results. Examples of these applications include (1) determination of environmentallymore » sensitive areas, such as endangered species habitat, wetlands, and areas of highly erosive soils; (2) evaluation of engineering constraints, including shallow depth to bedrock, major hydrographic features, and shallow water table; (3) classification of satellite imagery for land use/landcover that will affect ROWS; and (4) identification of alternative ROW corridors that avoid environmentally sensitive areas or areas with severe engineering constraints.« less
Use of geographic information systems for applications on gas pipeline rights-of-way
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sydelko, P.J.; Wilkey, P.L.
1992-01-01
Geographic information system (GIS) applications for the siting and monitoring of gas pipeline rights-of-way (ROWS) were developed for areas near Rio Vista, California. The data layers developed for this project represent geographic features, such as landcover, elevation, aspect, slope, soils, hydrography, transportation, endangered species, wetlands, and public line surveys. A GIS was used to develop and store spatial data from several sources; to manipulate spatial data to evaluate environmental and engineering issues associated with the siting, permitting, construction, maintenance, and monitoring of gas pipeline ROWS; and to graphically display analysis results. Examples of these applications include (1) determination of environmentallymore » sensitive areas, such as endangered species habitat, wetlands, and areas of highly erosive soils; (2) evaluation of engineering constraints, including shallow depth to bedrock, major hydrographic features, and shallow water table; (3) classification of satellite imagery for landuse/landcover that will affect ROWS; and (4) identification of alternative ROW corridors that avoid environmentally sensitive areas or areas with severe engineering constraints.« less
Resources monitoring and automatic management system for multi-VO distributed computing system
NASA Astrophysics Data System (ADS)
Chen, J.; Pelevanyuk, I.; Sun, Y.; Zhemchugov, A.; Yan, T.; Zhao, X. H.; Zhang, X. M.
2017-10-01
Multi-VO supports based on DIRAC have been set up to provide workload and data management for several high energy experiments in IHEP. To monitor and manage the heterogeneous resources which belong to different Virtual Organizations in a uniform way, a resources monitoring and automatic management system based on Resource Status System(RSS) of DIRAC has been presented in this paper. The system is composed of three parts: information collection, status decision and automatic control, and information display. The information collection includes active and passive way of gathering status from different sources and stores them in databases. The status decision and automatic control is used to evaluate the resources status and take control actions on resources automatically through some pre-defined policies and actions. The monitoring information is displayed on a web portal. Both the real-time information and historical information can be obtained from the web portal. All the implementations are based on DIRAC framework. The information and control including sites, policies, web portal for different VOs can be well defined and distinguished within DIRAC user and group management infrastructure.
Automatic welding detection by an intelligent tool pipe inspection
NASA Astrophysics Data System (ADS)
Arizmendi, C. J.; Garcia, W. L.; Quintero, M. A.
2015-07-01
This work provide a model based on machine learning techniques in welds recognition, based on signals obtained through in-line inspection tool called “smart pig” in Oil and Gas pipelines. The model uses a signal noise reduction phase by means of pre-processing algorithms and attribute-selection techniques. The noise reduction techniques were selected after a literature review and testing with survey data. Subsequently, the model was trained using recognition and classification algorithms, specifically artificial neural networks and support vector machines. Finally, the trained model was validated with different data sets and the performance was measured with cross validation and ROC analysis. The results show that is possible to identify welding automatically with an efficiency between 90 and 98 percent.
Documentation and Detection of Colour Changes of Bas Relieves Using Close Range Photogrammetry
NASA Astrophysics Data System (ADS)
Malinverni, E. S.; Pierdicca, R.; Sturari, M.; Colosi, F.; Orazi, R.
2017-05-01
The digitization of complex buildings, findings or bas relieves can strongly facilitate the work of archaeologists, mainly for in depth analysis tasks. Notwithstanding, whether new visualization techniques ease the study phase, a classical naked-eye approach for determining changes or surface alteration could bring towards several drawbacks. The research work described in these pages is aimed at providing experts with a workflow for the evaluation of alterations (e.g. color decay or surface alterations), allowing a more rapid and objective monitoring of monuments. More in deep, a pipeline of work has been tested in order to evaluate the color variation between surfaces acquired at different époques. The introduction of reliable tools of change detection in the archaeological domain is needful; in fact, the most widespread practice, among archaeologists and practitioners, is to perform a traditional monitoring of surfaces that is made of three main steps: production of a hand-made map based on a subjective analysis, selection of a sub-set of regions of interest, removal of small portion of surface for in depth analysis conducted in laboratory. To overcome this risky and time consuming process, digital automatic change detection procedure represents a turning point. To do so, automatic classification has been carried out according to two approaches: a pixel-based and an object-based method. Pixel-based classification aims to identify the classes by means of the spectral information provided by each pixel belonging to the original bands. The object-based approach operates on sets of pixels (objects/regions) grouped together by means of an image segmentation technique. The methodology was tested by studying the bas-relieves of a temple located in Peru, named Huaca de la Luna. Despite the data sources were collected with unplanned surveys, the workflow proved to be a valuable solution useful to understand which are the main changes over time.
Testing & Evaluation of Close-Range SAR for Monitoring & Automatically Detecting Pavement Conditions
DOT National Transportation Integrated Search
2012-01-01
This report summarizes activities in support of the DOT contract on Testing & Evaluating Close-Range SAR for Monitoring & Automatically Detecting Pavement Conditions & Improve Visual Inspection Procedures. The work of this project was performed by Dr...
Numerical and Experimental Case Study of Blasting Works Effect
NASA Astrophysics Data System (ADS)
Papán, Daniel; Valašková, Veronika; Drusa, Marian
2016-10-01
This article introduces the theoretical and experimental case study of dynamic monitoring of the geological environment above constructed highway tunnel. The monitored structure is in this case a very important water supply pipeline, which crosses the tunnel and was made from steel tubes with a diameter of 800 mm. The basic dynamic parameters had been monitored during blasting works, and were compared with the FEM (Finite Element Method) calculations and checked by the Slovak standard limits. A calibrated FEM model based on the experimental measurement data results was created and used in order to receive more realistic results in further predictions, time and space extrapolations. This case study was required and demanded by the general contractor company and also by the owner of water pipeline, and it was an answer of public safety evaluation of risks during tunnel construction.
NASA Technical Reports Server (NTRS)
Tang, Henry H.; Le, Suy Q.; Orndoff, Evelyne S.; Smith, Frederick D.; Tapia, Alma S.; Brower, David V.
2012-01-01
Integrity and performance monitoring of subsea pipelines and structures provides critical information for managing offshore oil and gas production operation and preventing environmentally damaging and costly catastrophic failure. Currently pipeline monitoring devices require ground assembly and installation prior to the underwater deployment of the pipeline. A monitoring device that could be installed in situ on the operating underwater structures could enhance the productivity and improve the safety of current offshore operation. Through a Space Act Agreement (SAA) between the National Aeronautics and Space Administration (NASA) Johnson Space Center (JSC) and Astro Technology, Inc. (ATI), JSC provides technical expertise and testing facilities to support the development of fiber optic sensor technologies by ATI. This paper details the first collaboration effort between NASA JSC and ATI in evaluating underwater applicable adhesives and friction coatings for attaching fiber optic sensor system to subsea pipeline. A market survey was conducted to examine different commercial ]off ]the ]shelf (COTS) underwater adhesive systems and to select adhesive candidates for testing and evaluation. Four COTS epoxy based underwater adhesives were selected and evaluated. The adhesives were applied and cured in simulated seawater conditions and then evaluated for application characteristics and adhesive strength. The adhesive that demonstrated the best underwater application characteristics and highest adhesive strength were identified for further evaluation in developing an attachment system that could be deployed in the harsh subsea environment. Various friction coatings were also tested in this study to measure their shear strengths for a mechanical clamping design concept for attaching fiber optic sensor system. A COTS carbide alloy coating was found to increase the shear strength of metal to metal clamping interface by up to 46 percent. This study provides valuable data for assessing the feasibility of developing the next generation fiber optic senor system that could be retrofitted onto existing subsea pipeline structures.
APPHi: Automated Photometry Pipeline for High Cadence Large Volume Data
NASA Astrophysics Data System (ADS)
Sánchez, E.; Castro, J.; Silva, J.; Hernández, J.; Reyes, M.; Hernández, B.; Alvarez, F.; García T.
2018-04-01
APPHi (Automated Photometry Pipeline) carries out aperture and differential photometry of TAOS-II project data. It is computationally efficient and can be used also with other astronomical wide-field image data. APPHi works with large volumes of data and handles both FITS and HDF5 formats. Due the large number of stars that the software has to handle in an enormous number of frames, it is optimized to automatically find the best value for parameters to carry out the photometry, such as mask size for aperture, size of window for extraction of a single star, and the number of counts for the threshold for detecting a faint star. Although intended to work with TAOS-II data, APPHi can analyze any set of astronomical images and is a robust and versatile tool to performing stellar aperture and differential photometry.
Robust Mosaicking of Stereo Digital Elevation Models from the Ames Stereo Pipeline
NASA Technical Reports Server (NTRS)
Kim, Tae Min; Moratto, Zachary M.; Nefian, Ara Victor
2010-01-01
Robust estimation method is proposed to combine multiple observations and create consistent, accurate, dense Digital Elevation Models (DEMs) from lunar orbital imagery. The NASA Ames Intelligent Robotics Group (IRG) aims to produce higher-quality terrain reconstructions of the Moon from Apollo Metric Camera (AMC) data than is currently possible. In particular, IRG makes use of a stereo vision process, the Ames Stereo Pipeline (ASP), to automatically generate DEMs from consecutive AMC image pairs. However, the DEMs currently produced by the ASP often contain errors and inconsistencies due to image noise, shadows, etc. The proposed method addresses this problem by making use of multiple observations and by considering their goodness of fit to improve both the accuracy and robustness of the estimate. The stepwise regression method is applied to estimate the relaxed weight of each observation.
Diagnostic layer integration in FPGA-based pipeline measurement systems for HEP experiments
NASA Astrophysics Data System (ADS)
Pozniak, Krzysztof T.
2007-08-01
Integrated triggering and data acquisition systems for high energy physics experiments may be considered as fast, multichannel, synchronous, distributed, pipeline measurement systems. A considerable extension of functional, technological and monitoring demands, which has recently been imposed on them, forced a common usage of large field-programmable gate array (FPGA), digital signal processing-enhanced matrices and fast optical transmission for their realization. This paper discusses modelling, design, realization and testing of pipeline measurement systems. A distribution of synchronous data stream flows is considered in the network. A general functional structure of a single network node is presented. A suggested, novel block structure of the node model facilitates full implementation in the FPGA chip, circuit standardization and parametrization, as well as integration of functional and diagnostic layers. A general method for pipeline system design was derived. This method is based on a unified model of the synchronous data network node. A few examples of practically realized, FPGA-based, pipeline measurement systems were presented. The described systems were applied in ZEUS and CMS.
Reclamation of the Wahsatch gathering system pipeline in southwestern Wyoming and northeastern Utah
DOE Office of Scientific and Technical Information (OSTI.GOV)
Strickland, D.; Dern, G.; Johnson, G.
1996-12-31
The Union Pacific Resources Company (UPRC) constructed a 40.4 mile pipeline in 1993 in Summit and Rich Countries, Utah and Uinta County, Wyoming. The pipeline collects and delivers natural gas from six existing wells to the Whitney Canyon Processing Plant north of Evanston, Wyoming. We describe reclamation of the pipeline, the cooperation received from landowners along the right-of-way, and mitigation measures implemented by UPRC to minimize impacts to wildlife. The reclamation procedure combines a 2 step topsoil separation, mulching with natural vegetation, native seed mixes, and measures designed to reduce the visual impacts of the pipeline. Topsoil is separated intomore » the top 4 inches of soil material, when present. The resulting top dressing is rich in native seed and rhizomes allowing a reduced seeding rate. The borders of the right-of-way are mowed in a curvilinear pattern to reduce the straight line effects of landowner cooperation on revegetation. Specifically, following 2 years of monitoring, significant differences in plant cover (0.01« less
Mobile GPU-based implementation of automatic analysis method for long-term ECG.
Fan, Xiaomao; Yao, Qihang; Li, Ye; Chen, Runge; Cai, Yunpeng
2018-05-03
Long-term electrocardiogram (ECG) is one of the important diagnostic assistant approaches in capturing intermittent cardiac arrhythmias. Combination of miniaturized wearable holters and healthcare platforms enable people to have their cardiac condition monitored at home. The high computational burden created by concurrent processing of numerous holter data poses a serious challenge to the healthcare platform. An alternative solution is to shift the analysis tasks from healthcare platforms to the mobile computing devices. However, long-term ECG data processing is quite time consuming due to the limited computation power of the mobile central unit processor (CPU). This paper aimed to propose a novel parallel automatic ECG analysis algorithm which exploited the mobile graphics processing unit (GPU) to reduce the response time for processing long-term ECG data. By studying the architecture of the sequential automatic ECG analysis algorithm, we parallelized the time-consuming parts and reorganized the entire pipeline in the parallel algorithm to fully utilize the heterogeneous computing resources of CPU and GPU. The experimental results showed that the average executing time of the proposed algorithm on a clinical long-term ECG dataset (duration 23.0 ± 1.0 h per signal) is 1.215 ± 0.140 s, which achieved an average speedup of 5.81 ± 0.39× without compromising analysis accuracy, comparing with the sequential algorithm. Meanwhile, the battery energy consumption of the automatic ECG analysis algorithm was reduced by 64.16%. Excluding energy consumption from data loading, 79.44% of the energy consumption could be saved, which alleviated the problem of limited battery working hours for mobile devices. The reduction of response time and battery energy consumption in ECG analysis not only bring better quality of experience to holter users, but also make it possible to use mobile devices as ECG terminals for healthcare professions such as physicians and health advisers, enabling them to inspect patient ECG recordings onsite efficiently without the need of a high-quality wide-area network environment.
Automatic recognition of vector and parallel operations in a higher level language
NASA Technical Reports Server (NTRS)
Schneck, P. B.
1971-01-01
A compiler for recognizing statements of a FORTRAN program which are suited for fast execution on a parallel or pipeline machine such as Illiac-4, Star or ASC is described. The technique employs interval analysis to provide flow information to the vector/parallel recognizer. Where profitable the compiler changes scalar variables to subscripted variables. The output of the compiler is an extension to FORTRAN which shows parallel and vector operations explicitly.
STARL -- a Program to Correct CCD Image Defects
NASA Astrophysics Data System (ADS)
Narbutis, D.; Vanagas, R.; Vansevičius, V.
We present a program tool, STARL, designed for automatic detection and correction of various defects in CCD images. It uses genetic algorithm for deblending and restoring of overlapping saturated stars in crowded stellar fields. Using Subaru Telescope Suprime-Cam images we demonstrate that the program can be implemented in the wide-field survey data processing pipelines for production of high quality color mosaics. The source code and examples are available at the STARL website.
ORAC-DR -- spectroscopy data reduction
NASA Astrophysics Data System (ADS)
Hirst, Paul; Cavanagh, Brad
ORAC-DR is a general-purpose automatic data-reduction pipeline environment. This document describes its use to reduce spectroscopy data collected at the United Kingdom Infrared Telescope (UKIRT) with the CGS4, UIST and Michelle instruments, at the Anglo-Australian Telescope (AAT) with the IRIS2 instrument, and from the Very Large Telescope with ISAAC. It outlines the algorithms used and how to make minor modifications of them, and how to correct for errors made at the telescope.
Loran Automatic Vehicle Monitoring System, Phase I : Volume 2. Appendices.
DOT National Transportation Integrated Search
1977-08-01
Presents results of the evaluation phase of a two phase program to develop an Automatic Vehicle Monitoring (AVM) system for the Southern California Rapid Transit District in Los Angeles, California. Tests were previously conducted on a Loran based lo...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dawn Lenz; Raymond T. Lines; Darryl Murdock
ITT Industries Space Systems Division (Space Systems) has developed an airborne natural gas leak detection system designed to detect, image, quantify, and precisely locate leaks from natural gas transmission pipelines. This system is called the Airborne Natural Gas Emission Lidar (ANGEL) system. The ANGEL system uses a highly sensitive differential absorption Lidar technology to remotely detect pipeline leaks. The ANGEL System is operated from a fixed wing aircraft and includes automatic scanning, pointing system, and pilot guidance systems. During a pipeline inspection, the ANGEL system aircraft flies at an elevation of 1000 feet above the ground at speeds of betweenmore » 100 and 150 mph. Under this contract with DOE/NETL, Space Systems was funded to integrate the ANGEL sensor into a test aircraft and conduct a series of flight tests over a variety of test targets including simulated natural gas pipeline leaks. Following early tests in upstate New York in the summer of 2004, the ANGEL system was deployed to Casper, Wyoming to participate in a set of DOE-sponsored field tests at the Rocky Mountain Oilfield Testing Center (RMOTC). At RMOTC the Space Systems team completed integration of the system and flew an operational system for the first time. The ANGEL system flew 2 missions/day for the duration for the 5-day test. Over the course of the week the ANGEL System detected leaks ranging from 100 to 5,000 scfh.« less
Transiting exoplanet candidates from K2 Campaigns 5 and 6
NASA Astrophysics Data System (ADS)
Pope, Benjamin J. S.; Parviainen, Hannu; Aigrain, Suzanne
2016-10-01
We introduce a new transit search and vetting pipeline for observations from the K2 mission, and present the candidate transiting planets identified by this pipeline out of the targets in Campaigns 5 and 6. Our pipeline uses the Gaussian process-based K2SC code to correct for the K2 pointing systematics and simultaneously model stellar variability. The systematics-corrected, variability-detrended light curves are searched for transits with the box-least-squares method, and a period-dependent detection threshold is used to generate a preliminary candidate list. Two or three individuals vet each candidate manually to produce the final candidate list, using a set of automatically generated transit fits and assorted diagnostic tests to inform the vetting. We detect 145 single-planet system candidates and 5 multi-planet systems, independently recovering the previously published hot Jupiters EPIC 212110888b, WASP-55b (EPIC 212300977b) and Qatar-2b (EPIC 212756297b). We also report the outcome of reconnaissance spectroscopy carried out for all candidates with Kepler magnitude Kp ≤ 13, identifying 12 targets as likely false positives. We compare our results to those of other K2 transit search pipelines, noting that ours performs particularly well for variable and/or active stars, but that the results are very similar overall. All the light curves and code used in the transit search and vetting process are publicly available, as are the follow-up spectra.
NASA Astrophysics Data System (ADS)
Lee, Joohwi; Kim, Sun Hyung; Oguz, Ipek; Styner, Martin
2016-03-01
The cortical thickness of the mammalian brain is an important morphological characteristic that can be used to investigate and observe the brain's developmental changes that might be caused by biologically toxic substances such as ethanol or cocaine. Although various cortical thickness analysis methods have been proposed that are applicable for human brain and have developed into well-validated open-source software packages, cortical thickness analysis methods for rodent brains have not yet become as robust and accurate as those designed for human brains. Based on a previously proposed cortical thickness measurement pipeline for rodent brain analysis,1 we present an enhanced cortical thickness pipeline in terms of accuracy and anatomical consistency. First, we propose a Lagrangian-based computational approach in the thickness measurement step in order to minimize local truncation error using the fourth-order Runge-Kutta method. Second, by constructing a line object for each streamline of the thickness measurement, we can visualize the way the thickness is measured and achieve sub-voxel accuracy by performing geometric post-processing. Last, with emphasis on the importance of an anatomically consistent partial differential equation (PDE) boundary map, we propose an automatic PDE boundary map generation algorithm that is specific to rodent brain anatomy, which does not require manual labeling. The results show that the proposed cortical thickness pipeline can produce statistically significant regions that are not observed in the previous cortical thickness analysis pipeline.
Remote Sensing Application in Oil and Gas Industry
NASA Astrophysics Data System (ADS)
Sizov, Oleg; Aloltsov, Alexander; Rubtsova, Natalia
2014-05-01
The main environmental problems of the Khanty-Mansi Autonomous Okrug (a federal subject of Russia) related to the activities of oil and gas industry (82 active companies which hold 77,000 oil wells). As on the 1st of January 2013 the subject produces more than 50% of all oil in Russia. The principle of environmental responsibility makes it necessary to minimize human impact and ecological impact. One of the most effective tools for environmental monitoring is remote sensing. The main advantages of such approach are: wide coverage of areas of interest, high temporal resolution, precise location, automatic processing, large set of extracted parameters, etc. Authorities of KhMAO are interested in regular detection of the impact on the environment by processing satellite data and plan to increase the coverage from 434.9 to 659.9 square kilometers with resolution not less than 10 m/pixel. Years of experience of our company shows the significant potential to expand the use of such remote sensing data in the solution of environmental problems. The main directions are: monitoring of rational use of associated petroleum gas (detection of all gas flares and volumes of burned gas), monitoring of soil pollution (detection of areas of oil pollution, assess of the extent of pollution, planning of reclamation activities and assessment of their efficiency, detection of potential areas of pipelines corrosion), monitoring of status of sludge pits (inventory of all sludge pits, assessment of their liquidation), monitoring of technogenic impact (detection of changes), upgrading of a geospatial database (topographic map of not less than 1:50000 scale). Implementation of modeling, extrapolation and remote analysis techniques based on satellite images will help to reduce unnecessary costs for instrumental methods. Thus, the introduction of effective remote monitoring technology to the activity of oil and gas companies promotes environmental responsibility of these companies.
Design of a real-time tax-data monitoring intelligent card system
NASA Astrophysics Data System (ADS)
Gu, Yajun; Bi, Guotang; Chen, Liwei; Wang, Zhiyuan
2009-07-01
To solve the current problem of low efficiency of domestic Oil Station's information management, Oil Station's realtime tax data monitoring system has been developed to automatically access tax data of Oil pumping machines, realizing Oil-pumping machines' real-time automatic data collection, displaying and saving. The monitoring system uses the noncontact intelligent card or network to directly collect data which can not be artificially modified and so seals the loopholes and improves the tax collection's automatic level. It can perform real-time collection and management of the Oil Station information, and find the problem promptly, achieves the automatic management for the entire process covering Oil sales accounting and reporting. It can also perform remote query to the Oil Station's operation data. This system has broad application future and economic value.
Applications of optical measurement technology in pollution gas monitoring at thermal power plants
NASA Astrophysics Data System (ADS)
Wang, Jian; Yu, Dahai; Ye, Huajun; Yang, Jianhu; Ke, Liang; Han, Shuanglai; Gu, Haitao; Chen, Yingbin
2011-11-01
This paper presents the work of using advanced optical measurement techniques to implement stack gas emission monitoring and process control. A system is designed to conduct online measurement of SO2/NOX and mercury emission from stacks and slipping NH3 of de-nitrification process. The system is consisted of SO2/NOX monitoring subsystem, mercury monitoring subsystem, and NH3 monitoring subsystem. The SO2/NOX monitoring subsystem is developed based on the ultraviolet differential optical absorption spectroscopy (UV-DOAS) technique. By using this technique, a linearity error less than +/-1% F.S. is achieved, and the measurement errors resulting from optical path contamination and light fluctuation are removed. Moreover, this subsystem employs in situ extraction and hot-wet line sampling technique to significantly reduce SO2 loss due to condensation and protect gas pipeline from corrosion. The mercury monitoring subsystem is used to measure the concentration of element mercury (Hg0), oxidized mercury (Hg2+), and total gaseous mercury (HgT) in the flue gas exhaust. The measurement of Hg with a low detection limit (0.1μg/m3) and a high sensitivity is realized by using cold vapor atom fluorescence spectroscopy (CVAFS) technique. This subsystem is also equipped with an inertial separation type sampling technique to prevent gas pipeline from being clogged and to reduce speciation mercury measurement error. The NH3 monitoring subsystem is developed to measure the concentration of slipping NH3 and then to help improving the efficiency of de-nitrification. The NH3 concentration as low as 0.1ppm is able to be measured by using the off-axis integrated cavity output spectroscopy (ICOS) and the tunable diode laser absorption spectroscopy (TDLAS) techniques. The problem of trace NH3 sampling loss is solved by applying heating the gas pipelines when the measurement is running.
ESAP plus: a web-based server for EST-SSR marker development.
Ponyared, Piyarat; Ponsawat, Jiradej; Tongsima, Sissades; Seresangtakul, Pusadee; Akkasaeng, Chutipong; Tantisuwichwong, Nathpapat
2016-12-22
Simple sequence repeats (SSRs) have become widely used as molecular markers in plant genetic studies due to their abundance, high allelic variation at each locus and simplicity to analyze using conventional PCR amplification. To study plants with unknown genome sequence, SSR markers from Expressed Sequence Tags (ESTs), which can be obtained from the plant mRNA (converted to cDNA), must be utilized. With the advent of high-throughput sequencing technology, huge EST sequence data have been generated and are now accessible from many public databases. However, SSR marker identification from a large in-house or public EST collection requires a computational pipeline that makes use of several standard bioinformatic tools to design high quality EST-SSR primers. Some of these computational tools are not users friendly and must be tightly integrated with reference genomic databases. A web-based bioinformatic pipeline, called EST Analysis Pipeline Plus (ESAP Plus), was constructed for assisting researchers to develop SSR markers from a large EST collection. ESAP Plus incorporates several bioinformatic scripts and some useful standard software tools necessary for the four main procedures of EST-SSR marker development, namely 1) pre-processing, 2) clustering and assembly, 3) SSR mining and 4) SSR primer design. The proposed pipeline also provides two alternative steps for reducing EST redundancy and identifying SSR loci. Using public sugarcane ESTs, ESAP Plus automatically executed the aforementioned computational pipeline via a simple web user interface, which was implemented using standard PHP, HTML, CSS and Java scripts. With ESAP Plus, users can upload raw EST data and choose various filtering options and parameters to analyze each of the four main procedures through this web interface. All input EST data and their predicted SSR results will be stored in the ESAP Plus MySQL database. Users will be notified via e-mail when the automatic process is completed and they can download all the results through the web interface. ESAP Plus is a comprehensive and convenient web-based bioinformatic tool for SSR marker development. ESAP Plus offers all necessary EST-SSR development processes with various adjustable options that users can easily use to identify SSR markers from a large EST collection. With familiar web interface, users can upload the raw EST using the data submission page and visualize/download the corresponding EST-SSR information from within ESAP Plus. ESAP Plus can handle considerably large EST datasets. This EST-SSR discovery tool can be accessed directly from: http://gbp.kku.ac.th/esap_plus/ .
Loran Automatic Vehicle Monitoring System, Phase I : Volume 1. Test Results.
DOT National Transportation Integrated Search
1977-08-01
Presents results of the evaluation phase of a two phase program to develop an Automatic Vehicle Monitoring (AVM) system for the Southern California Rapid Transit District in Los Angeles, California. Tests were previously conducted on a Loran based lo...
Ultrasonic wave based pressure measurement in small diameter pipeline.
Wang, Dan; Song, Zhengxiang; Wu, Yuan; Jiang, Yuan
2015-12-01
An effective non-intrusive method of ultrasound-based technique that allows monitoring liquid pressure in small diameter pipeline (less than 10mm) is presented in this paper. Ultrasonic wave could penetrate medium, through the acquisition of representative information from the echoes, properties of medium can be reflected. This pressure measurement is difficult due to that echoes' information is not easy to obtain in small diameter pipeline. The proposed method is a study on pipeline with Kneser liquid and is based on the principle that the transmission speed of ultrasonic wave in pipeline liquid correlates with liquid pressure and transmission speed of ultrasonic wave in pipeline liquid is reflected through ultrasonic propagation time providing that acoustic distance is fixed. Therefore, variation of ultrasonic propagation time can reflect variation of pressure in pipeline. Ultrasonic propagation time is obtained by electric processing approach and is accurately measured to nanosecond through high resolution time measurement module. We used ultrasonic propagation time difference to reflect actual pressure in this paper to reduce the environmental influences. The corresponding pressure values are finally obtained by acquiring the relationship between variation of ultrasonic propagation time difference and pressure with the use of neural network analysis method, the results show that this method is accurate and can be used in practice. Copyright © 2015 Elsevier B.V. All rights reserved.
Automatic Generation of Cycle-Approximate TLMs with Timed RTOS Model Support
NASA Astrophysics Data System (ADS)
Hwang, Yonghyun; Schirner, Gunar; Abdi, Samar
This paper presents a technique for automatically generating cycle-approximate transaction level models (TLMs) for multi-process applications mapped to embedded platforms. It incorporates three key features: (a) basic block level timing annotation, (b) RTOS model integration, and (c) RTOS overhead delay modeling. The inputs to TLM generation are application C processes and their mapping to processors in the platform. A processor data model, including pipelined datapath, memory hierarchy and branch delay model is used to estimate basic block execution delays. The delays are annotated to the C code, which is then integrated with a generated SystemC RTOS model. Our abstract RTOS provides dynamic scheduling and inter-process communication (IPC) with processor- and RTOS-specific pre-characterized timing. Our experiments using a MP3 decoder and a JPEG encoder show that timed TLMs, with integrated RTOS models, can be automatically generated in less than a minute. Our generated TLMs simulated three times faster than real-time and showed less than 10% timing error compared to board measurements.
OKCARS : Oklahoma Collision Analysis and Response System.
DOT National Transportation Integrated Search
2012-10-01
By continuously monitoring traffic intersections to automatically detect that a collision or nearcollision : has occurred, automatically call for assistance, and automatically forewarn oncoming traffic, : our OKCARS has the capability to effectively ...
Got, Jeanne; Cortés, María Paz; Maass, Alejandro
2018-01-01
Genome-scale metabolic models have become the tool of choice for the global analysis of microorganism metabolism, and their reconstruction has attained high standards of quality and reliability. Improvements in this area have been accompanied by the development of some major platforms and databases, and an explosion of individual bioinformatics methods. Consequently, many recent models result from “à la carte” pipelines, combining the use of platforms, individual tools and biological expertise to enhance the quality of the reconstruction. Although very useful, introducing heterogeneous tools, that hardly interact with each other, causes loss of traceability and reproducibility in the reconstruction process. This represents a real obstacle, especially when considering less studied species whose metabolic reconstruction can greatly benefit from the comparison to good quality models of related organisms. This work proposes an adaptable workspace, AuReMe, for sustainable reconstructions or improvements of genome-scale metabolic models involving personalized pipelines. At each step, relevant information related to the modifications brought to the model by a method is stored. This ensures that the process is reproducible and documented regardless of the combination of tools used. Additionally, the workspace establishes a way to browse metabolic models and their metadata through the automatic generation of ad-hoc local wikis dedicated to monitoring and facilitating the process of reconstruction. AuReMe supports exploration and semantic query based on RDF databases. We illustrate how this workspace allowed handling, in an integrated way, the metabolic reconstructions of non-model organisms such as an extremophile bacterium or eukaryote algae. Among relevant applications, the latter reconstruction led to putative evolutionary insights of a metabolic pathway. PMID:29791443
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jutras, Jean-David
MRI-only Radiation Treatment Planning (RTP) is becoming increasingly popular because of a simplified work-flow, and less inconvenience to the patient who avoids multiple scans. The advantages of MRI-based RTP over traditional CT-based RTP lie in its superior soft-tissue contrast, and absence of ionizing radiation dose. The lack of electron-density information in MRI can be addressed by automatic tissue classification. To distinguish bone from air, which both appear dark in MRI, an ultra-short echo time (UTE) pulse sequence may be used. Quantitative MRI parametric maps can provide improved tissue segmentation/classification and better sensitivity in monitoring disease progression and treatment outcome thanmore » standard weighted images. Superior tumor contrast can be achieved on pure T{sub 1} images compared to conventional T{sub 1}-weighted images acquired in the same scan duration and voxel resolution. In this study, we have developed a robust and fast quantitative MRI acquisition and post-processing work-flow that integrates these latest advances into the MRI-based RTP of brain lesions. Using 3D multi-echo FLASH images at two different optimized flip angles (both acquired in under 9 min, and 1mm isotropic resolution), parametric maps of T{sub 1}, proton-density (M{sub 0}), and T{sub 2}{sup *} are obtained with high contrast-to-noise ratio, and negligible geometrical distortions, water-fat shifts and susceptibility effects. An additional 3D UTE MRI dataset is acquired (in under 4 min) and post-processed to classify tissues for dose simulation. The pipeline was tested on four healthy volunteers and a clinical trial on brain cancer patients is underway.« less
Mary Beth Adams; Pamela J. Edwards; W. Mark Ford; Joshua B. Johnson; Thomas M. Schuler; Melissa Thomas-Van Gundy; Frederica Wood
2011-01-01
Development of a natural gas well and pipeline on the Fernow Experimental Forest, WV, raised concerns about the effects on the natural and scientifi c resources of the Fernow, set aside in 1934 for long-term research. A case study approach was used to evaluate effects of the development. This report includes results of monitoring projects as well as observations...
Microfabricated fuel heating value monitoring device
Robinson, Alex L [Albuquerque, NM; Manginell, Ronald P [Albuquerque, NM; Moorman, Matthew W [Albuquerque, NM
2010-05-04
A microfabricated fuel heating value monitoring device comprises a microfabricated gas chromatography column in combination with a catalytic microcalorimeter. The microcalorimeter can comprise a reference thermal conductivity sensor to provide diagnostics and surety. Using microfabrication techniques, the device can be manufactured in production quantities at a low per-unit cost. The microfabricated fuel heating value monitoring device enables continuous calorimetric determination of the heating value of natural gas with a 1 minute analysis time and 1.5 minute cycle time using air as a carrier gas. This device has applications in remote natural gas mining stations, pipeline switching and metering stations, turbine generators, and other industrial user sites. For gas pipelines, the device can improve gas quality during transfer and blending, and provide accurate financial accounting. For industrial end users, the device can provide continuous feedback of physical gas properties to improve combustion efficiency during use.
49 CFR 195.402 - Procedural manual for operations, maintenance, and emergencies.
Code of Federal Regulations, 2012 CFR
2012-10-01
..., monitoring from an attended location pipeline pressure during startup until steady state pressure and flow... operating conditions by monitoring pressure, temperature, flow or other appropriate operational data and...) Increase or decrease in pressure or flow rate outside normal operating limits; (iii) Loss of communications...
49 CFR 195.402 - Procedural manual for operations, maintenance, and emergencies.
Code of Federal Regulations, 2014 CFR
2014-10-01
..., monitoring from an attended location pipeline pressure during startup until steady state pressure and flow... operating conditions by monitoring pressure, temperature, flow or other appropriate operational data and...) Increase or decrease in pressure or flow rate outside normal operating limits; (iii) Loss of communications...
Concept of an advanced hyperspectral remote sensing system for pipeline monitoring
NASA Astrophysics Data System (ADS)
Keskin, Göksu; Teutsch, Caroline D.; Lenz, Andreas; Middelmann, Wolfgang
2015-10-01
Areas occupied by oil pipelines and storage facilities are prone to severe contamination due to leaks caused by natural forces, poor maintenance or third parties. These threats have to be detected as quickly as possible in order to prevent serious environmental damage. Periodical and emergency monitoring activities need to be carried out for successful disaster management and pollution minimization. Airborne remote sensing stands out as an appropriate choice to operate either in an emergency or periodically. Hydrocarbon Index (HI) and Hydrocarbon Detection Index (HDI) utilize the unique absorption features of hydrocarbon based materials at SWIR spectral region. These band ratio based methods require no a priori knowledge of the reference spectrum and can be calculated in real time. This work introduces a flexible airborne pipeline monitoring system based on the online quasi-operational hyperspectral remote sensing system developed at Fraunhofer IOSB, utilizing HI and HDI for oil leak detection on the data acquired by an SWIR imaging sensor. Robustness of HI and HDI compared to state of the art detection algorithms is evaluated in an experimental setup using a synthetic dataset, which was prepared in a systematic way to simulate linear mixtures of selected background and oil spectra consisting of gradually decreasing percentages of oil content. Real airborne measurements in Ettlingen, Germany are used to gather background data while the crude oil spectrum was measured with a field spectrometer. The results indicate that the system can be utilized for online and offline monitoring activities.
Aerial surveillance for gas and liquid hydrocarbon pipelines using a flame ionization detector (FID)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Riquetti, P.V.; Fletcher, J.I.; Minty, C.D.
1996-12-31
A novel application for the detection of airborne hydrocarbons has been successfully developed by means of a highly sensitive, fast responding Flame Ionization Detector (FID). The traditional way to monitor pipeline leaks has been by ground crews using specific sensors or by airborne crews highly trained to observe anomalies associated with leaks during periodic surveys of the pipeline right-of-way. The goal has been to detect leaks in a fast and cost effective way before the associated spill becomes a costly and hazardous problem. This paper describes a leak detection system combined with a global positioning system (GPS) and a computerizedmore » data output designed to pinpoint the presence of hydrocarbons in the air space of the pipeline`s right of way. Fixed wing aircraft as well as helicopters have been successfully used as airborne platforms. Natural gas, crude oil and finished products pipelines in Canada and the US have been surveyed using this technology with excellent correlation between the aircraft detection and in situ ground detection. The information obtained is processed with a proprietary software and reduced to simple coordinates. Results are transferred to ground crews to effect the necessary repairs.« less
Pathfinder-Plus aircraft in flight
NASA Technical Reports Server (NTRS)
1998-01-01
The Pathfinder-Plus solar-powered aircraft is shown taking off from a runway, then flying at low altitude over the ocean. The vehicle, which looks like a flying ruler, operates at low airspeed. Among the missions proposed for a solar-powered aircraft are communications relay, atmospheric studies, pipeline monitoring and gas leak detection, environmental monitoring using thermal and radar images, and disaster relief and monitoring.
GEONEX: Land Monitoring From a New Generation of Geostationary Satellite Sensors
NASA Technical Reports Server (NTRS)
Nemani, Ramakrishna; Lyapustin, Alexei; Wang, Weile; Wang, Yujie; Hashimoto, Hirofumi; Li, Shuang; Ganguly, Sangram; Michaelis, Andrew; Higuchi, Atsushi; Takaneka, Hideaki;
2017-01-01
The latest generation of geostationary satellites carry sensors such as ABI (Advanced Baseline Imager on GOES-16) and the AHI (Advanced Himawari Imager on Himawari) that closely mimic the spatial and spectral characteristics of Earth Observing System flagship MODIS for monitoring land surface conditions. More importantly they provide observations at 5-15 minute intervals. Such high frequency data offer exciting possibilities for producing robust estimates of land surface conditions by overcoming cloud cover, enabling studies of diurnally varying local-to-regional biosphere-atmosphere interactions, and operational decision-making in agriculture, forestry and disaster management. But the data come with challenges that need special attention. For instance, geostationary data feature changing sun angle at constant view for each pixel, which is reciprocal to sun-synchronous observations, and thus require careful adaptation of EOS algorithms. Our goal is to produce a set of land surface products from geostationary sensors by leveraging NASA's investments in EOS algorithms and in the data/compute facility NEX. The land surface variables of interest include atmospherically corrected surface reflectances, snow cover, vegetation indices and leaf area index (LAI)/fraction of photosynthetically absorbed radiation (FPAR), as well as land surface temperature and fires. In order to get ready to produce operational products over the US from GOES-16 starting 2018, we have utilized 18 months of data from Himawari AHI over Australia to test the production pipeline and the performance of various algorithms for our initial tests. The end-to-end processing pipeline consists of a suite of modules to (a) perform calibration and automatic georeference correction of the AHI L1b data, (b) adopt the Multi-Angle Implementation of Atmospheric Correction (MAIAC) algorithm to produce surface spectral reflectances along with compositing schemes and QA, and (c) modify relevant EOS retrieval algorithms (e.g., LAI and FPAR, GPP, etc.) for subsequent science product generation. Initial evaluation of Himawari AHI products against standard MODIS products indicate general agreement, suggesting that data from geostationary sensors can augment low earth orbit (LEO) satellite observations.
GEONEX: Land monitoring from a new generation of geostationary satellite sensors
NASA Astrophysics Data System (ADS)
Nemani, R. R.; Lyapustin, A.; Wang, W.; Ganguly, S.; Wang, Y.; Michaelis, A.; Hashimoto, H.; Li, S.; Higuchi, A.; Huete, A. R.; Yeom, J. M.; camacho De Coca, F.; Lee, T. J.; Takenaka, H.
2017-12-01
The latest generation of geostationary satellites carry sensors such as ABI (Advanced Baseline Imager on GOES-16) and the AHI (Advanced Himawari Imager on Himawari) that closely mimic the spatial and spectral characteristics of Earth Observing System flagship MODIS for monitoring land surface conditions. More importantly they provide observations at 5-15 minute intervals. Such high frequency data offer exciting possibilities for producing robust estimates of land surface conditions by overcoming cloud cover, enabling studies of diurnally varying local-to-regional biosphere-atmosphere interactions, and operational decision-making in agriculture, forestry and disaster management. But the data come with challenges that need special attention. For instance, geostationary data feature changing sun angle at constant view for each pixel, which is reciprocal to sun-synchronous observations, and thus require careful adaptation of EOS algorithms. Our goal is to produce a set of land surface products from geostationary sensors by leveraging NASA's investments in EOS algorithms and in the data/compute facility NEX. The land surface variables of interest include atmospherically corrected surface reflectances, snow cover, vegetation indices and leaf area index (LAI)/fraction of photosynthetically absorbed radiation (FPAR), as well as land surface temperature and fires. In order to get ready to produce operational products over the US from GOES-16 starting 2018, we have utilized 18 months of data from Himawari AHI over Australia to test the production pipeline and the performance of various algorithms for our initial tests. The end-to-end processing pipeline consists of a suite of modules to (a) perform calibration and automatic georeference correction of the AHI L1b data, (b) adopt the Multi-Angle Implementation of Atmospheric Correction (MAIAC) algorithm to produce surface spectral reflectances along with compositing schemes and QA, and (c) modify relevant EOS retrieval algorithms (e.g., LAI and FPAR, GPP, etc.) for subsequent science product generation. Initial evaluation of Himawari AHI products against standard MODIS products indicate general agreement, suggesting that data from geostationary sensors can augment low earth orbit (LEO) satellite observations.
A Fuzzy Control System for Reducing Urban Runoff by a Stormwater Storage Tank
NASA Astrophysics Data System (ADS)
Zhang, P.; Cai, Y.; Wang, J.
2017-12-01
Stormwater storage tank (SST) is a popular low impact development technology for reducing stormwater runoff in the construction of sponge city. Most researches on SST were mainly the design, pollutants removal effect, and operation assessment. While there were few researches on the automatic control of SST for reducing peak flow. In this paper, fuzzy control was introduced into the peak control of SST to improve the efficiency of reducing stormawter runoff. Firstly, the design of SST was investigated. A catchment area and return period were assumed, a SST model was manufactured, and then the storage capacity of the SST was verified. Secondly, the control parameters of the SST based on reducing stormwater runoff was analyzed, and a schematic diagram of real-time control (RTC) system based on peak control SST was established. Finally, fuzzy control system of a double input (flow and water level) and double output (inlet and outlet valve) was designed. The results showed that 1) under the different return periods (one year, three years, five years), the SST had the effect of delayed peak control and storage by increasing the detention time, 2) rainfall, pipeline flow, the influent time and the water level in the SST could be used as RTC parameters, and 3) the response curves of flow velocity and water level fluctuated very little and reached equilibrium in a short time. The combination of online monitoring and fuzzy control was feasible to control the SST automatically. This paper provides a theoretical reference for reducing stormwater runoff and improving the operation efficiency of SST.
Piersma, Sjouke; Denham, Emma L; Drulhe, Samuel; Tonk, Rudi H J; Schwikowski, Benno; van Dijl, Jan Maarten
2013-01-01
Gene expression heterogeneity is a key driver for microbial adaptation to fluctuating environmental conditions, cell differentiation and the evolution of species. This phenomenon has therefore enormous implications, not only for life in general, but also for biotechnological applications where unwanted subpopulations of non-producing cells can emerge in large-scale fermentations. Only time-lapse fluorescence microscopy allows real-time measurements of gene expression heterogeneity. A major limitation in the analysis of time-lapse microscopy data is the lack of fast, cost-effective, open, simple and adaptable protocols. Here we describe TLM-Quant, a semi-automatic pipeline for the analysis of time-lapse fluorescence microscopy data that enables the user to visualize and quantify gene expression heterogeneity. Importantly, our pipeline builds on the open-source packages ImageJ and R. To validate TLM-Quant, we selected three possible scenarios, namely homogeneous expression, highly 'noisy' heterogeneous expression, and bistable heterogeneous expression in the Gram-positive bacterium Bacillus subtilis. This bacterium is both a paradigm for systems-level studies on gene expression and a highly appreciated biotechnological 'cell factory'. We conclude that the temporal resolution of such analyses with TLM-Quant is only limited by the numbers of recorded images.
Second Iteration of Photogrammetric Pipeline to Enhance the Accuracy of Image Pose Estimation
NASA Astrophysics Data System (ADS)
Nguyen, T. G.; Pierrot-Deseilligny, M.; Muller, J.-M.; Thom, C.
2017-05-01
In classical photogrammetric processing pipeline, the automatic tie point extraction plays a key role in the quality of achieved results. The image tie points are crucial to pose estimation and have a significant influence on the precision of calculated orientation parameters. Therefore, both relative and absolute orientations of the 3D model can be affected. By improving the precision of image tie point measurement, one can enhance the quality of image orientation. The quality of image tie points is under the influence of several factors such as the multiplicity, the measurement precision and the distribution in 2D images as well as in 3D scenes. In complex acquisition scenarios such as indoor applications and oblique aerial images, tie point extraction is limited while only image information can be exploited. Hence, we propose here a method which improves the precision of pose estimation in complex scenarios by adding a second iteration to the classical processing pipeline. The result of a first iteration is used as a priori information to guide the extraction of new tie points with better quality. Evaluated with multiple case studies, the proposed method shows its validity and its high potiential for precision improvement.
Species Distribution Modelling: Contrasting presence-only models with plot abundance data.
Gomes, Vitor H F; IJff, Stéphanie D; Raes, Niels; Amaral, Iêda Leão; Salomão, Rafael P; de Souza Coelho, Luiz; de Almeida Matos, Francisca Dionízia; Castilho, Carolina V; de Andrade Lima Filho, Diogenes; López, Dairon Cárdenas; Guevara, Juan Ernesto; Magnusson, William E; Phillips, Oliver L; Wittmann, Florian; de Jesus Veiga Carim, Marcelo; Martins, Maria Pires; Irume, Mariana Victória; Sabatier, Daniel; Molino, Jean-François; Bánki, Olaf S; da Silva Guimarães, José Renan; Pitman, Nigel C A; Piedade, Maria Teresa Fernandez; Mendoza, Abel Monteagudo; Luize, Bruno Garcia; Venticinque, Eduardo Martins; de Leão Novo, Evlyn Márcia Moraes; Vargas, Percy Núñez; Silva, Thiago Sanna Freire; Manzatto, Angelo Gilberto; Terborgh, John; Reis, Neidiane Farias Costa; Montero, Juan Carlos; Casula, Katia Regina; Marimon, Beatriz S; Marimon, Ben-Hur; Coronado, Euridice N Honorio; Feldpausch, Ted R; Duque, Alvaro; Zartman, Charles Eugene; Arboleda, Nicolás Castaño; Killeen, Timothy J; Mostacedo, Bonifacio; Vasquez, Rodolfo; Schöngart, Jochen; Assis, Rafael L; Medeiros, Marcelo Brilhante; Simon, Marcelo Fragomeni; Andrade, Ana; Laurance, William F; Camargo, José Luís; Demarchi, Layon O; Laurance, Susan G W; de Sousa Farias, Emanuelle; Nascimento, Henrique Eduardo Mendonça; Revilla, Juan David Cardenas; Quaresma, Adriano; Costa, Flavia R C; Vieira, Ima Célia Guimarães; Cintra, Bruno Barçante Ladvocat; Castellanos, Hernán; Brienen, Roel; Stevenson, Pablo R; Feitosa, Yuri; Duivenvoorden, Joost F; Aymard C, Gerardo A; Mogollón, Hugo F; Targhetta, Natalia; Comiskey, James A; Vicentini, Alberto; Lopes, Aline; Damasco, Gabriel; Dávila, Nállarett; García-Villacorta, Roosevelt; Levis, Carolina; Schietti, Juliana; Souza, Priscila; Emilio, Thaise; Alonso, Alfonso; Neill, David; Dallmeier, Francisco; Ferreira, Leandro Valle; Araujo-Murakami, Alejandro; Praia, Daniel; do Amaral, Dário Dantas; Carvalho, Fernanda Antunes; de Souza, Fernanda Coelho; Feeley, Kenneth; Arroyo, Luzmila; Pansonato, Marcelo Petratti; Gribel, Rogerio; Villa, Boris; Licona, Juan Carlos; Fine, Paul V A; Cerón, Carlos; Baraloto, Chris; Jimenez, Eliana M; Stropp, Juliana; Engel, Julien; Silveira, Marcos; Mora, Maria Cristina Peñuela; Petronelli, Pascal; Maas, Paul; Thomas-Caesar, Raquel; Henkel, Terry W; Daly, Doug; Paredes, Marcos Ríos; Baker, Tim R; Fuentes, Alfredo; Peres, Carlos A; Chave, Jerome; Pena, Jose Luis Marcelo; Dexter, Kyle G; Silman, Miles R; Jørgensen, Peter Møller; Pennington, Toby; Di Fiore, Anthony; Valverde, Fernando Cornejo; Phillips, Juan Fernando; Rivas-Torres, Gonzalo; von Hildebrand, Patricio; van Andel, Tinde R; Ruschel, Ademir R; Prieto, Adriana; Rudas, Agustín; Hoffman, Bruce; Vela, César I A; Barbosa, Edelcilio Marques; Zent, Egleé L; Gonzales, George Pepe Gallardo; Doza, Hilda Paulette Dávila; de Andrade Miranda, Ires Paula; Guillaumet, Jean-Louis; Pinto, Linder Felipe Mozombite; de Matos Bonates, Luiz Carlos; Silva, Natalino; Gómez, Ricardo Zárate; Zent, Stanford; Gonzales, Therany; Vos, Vincent A; Malhi, Yadvinder; Oliveira, Alexandre A; Cano, Angela; Albuquerque, Bianca Weiss; Vriesendorp, Corine; Correa, Diego Felipe; Torre, Emilio Vilanova; van der Heijden, Geertje; Ramirez-Angulo, Hirma; Ramos, José Ferreira; Young, Kenneth R; Rocha, Maira; Nascimento, Marcelo Trindade; Medina, Maria Natalia Umaña; Tirado, Milton; Wang, Ophelia; Sierra, Rodrigo; Torres-Lezama, Armando; Mendoza, Casimiro; Ferreira, Cid; Baider, Cláudia; Villarroel, Daniel; Balslev, Henrik; Mesones, Italo; Giraldo, Ligia Estela Urrego; Casas, Luisa Fernanda; Reategui, Manuel Augusto Ahuite; Linares-Palomino, Reynaldo; Zagt, Roderick; Cárdenas, Sasha; Farfan-Rios, William; Sampaio, Adeilza Felipe; Pauletto, Daniela; Sandoval, Elvis H Valderrama; Arevalo, Freddy Ramirez; Huamantupa-Chuquimaco, Isau; Garcia-Cabrera, Karina; Hernandez, Lionel; Gamarra, Luis Valenzuela; Alexiades, Miguel N; Pansini, Susamar; Cuenca, Walter Palacios; Milliken, William; Ricardo, Joana; Lopez-Gonzalez, Gabriela; Pos, Edwin; Ter Steege, Hans
2018-01-17
Species distribution models (SDMs) are widely used in ecology and conservation. Presence-only SDMs such as MaxEnt frequently use natural history collections (NHCs) as occurrence data, given their huge numbers and accessibility. NHCs are often spatially biased which may generate inaccuracies in SDMs. Here, we test how the distribution of NHCs and MaxEnt predictions relates to a spatial abundance model, based on a large plot dataset for Amazonian tree species, using inverse distance weighting (IDW). We also propose a new pipeline to deal with inconsistencies in NHCs and to limit the area of occupancy of the species. We found a significant but weak positive relationship between the distribution of NHCs and IDW for 66% of the species. The relationship between SDMs and IDW was also significant but weakly positive for 95% of the species, and sensitivity for both analyses was high. Furthermore, the pipeline removed half of the NHCs records. Presence-only SDM applications should consider this limitation, especially for large biodiversity assessments projects, when they are automatically generated without subsequent checking. Our pipeline provides a conservative estimate of a species' area of occupancy, within an area slightly larger than its extent of occurrence, compatible to e.g. IUCN red list assessments.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kern, J.J.
1978-01-01
The recently completed 800-mile trans-Alaska pipeline is reviewed from the perspective of its first six months of successful operation. Because of the many environmental and political constraints, the $7.7 billion project is viewed as a triumph of both engineering and capitalism. Design problems were imposed by the harsh climate and terrain and by the constant public and bureaucratic monitoring. Specifications are reviewed for the pipes, valves, river crossings, pump stations, control stations, and the terminal at Valdez, where special ballast treatment and a vapor-recovery system were required to protect the harbor's water and air quality. The article outlines operating proceduresmore » and contingency planning for the pipeline and terminal. (DCK)« less
Automatic, nondestructive test monitors in-process weld quality
NASA Technical Reports Server (NTRS)
Deal, F. C.
1968-01-01
Instrument automatically and nondestructively monitors the quality of welds produced in microresistance welding. It measures the infrared energy generated in the weld as the weld is made and compares this energy with maximum and minimum limits of infrared energy values previously correlated with acceptable weld-strength tolerances.
14 CFR 171.263 - Localizer automatic monitor system.
Code of Federal Regulations, 2012 CFR
2012-01-01
... (CONTINUED) NAVIGATIONAL FACILITIES NON-FEDERAL NAVIGATION FACILITIES Interim Standard Microwave Landing... provide an automatic monitor system that transmits a warning to designated local and remote control points... centerline equivalent to more than 0.015 DDM at the ISMLS reference datum. (2) For localizers in which the...
14 CFR 171.263 - Localizer automatic monitor system.
Code of Federal Regulations, 2014 CFR
2014-01-01
... (CONTINUED) NAVIGATIONAL FACILITIES NON-FEDERAL NAVIGATION FACILITIES Interim Standard Microwave Landing... provide an automatic monitor system that transmits a warning to designated local and remote control points... centerline equivalent to more than 0.015 DDM at the ISMLS reference datum. (2) For localizers in which the...
14 CFR 171.263 - Localizer automatic monitor system.
Code of Federal Regulations, 2010 CFR
2010-01-01
... (CONTINUED) NAVIGATIONAL FACILITIES NON-FEDERAL NAVIGATION FACILITIES Interim Standard Microwave Landing... provide an automatic monitor system that transmits a warning to designated local and remote control points... centerline equivalent to more than 0.015 DDM at the ISMLS reference datum. (2) For localizers in which the...
14 CFR 171.263 - Localizer automatic monitor system.
Code of Federal Regulations, 2011 CFR
2011-01-01
... (CONTINUED) NAVIGATIONAL FACILITIES NON-FEDERAL NAVIGATION FACILITIES Interim Standard Microwave Landing... provide an automatic monitor system that transmits a warning to designated local and remote control points... centerline equivalent to more than 0.015 DDM at the ISMLS reference datum. (2) For localizers in which the...
14 CFR 171.263 - Localizer automatic monitor system.
Code of Federal Regulations, 2013 CFR
2013-01-01
... (CONTINUED) NAVIGATIONAL FACILITIES NON-FEDERAL NAVIGATION FACILITIES Interim Standard Microwave Landing... provide an automatic monitor system that transmits a warning to designated local and remote control points... centerline equivalent to more than 0.015 DDM at the ISMLS reference datum. (2) For localizers in which the...
Merlet, Benjamin; Paulhe, Nils; Vinson, Florence; Frainay, Clément; Chazalviel, Maxime; Poupin, Nathalie; Gloaguen, Yoann; Giacomoni, Franck; Jourdan, Fabien
2016-01-01
This article describes a generic programmatic method for mapping chemical compound libraries on organism-specific metabolic networks from various databases (KEGG, BioCyc) and flat file formats (SBML and Matlab files). We show how this pipeline was successfully applied to decipher the coverage of chemical libraries set up by two metabolomics facilities MetaboHub (French National infrastructure for metabolomics and fluxomics) and Glasgow Polyomics (GP) on the metabolic networks available in the MetExplore web server. The present generic protocol is designed to formalize and reduce the volume of information transfer between the library and the network database. Matching of metabolites between libraries and metabolic networks is based on InChIs or InChIKeys and therefore requires that these identifiers are specified in both libraries and networks. In addition to providing covering statistics, this pipeline also allows the visualization of mapping results in the context of metabolic networks. In order to achieve this goal, we tackled issues on programmatic interaction between two servers, improvement of metabolite annotation in metabolic networks and automatic loading of a mapping in genome scale metabolic network analysis tool MetExplore. It is important to note that this mapping can also be performed on a single or a selection of organisms of interest and is thus not limited to large facilities.
A De-Identification Pipeline for Ultrasound Medical Images in DICOM Format.
Monteiro, Eriksson; Costa, Carlos; Oliveira, José Luís
2017-05-01
Clinical data sharing between healthcare institutions, and between practitioners is often hindered by privacy protection requirements. This problem is critical in collaborative scenarios where data sharing is fundamental for establishing a workflow among parties. The anonymization of patient information burned in DICOM images requires elaborate processes somewhat more complex than simple de-identification of textual information. Usually, before sharing, there is a need for manual removal of specific areas containing sensitive information in the images. In this paper, we present a pipeline for ultrasound medical image de-identification, provided as a free anonymization REST service for medical image applications, and a Software-as-a-Service to streamline automatic de-identification of medical images, which is freely available for end-users. The proposed approach applies image processing functions and machine-learning models to bring about an automatic system to anonymize medical images. To perform character recognition, we evaluated several machine-learning models, being Convolutional Neural Networks (CNN) selected as the best approach. For accessing the system quality, 500 processed images were manually inspected showing an anonymization rate of 89.2%. The tool can be accessed at https://bioinformatics.ua.pt/dicom/anonymizer and it is available with the most recent version of Google Chrome, Mozilla Firefox and Safari. A Docker image containing the proposed service is also publicly available for the community.
Rapid, Vehicle-Based Identification of Location and Magnitude of Urban Natural Gas Pipeline Leaks.
von Fischer, Joseph C; Cooley, Daniel; Chamberlain, Sam; Gaylord, Adam; Griebenow, Claire J; Hamburg, Steven P; Salo, Jessica; Schumacher, Russ; Theobald, David; Ham, Jay
2017-04-04
Information about the location and magnitudes of natural gas (NG) leaks from urban distribution pipelines is important for minimizing greenhouse gas emissions and optimizing investment in pipeline management. To enable rapid collection of such data, we developed a relatively simple method using high-precision methane analyzers in Google Street View cars. Our data indicate that this automated leak survey system can document patterns in leak location and magnitude within and among cities, even without wind data. We found that urban areas with prevalent corrosion-prone distribution lines (Boston, MA, Staten Island, NY, and Syracuse, NY), leaked approximately 25-fold more methane than cities with more modern pipeline materials (Burlington, VT, and Indianapolis, IN). Although this mobile monitoring method produces conservative estimates of leak rates and leak counts, it can still help prioritize both leak repairs and replacement of leak-prone sections of distribution lines, thus minimizing methane emissions over short and long terms.
Campbell, W.H.
1986-01-01
Electric currents in long pipelines can contribute to corrosion effects that limit the pipe's lifetime. One cause of such electric currents is the geomagnetic field variations that have sources in the Earth's upper atmosphere. Knowledge of the general behavior of the sources allows a prediction of the occurrence times, favorable locations for the pipeline effects, and long-term projections of corrosion contributions. The source spectral characteristics, the Earth's conductivity profile, and a corrosion-frequency dependence limit the period range of the natural field changes that affect the pipe. The corrosion contribution by induced currents from geomagnetic sources should be evaluated for pipelines that are located at high and at equatorial latitudes. At midlatitude locations, the times of these natural current maxima should be avoided for the necessary accurate monitoring of the pipe-to-soil potential. ?? 1986 D. Reidel Publishing Company.
The Kepler Science Operations Center Pipeline Framework Extensions
NASA Technical Reports Server (NTRS)
Klaus, Todd C.; Cote, Miles T.; McCauliff, Sean; Girouard, Forrest R.; Wohler, Bill; Allen, Christopher; Chandrasekaran, Hema; Bryson, Stephen T.; Middour, Christopher; Caldwell, Douglas A.;
2010-01-01
The Kepler Science Operations Center (SOC) is responsible for several aspects of the Kepler Mission, including managing targets, generating on-board data compression tables, monitoring photometer health and status, processing the science data, and exporting the pipeline products to the mission archive. We describe how the generic pipeline framework software developed for Kepler is extended to achieve these goals, including pipeline configurations for processing science data and other support roles, and custom unit of work generators that control how the Kepler data are partitioned and distributed across the computing cluster. We describe the interface between the Java software that manages the retrieval and storage of the data for a given unit of work and the MATLAB algorithms that process these data. The data for each unit of work are packaged into a single file that contains everything needed by the science algorithms, allowing these files to be used to debug and evolve the algorithms offline.
Research on airborne infrared leakage detection of natural gas pipeline
NASA Astrophysics Data System (ADS)
Tan, Dongjie; Xu, Bin; Xu, Xu; Wang, Hongchao; Yu, Dongliang; Tian, Shengjie
2011-12-01
An airborne laser remote sensing technology is proposed to detect natural gas pipeline leakage in helicopter which carrying a detector, and the detector can detect a high spatial resolution of trace of methane on the ground. The principle of the airborne laser remote sensing system is based on tunable diode laser absorption spectroscopy (TDLAS). The system consists of an optical unit containing the laser, camera, helicopter mount, electronic unit with DGPS antenna, a notebook computer and a pilot monitor. And the system is mounted on a helicopter. The principle and the architecture of the airborne laser remote sensing system are presented. Field test experiments are carried out on West-East Natural Gas Pipeline of China, and the results show that airborne detection method is suitable for detecting gas leak of pipeline on plain, desert, hills but unfit for the area with large altitude diversification.
Chamberlain, Samuel D; Ingraffea, Anthony R; Sparks, Jed P
2016-11-01
Natural gas leakage and combustion are major sources of methane (CH 4 ) and carbon dioxide (CO 2 ), respectively; however, our understanding of emissions from cities is limited. We mapped distribution pipeline leakage using a mobile CH 4 detection system, and continuously monitored atmospheric CO 2 and CH 4 concentrations and carbon isotopes (δ 13 C-CO 2 and δ 13 C-CH 4 ) for one-year above Ithaca, New York. Pipeline leakage rates were low (<0.39 leaks mile -1 ), likely due to the small extent of cast iron and bare steel within the distribution pipeline system (2.6%). Our atmospheric monitoring demonstrated that the isotopic composition of locally emitted CO 2 approached the δ 13 C range of natural gas combustion in winter, correlating to natural gas power generation patterns at Cornell's Combined Heat and Power Plant located 600 m southeast of the monitoring site. Atmospheric CH 4 plumes were primarily of natural gas origin, were observed intermittently throughout the year, and were most frequent in winter and spring. No correlations between the timing of atmospheric natural gas CH 4 plumes and Cornell Plant gas use patterns could be drawn. However, elevated CH 4 and CO 2 concentrations were observed coincident with high winds from the southeast, and the plant is the only major emission source in that wind sector. Our results demonstrate pipeline leakage rates are low in cities with a low extent of leak prone pipe, and natural gas power facilities may be an important source of urban and suburban emissions. Copyright © 2016 Elsevier Ltd. All rights reserved.
Acoustic power delivery to pipeline monitoring wireless sensors.
Kiziroglou, M E; Boyle, D E; Wright, S W; Yeatman, E M
2017-05-01
The use of energy harvesting for powering wireless sensors is made more challenging in most applications by the requirement for customization to each specific application environment because of specificities of the available energy form, such as precise location, direction and motion frequency, as well as the temporal variation and unpredictability of the energy source. Wireless power transfer from dedicated sources can overcome these difficulties, and in this work, the use of targeted ultrasonic power transfer as a possible method for remote powering of sensor nodes is investigated. A powering system for pipeline monitoring sensors is described and studied experimentally, with a pair of identical, non-inertial piezoelectric transducers used at the transmitter and receiver. Power transmission of 18mW (Root-Mean-Square) through 1m of a118mm diameter cast iron pipe, with 8mm wall thickness is demonstrated. By analysis of the delay between transmission and reception, including reflections from the pipeline edges, a transmission speed of 1000m/s is observed, corresponding to the phase velocity of the L(0,1) axial and F(1,1) radial modes of the pipe structure. A reduction of power delivery with water-filling is observed, yet over 4mW of delivered power through a fully-filled pipe is demonstrated. The transmitted power and voltage levels exceed the requirements for efficient power management, including rectification at cold-starting conditions, and for the operation of low-power sensor nodes. The proposed powering technique may allow the implementation of energy autonomous wireless sensor systems for monitoring industrial and network pipeline infrastructure. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.
Automatically pairing measured findings across narrative abdomen CT reports.
Sevenster, Merlijn; Bozeman, Jeffrey; Cowhy, Andrea; Trost, William
2013-01-01
Radiological measurements are one of the key variables in widely adopted guidelines (WHO, RECIST) that standardize and objectivize response assessment in oncology care. Measurements are typically described in free-text, narrative radiology reports. We present a natural language processing pipeline that extracts measurements from radiology reports and pairs them with extracted measurements from prior reports of the same clinical finding, e.g., lymph node or mass. A ground truth was created by manually pairing measurements in the abdomen CT reports of 50 patients. A Random Forest classifier trained on 15 features achieved superior results in an end-to-end evaluation of the pipeline on the extraction and pairing task: precision 0.910, recall 0.878, F-measure 0.894, AUC 0.988. Representing the narrative content in terms of UMLS concepts did not improve results. Applications of the proposed technology include data mining, advanced search and workflow support for healthcare professionals managing radiological measurements.
A Web-Based Framework For a Time-Domain Warehouse
NASA Astrophysics Data System (ADS)
Brewer, J. M.; Bloom, J. S.; Kennedy, R.; Starr, D. L.
2009-09-01
The Berkeley Transients Classification Pipeline (TCP) uses a machine-learning classifier to automatically categorize transients from large data torrents and provide automated notification of astronomical events of scientific interest. As part of the training process, we created a large warehouse of light-curve sources with well-labelled classes that serve as priors to the classification engine. This web-based interactive framework, which we are now making public via DotAstro.org (http://dotastro.org/), allows us to ingest time-variable source data in a wide variety of formats and store it in a common internal data model. Data is passed between pipeline modules in a prototype XML representation of time-series format (VOTimeseries), which can also be emitted to collaborators through dotastro.org. After import, the sources can be visualized using Google Sky, light curves can be inspected interactively, and classifications can be manually adjusted.
The Chandra Source Catalog 2.0: Building The Catalog
NASA Astrophysics Data System (ADS)
Grier, John D.; Plummer, David A.; Allen, Christopher E.; Anderson, Craig S.; Budynkiewicz, Jamie A.; Burke, Douglas; Chen, Judy C.; Civano, Francesca Maria; D'Abrusco, Raffaele; Doe, Stephen M.; Evans, Ian N.; Evans, Janet D.; Fabbiano, Giuseppina; Gibbs, Danny G., II; Glotfelty, Kenny J.; Graessle, Dale E.; Hain, Roger; Hall, Diane M.; Harbo, Peter N.; Houck, John C.; Lauer, Jennifer L.; Laurino, Omar; Lee, Nicholas P.; Martínez-Galarza, Juan Rafael; McCollough, Michael L.; McDowell, Jonathan C.; Miller, Joseph; McLaughlin, Warren; Morgan, Douglas L.; Mossman, Amy E.; Nguyen, Dan T.; Nichols, Joy S.; Nowak, Michael A.; Paxson, Charles; Primini, Francis Anthony; Rots, Arnold H.; Siemiginowska, Aneta; Sundheim, Beth A.; Tibbetts, Michael; Van Stone, David W.; Zografou, Panagoula
2018-01-01
To build release 2.0 of the Chandra Source Catalog (CSC2), we require scientific software tools and processing pipelines to evaluate and analyze the data. Additionally, software and hardware infrastructure is needed to coordinate and distribute pipeline execution, manage data i/o, and handle data for Quality Assurance (QA) intervention. We also provide data product staging for archive ingestion.Release 2 utilizes a database driven system used for integration and production. Included are four distinct instances of the Automatic Processing (AP) system (Source Detection, Master Match, Source Properties and Convex Hulls) and a high performance computing (HPC) cluster that is managed to provide efficient catalog processing. In this poster we highlight the internal systems developed to meet the CSC2 challenge.This work has been supported by NASA under contract NAS 8-03060 to the Smithsonian Astrophysical Observatory for operation of the Chandra X-ray Center.
GenePRIMP: A Gene Prediction Improvement Pipeline For Prokaryotic Genomes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kyrpides, Nikos C.; Ivanova, Natalia N.; Pati, Amrita
2010-07-08
GenePRIMP (Gene Prediction Improvement Pipeline, Http://geneprimp.jgi-psf.org), a computational process that performs evidence-based evaluation of gene models in prokaryotic genomes and reports anomalies including inconsistent start sites, missing genes, and split genes. We show that manual curation of gene models using the anomaly reports generated by GenePRIMP improves their quality and demonstrate the applicability of GenePRIMP in improving finishing quality and comparing different genome sequencing and annotation technologies. Keywords in context: Gene model, Quality Control, Translation start sites, Automatic correction. Hardware requirements; PC, MAC; Operating System: UNIX/LINUX; Compiler/Version: Perl 5.8.5 or higher; Special requirements: NCBI Blast and nr installation; File Types:more » Source Code, Executable module(s), Sample problem input data; installation instructions other; programmer documentation. Location/transmission: http://geneprimp.jgi-psf.org/gp.tar.gz« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thompson, P.J.
1991-12-01
Geographic information system (GIS) applications for the siting and monitoring of gas pipeline rights-of-way for this project (ROWs) were developed for areas near Rio Vista, California. The data layers developed for this project represent geographic features, such as landcover, elevation, aspect, slope, soils, hydrography, transportation, endangered species, wetlands, and public line surveys. A GIS was used to develop and store spatial data from several sources; to manipulate spatial data to evaluate environmental and engineering issues associated with the siting, permitting, construction, maintenance, and monitoring of gas pipeline ROWS; and to graphically display analysis results. Examples of these applications include (1)more » determination of environmentally sensitive areas, such as endangered species habitat, wetlands, and areas of highly erosive soils; (2) evaluation of engineering constraints, including shallow depth to bedrock, major hydrographic features, and shallow water table; (3) classification of satellite imagery for landuse/landcover that will affect ROWs; and (4) identification of alternative ROW corridors that avoid environmentally sensitive areas or areas with severe engineering constraints.« less
Use of geographic information systems for applications on gas pipeline rights-of-way
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thompson, P.J.
1991-12-01
Geographic information system (GIS) applications for the siting and monitoring of gas pipeline rights-of-way for this project (ROWs) were developed for areas near Rio Vista, California. The data layers developed for this project represent geographic features, such as landcover, elevation, aspect, slope, soils, hydrography, transportation, endangered species, wetlands, and public line surveys. A GIS was used to develop and store spatial data from several sources; to manipulate spatial data to evaluate environmental and engineering issues associated with the siting, permitting, construction, maintenance, and monitoring of gas pipeline ROWS; and to graphically display analysis results. Examples of these applications include (1)more » determination of environmentally sensitive areas, such as endangered species habitat, wetlands, and areas of highly erosive soils; (2) evaluation of engineering constraints, including shallow depth to bedrock, major hydrographic features, and shallow water table; (3) classification of satellite imagery for landuse/landcover that will affect ROWs; and (4) identification of alternative ROW corridors that avoid environmentally sensitive areas or areas with severe engineering constraints.« less
Lévesque, Lucie M; Dubé, Monique G
2007-09-01
Pipeline crossing construction alters river and stream channels, hence may have detrimental effects on aquatic ecosystems. This review examines the effects of crossing construction on fish and fish habitat in rivers and streams, and recommends an approach to monitoring and assessment of impacts associated with these activities. Pipeline crossing construction is shown to not only compromise the integrity of the physical and chemical nature of fish habitat, but also to affect biological habitat (e.g., benthic invertebrates and invertebrate drift), and fish behavior and physiology. Indicators of effect include: water quality (total suspended solids TSS), physical habitat (substrate particle size, channel morphology), benthic invertebrate community structure and drift (abundance, species composition, diversity, standing crop), and fish behavior and physiology (hierarchy, feeding, respiration rate, loss of equilibrium, blood hematocrit and leukocrit levels, heart rate and stroke volume). The Before-After-Control-Impact (BACI) approach, which is often applied in Environmental Effects Monitoring (EEM), is recommended as a basis for impact assessment, as is consideration of site-specific sensitivities, assessment of significance, and cumulative effects.
Detection of Two Buried Cross Pipelines by Observation of the Scattered Electromagnetic Field
NASA Astrophysics Data System (ADS)
Mangini, Fabio; Di Gregorio, Pietro Paolo; Frezza, Fabrizio; Muzi, Marco; Tedeschi, Nicola
2015-04-01
In this work we present a numerical study on the effects that can be observed in the electromagnetic scattering of a plane wave due to the presence of two crossed pipelines buried in a half-space occupied by cement. The pipeline, supposed to be used for water conveyance, is modeled as a cylindrical shell made of metallic or poly-vinyl chloride (PVC) material. In order to make the model simpler, the pipelines are supposed running parallel to the air-cement interface on two different parallel planes; moreover, initially we suppose that the two tubes make an angle of 90 degrees. We consider a circularly-polarized plane wave impinging normally to the interface between air and the previously-mentioned medium, which excites the structure in order to determine the most useful configuration in terms of scattered-field sensitivity. To perform the study, a commercially available simulator which implements the Finite Element Method was adopted. A preliminary frequency sweep allows us to choose the most suitable operating frequency depending on the dimensions of the commercial pipeline cross-section. We monitor the three components of the scattered electric field along a line just above the interface between the two media. The electromagnetic properties of the materials employed in this study are taken from the literature and, since a frequency-domain technique is adopted, no further approximation is needed. Once the ideal problem has been studied, i.e. having considered orthogonal and tangential scenario, we further complicate the model by considering different crossing angles and distances between the tubes, in two cases of PVC and metallic material. The results obtained in these cases are compared with those of the initial problem with the goal of determining the scattered field dependence on the geometrical characteristics of the cross between two pipelines. One of the practical applications in the field of Civil Engineering of this study may be the use of ground penetrating radar (GPR) techniques to monitor the fouling conditions of water pipelines without the need to intervene destructively on the structure. Acknowledgements: This work is a contribution to COST Action TU1208 "Civil Engineering Applications of Ground Penetrating Radar".
NASA Astrophysics Data System (ADS)
Razak, K. Abdul; Othman, M. I. H.; Mat Yusuf, S.; Fuad, M. F. I. Ahmad; yahaya, Effah
2018-05-01
Oil and gas today being developed at different water depth characterized as shallow, deep and ultra-deep waters. Among the major components involved during the offshore installation is pipelines. Pipelines are a transportation method of material through a pipe. In oil and gas industry, pipeline come from a bunch of line pipe that welded together to become a long pipeline and can be divided into two which is gas pipeline and oil pipeline. In order to perform pipeline installation, we need pipe laying barge or pipe laying vessel. However, pipe laying vessel can be divided into two types: S-lay vessel and J-lay vessel. The function of pipe lay vessel is not only to perform pipeline installation. It also performed installation of umbilical or electrical cables. In the simple words, pipe lay vessel is performing the installation of subsea in all the connecting infrastructures. Besides that, the installation processes of pipelines require special focus to make the installation succeed. For instance, the heavy pipelines may exceed the lay vessel’s tension capacities in certain kind of water depth. Pipeline have their own characteristic and we can group it or differentiate it by certain parameters such as grade of material, type of material, size of diameter, size of wall thickness and the strength. For instances, wall thickness parameter studies indicate that if use the higher steel grade of the pipelines will have a significant contribution in pipeline wall thickness reduction. When running the process of pipe lay, water depth is the most critical thing that we need to monitor and concern about because of course we cannot control the water depth but we can control the characteristic of the pipe like apply line pipe that have wall thickness suitable with current water depth in order to avoid failure during the installation. This research will analyse whether the pipeline parameter meet the requirements limit and minimum yield stress. It will overlook to simulate pipe grade API 5L X60 which size from 8 to 20mm thickness with a water depth of 50 to 300m. Result shown that pipeline installation will fail from the wall thickness of 18mm onwards since it has been passed the critical yield percentage.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Young, S; Lo, P; Hoffman, J
Purpose: To evaluate the robustness of CAD or Quantitative Imaging methods, they should be tested on a variety of cases and under a variety of image acquisition and reconstruction conditions that represent the heterogeneity encountered in clinical practice. The purpose of this work was to develop a fully-automated pipeline for generating CT images that represent a wide range of dose and reconstruction conditions. Methods: The pipeline consists of three main modules: reduced-dose simulation, image reconstruction, and quantitative analysis. The first two modules of the pipeline can be operated in a completely automated fashion, using configuration files and running the modulesmore » in a batch queue. The input to the pipeline is raw projection CT data; this data is used to simulate different levels of dose reduction using a previously-published algorithm. Filtered-backprojection reconstructions are then performed using FreeCT-wFBP, a freely-available reconstruction software for helical CT. We also added support for an in-house, model-based iterative reconstruction algorithm using iterative coordinate-descent optimization, which may be run in tandem with the more conventional recon methods. The reduced-dose simulations and image reconstructions are controlled automatically by a single script, and they can be run in parallel on our research cluster. The pipeline was tested on phantom and lung screening datasets from a clinical scanner (Definition AS, Siemens Healthcare). Results: The images generated from our test datasets appeared to represent a realistic range of acquisition and reconstruction conditions that we would expect to find clinically. The time to generate images was approximately 30 minutes per dose/reconstruction combination on a hybrid CPU/GPU architecture. Conclusion: The automated research pipeline promises to be a useful tool for either training or evaluating performance of quantitative imaging software such as classifiers and CAD algorithms across the range of acquisition and reconstruction parameters present in the clinical environment. Funding support: NIH U01 CA181156; Disclosures (McNitt-Gray): Institutional research agreement, Siemens Healthcare; Past recipient, research grant support, Siemens Healthcare; Consultant, Toshiba America Medical Systems; Consultant, Samsung Electronics.« less
Magnetic Flux Leakage and Principal Component Analysis for metal loss approximation in a pipeline
NASA Astrophysics Data System (ADS)
Ruiz, M.; Mujica, L. E.; Quintero, M.; Florez, J.; Quintero, S.
2015-07-01
Safety and reliability of hydrocarbon transportation pipelines represent a critical aspect for the Oil an Gas industry. Pipeline failures caused by corrosion, external agents, among others, can develop leaks or even rupture, which can negatively impact on population, natural environment, infrastructure and economy. It is imperative to have accurate inspection tools traveling through the pipeline to diagnose the integrity. In this way, over the last few years, different techniques under the concept of structural health monitoring (SHM) have continuously been in development. This work is based on a hybrid methodology that combines the Magnetic Flux Leakage (MFL) and Principal Components Analysis (PCA) approaches. The MFL technique induces a magnetic field in the pipeline's walls. The data are recorded by sensors measuring leakage magnetic field in segments with loss of metal, such as cracking, corrosion, among others. The data provide information of a pipeline with 15 years of operation approximately, which transports gas, has a diameter of 20 inches and a total length of 110 km (with several changes in the topography). On the other hand, PCA is a well-known technique that compresses the information and extracts the most relevant information facilitating the detection of damage in several structures. At this point, the goal of this work is to detect and localize critical loss of metal of a pipeline that are currently working.
Improving fMRI reliability in presurgical mapping for brain tumours.
Stevens, M Tynan R; Clarke, David B; Stroink, Gerhard; Beyea, Steven D; D'Arcy, Ryan Cn
2016-03-01
Functional MRI (fMRI) is becoming increasingly integrated into clinical practice for presurgical mapping. Current efforts are focused on validating data quality, with reliability being a major factor. In this paper, we demonstrate the utility of a recently developed approach that uses receiver operating characteristic-reliability (ROC-r) to: (1) identify reliable versus unreliable data sets; (2) automatically select processing options to enhance data quality; and (3) automatically select individualised thresholds for activation maps. Presurgical fMRI was conducted in 16 patients undergoing surgical treatment for brain tumours. Within-session test-retest fMRI was conducted, and ROC-reliability of the patient group was compared to a previous healthy control cohort. Individually optimised preprocessing pipelines were determined to improve reliability. Spatial correspondence was assessed by comparing the fMRI results to intraoperative cortical stimulation mapping, in terms of the distance to the nearest active fMRI voxel. The average ROC-r reliability for the patients was 0.58±0.03, as compared to 0.72±0.02 in healthy controls. For the patient group, this increased significantly to 0.65±0.02 by adopting optimised preprocessing pipelines. Co-localisation of the fMRI maps with cortical stimulation was significantly better for more reliable versus less reliable data sets (8.3±0.9 vs 29±3 mm, respectively). We demonstrated ROC-r analysis for identifying reliable fMRI data sets, choosing optimal postprocessing pipelines, and selecting patient-specific thresholds. Data sets with higher reliability also showed closer spatial correspondence to cortical stimulation. ROC-r can thus identify poor fMRI data at time of scanning, allowing for repeat scans when necessary. ROC-r analysis provides optimised and automated fMRI processing for improved presurgical mapping. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
Text Mining the History of Medicine.
Thompson, Paul; Batista-Navarro, Riza Theresa; Kontonatsios, Georgios; Carter, Jacob; Toon, Elizabeth; McNaught, John; Timmermann, Carsten; Worboys, Michael; Ananiadou, Sophia
2016-01-01
Historical text archives constitute a rich and diverse source of information, which is becoming increasingly readily accessible, due to large-scale digitisation efforts. However, it can be difficult for researchers to explore and search such large volumes of data in an efficient manner. Text mining (TM) methods can help, through their ability to recognise various types of semantic information automatically, e.g., instances of concepts (places, medical conditions, drugs, etc.), synonyms/variant forms of concepts, and relationships holding between concepts (which drugs are used to treat which medical conditions, etc.). TM analysis allows search systems to incorporate functionality such as automatic suggestions of synonyms of user-entered query terms, exploration of different concepts mentioned within search results or isolation of documents in which concepts are related in specific ways. However, applying TM methods to historical text can be challenging, according to differences and evolutions in vocabulary, terminology, language structure and style, compared to more modern text. In this article, we present our efforts to overcome the various challenges faced in the semantic analysis of published historical medical text dating back to the mid 19th century. Firstly, we used evidence from diverse historical medical documents from different periods to develop new resources that provide accounts of the multiple, evolving ways in which concepts, their variants and relationships amongst them may be expressed. These resources were employed to support the development of a modular processing pipeline of TM tools for the robust detection of semantic information in historical medical documents with varying characteristics. We applied the pipeline to two large-scale medical document archives covering wide temporal ranges as the basis for the development of a publicly accessible semantically-oriented search system. The novel resources are available for research purposes, while the processing pipeline and its modules may be used and configured within the Argo TM platform.
Text Mining the History of Medicine
Thompson, Paul; Batista-Navarro, Riza Theresa; Kontonatsios, Georgios; Carter, Jacob; Toon, Elizabeth; McNaught, John; Timmermann, Carsten; Worboys, Michael; Ananiadou, Sophia
2016-01-01
Historical text archives constitute a rich and diverse source of information, which is becoming increasingly readily accessible, due to large-scale digitisation efforts. However, it can be difficult for researchers to explore and search such large volumes of data in an efficient manner. Text mining (TM) methods can help, through their ability to recognise various types of semantic information automatically, e.g., instances of concepts (places, medical conditions, drugs, etc.), synonyms/variant forms of concepts, and relationships holding between concepts (which drugs are used to treat which medical conditions, etc.). TM analysis allows search systems to incorporate functionality such as automatic suggestions of synonyms of user-entered query terms, exploration of different concepts mentioned within search results or isolation of documents in which concepts are related in specific ways. However, applying TM methods to historical text can be challenging, according to differences and evolutions in vocabulary, terminology, language structure and style, compared to more modern text. In this article, we present our efforts to overcome the various challenges faced in the semantic analysis of published historical medical text dating back to the mid 19th century. Firstly, we used evidence from diverse historical medical documents from different periods to develop new resources that provide accounts of the multiple, evolving ways in which concepts, their variants and relationships amongst them may be expressed. These resources were employed to support the development of a modular processing pipeline of TM tools for the robust detection of semantic information in historical medical documents with varying characteristics. We applied the pipeline to two large-scale medical document archives covering wide temporal ranges as the basis for the development of a publicly accessible semantically-oriented search system. The novel resources are available for research purposes, while the processing pipeline and its modules may be used and configured within the Argo TM platform. PMID:26734936
Pandey, Ram Vinay; Pulverer, Walter; Kallmeyer, Rainer; Beikircher, Gabriel; Pabinger, Stephan; Kriegner, Albert; Weinhäusel, Andreas
2016-01-01
Bisulfite (BS) conversion-based and methylation-sensitive restriction enzyme (MSRE)-based PCR methods have been the most commonly used techniques for locus-specific DNA methylation analysis. However, both methods have advantages and limitations. Thus, an integrated approach would be extremely useful to quantify the DNA methylation status successfully with great sensitivity and specificity. Designing specific and optimized primers for target regions is the most critical and challenging step in obtaining the adequate DNA methylation results using PCR-based methods. Currently, no integrated, optimized, and high-throughput methylation-specific primer design software methods are available for both BS- and MSRE-based methods. Therefore an integrated, powerful, and easy-to-use methylation-specific primer design pipeline with great accuracy and success rate will be very useful. We have developed a new web-based pipeline, called MSP-HTPrimer, to design primers pairs for MSP, BSP, pyrosequencing, COBRA, and MSRE assays on both genomic strands. First, our pipeline converts all target sequences into bisulfite-treated templates for both forward and reverse strand and designs all possible primer pairs, followed by filtering for single nucleotide polymorphisms (SNPs) and known repeat regions. Next, each primer pairs are annotated with the upstream and downstream RefSeq genes, CpG island, and cut sites (for COBRA and MSRE). Finally, MSP-HTPrimer selects specific primers from both strands based on custom and user-defined hierarchical selection criteria. MSP-HTPrimer produces a primer pair summary output table in TXT and HTML format for display and UCSC custom tracks for resulting primer pairs in GTF format. MSP-HTPrimer is an integrated, web-based, and high-throughput pipeline and has no limitation on the number and size of target sequences and designs MSP, BSP, pyrosequencing, COBRA, and MSRE assays. It is the only pipeline, which automatically designs primers on both genomic strands to increase the success rate. It is a standalone web-based pipeline, which is fully configured within a virtual machine and thus can be readily used without any configuration. We have experimentally validated primer pairs designed by our pipeline and shown a very high success rate of primer pairs: out of 66 BSP primer pairs, 63 were successfully validated without any further optimization step and using the same qPCR conditions. The MSP-HTPrimer pipeline is freely available from http://sourceforge.net/p/msp-htprimer.
Applications of the pipeline environment for visual informatics and genomics computations
2011-01-01
Background Contemporary informatics and genomics research require efficient, flexible and robust management of large heterogeneous data, advanced computational tools, powerful visualization, reliable hardware infrastructure, interoperability of computational resources, and detailed data and analysis-protocol provenance. The Pipeline is a client-server distributed computational environment that facilitates the visual graphical construction, execution, monitoring, validation and dissemination of advanced data analysis protocols. Results This paper reports on the applications of the LONI Pipeline environment to address two informatics challenges - graphical management of diverse genomics tools, and the interoperability of informatics software. Specifically, this manuscript presents the concrete details of deploying general informatics suites and individual software tools to new hardware infrastructures, the design, validation and execution of new visual analysis protocols via the Pipeline graphical interface, and integration of diverse informatics tools via the Pipeline eXtensible Markup Language syntax. We demonstrate each of these processes using several established informatics packages (e.g., miBLAST, EMBOSS, mrFAST, GWASS, MAQ, SAMtools, Bowtie) for basic local sequence alignment and search, molecular biology data analysis, and genome-wide association studies. These examples demonstrate the power of the Pipeline graphical workflow environment to enable integration of bioinformatics resources which provide a well-defined syntax for dynamic specification of the input/output parameters and the run-time execution controls. Conclusions The LONI Pipeline environment http://pipeline.loni.ucla.edu provides a flexible graphical infrastructure for efficient biomedical computing and distributed informatics research. The interactive Pipeline resource manager enables the utilization and interoperability of diverse types of informatics resources. The Pipeline client-server model provides computational power to a broad spectrum of informatics investigators - experienced developers and novice users, user with or without access to advanced computational-resources (e.g., Grid, data), as well as basic and translational scientists. The open development, validation and dissemination of computational networks (pipeline workflows) facilitates the sharing of knowledge, tools, protocols and best practices, and enables the unbiased validation and replication of scientific findings by the entire community. PMID:21791102
Elixir - how to handle 2 trillion pixels
NASA Astrophysics Data System (ADS)
Magnier, Eugene A.; Cuillandre, Jean-Charles
2002-12-01
The Elixir system at CFHT provides automatic data quality assurance and calibration for the wide-field mosaic imager camera CFH12K. Elixir consists of a variety of tools, including: a real-time analysis suite which runs at the telescope to provide quick feedback to the observers; a detailed analysis of the calibration data; and an automated pipeline for processing data to be distributed to observers. To date, 2.4 × 1012 night-time sky pixels from CFH12K have been processed by the Elixir system.
49 CFR 192.16 - Customer notification.
Code of Federal Regulations, 2011 CFR
2011-10-01
... ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED) PIPELINE SAFETY TRANSPORTATION OF NATURAL AND OTHER GAS... serve yard lanterns, pool heaters, or other types of secondary equipment. Also, “maintain” means monitor...
Soil-pipe interaction modeling for pipe behavior prediction with super learning based methods
NASA Astrophysics Data System (ADS)
Shi, Fang; Peng, Xiang; Liu, Huan; Hu, Yafei; Liu, Zheng; Li, Eric
2018-03-01
Underground pipelines are subject to severe distress from the surrounding expansive soil. To investigate the structural response of water mains to varying soil movements, field data, including pipe wall strains in situ soil water content, soil pressure and temperature, was collected. The research on monitoring data analysis has been reported, but the relationship between soil properties and pipe deformation has not been well-interpreted. To characterize the relationship between soil property and pipe deformation, this paper presents a super learning based approach combining feature selection algorithms to predict the water mains structural behavior in different soil environments. Furthermore, automatic variable selection method, e.i. recursive feature elimination algorithm, were used to identify the critical predictors contributing to the pipe deformations. To investigate the adaptability of super learning to different predictive models, this research employed super learning based methods to three different datasets. The predictive performance was evaluated by R-squared, root-mean-square error and mean absolute error. Based on the prediction performance evaluation, the superiority of super learning was validated and demonstrated by predicting three types of pipe deformations accurately. In addition, a comprehensive understand of the water mains working environments becomes possible.
Hardware Neural Network for a Visual Inspection System
NASA Astrophysics Data System (ADS)
Chun, Seungwoo; Hayakawa, Yoshihiro; Nakajima, Koji
The visual inspection of defects in products is heavily dependent on human experience and instinct. In this situation, it is difficult to reduce the production costs and to shorten the inspection time and hence the total process time. Consequently people involved in this area desire an automatic inspection system. In this paper, we propose a hardware neural network, which is expected to provide high-speed operation for automatic inspection of products. Since neural networks can learn, this is a suitable method for self-adjustment of criteria for classification. To achieve high-speed operation, we use parallel and pipelining techniques. Furthermore, we use a piecewise linear function instead of a conventional activation function in order to save hardware resources. Consequently, our proposed hardware neural network achieved 6GCPS and 2GCUPS, which in our test sample proved to be sufficiently fast.
3D Lunar Terrain Reconstruction from Apollo Images
NASA Technical Reports Server (NTRS)
Broxton, Michael J.; Nefian, Ara V.; Moratto, Zachary; Kim, Taemin; Lundy, Michael; Segal, Alkeksandr V.
2009-01-01
Generating accurate three dimensional planetary models is becoming increasingly important as NASA plans manned missions to return to the Moon in the next decade. This paper describes a 3D surface reconstruction system called the Ames Stereo Pipeline that is designed to produce such models automatically by processing orbital stereo imagery. We discuss two important core aspects of this system: (1) refinement of satellite station positions and pose estimates through least squares bundle adjustment; and (2) a stochastic plane fitting algorithm that generalizes the Lucas-Kanade method for optimal matching between stereo pair images.. These techniques allow us to automatically produce seamless, highly accurate digital elevation models from multiple stereo image pairs while significantly reducing the influence of image noise. Our technique is demonstrated on a set of 71 high resolution scanned images from the Apollo 15 mission
Distributed acoustic fibre optic sensors for condition monitoring of pipelines
NASA Astrophysics Data System (ADS)
Hussels, Maria-Teresa; Chruscicki, Sebastian; Habib, Abdelkarim; Krebber, Katerina
2016-05-01
Industrial piping systems are particularly relevant to public safety and the continuous availability of infrastructure. However, condition monitoring systems based on many discrete sensors are generally not well-suited for widespread piping systems due to considerable installation effort, while use of distributed fibre-optic sensors would reduce this effort to a minimum. Specifically distributed acoustic sensing (DAS) is employed for detection of third-party threats and leaks in oil and gas pipelines in recent years and can in principle also be applied to industrial plants. Further possible detection routes amenable by DAS that could identify damage prior to emission of medium are subject of a current project at BAM, which aims at qualifying distributed fibre optic methods such as DAS as a means for spatially continuous monitoring of industrial piping systems. Here, first tests on a short pipe are presented, where optical fibres were applied directly to the surface. An artificial signal was used to define suitable parameters of the measurement system and compare different ways of applying the sensor.
NASA Astrophysics Data System (ADS)
Sidiropoulos, Panagiotis; Muller, Jan-Peter; Watson, Gillian; Michael, Gregory; Walter, Sebastian
2018-02-01
This work presents the coregistered, orthorectified and mosaiced high-resolution products of the MC11 quadrangle of Mars, which have been processed using novel, fully automatic, techniques. We discuss the development of a pipeline that achieves fully automatic and parameter independent geometric alignment of high-resolution planetary images, starting from raw input images in NASA PDS format and following all required steps to produce a coregistered geotiff image, a corresponding footprint and useful metadata. Additionally, we describe the development of a radiometric calibration technique that post-processes coregistered images to make them radiometrically consistent. Finally, we present a batch-mode application of the developed techniques over the MC11 quadrangle to validate their potential, as well as to generate end products, which are released to the planetary science community, thus assisting in the analysis of Mars static and dynamic features. This case study is a step towards the full automation of signal processing tasks that are essential to increase the usability of planetary data, but currently, require the extensive use of human resources.
Processing of Crawled Urban Imagery for Building Use Classification
NASA Astrophysics Data System (ADS)
Tutzauer, P.; Haala, N.
2017-05-01
Recent years have shown a shift from pure geometric 3D city models to data with semantics. This is induced by new applications (e.g. Virtual/Augmented Reality) and also a requirement for concepts like Smart Cities. However, essential urban semantic data like building use categories is often not available. We present a first step in bridging this gap by proposing a pipeline to use crawled urban imagery and link it with ground truth cadastral data as an input for automatic building use classification. We aim to extract this city-relevant semantic information automatically from Street View (SV) imagery. Convolutional Neural Networks (CNNs) proved to be extremely successful for image interpretation, however, require a huge amount of training data. Main contribution of the paper is the automatic provision of such training datasets by linking semantic information as already available from databases provided from national mapping agencies or city administrations to the corresponding façade images extracted from SV. Finally, we present first investigations with a CNN and an alternative classifier as a proof of concept.
NASA Astrophysics Data System (ADS)
Ogungbuyi, M. G.; Eckardt, F. D.; Martinez, P.
2016-12-01
Nigeria, the largest producer of crude oil in Africa occupies sixth position in the world. Despite such huge oil revenue potentials, its pipeline network system is consistently susceptible to leaks causing oil spills. We investigate ground based spill events which are caused by operational error, equipment failure and most importantly by deliberate attacks along the major pipeline transport system. Sometimes, these spills are accompanied with fire explosion caused by accidental discharge, natural or illegal refineries in the creeds, etc. MODIS satellites fires data corresponding to the times and spill events (i.e. ground based data) of the Area of Interest (AOI) show significant correlation. The open source Quantum Geographical Information System (QGIS) was used to validate the dataset and the spatiotemporal analyses of the oil spill fires were expressed. We demonstrate that through QGIS and Google Earth (using the time sliders), we can identify and monitor oil spills when they are attended with fire events along the pipeline transport system accordingly. This is shown through the spatiotemporal images of the fires. Evidence of such fire cases resulting from bunt vegetation as different from industrial and domestic fire is also presented. Detecting oil spill fires in the study location may not require an enormous terabyte of image processing: we can however rely on a near-real-time (NRT) MODIS data that is readily available twice daily to detect oil spill fire as early warning signal for those hotspots areas where cases of oil seepage is significant in Nigeria.
An implementation of a data-transmission pipelining algorithm on Imote2 platforms
NASA Astrophysics Data System (ADS)
Li, Xu; Dorvash, Siavash; Cheng, Liang; Pakzad, Shamim
2011-04-01
Over the past several years, wireless network systems and sensing technologies have been developed significantly. This has resulted in the broad application of wireless sensor networks (WSNs) in many engineering fields and in particular structural health monitoring (SHM). The movement of traditional SHM toward the new generation of SHM, which utilizes WSNs, relies on the advantages of this new approach such as relatively low costs, ease of implementation and the capability of onboard data processing and management. In the particular case of long span bridge monitoring, a WSN should be capable of transmitting commands and measurement data over long network geometry in a reliable manner. While using single-hop data transmission in such geometry requires a long radio range and consequently a high level of power supply, multi-hop communication may offer an effective and reliable way for data transmissions across the network. Using a multi-hop communication protocol, the network relays data from a remote node to the base station via intermediary nodes. We have proposed a data-transmission pipelining algorithm to enable an effective use of the available bandwidth and minimize the energy consumption and the delay performance by the multi-hop communication protocol. This paper focuses on the implementation aspect of the pipelining algorithm on Imote2 platforms for SHM applications, describes its interaction with underlying routing protocols, and presents the solutions to various implementation issues of the proposed pipelining algorithm. Finally, the performance of the algorithm is evaluated based on the results of an experimental implementation.
Pipeline oil fire detection with MODIS active fire products
NASA Astrophysics Data System (ADS)
Ogungbuyi, M. G.; Martinez, P.; Eckardt, F. D.
2017-12-01
We investigate 85 129 MODIS satellite active fire events from 2007 to 2015 in the Niger Delta of Nigeria. The region is the oil base for Nigerian economy and the hub of oil exploration where oil facilities (i.e. flowlines, flow stations, trunklines, oil wells and oil fields) are domiciled, and from where crude oil and refined products are transported to different Nigerian locations through a network of pipeline systems. Pipeline and other oil facilities are consistently susceptible to oil leaks due to operational or maintenance error, and by acts of deliberate sabotage of the pipeline equipment which often result in explosions and fire outbreaks. We used ground oil spill reports obtained from the National Oil Spill Detection and Response Agency (NOSDRA) database (see www.oilspillmonitor.ng) to validate MODIS satellite data. NOSDRA database shows an estimate of 10 000 spill events from 2007 - 2015. The spill events were filtered to include largest spills by volume and events occurring only in the Niger Delta (i.e. 386 spills). By projecting both MODIS fire and spill as `input vector' layers with `Points' geometry, and the Nigerian pipeline networks as `from vector' layers with `LineString' geometry in a geographical information system, we extracted the nearest MODIS events (i.e. 2192) closed to the pipelines by 1000m distance in spatial vector analysis. The extraction process that defined the nearest distance to the pipelines is based on the global practices of the Right of Way (ROW) in pipeline management that earmarked 30m strip of land to the pipeline. The KML files of the extracted fires in a Google map validated their source origin to be from oil facilities. Land cover mapping confirmed fire anomalies. The aim of the study is to propose a near-real-time monitoring of spill events along pipeline routes using 250 m spatial resolution of MODIS active fire detection sensor when such spills are accompanied by fire events in the study location.
A Computational Pipeline to Improve Clinical Alarms Using a Parallel Computing Infrastructure
ERIC Educational Resources Information Center
Nguyen, Andrew V.
2013-01-01
Physicians, nurses, and other clinical staff rely on alarms from various bedside monitors and sensors to alert when there is a change in the patient's clinical status, typically when urgent intervention is necessary. These alarms are usually embedded directly within the sensor or monitor and lacks the context of the patient's medical history and…
MarsSI: Martian surface data processing information system
NASA Astrophysics Data System (ADS)
Quantin-Nataf, C.; Lozac'h, L.; Thollot, P.; Loizeau, D.; Bultel, B.; Fernando, J.; Allemand, P.; Dubuffet, F.; Poulet, F.; Ody, A.; Clenet, H.; Leyrat, C.; Harrisson, S.
2018-01-01
MarsSI (Acronym for Mars System of Information, https://emars.univ-lyon1.fr/MarsSI/, is a web Geographic Information System application which helps managing and processing martian orbital data. The MarsSI facility is part of the web portal called PSUP (Planetary SUrface Portal) developed by the Observatories of Paris Sud (OSUPS) and Lyon (OSUL) to provide users with efficient and easy access to data products dedicated to the martian surface. The portal proposes 1) the management and processing of data thanks to MarsSI and 2) the visualization and merging of high level (imagery, spectral, and topographic) products and catalogs via a web-based user interface (MarsVisu). The portal PSUP as well as the facility MarsVisu is detailed in a companion paper (Poulet et al., 2018). The purpose of this paper is to describe the facility MarsSI. From this application, users are able to easily and rapidly select observations, process raw data via automatic pipelines, and get back final products which can be visualized under Geographic Information Systems. Moreover, MarsSI also contains an automatic stereo-restitution pipeline in order to produce Digital Terrain Models (DTM) on demand from HiRISE (High Resolution Imaging Science Experiment) or CTX (Context Camera) pair-images. This application is funded by the European Union's Seventh Framework Programme (FP7/2007-2013) (ERC project eMars, No. 280168) and has been developed in the scope of Mars, but the design is applicable to any other planetary body of the solar system.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mackenzie, Cristóbal; Pichara, Karim; Protopapas, Pavlos
The success of automatic classification of variable stars depends strongly on the lightcurve representation. Usually, lightcurves are represented as a vector of many descriptors designed by astronomers called features. These descriptors are expensive in terms of computing, require substantial research effort to develop, and do not guarantee a good classification. Today, lightcurve representation is not entirely automatic; algorithms must be designed and manually tuned up for every survey. The amounts of data that will be generated in the future mean astronomers must develop scalable and automated analysis pipelines. In this work we present a feature learning algorithm designed for variablemore » objects. Our method works by extracting a large number of lightcurve subsequences from a given set, which are then clustered to find common local patterns in the time series. Representatives of these common patterns are then used to transform lightcurves of a labeled set into a new representation that can be used to train a classifier. The proposed algorithm learns the features from both labeled and unlabeled lightcurves, overcoming the bias using only labeled data. We test our method on data sets from the Massive Compact Halo Object survey and the Optical Gravitational Lensing Experiment; the results show that our classification performance is as good as and in some cases better than the performance achieved using traditional statistical features, while the computational cost is significantly lower. With these promising results, we believe that our method constitutes a significant step toward the automation of the lightcurve classification pipeline.« less
Clustering-based Feature Learning on Variable Stars
NASA Astrophysics Data System (ADS)
Mackenzie, Cristóbal; Pichara, Karim; Protopapas, Pavlos
2016-04-01
The success of automatic classification of variable stars depends strongly on the lightcurve representation. Usually, lightcurves are represented as a vector of many descriptors designed by astronomers called features. These descriptors are expensive in terms of computing, require substantial research effort to develop, and do not guarantee a good classification. Today, lightcurve representation is not entirely automatic; algorithms must be designed and manually tuned up for every survey. The amounts of data that will be generated in the future mean astronomers must develop scalable and automated analysis pipelines. In this work we present a feature learning algorithm designed for variable objects. Our method works by extracting a large number of lightcurve subsequences from a given set, which are then clustered to find common local patterns in the time series. Representatives of these common patterns are then used to transform lightcurves of a labeled set into a new representation that can be used to train a classifier. The proposed algorithm learns the features from both labeled and unlabeled lightcurves, overcoming the bias using only labeled data. We test our method on data sets from the Massive Compact Halo Object survey and the Optical Gravitational Lensing Experiment; the results show that our classification performance is as good as and in some cases better than the performance achieved using traditional statistical features, while the computational cost is significantly lower. With these promising results, we believe that our method constitutes a significant step toward the automation of the lightcurve classification pipeline.
High-throughput protein analysis integrating bioinformatics and experimental assays
del Val, Coral; Mehrle, Alexander; Falkenhahn, Mechthild; Seiler, Markus; Glatting, Karl-Heinz; Poustka, Annemarie; Suhai, Sandor; Wiemann, Stefan
2004-01-01
The wealth of transcript information that has been made publicly available in recent years requires the development of high-throughput functional genomics and proteomics approaches for its analysis. Such approaches need suitable data integration procedures and a high level of automation in order to gain maximum benefit from the results generated. We have designed an automatic pipeline to analyse annotated open reading frames (ORFs) stemming from full-length cDNAs produced mainly by the German cDNA Consortium. The ORFs are cloned into expression vectors for use in large-scale assays such as the determination of subcellular protein localization or kinase reaction specificity. Additionally, all identified ORFs undergo exhaustive bioinformatic analysis such as similarity searches, protein domain architecture determination and prediction of physicochemical characteristics and secondary structure, using a wide variety of bioinformatic methods in combination with the most up-to-date public databases (e.g. PRINTS, BLOCKS, INTERPRO, PROSITE SWISSPROT). Data from experimental results and from the bioinformatic analysis are integrated and stored in a relational database (MS SQL-Server), which makes it possible for researchers to find answers to biological questions easily, thereby speeding up the selection of targets for further analysis. The designed pipeline constitutes a new automatic approach to obtaining and administrating relevant biological data from high-throughput investigations of cDNAs in order to systematically identify and characterize novel genes, as well as to comprehensively describe the function of the encoded proteins. PMID:14762202
Piersma, Sjouke; Denham, Emma L.; Drulhe, Samuel; Tonk, Rudi H. J.; Schwikowski, Benno; van Dijl, Jan Maarten
2013-01-01
Gene expression heterogeneity is a key driver for microbial adaptation to fluctuating environmental conditions, cell differentiation and the evolution of species. This phenomenon has therefore enormous implications, not only for life in general, but also for biotechnological applications where unwanted subpopulations of non-producing cells can emerge in large-scale fermentations. Only time-lapse fluorescence microscopy allows real-time measurements of gene expression heterogeneity. A major limitation in the analysis of time-lapse microscopy data is the lack of fast, cost-effective, open, simple and adaptable protocols. Here we describe TLM-Quant, a semi-automatic pipeline for the analysis of time-lapse fluorescence microscopy data that enables the user to visualize and quantify gene expression heterogeneity. Importantly, our pipeline builds on the open-source packages ImageJ and R. To validate TLM-Quant, we selected three possible scenarios, namely homogeneous expression, highly ‘noisy’ heterogeneous expression, and bistable heterogeneous expression in the Gram-positive bacterium Bacillus subtilis. This bacterium is both a paradigm for systems-level studies on gene expression and a highly appreciated biotechnological ‘cell factory’. We conclude that the temporal resolution of such analyses with TLM-Quant is only limited by the numbers of recorded images. PMID:23874729
Li, Weifeng; Ling, Wencui; Liu, Suoxiang; Zhao, Jing; Liu, Ruiping; Chen, Qiuwen; Qiang, Zhimin; Qu, Jiuhui
2011-01-01
Water leakage in drinking water distribution systems is a serious problem for many cities and a huge challenge for water utilities. An integrated system for the detection, early warning, and control of pipeline leakage has been developed and successfully used to manage the pipeline networks in selected areas of Beijing. A method based on the geographic information system has been proposed to quickly and automatically optimize the layout of the instruments which detect leaks. Methods are also proposed to estimate the probability of each pipe segment leaking (on the basis of historic leakage data), and to assist in locating the leakage points (based on leakage signals). The district metering area (DMA) strategy is used. Guidelines and a flowchart for establishing a DMA to manage the large-scale looped networks in Beijing are proposed. These different functions have been implemented into a central software system to simplify the day-to-day use of the system. In 2007 the system detected 102 non-obvious leakages (i.e., 14.2% of the total detected in Beijing) in the selected areas, which was estimated to save a total volume of 2,385,000 m3 of water. These results indicate the feasibility, efficiency and wider applicability of this system.
Investigation of the motion processes of wastewater in sewerage of high-rise buildings
NASA Astrophysics Data System (ADS)
Pomogaeva, Valentina; Metechko, Lyudmila; Prokofiev, Dmitry; Narezhnaya, Tamara
2018-03-01
When designing, constructing and operating sewage pipelines in high-rise buildings, issues of failure-free operation of a network arise. Investigation of the processes of wastewater moving allows identifying problem areas during operation, assessing the possibility of obstructions and breakdowns of plumbing traps on the gravity drainage sections of the pipeline. The article performs the schemes of the water outflow from the floor sewer into the riser, including the places where the riser is bent, of air delivery to the working riser under the change of the direction of drain movement with the dropout line set-up, with the installation of an automatic anti-vacuum valve, with the installation of the ventilation pipeline. Investigations of the process of sewage waste flow in a sewage riser were carried out, in order to select the appropriate structure. The authors consider structure features of some sections of sewerage in high-rise buildings. The exhaustion value in the riser is determined from the rarefactions that occur below the compressed cross-section of the riser and the loss of the air flow pressure coming from the atmosphere into the riser during the deflooding of the liquid. Preventing the formation of obstructions and breakdowns of plumbing traps is an integral part of sewage networks.
NASA Astrophysics Data System (ADS)
Jensen-Clem, Rebecca; Duev, Dmitry A.; Riddle, Reed; Salama, Maïssa; Baranec, Christoph; Law, Nicholas M.; Kulkarni, S. R.; Ramprakash, A. N.
2018-01-01
Robo-AO is an autonomous laser guide star adaptive optics (AO) system recently commissioned at the Kitt Peak 2.1 m telescope. With the ability to observe every clear night, Robo-AO at the 2.1 m telescope is the first dedicated AO observatory. This paper presents the imaging performance of the AO system in its first 18 months of operations. For a median seeing value of 1.″44, the average Strehl ratio is 4% in the i\\prime band. After post processing, the contrast ratio under sub-arcsecond seeing for a 2≤slant i\\prime ≤slant 16 primary star is five and seven magnitudes at radial offsets of 0.″5 and 1.″0, respectively. The data processing and archiving pipelines run automatically at the end of each night. The first stage of the processing pipeline shifts and adds the rapid frame rate data using techniques optimized for different signal-to-noise ratios. The second “high-contrast” stage of the pipeline is eponymously well suited to finding faint stellar companions. Currently, a range of scientific programs, including the synthetic tracking of near-Earth asteroids, the binarity of stars in young clusters, and weather on solar system planets are being undertaken with Robo-AO.
Designing integrated computational biology pipelines visually.
Jamil, Hasan M
2013-01-01
The long-term cost of developing and maintaining a computational pipeline that depends upon data integration and sophisticated workflow logic is too high to even contemplate "what if" or ad hoc type queries. In this paper, we introduce a novel application building interface for computational biology research, called VizBuilder, by leveraging a recent query language called BioFlow for life sciences databases. Using VizBuilder, it is now possible to develop ad hoc complex computational biology applications at throw away costs. The underlying query language supports data integration and workflow construction almost transparently and fully automatically, using a best effort approach. Users express their application by drawing it with VizBuilder icons and connecting them in a meaningful way. Completed applications are compiled and translated as BioFlow queries for execution by the data management system LifeDB, for which VizBuilder serves as a front end. We discuss VizBuilder features and functionalities in the context of a real life application after we briefly introduce BioFlow. The architecture and design principles of VizBuilder are also discussed. Finally, we outline future extensions of VizBuilder. To our knowledge, VizBuilder is a unique system that allows visually designing computational biology pipelines involving distributed and heterogeneous resources in an ad hoc manner.
Computer vision and machine learning for robust phenotyping in genome-wide studies
Zhang, Jiaoping; Naik, Hsiang Sing; Assefa, Teshale; Sarkar, Soumik; Reddy, R. V. Chowda; Singh, Arti; Ganapathysubramanian, Baskar; Singh, Asheesh K.
2017-01-01
Traditional evaluation of crop biotic and abiotic stresses are time-consuming and labor-intensive limiting the ability to dissect the genetic basis of quantitative traits. A machine learning (ML)-enabled image-phenotyping pipeline for the genetic studies of abiotic stress iron deficiency chlorosis (IDC) of soybean is reported. IDC classification and severity for an association panel of 461 diverse plant-introduction accessions was evaluated using an end-to-end phenotyping workflow. The workflow consisted of a multi-stage procedure including: (1) optimized protocols for consistent image capture across plant canopies, (2) canopy identification and registration from cluttered backgrounds, (3) extraction of domain expert informed features from the processed images to accurately represent IDC expression, and (4) supervised ML-based classifiers that linked the automatically extracted features with expert-rating equivalent IDC scores. ML-generated phenotypic data were subsequently utilized for the genome-wide association study and genomic prediction. The results illustrate the reliability and advantage of ML-enabled image-phenotyping pipeline by identifying previously reported locus and a novel locus harboring a gene homolog involved in iron acquisition. This study demonstrates a promising path for integrating the phenotyping pipeline into genomic prediction, and provides a systematic framework enabling robust and quicker phenotyping through ground-based systems. PMID:28272456
[Development of automatic urine monitoring system].
Wei, Liang; Li, Yongqin; Chen, Bihua
2014-03-01
An automatic urine monitoring system is presented to replace manual operation. The system is composed of the flow sensor, MSP430f149 single chip microcomputer, human-computer interaction module, LCD module, clock module and memory module. The signal of urine volume is captured when the urine flows through the flow sensor and then displayed on the LCD after data processing. The experiment results suggest that the design of the monitor provides a high stability, accurate measurement and good real-time, and meets the demand of the clinical application.
Experimental and analytical study of water pipe's rupture for damage identification purposes
NASA Astrophysics Data System (ADS)
Papakonstantinou, Konstantinos G.; Shinozuka, Masanobu; Beikae, Mohsen
2011-04-01
A malfunction, local damage or sudden pipe break of a pipeline system can trigger significant flow variations. As shown in the paper, pressure variations and pipe vibrations are two strongly correlated parameters. A sudden change in the flow velocity and pressure of a pipeline system can induce pipe vibrations. Thus, based on acceleration data, a rapid detection and localization of a possible damage may be carried out by inexpensive, nonintrusive monitoring techniques. To illustrate this approach, an experiment on a single pipe was conducted in the laboratory. Pressure gauges and accelerometers were installed and their correlation was checked during an artificially created transient flow. The experimental findings validated the correlation between the parameters. The interaction between pressure variations and pipe vibrations was also theoretically justified. The developed analytical model explains the connection among flow pressure, velocity, pressure wave propagation and pipe vibration. The proposed method provides a rapid, efficient and practical way to identify and locate sudden failures of a pipeline system and sets firm foundations for the development and implementation of an advanced, new generation Supervisory Control and Data Acquisition (SCADA) system for continuous health monitoring of pipe networks.
NASA Astrophysics Data System (ADS)
Longmore, S. N.; Collins, R. P.; Pfeifer, S.; Fox, S. E.; Mulero-Pazmany, M.; Bezombes, F.; Goodwind, A.; de Juan Ovelar, M.; Knapen, J. H.; Wich, S. A.
2017-02-01
In this paper we describe an unmanned aerial system equipped with a thermal-infrared camera and software pipeline that we have developed to monitor animal populations for conservation purposes. Taking a multi-disciplinary approach to tackle this problem, we use freely available astronomical source detection software and the associated expertise of astronomers, to efficiently and reliably detect humans and animals in aerial thermal-infrared footage. Combining this astronomical detection software with existing machine learning algorithms into a single, automated, end-to-end pipeline, we test the software using aerial video footage taken in a controlled, field-like environment. We demonstrate that the pipeline works reliably and describe how it can be used to estimate the completeness of different observational datasets to objects of a given type as a function of height, observing conditions etc. - a crucial step in converting video footage to scientifically useful information such as the spatial distribution and density of different animal species. Finally, having demonstrated the potential utility of the system, we describe the steps we are taking to adapt the system for work in the field, in particular systematic monitoring of endangered species at National Parks around the world.
Pipelines subject to slow landslide movements: Structural modeling vs field measurement
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bruschi, R.; Glavina, S.; Spinazze, M.
1996-12-01
In recent years finite element techniques have been increasingly used to investigate the behavior of buried pipelines subject to soil movements. The use of these tools provides a rational basis for the definition of minimum wall thickness requirements in landslide crossings. Furthermore the design of mitigation measures or monitoring systems which control the development of undesirable strains in the pipe wall over time, requires a detailed structural modeling. The scope of this paper is to discuss the use of dedicated structural modeling with relevant calibration to field measurements. The strain measurements used were regularly gathered from pipe sections, in twomore » different sites over a period of time long enough to record changes of axial strain due to soil movement. Detailed structural modeling of pipeline layout in both sites and for operating conditions, is applied. Numerical simulations show the influence of the distribution of soil movement acting on the pipeline with regards to the state of strain which can be developed in certain locations. The role of soil nature and direction of relative movements in the definition of loads transferred to the pipeline, is also discussed.« less
AMBULATORY BLOOD PRESSURE MONITORING: THE NEED OF 7-DAY RECORD
HALBERG, F.; KATINAS, G.; CORNÉLISSEN, G.; SCHWARTZKOPFF, O.; FIŠER, B.; SIEGELOVÁ, J.; DUŠEK, J.; JANČÍK, J.
2008-01-01
The need for systematic around-the-clock self-measurements of blood pressure (BP) and heart rate (HR), or preferably for automatic monitoring as the need arises and can be met by inexpensive tools, is illustrated in two case reports. Miniaturized unobtrusive, as yet unavailable instrumentation for the automatic measurement of BP and HR should be a high priority for both government and industry. Automatic ambulatorily functioning monitors already represent great progress, enabling us to introduce the concept of eventually continuous or, as yet, intermittent home ABPM. On BP and HR records, gliding spectra aligned with global spectra visualize the changing dynamics involved in health and disease, and can be part of an eventually automated system of therapy adjusted to the ever-present variability of BP. In the interim, with tools already available, chronomics on self- or automatic measurements can be considered, with analyses provided by the Halberg Chronobiology Center, as an alternative to “flying blind”, as an editor put it. Chronomics assessing variability has to be considered. PMID:19018289
López-Linares, Karen; Aranjuelo, Nerea; Kabongo, Luis; Maclair, Gregory; Lete, Nerea; Ceresa, Mario; García-Familiar, Ainhoa; Macía, Iván; González Ballester, Miguel A
2018-05-01
Computerized Tomography Angiography (CTA) based follow-up of Abdominal Aortic Aneurysms (AAA) treated with Endovascular Aneurysm Repair (EVAR) is essential to evaluate the progress of the patient and detect complications. In this context, accurate quantification of post-operative thrombus volume is required. However, a proper evaluation is hindered by the lack of automatic, robust and reproducible thrombus segmentation algorithms. We propose a new fully automatic approach based on Deep Convolutional Neural Networks (DCNN) for robust and reproducible thrombus region of interest detection and subsequent fine thrombus segmentation. The DetecNet detection network is adapted to perform region of interest extraction from a complete CTA and a new segmentation network architecture, based on Fully Convolutional Networks and a Holistically-Nested Edge Detection Network, is presented. These networks are trained, validated and tested in 13 post-operative CTA volumes of different patients using a 4-fold cross-validation approach to provide more robustness to the results. Our pipeline achieves a Dice score of more than 82% for post-operative thrombus segmentation and provides a mean relative volume difference between ground truth and automatic segmentation that lays within the experienced human observer variance without the need of human intervention in most common cases. Copyright © 2018 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Knox, H. A.; Draelos, T.; Young, C. J.; Lawry, B.; Chael, E. P.; Faust, A.; Peterson, M. G.
2015-12-01
The quality of automatic detections from seismic sensor networks depends on a large number of data processing parameters that interact in complex ways. The largely manual process of identifying effective parameters is painstaking and does not guarantee that the resulting controls are the optimal configuration settings. Yet, achieving superior automatic detection of seismic events is closely related to these parameters. We present an automated sensor tuning (AST) system that learns near-optimal parameter settings for each event type using neuro-dynamic programming (reinforcement learning) trained with historic data. AST learns to test the raw signal against all event-settings and automatically self-tunes to an emerging event in real-time. The overall goal is to reduce the number of missed legitimate event detections and the number of false event detections. Reducing false alarms early in the seismic pipeline processing will have a significant impact on this goal. Applicable both for existing sensor performance boosting and new sensor deployment, this system provides an important new method to automatically tune complex remote sensing systems. Systems tuned in this way will achieve better performance than is currently possible by manual tuning, and with much less time and effort devoted to the tuning process. With ground truth on detections in seismic waveforms from a network of stations, we show that AST increases the probability of detection while decreasing false alarms.
Contract Monitoring in Agent-Based Systems: Case Study
NASA Astrophysics Data System (ADS)
Hodík, Jiří; Vokřínek, Jiří; Jakob, Michal
Monitoring of fulfilment of obligations defined by electronic contracts in distributed domains is presented in this paper. A two-level model of contract-based systems and the types of observations needed for contract monitoring are introduced. The observations (inter-agent communication and agents’ actions) are collected and processed by the contract observation and analysis pipeline. The presented approach has been utilized in a multi-agent system for electronic contracting in a modular certification testing domain.
SU-E-I-97: Smart Auto-Planning Framework in An EMR Environment (SAFEE)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, B; Chen, S; Mutaf, Y
2014-06-01
Purpose: Our Radiation Oncology Department uses clinical practice guidelines for patient treatment, including normal tissue sparing and other dosimetric constraints. These practice guidelines were adapted from national guidelines, clinical trials, literature reviews, and practitioner's own experience. Modern treatment planning systems (TPS) have the capability of incorporating these practice guidelines to automatically create radiation therapy treatment plans with little human intervention. We are developing a software infrastructure to integrate clinical practice guidelines and radiation oncology electronic medical record (EMR) system into radiation therapy treatment planning system (TPS) for auto planning. Methods: Our Smart Auto-Planning Framework in an EMR environment (SAFEE) usesmore » a software pipeline framework to integrate practice guidelines,EMR, and TPS together. The SAFEE system starts with retrieving diagnosis information and physician's prescription from the EMR system. After approval of contouring, SAFEE will automatically create plans according to our guidelines. Based on clinical objectives, SAFEE will automatically select treatment delivery techniques (such as, 3DRT/IMRT/VMAT) and optimize plans. When necessary, SAFEE will create multiple treatment plans with different combinations of parameters. SAFEE's pipeline structure makes it very flexible to integrate various techniques, such as, Model-Base Segmentation (MBS) and plan optimization algorithms, e.g., Multi-Criteria Optimization (MCO). In addition, SAFEE uses machine learning, data mining techniques, and an integrated database to create clinical knowledgebase and then answer clinical questions, such as, how to score plan quality or how volume overlap affects physicians' decision in beam and treatment technique selection. Results: In our institution, we use Varian Aria EMR system and RayStation TPS from RaySearch, whose ScriptService API allows control by external programs. These applications are the building blocks of our SAFEE system. Conclusion: SAFEE is a feasible method of integrating clinical information to develop an auto-planning paradigm to improve clinical workflow in cancer patient care.« less
Use of Automatic Interaction Detector in Monitoring Faculty Salaries. AIR 1983 Annual Forum Paper.
ERIC Educational Resources Information Center
Cohen, Margaret E.
A university's use of the Automatic Interaction Detector (AID) to monitor faculty salary data is described. The first step consists of examining a tree diagram and summary table produced by AID. The tree is used to identify the characteristics of faculty at different salary levels. The table is used to determine the explanatory power of the…
Demonstration of subsidence monitoring system
NASA Astrophysics Data System (ADS)
Conroy, P. J.; Gyarmaty, J. H.; Pearson, M. L.
1981-06-01
Data on coal mine subsidence were studied as a basis for the development of subsidence control technology. Installation, monitoring, and evaluation of three subsidence monitoring instrument systems were examined: structure performance, performance of supported systems, and performance of caving systems. Objectives of the instrument program were: (1) to select, test, assemble, install, monitor, and maintain all instrumentation required for implementing the three subsidence monitoring systems; and (2) to evaluate performance of each instrument individually and as part of the appropriate monitoring system or systems. The use of an automatic level and a rod extensometer for measuring structure performance, and the automatic level, steel tape extensometer, FPBX, FPBI, USBM borehole deformation gauge, and vibrating wire stressmeters for measuring the performance of caving systems are recommended.
Automatic start control for a three-phase electric motor using infrared sensors
NASA Astrophysics Data System (ADS)
Echenique Lima, Mario; Ramírez Arenas, Francisco; Rodríguez Pedroza, Griselda
2006-02-01
We introduce equipment for the automatic activation of a three-phase electric motor (1Hp, 3A, 240V AC) using 2 infrared sensors monitored by a Microchip microcontroller PIC16F62x@4Mhz for the control of a filling system. This project was carried out to Fabrica de Chocolates y Dulces Costanzo, where the automatization of cacao grain supply was required for a machine in charge of cleaning the cacao from its rind. This process demanded the monitoring of the filling level to avoid the spill of toasted cacao.
NASA Astrophysics Data System (ADS)
Saqib, Najam us; Faizan Mysorewala, Muhammad; Cheded, Lahouari
2017-12-01
In this paper, we propose a novel monitoring strategy for a wireless sensor networks (WSNs)-based water pipeline network. Our strategy uses a multi-pronged approach to reduce energy consumption based on the use of two types of vibration sensors and pressure sensors, all having different energy levels, and a hierarchical adaptive sampling mechanism to determine the sampling frequency. The sampling rate of the sensors is adjusted according to the bandwidth of the vibration signal being monitored by using a wavelet-based adaptive thresholding scheme that calculates the new sampling frequency for the following cycle. In this multimodal sensing scheme, the duty-cycling approach is used for all sensors to reduce the sampling instances, such that the high-energy, high-precision (HE-HP) vibration sensors have low duty cycles, and the low-energy, low-precision (LE-LP) vibration sensors have high duty cycles. The low duty-cycling (HE-HP) vibration sensor adjusts the sampling frequency of the high duty-cycling (LE-LP) vibration sensor. The simulated test bed considered here consists of a water pipeline network which uses pressure and vibration sensors, with the latter having different energy consumptions and precision levels, at various locations in the network. This is all the more useful for energy conservation for extended monitoring. It is shown that by using the novel features of our proposed scheme, a significant reduction in energy consumption is achieved and the leak is effectively detected by the sensor node that is closest to it. Finally, both the total energy consumed by monitoring as well as the time to detect the leak by a WSN node are computed, and show the superiority of our proposed hierarchical adaptive sampling algorithm over a non-adaptive sampling approach.
NASA Astrophysics Data System (ADS)
Baskoro, Ario Sunar; Kabutomori, Masashi; Suga, Yasuo
An automatic welding system using Tungsten Inert Gas (TIG) welding with vision sensor for welding of aluminum pipe was constructed. This research studies the intelligent welding process of aluminum alloy pipe 6063S-T5 in fixed position and moving welding torch with the AC welding machine. The monitoring system consists of a vision sensor using a charge-coupled device (CCD) camera to monitor backside image of molten pool. The captured image was processed to recognize the edge of molten pool by image processing algorithm. Neural network model for welding speed control were constructed to perform the process automatically. From the experimental results it shows the effectiveness of the control system confirmed by good detection of molten pool and sound weld of experimental result.
Update on the SDSS-III MARVELS data pipeline development
NASA Astrophysics Data System (ADS)
Li, Rui; Ge, J.; Thomas, N. B.; Petersen, E.; Wang, J.; Ma, B.; Sithajan, S.; Shi, J.; Ouyang, Y.; Chen, Y.
2014-01-01
MARVELS (Multi-object APO Radial Velocity Exoplanet Large-area Survey), as one of the four surveys in the SDSS-III program, has monitored over 3,300 stars during 2008-2012, with each being visited an average of 26 times over a 2-year window. Although the early data pipeline was able to detect over 20 brown dwarf candidates and several hundreds of binaries, no giant planet candidates have been reliably identified due to its large systematic errors. Learning from past data pipeline lessons, we re-designed the entire pipeline to handle various types of systematic effects caused by the instrument (such as trace, slant, distortion, drifts and dispersion) and observation condition changes (such as illumination profile and continuum). We also introduced several advanced methods to precisely extract the RV signals. To date, we have achieved a long term RMS RV measurement error of 14 m/s for HIP-14810 (one of our reference stars) after removal of the known planet signal based on previous HIRES RV measurement. This new 1-D data pipeline has been used to robustly identify four giant planet candidates within the small fraction of the survey data that has been processed (Thomas et al. this meeting). The team is currently working hard to optimize the pipeline, especially the 2-D interference-fringe RV extraction, where early results show a 1.5 times improvement over the 1-D data pipeline. We are quickly approaching the survey baseline performance requirement of 10-35 m/s RMS for 8-12 solar type stars. With this fine-tuned pipeline and the soon to be processed plates of data, we expect to discover many more giant planet candidates and make a large statistical impact to the exoplanet study.
Aerial image databases for pipeline rights-of-way management
NASA Astrophysics Data System (ADS)
Jadkowski, Mark A.
1996-03-01
Pipeline companies that own and manage extensive rights-of-way corridors are faced with ever-increasing regulatory pressures, operating issues, and the need to remain competitive in today's marketplace. Automation has long been an answer to the problem of having to do more work with less people, and Automated Mapping/Facilities Management/Geographic Information Systems (AM/FM/GIS) solutions have been implemented at several pipeline companies. Until recently, the ability to cost-effectively acquire and incorporate up-to-date aerial imagery into these computerized systems has been out of the reach of most users. NASA's Earth Observations Commercial Applications Program (EOCAP) is providing a means by which pipeline companies can bridge this gap. The EOCAP project described in this paper includes a unique partnership with NASA and James W. Sewall Company to develop an aircraft-mounted digital camera system and a ground-based computer system to geometrically correct and efficiently store and handle the digital aerial images in an AM/FM/GIS environment. This paper provides a synopsis of the project, including details on (1) the need for aerial imagery, (2) NASA's interest and role in the project, (3) the design of a Digital Aerial Rights-of-Way Monitoring System, (4) image georeferencing strategies for pipeline applications, and (5) commercialization of the EOCAP technology through a prototype project at Algonquin Gas Transmission Company which operates major gas pipelines in New England, New York, and New Jersey.
Real-time electronic monitoring of a pitted and leaking gas gathering pipeline
DOE Office of Scientific and Technical Information (OSTI.GOV)
Asperger, R.G.; Hewitt, P.G.
1986-08-01
Hydrogen patch, flush electrical resistance, and flush linear polarization proves wre used with flush coupons to monitor corrosion rates in a pitted and leaking sour gas gathering line. Four inhibitors were evaluated in stopping the leaks. Inhibitor residuals and the amount and ratio of water and condensate in the lines were measured at five locations along the line. The best inhibitor reduced reduced the pit-leak frequency by over a factor of 10. Inhibitor usage rate was optimized using the hydrogen patch current as a measure of the instantaneous corrosion rate. Improper pigging was identified as a cause of corrosion transients.more » This problem is discussed in relation to the pigging of pipelines in stratified flow where moving fluids are the carriers for continuously injected corrosion inhibitors.« less
NASA Astrophysics Data System (ADS)
Wu, Huijuan; Qian, Ya; Zhang, Wei; Tang, Chenghao
2017-12-01
High sensitivity of a distributed optical-fiber vibration sensing (DOVS) system based on the phase-sensitivity optical time domain reflectometry (Φ-OTDR) technology also brings in high nuisance alarm rates (NARs) in real applications. In this paper, feature extraction methods of wavelet decomposition (WD) and wavelet packet decomposition (WPD) are comparatively studied for three typical field testing signals, and an artificial neural network (ANN) is built for the event identification. The comparison results prove that the WPD performs a little better than the WD for the DOVS signal analysis and identification in oil pipeline safety monitoring. The identification rate can be improved up to 94.4%, and the nuisance alarm rate can be effectively controlled as low as 5.6% for the identification network with the wavelet packet energy distribution features.
19 CFR 360.103 - Automatic issuance of import licenses.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 19 Customs Duties 3 2010-04-01 2010-04-01 false Automatic issuance of import licenses. 360.103 Section 360.103 Customs Duties INTERNATIONAL TRADE ADMINISTRATION, DEPARTMENT OF COMMERCE STEEL IMPORT MONITORING AND ANALYSIS SYSTEM § 360.103 Automatic issuance of import licenses. (a) In general. Steel import...
19 CFR 360.103 - Automatic issuance of import licenses.
Code of Federal Regulations, 2014 CFR
2014-04-01
... 19 Customs Duties 3 2014-04-01 2014-04-01 false Automatic issuance of import licenses. 360.103 Section 360.103 Customs Duties INTERNATIONAL TRADE ADMINISTRATION, DEPARTMENT OF COMMERCE STEEL IMPORT MONITORING AND ANALYSIS SYSTEM § 360.103 Automatic issuance of import licenses. (a) In general. Steel import...
19 CFR 360.103 - Automatic issuance of import licenses.
Code of Federal Regulations, 2013 CFR
2013-04-01
... 19 Customs Duties 3 2013-04-01 2013-04-01 false Automatic issuance of import licenses. 360.103 Section 360.103 Customs Duties INTERNATIONAL TRADE ADMINISTRATION, DEPARTMENT OF COMMERCE STEEL IMPORT MONITORING AND ANALYSIS SYSTEM § 360.103 Automatic issuance of import licenses. (a) In general. Steel import...
19 CFR 360.103 - Automatic issuance of import licenses.
Code of Federal Regulations, 2012 CFR
2012-04-01
... 19 Customs Duties 3 2012-04-01 2012-04-01 false Automatic issuance of import licenses. 360.103 Section 360.103 Customs Duties INTERNATIONAL TRADE ADMINISTRATION, DEPARTMENT OF COMMERCE STEEL IMPORT MONITORING AND ANALYSIS SYSTEM § 360.103 Automatic issuance of import licenses. (a) In general. Steel import...
19 CFR 360.103 - Automatic issuance of import licenses.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 19 Customs Duties 3 2011-04-01 2011-04-01 false Automatic issuance of import licenses. 360.103 Section 360.103 Customs Duties INTERNATIONAL TRADE ADMINISTRATION, DEPARTMENT OF COMMERCE STEEL IMPORT MONITORING AND ANALYSIS SYSTEM § 360.103 Automatic issuance of import licenses. (a) In general. Steel import...
30 CFR 27.23 - Automatic warning device.
Code of Federal Regulations, 2010 CFR
2010-07-01
... APPROVAL OF MINING PRODUCTS METHANE-MONITORING SYSTEMS Construction and Design Requirements § 27.23... function automatically at a methane content of the mine atmosphere between 1.0 to 1.5 volume percent and at all higher concentrations of methane. (c) It is recommended that the automatic warning device be...
Dynamic Human Body Modeling Using a Single RGB Camera.
Zhu, Haiyu; Yu, Yao; Zhou, Yu; Du, Sidan
2016-03-18
In this paper, we present a novel automatic pipeline to build personalized parametric models of dynamic people using a single RGB camera. Compared to previous approaches that use monocular RGB images, our system can model a 3D human body automatically and incrementally, taking advantage of human motion. Based on coarse 2D and 3D poses estimated from image sequences, we first perform a kinematic classification of human body parts to refine the poses and obtain reconstructed body parts. Next, a personalized parametric human model is generated by driving a general template to fit the body parts and calculating the non-rigid deformation. Experimental results show that our shape estimation method achieves comparable accuracy with reconstructed models using depth cameras, yet requires neither user interaction nor any dedicated devices, leading to the feasibility of using this method on widely available smart phones.
volBrain: An Online MRI Brain Volumetry System
Manjón, José V.; Coupé, Pierrick
2016-01-01
The amount of medical image data produced in clinical and research settings is rapidly growing resulting in vast amount of data to analyze. Automatic and reliable quantitative analysis tools, including segmentation, allow to analyze brain development and to understand specific patterns of many neurological diseases. This field has recently experienced many advances with successful techniques based on non-linear warping and label fusion. In this work we present a novel and fully automatic pipeline for volumetric brain analysis based on multi-atlas label fusion technology that is able to provide accurate volumetric information at different levels of detail in a short time. This method is available through the volBrain online web interface (http://volbrain.upv.es), which is publically and freely accessible to the scientific community. Our new framework has been compared with current state-of-the-art methods showing very competitive results. PMID:27512372
Dynamic Human Body Modeling Using a Single RGB Camera
Zhu, Haiyu; Yu, Yao; Zhou, Yu; Du, Sidan
2016-01-01
In this paper, we present a novel automatic pipeline to build personalized parametric models of dynamic people using a single RGB camera. Compared to previous approaches that use monocular RGB images, our system can model a 3D human body automatically and incrementally, taking advantage of human motion. Based on coarse 2D and 3D poses estimated from image sequences, we first perform a kinematic classification of human body parts to refine the poses and obtain reconstructed body parts. Next, a personalized parametric human model is generated by driving a general template to fit the body parts and calculating the non-rigid deformation. Experimental results show that our shape estimation method achieves comparable accuracy with reconstructed models using depth cameras, yet requires neither user interaction nor any dedicated devices, leading to the feasibility of using this method on widely available smart phones. PMID:26999159
volBrain: An Online MRI Brain Volumetry System.
Manjón, José V; Coupé, Pierrick
2016-01-01
The amount of medical image data produced in clinical and research settings is rapidly growing resulting in vast amount of data to analyze. Automatic and reliable quantitative analysis tools, including segmentation, allow to analyze brain development and to understand specific patterns of many neurological diseases. This field has recently experienced many advances with successful techniques based on non-linear warping and label fusion. In this work we present a novel and fully automatic pipeline for volumetric brain analysis based on multi-atlas label fusion technology that is able to provide accurate volumetric information at different levels of detail in a short time. This method is available through the volBrain online web interface (http://volbrain.upv.es), which is publically and freely accessible to the scientific community. Our new framework has been compared with current state-of-the-art methods showing very competitive results.
Tang, Wei; Peled, Noam; Vallejo, Deborah I.; Borzello, Mia; Dougherty, Darin D.; Eskandar, Emad N.; Widge, Alik S.; Cash, Sydney S.; Stufflebeam, Steven M.
2018-01-01
Purpose Existing methods for sorting, labeling, registering, and across-subject localization of electrodes in intracranial encephalography (iEEG) may involve laborious work requiring manual inspection of radiological images. Methods We describe a new open-source software package, the interactive electrode localization utility which presents a full pipeline for the registration, localization, and labeling of iEEG electrodes from CT and MR images. In addition, we describe a method to automatically sort and label electrodes from subdural grids of known geometry. Results We validated our software against manual inspection methods in twelve subjects undergoing iEEG for medically intractable epilepsy. Our algorithm for sorting and labeling performed correct identification on 96% of the electrodes. Conclusions The sorting and labeling methods we describe offer nearly perfect performance and the software package we have distributed may simplify the process of registering, sorting, labeling, and localizing subdural iEEG grid electrodes by manual inspection. PMID:27915398
Vlaic, Sebastian; Hoffmann, Bianca; Kupfer, Peter; Weber, Michael; Dräger, Andreas
2013-09-01
GRN2SBML automatically encodes gene regulatory networks derived from several inference tools in systems biology markup language. Providing a graphical user interface, the networks can be annotated via the simple object access protocol (SOAP)-based application programming interface of BioMart Central Portal and minimum information required in the annotation of models registry. Additionally, we provide an R-package, which processes the output of supported inference algorithms and automatically passes all required parameters to GRN2SBML. Therefore, GRN2SBML closes a gap in the processing pipeline between the inference of gene regulatory networks and their subsequent analysis, visualization and storage. GRN2SBML is freely available under the GNU Public License version 3 and can be downloaded from http://www.hki-jena.de/index.php/0/2/490. General information on GRN2SBML, examples and tutorials are available at the tool's web page.
Automatic finger joint synovitis localization in ultrasound images
NASA Astrophysics Data System (ADS)
Nurzynska, Karolina; Smolka, Bogdan
2016-04-01
A long-lasting inflammation of joints results between others in many arthritis diseases. When not cured, it may influence other organs and general patients' health. Therefore, early detection and running proper medical treatment are of big value. The patients' organs are scanned with high frequency acoustic waves, which enable visualization of interior body structures through an ultrasound sonography (USG) image. However, the procedure is standardized, different projections result in a variety of possible data, which should be analyzed in short period of time by a physician, who is using medical atlases as a guidance. This work introduces an efficient framework based on statistical approach to the finger joint USG image, which enables automatic localization of skin and bone regions, which are then used for localization of the finger joint synovitis area. The processing pipeline realizes the task in real-time and proves high accuracy when compared to annotation prepared by the expert.
Towards photometry pipeline of the Indonesian space surveillance system
NASA Astrophysics Data System (ADS)
Priyatikanto, Rhorom; Religia, Bahar; Rachman, Abdul; Dani, Tiar
2015-09-01
Optical observation through sub-meter telescope equipped with CCD camera becomes alternative method for increasing orbital debris detection and surveillance. This observational mode is expected to eye medium-sized objects in higher orbits (e.g. MEO, GTO, GSO & GEO), beyond the reach of usual radar system. However, such observation of fast moving objects demands special treatment and analysis technique. In this study, we performed photometric analysis of the satellite track images photographed using rehabilitated Schmidt Bima Sakti telescope in Bosscha Observatory. The Hough transformation was implemented to automatically detect linear streak from the images. From this analysis and comparison to USSPACECOM catalog, two satellites were identified and associated with inactive Thuraya-3 satellite and Satcom-3 debris which are located at geostationary orbit. Further aperture photometry analysis revealed the periodicity of tumbling Satcom-3 debris. In the near future, it is not impossible to apply similar scheme to establish an analysis pipeline for optical space surveillance system hosted in Indonesia.
Lee, Hyung-Chul; Jung, Chul-Woo
2018-01-24
The current anaesthesia information management system (AIMS) has limited capability for the acquisition of high-quality vital signs data. We have developed a Vital Recorder program to overcome the disadvantages of AIMS and to support research. Physiological data of surgical patients were collected from 10 operating rooms using the Vital Recorder. The basic equipment used were a patient monitor, the anaesthesia machine, and the bispectral index (BIS) monitor. Infusion pumps, cardiac output monitors, regional oximeter, and rapid infusion device were added as required. The automatic recording option was used exclusively and the status of recording was frequently checked through web monitoring. Automatic recording was successful in 98.5% (4,272/4,335) cases during eight months of operation. The total recorded time was 13,489 h (3.2 ± 1.9 h/case). The Vital Recorder's automatic recording and remote monitoring capabilities enabled us to record physiological big data with minimal effort. The Vital Recorder also provided time-synchronised data captured from a variety of devices to facilitate an integrated analysis of vital signs data. The free distribution of the Vital Recorder is expected to improve data access for researchers attempting physiological data studies and to eliminate inequalities in research opportunities due to differences in data collection capabilities.
ORAC-DR -- imaging data reduction
NASA Astrophysics Data System (ADS)
Currie, Malcolm J.; Cavanagh, Brad
ORAC-DR is a general-purpose automatic data-reduction pipeline environment. This document describes its use to reduce imaging data collected at the United Kingdom Infrared Telescope (UKIRT) with the UFTI, UIST, IRCAM, and Michelle instruments; at the Anglo-Australian Telescope (AAT) with the IRIS2 instrument; at the Very Large Telescope with ISAAC and NACO; from Magellan's Classic Cam, at Gemini with NIRI, and from the Isaac Newton Group using INGRID. It outlines the algorithms used and how to make minor modifications to them, and how to correct for errors made at the telescope.
Automatic aortic root segmentation in CTA whole-body dataset
NASA Astrophysics Data System (ADS)
Gao, Xinpei; Kitslaar, Pieter H.; Scholte, Arthur J. H. A.; Lelieveldt, Boudewijn P. F.; Dijkstra, Jouke; Reiber, Johan H. C.
2016-03-01
Trans-catheter aortic valve replacement (TAVR) is an evolving technique for patients with serious aortic stenosis disease. Typically, in this application a CTA data set is obtained of the patient's arterial system from the subclavian artery to the femoral arteries, to evaluate the quality of the vascular access route and analyze the aortic root to determine if and which prosthesis should be used. In this paper, we concentrate on the automated segmentation of the aortic root. The purpose of this study was to automatically segment the aortic root in computed tomography angiography (CTA) datasets to support TAVR procedures. The method in this study includes 4 major steps. First, the patient's cardiac CTA image was resampled to reduce the computation time. Next, the cardiac CTA image was segmented using an atlas-based approach. The most similar atlas was selected from a total of 8 atlases based on its image similarity to the input CTA image. Third, the aortic root segmentation from the previous step was transferred to the patient's whole-body CTA image by affine registration and refined in the fourth step using a deformable subdivision surface model fitting procedure based on image intensity. The pipeline was applied to 20 patients. The ground truth was created by an analyst who semi-automatically corrected the contours of the automatic method, where necessary. The average Dice similarity index between the segmentations of the automatic method and the ground truth was found to be 0.965±0.024. In conclusion, the current results are very promising.
Information processing requirements for on-board monitoring of automatic landing
NASA Technical Reports Server (NTRS)
Sorensen, J. A.; Karmarkar, J. S.
1977-01-01
A systematic procedure is presented for determining the information processing requirements for on-board monitoring of automatic landing systems. The monitoring system detects landing anomalies through use of appropriate statistical tests. The time-to-correct aircraft perturbations is determined from covariance analyses using a sequence of suitable aircraft/autoland/pilot models. The covariance results are used to establish landing safety and a fault recovery operating envelope via an event outcome tree. This procedure is demonstrated with examples using the NASA Terminal Configured Vehicle (B-737 aircraft). The procedure can also be used to define decision height, assess monitoring implementation requirements, and evaluate alternate autoland configurations.
Sarker, Abeed; O'Connor, Karen; Ginn, Rachel; Scotch, Matthew; Smith, Karen; Malone, Dan; Gonzalez, Graciela
2016-03-01
Prescription medication overdose is the fastest growing drug-related problem in the USA. The growing nature of this problem necessitates the implementation of improved monitoring strategies for investigating the prevalence and patterns of abuse of specific medications. Our primary aims were to assess the possibility of utilizing social media as a resource for automatic monitoring of prescription medication abuse and to devise an automatic classification technique that can identify potentially abuse-indicating user posts. We collected Twitter user posts (tweets) associated with three commonly abused medications (Adderall(®), oxycodone, and quetiapine). We manually annotated 6400 tweets mentioning these three medications and a control medication (metformin) that is not the subject of abuse due to its mechanism of action. We performed quantitative and qualitative analyses of the annotated data to determine whether posts on Twitter contain signals of prescription medication abuse. Finally, we designed an automatic supervised classification technique to distinguish posts containing signals of medication abuse from those that do not and assessed the utility of Twitter in investigating patterns of abuse over time. Our analyses show that clear signals of medication abuse can be drawn from Twitter posts and the percentage of tweets containing abuse signals are significantly higher for the three case medications (Adderall(®): 23 %, quetiapine: 5.0 %, oxycodone: 12 %) than the proportion for the control medication (metformin: 0.3 %). Our automatic classification approach achieves 82 % accuracy overall (medication abuse class recall: 0.51, precision: 0.41, F measure: 0.46). To illustrate the utility of automatic classification, we show how the classification data can be used to analyze abuse patterns over time. Our study indicates that social media can be a crucial resource for obtaining abuse-related information for medications, and that automatic approaches involving supervised classification and natural language processing hold promises for essential future monitoring and intervention tasks.
ERIC Educational Resources Information Center
Chounta, Irene-Angelica; Avouris, Nikolaos
2016-01-01
This paper presents the integration of a real time evaluation method of collaboration quality in a monitoring application that supports teachers in class orchestration. The method is implemented as an automatic rater of collaboration quality and studied in a real time scenario of use. We argue that automatic and semi-automatic methods which…
Technical-Environmental Permafrost Observatories (TEPO) of northern West Siberia
NASA Astrophysics Data System (ADS)
Kurchatova, A. N.; Griva, G. I.; Osokin, A. B.; Smolov, G. K.
2005-12-01
During the last decade one of the most developed topics in environmental studies was the effect of global climate change. This has been shown to be especially pronounced in northern regions, having an important influence on the subsequent transformation of frozen soil distribution and potential permafrost degradation. In West Siberia such studies are especially important with the prospect of plans for development of oil-gas fields (Yamal, Gydan and Kara Sea shelf). Presently the enterprises independently determine the necessary research for ecological control of the territory. Therefore, the Tyumen State Oil and Gas University (TSOGU) together with one of the leading gas enterprises "Nadymgasprom" started to create an observational network along the meridian transect of northern West Siberia (Yamal-Nenets administrative district). Observational network consists from a number of monitoring sites - Technical-Environmental permafrost Observatories (TEPO). The research complex includes temperature observations in boreholes (depths of 30) equipped with automatic systems for registration and data collection; seasonal field investigations on spatial distribution and temporal variability of the snow cover and vegetation and soil distribution. TSOGU and "Nadymgasprom" plan for the realization of long-term monitoring to obtain representative results on permafrost-climate interaction. At present there are three monitoring observatories located in the main landscape types and gas fields in use since 1972 (Medvezhye), 1992 (Yubileynoe) and in development (Harasavey). The next contribution to International Polar Year (2007-2008) will be renewal of one of the former monitoring sites (established in 1972) with a long-term period of observation and creation of a new site at the Yamal peninsula (Arctic tundra zone). At the last site the installation of an automatic Climate-Soil Station is being planned in the framework of the INTAS Infrastructure Action project with cooperation of the Alfred Wegener Institute for Polar and Marine Research and the University of Hamburg, Germany. One of the responsibilities of TEPO is to provide assistance to students taking part in scientific research (undergraduate and post-graduate practical work and organization of summer schools and seminars). In 2005 a joint summer student field excursion with the Moscow State University Department of Cryolithology and Glaciology took place at TEPO headquarters. The teaching courses consist of the following main topics: 1. Environment and Permafrost of northern West Siberia; 2. Paleocryogenic Formation of Alluvial Terraces; 3. Hydrology and Hydrogeological Conditions of the Territory; 4. Geotechnical Monitoring of Gas Fields; 5. Geotechnical Dangers in the Cryolithozone. The workshop "Stability of Pipelines in the Cryolithozone" held in Nadym at August, 29-31 with participation of "Nadymgasprom", TSOGU and Hokkaido University, Graduate School of Engineering (Japan) included a field excursion. TEPO is expected to be the basis for scientific and educational exchange with national and foreign universities and research institutes and part of the global international monitoring in the northern regions.
Chmielewski, Witold X; Beste, Christian
2017-02-01
In everyday life successful acting often requires to inhibit automatic responses that might not be appropriate in the current situation. These response inhibition processes have been shown to become aggravated with increasing automaticity of pre-potent response tendencies. Likewise, it has been shown that inhibitory processes are complicated by a concurrent engagement in additional cognitive control processes (e.g. conflicting monitoring). Therefore, opposing processes (i.e. automaticity and cognitive control) seem to strongly impact response inhibition. However, possible interactive effects of automaticity and cognitive control for the modulation of response inhibition processes have yet not been examined. In the current study we examine this question using a novel experimental paradigm combining a Go/NoGo with a Simon task in a system neurophysiological approach combining EEG recordings with source localization analyses. The results show that response inhibition is less accurate in non-conflicting than in conflicting stimulus-response mappings. Thus it seems that conflicts and the resulting engagement in conflict monitoring processes, as reflected in the N2 amplitude, may foster response inhibition processes. This engagement in conflict monitoring processes leads to an increase in cognitive control, as reflected by an increased activity in the anterior and posterior cingulate areas, while simultaneously the automaticity of response tendencies is decreased. Most importantly, this study suggests that the quality of conflict processes in anterior cingulate areas and especially the resulting interaction of cognitive control and automaticity of pre-potent response tendencies are important factors to consider, when it comes to the modulation of response inhibition processes. Copyright © 2016 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Wojenski, Andrzej; Kasprowicz, Grzegorz; Pozniak, Krzysztof T.; Romaniuk, Ryszard
2013-10-01
The paper describes a concept of automatic firmware generation for reconfigurable measurement systems, which uses FPGA devices and measurement cards in FMC standard. Following sections are described in details: automatic HDL code generation for FPGA devices, automatic communication interfaces implementation, HDL drivers for measurement cards, automatic serial connection between multiple measurement backplane boards, automatic build of memory map (address space), automatic generated firmware management. Presented solutions are required in many advanced measurement systems, like Beam Position Monitors or GEM detectors. This work is a part of a wider project for automatic firmware generation and management of reconfigurable systems. Solutions presented in this paper are based on previous publication in SPIE.
Batchu, S; Narasimhachar, H; Mayeda, J C; Hall, T; Lopez, J; Nguyen, T; Banister, R E; Lie, D Y C
2017-07-01
Doppler-based non-contact vital signs (NCVS) sensors can monitor heart rates, respiration rates, and motions of patients without physically touching them. We have developed a novel single-board Doppler-based phased-array antenna NCVS biosensor system that can perform robust overnight continuous NCVS monitoring with intelligent automatic subject tracking and optimal beam steering algorithms. Our NCVS sensor achieved overnight continuous vital signs monitoring with an impressive heart-rate monitoring accuracy of over 94% (i.e., within ±5 Beats-Per-Minute vs. a reference sensor), analyzed from over 400,000 data points collected during each overnight monitoring period of ~ 6 hours at a distance of 1.75 meters. The data suggests our intelligent phased-array NCVS sensor can be very attractive for continuous monitoring of low-acuity patients.
Acoustic energy transmission in cast iron pipelines
NASA Astrophysics Data System (ADS)
Kiziroglou, Michail E.; Boyle, David E.; Wright, Steven W.; Yeatman, Eric M.
2015-12-01
In this paper we propose acoustic power transfer as a method for the remote powering of pipeline sensor nodes. A theoretical framework of acoustic power propagation in the ceramic transducers and the metal structures is drawn, based on the Mason equivalent circuit. The effect of mounting on the electrical response of piezoelectric transducers is studied experimentally. Using two identical transducer structures, power transmission of 0.33 mW through a 1 m long, 118 mm diameter cast iron pipe, with 8 mm wall thickness is demonstrated, at 1 V received voltage amplitude. A near-linear relationship between input and output voltage is observed. These results show that it is possible to deliver significant power to sensor nodes through acoustic waves in solid structures. The proposed method may enable the implementation of acoustic - powered wireless sensor nodes for structural and operation monitoring of pipeline infrastructure.
Crack detection and leakage monitoring on reinforced concrete pipe
NASA Astrophysics Data System (ADS)
Feng, Qian; Kong, Qingzhao; Huo, Linsheng; Song, Gangbing
2015-11-01
Reinforced concrete underground pipelines are some of the most widely used types of structures in water transportation systems. Cracks and leakage are the leading causes of pipeline structural failures which directly results in economic losses and environmental hazards. In this paper, the authors propose a piezoceramic based active sensing approach to detect the cracks and the further leakage of concrete pipelines. Due to the piezoelectric properties, piezoceramic material can be utilized as both the actuator and the sensor in the active sensing approach. The piezoceramic patch, which is sandwiched between protective materials called ‘smart aggregates,’ can be safely embedded into concrete structures. Circumferential and axial cracks were investigated. A wavelet packet-based energy analysis was developed to distinguish the type of crack and determine the further leakage based on different stress wave energy attenuation propagated through the cracks.
... Learn more about getting to NIH Get Email Alerts Receive automatic alerts about NHLBI related news and ... Connect With Us Contact Us Directly Get Email Alerts Receive automatic alerts about NHLBI related news and ...
Automated monitoring of recovered water quality
NASA Technical Reports Server (NTRS)
Misselhorn, J. E.; Hartung, W. H.; Witz, S. W.
1974-01-01
Laboratory prototype water quality monitoring system provides automatic system for online monitoring of chemical, physical, and bacteriological properties of recovered water and for signaling malfunction in water recovery system. Monitor incorporates whenever possible commercially available sensors suitably modified.
NASA Astrophysics Data System (ADS)
Cook, K.; Alcock, C.; Allsman, R.; Axelrod, T.; Bennett, D.; Marshall, S.; Stubbs, C.; Griest, K.; Perlmutter, S.; Sutherland, W.; Freeman, K.; Peterson, B.; Quinn, P.; Rodgers, A.
1992-12-01
This collaboration, dubbed the MACHO Project (an acronym for MAssive Compact Halo Objects), has refurbished the 1.27-m, Great Melbourne Telescope at Mt. Stromlo and equipped it with a corrected {1°} FOV. The prime focus corrector yields a red and blue beam for simultaneous imaging in two passbands, 4500{ Angstroms}--6100{ Angstroms} and 6100{ Angstroms}--7900{ Angstroms}. Each beam is imaged by a 2x2 array of 2048x2048 pixel CCDs which are simultaneously read out from two amplifiers on each CCD. A 32 Megapixel dual-color image of 0.5 square degree is clocked directly into computer memory in less than 70 seconds. We are using this system to monitor more than 10(7) stars in the Magellanic Clouds for gravitational microlensing events and will soon monitor an additional 10(7) stars in the bulge of our galaxy. Image data goes directly into a reduction pipeline where photometry for stars in an image is determined and stored in a database. An early version of this pipeline has used a simple aperture photometry code and results from this will be presented. A more sophisticated PSF fitting photometry code is currently being installed in the pipeline and results should also be available at the meeting. The PSF fitting code has also been used to produce ~ 10(7) photometric measurements outside of the pipeline. This poster will present details of the instrumentation, data pipeline, observing conditions (weather and seeing), reductions and analyses for the first six months of dual-color observing. Eventually, we expect to be able to determine whether MACHOs are a significant component of the galactic halo in the mass range of \\(10^{-6} M_{\\sun} < M \\ {lower .5exhbox {\\: \\buildrel < \\over \\sim ;}} \\ 100 M_{\\sun}\\).
OrthoSelect: a protocol for selecting orthologous groups in phylogenomics.
Schreiber, Fabian; Pick, Kerstin; Erpenbeck, Dirk; Wörheide, Gert; Morgenstern, Burkhard
2009-07-16
Phylogenetic studies using expressed sequence tags (EST) are becoming a standard approach to answer evolutionary questions. Such studies are usually based on large sets of newly generated, unannotated, and error-prone EST sequences from different species. A first crucial step in EST-based phylogeny reconstruction is to identify groups of orthologous sequences. From these data sets, appropriate target genes are selected, and redundant sequences are eliminated to obtain suitable sequence sets as input data for tree-reconstruction software. Generating such data sets manually can be very time consuming. Thus, software tools are needed that carry out these steps automatically. We developed a flexible and user-friendly software pipeline, running on desktop machines or computer clusters, that constructs data sets for phylogenomic analyses. It automatically searches assembled EST sequences against databases of orthologous groups (OG), assigns ESTs to these predefined OGs, translates the sequences into proteins, eliminates redundant sequences assigned to the same OG, creates multiple sequence alignments of identified orthologous sequences and offers the possibility to further process this alignment in a last step by excluding potentially homoplastic sites and selecting sufficiently conserved parts. Our software pipeline can be used as it is, but it can also be adapted by integrating additional external programs. This makes the pipeline useful for non-bioinformaticians as well as to bioinformatic experts. The software pipeline is especially designed for ESTs, but it can also handle protein sequences. OrthoSelect is a tool that produces orthologous gene alignments from assembled ESTs. Our tests show that OrthoSelect detects orthologs in EST libraries with high accuracy. In the absence of a gold standard for orthology prediction, we compared predictions by OrthoSelect to a manually created and published phylogenomic data set. Our tool was not only able to rebuild the data set with a specificity of 98%, but it detected four percent more orthologous sequences. Furthermore, the results OrthoSelect produces are in absolut agreement with the results of other programs, but our tool offers a significant speedup and additional functionality, e.g. handling of ESTs, computing sequence alignments, and refining them. To our knowledge, there is currently no fully automated and freely available tool for this purpose. Thus, OrthoSelect is a valuable tool for researchers in the field of phylogenomics who deal with large quantities of EST sequences. OrthoSelect is written in Perl and runs on Linux/Mac OS X. The tool can be downloaded at (http://gobics.de/fabian/orthoselect.php).
Prototype Technology for Monitoring Volatile Organics. Volume 1.
1988-03-01
117, pp. 285-294. Grote, J.O. and Westendorf , R.G., "An Automatic Purge and Trap Concentrator," American Laboratory, December 1979. Khromchenko, Y.L...Environmental Monitoring and Support Laboratory, Office of Research and Development, Cincinnati, OH. Westendorf , R.G., "Closed-loop Stripping Analysis...Technique and Applications," American Laboratory, December 1982. Westendorf , R.G., "Development Application of A Semi-Automatic Purge and Trap Concentrator
Designing a reliable leak bio-detection system for natural gas pipelines.
Batzias, F A; Siontorou, C G; Spanidis, P-M P
2011-02-15
Monitoring of natural gas (NG) pipelines is an important task for economical/safety operation, loss prevention and environmental protection. Timely and reliable leak detection of gas pipeline, therefore, plays a key role in the overall integrity management for the pipeline system. Owing to the various limitations of the currently available techniques and the surveillance area that needs to be covered, the research on new detector systems is still thriving. Biosensors are worldwide considered as a niche technology in the environmental market, since they afford the desired detector capabilities at low cost, provided they have been properly designed/developed and rationally placed/networked/maintained by the aid of operational research techniques. This paper addresses NG leakage surveillance through a robust cooperative/synergistic scheme between biosensors and conventional detector systems; the network is validated in situ and optimized in order to provide reliable information at the required granularity level. The proposed scheme is substantiated through a knowledge based approach and relies on Fuzzy Multicriteria Analysis (FMCA), for selecting the best biosensor design that suits both, the target analyte and the operational micro-environment. This approach is illustrated in the design of leak surveying over a pipeline network in Greece. Copyright © 2010 Elsevier B.V. All rights reserved.
There's Method in Our Madness: Interpersonal Attraction as a Multidimensional Construct
ERIC Educational Resources Information Center
Latta, R. Michael
1976-01-01
Examines the similarity-attraction relationship using rating scale and bogus pipeline (a pseudophysiological monitoring device) techniques within the context of individual differences in social desirability biases and variations in experimental demands. (Author/RK)
Crucial considerations for pipelines to validate circulating biomarkers for breast cancer.
Ewaisha, Radwa; Gawryletz, Chelsea D; Anderson, Karen S
2016-01-01
Despite decades of progress in breast imaging, breast cancer remains the second most common cause of cancer mortality in women. The rapidly proliferative breast cancers that are associated with high relapse rates and mortality frequently present in younger women, in unscreened individuals, or in the intervals between screening mammography. Biomarkers exist for monitoring metastatic disease, such as CEA, CA27.29 and CA15-3, but there are no circulating biomarkers clinically available for early detection, prognosis, or monitoring for clinical relapse. There has been significant progress in the discovery of potential circulating biomarkers, including proteins, autoantibodies, nucleic acids, exosomes, and circulating tumor cells, but the vast majority of these biomarkers have not progressed beyond initial research discovery, and none have yet been approved for clinical use in early stage disease. Here, the authors review the crucial considerations of developing pipelines for the rapid evaluation of circulating biomarkers for breast cancer.
Low-cost failure sensor design and development for water pipeline distribution systems.
Khan, K; Widdop, P D; Day, A J; Wood, A S; Mounce, S R; Machell, J
2002-01-01
This paper describes the design and development of a new sensor which is low cost to manufacture and install and is reliable in operation with sufficient accuracy, resolution and repeatability for use in newly developed systems for pipeline monitoring and leakage detection. To provide an appropriate signal, the concept of a "failure" sensor is introduced, in which the output is not necessarily proportional to the input, but is unmistakably affected when an unusual event occurs. The design of this failure sensor is based on the water opacity which can be indicative of an unusual event in a water distribution network. The laboratory work and field trials necessary to design and prove out this type of failure sensor are described here. It is concluded that a low-cost failure sensor of this type has good potential for use in a comprehensive water monitoring and management system based on Artificial Neural Networks (ANN).
Monitoring of pipelines in nuclear power plants by measuring laser-based mechanical impedance
NASA Astrophysics Data System (ADS)
Lee, Hyeonseok; Sohn, Hoon; Yang, Suyoung; Yang, Jinyeol
2014-06-01
Using laser-based mechanical impedance (LMI) measurement, this study proposes a damage detection technique that enables structural health monitoring of pipelines under the high temperature and radioactive environments of nuclear power plants (NPPs). The applications of conventional electromechanical impedance (EMI) based techniques to NPPs have been limited, mainly due to the contact nature of piezoelectric transducers, which cannot survive under the high temperature and high radiation environments of NPPs. The proposed LMI measurement technique aims to tackle the limitations of the EMI techniques by utilizing noncontact laser beams for both ultrasound generation and sensing. An Nd:Yag pulse laser is used for ultrasound generation, and a laser Doppler vibrometer is employed for the measurement of the corresponding ultrasound responses. For the monitoring of pipes covered by insulation layers, this study utilizes optical fibers to guide the laser beams to specific target locations. Then, an outlier analysis is adopted for autonomous damage diagnosis. Validation of the proposed LMI technique is carried out on a carbon steel pipe elbow under varying temperatures. A corrosion defect chemically engraved in the specimen is successfully detected.
43 CFR 2881.5 - What acronyms and terms are used in the regulations in this part?
Code of Federal Regulations, 2012 CFR
2012-10-01
... of 1920, as amended (30 U.S.C. 185). TAPS means the Trans-Alaska Oil Pipeline System. TUP means a... Federal government expends or uses in processing a right-of-way application or in monitoring the... authorizations issued under FLPMA (43 U.S.C. 1761 et seq.). Monitoring means those actions, subject to § 2886.11...
43 CFR 2881.5 - What acronyms and terms are used in the regulations in this part?
Code of Federal Regulations, 2014 CFR
2014-10-01
... of 1920, as amended (30 U.S.C. 185). TAPS means the Trans-Alaska Oil Pipeline System. TUP means a... Federal government expends or uses in processing a right-of-way application or in monitoring the... authorizations issued under FLPMA (43 U.S.C. 1761 et seq.). Monitoring means those actions, subject to § 2886.11...
43 CFR 2881.5 - What acronyms and terms are used in the regulations in this part?
Code of Federal Regulations, 2013 CFR
2013-10-01
... of 1920, as amended (30 U.S.C. 185). TAPS means the Trans-Alaska Oil Pipeline System. TUP means a... Federal government expends or uses in processing a right-of-way application or in monitoring the... authorizations issued under FLPMA (43 U.S.C. 1761 et seq.). Monitoring means those actions, subject to § 2886.11...
43 CFR 2881.5 - What acronyms and terms are used in the regulations in this part?
Code of Federal Regulations, 2011 CFR
2011-10-01
... of 1920, as amended (30 U.S.C. 185). TAPS means the Trans-Alaska Oil Pipeline System. TUP means a... Federal government expends or uses in processing a right-of-way application or in monitoring the... authorizations issued under FLPMA (43 U.S.C. 1761 et seq.). Monitoring means those actions, subject to § 2886.11...
A Little Sensor That Packs a Wallop
NASA Technical Reports Server (NTRS)
2000-01-01
A gas sensor originally built for NASA to measure the composition of the atmosphere of Earth and Mars has been commercialized by SpectraSensors. The commercial tunable diode laser (TDL) gas sensor can be used for oil and gas pipeline monitoring, aircraft safety, environmental monitoring and medicine. The TDL technology is good at detecting low levels of gases from parts-per-million to parts-per-billion.
Characterization of Microbial Communities in Gas Industry Pipelines
Zhu, Xiang Y.; Lubeck, John; Kilbane, John J.
2003-01-01
Culture-independent techniques, denaturing gradient gel electrophoresis (DGGE) analysis, and random cloning of 16S rRNA gene sequences amplified from community DNA were used to determine the diversity of microbial communities in gas industry pipelines. Samples obtained from natural gas pipelines were used directly for DNA extraction, inoculated into sulfate-reducing bacterium medium, or used to inoculate a reactor that simulated a natural gas pipeline environment. The variable V2-V3 (average size, 384 bp) and V3-V6 (average size, 648 bp) regions of bacterial and archaeal 16S rRNA genes, respectively, were amplified from genomic DNA isolated from nine natural gas pipeline samples and analyzed. A total of 106 bacterial 16S rDNA sequences were derived from DGGE bands, and these formed three major clusters: beta and gamma subdivisions of Proteobacteria and gram-positive bacteria. The most frequently encountered bacterial species was Comamonas denitrificans, which was not previously reported to be associated with microbial communities found in gas pipelines or with microbially influenced corrosion. The 31 archaeal 16S rDNA sequences obtained in this study were all related to those of methanogens and phylogenetically fall into three clusters: order I, Methanobacteriales; order III, Methanomicrobiales; and order IV, Methanosarcinales. Further microbial ecology studies are needed to better understand the relationship among bacterial and archaeal groups and the involvement of these groups in the process of microbially influenced corrosion in order to develop improved ways of monitoring and controlling microbially influenced corrosion. PMID:12957923
The Gemini Recipe System: a dynamic workflow for automated data reduction
NASA Astrophysics Data System (ADS)
Labrie, Kathleen; Allen, Craig; Hirst, Paul; Holt, Jennifer; Allen, River; Dement, Kaniela
2010-07-01
Gemini's next generation data reduction software suite aims to offer greater automation of the data reduction process without compromising the flexibility required by science programs using advanced or unusual observing strategies. The Recipe System is central to our new data reduction software. Developed in Python, it facilitates near-real time processing for data quality assessment, and both on- and off-line science quality processing. The Recipe System can be run as a standalone application or as the data processing core of an automatic pipeline. The data reduction process is defined in a Recipe written in a science (as opposed to computer) oriented language, and consists of a sequence of data reduction steps, called Primitives, which are written in Python and can be launched from the PyRAF user interface by users wishing to use them interactively for more hands-on optimization of the data reduction process. The fact that the same processing Primitives can be run within both the pipeline context and interactively in a PyRAF session is an important strength of the Recipe System. The Recipe System offers dynamic flow control allowing for decisions regarding processing and calibration to be made automatically, based on the pixel and the metadata properties of the dataset at the stage in processing where the decision is being made, and the context in which the processing is being carried out. Processing history and provenance recording are provided by the AstroData middleware, which also offers header abstraction and data type recognition to facilitate the development of instrument-agnostic processing routines.
Automatic Extraction of Metadata from Scientific Publications for CRIS Systems
ERIC Educational Resources Information Center
Kovacevic, Aleksandar; Ivanovic, Dragan; Milosavljevic, Branko; Konjovic, Zora; Surla, Dusan
2011-01-01
Purpose: The aim of this paper is to develop a system for automatic extraction of metadata from scientific papers in PDF format for the information system for monitoring the scientific research activity of the University of Novi Sad (CRIS UNS). Design/methodology/approach: The system is based on machine learning and performs automatic extraction…
The Chandra Source Catalog : Automated Source Correlation
NASA Astrophysics Data System (ADS)
Hain, Roger; Evans, I. N.; Evans, J. D.; Glotfelty, K. J.; Anderson, C. S.; Bonaventura, N. R.; Chen, J. C.; Davis, J. E.; Doe, S. M.; Fabbiano, G.; Galle, E.; Gibbs, D. G.; Grier, J. D.; Hall, D. M.; Harbo, P. N.; He, X.; Houck, J. C.; Karovska, M.; Lauer, J.; McCollough, M. L.; McDowell, J. C.; Miller, J. B.; Mitschang, A. W.; Morgan, D. L.; Nichols, J. S.; Nowak, M. A.; Plummer, D. A.; Primini, F. A.; Refsdal, B. L.; Rots, A. H.; Siemiginowska, A. L.; Sundheim, B. A.; Tibbetts, M. S.; Van Stone, D. W.; Winkelman, S. L.; Zografou, P.
2009-01-01
Chandra Source Catalog (CSC) master source pipeline processing seeks to automatically detect sources and compute their properties. Since Chandra is a pointed mission and not a sky survey, different sky regions are observed for a different number of times at varying orientations, resolutions, and other heterogeneous conditions. While this provides an opportunity to collect data from a potentially large number of observing passes, it also creates challenges in determining the best way to combine different detection results for the most accurate characterization of the detected sources. The CSC master source pipeline correlates data from multiple observations by updating existing cataloged source information with new data from the same sky region as they become available. This process sometimes leads to relatively straightforward conclusions, such as when single sources from two observations are similar in size and position. Other observation results require more logic to combine, such as one observation finding a single, large source and another identifying multiple, smaller sources at the same position. We present examples of different overlapping source detections processed in the current version of the CSC master source pipeline. We explain how they are resolved into entries in the master source database, and examine the challenges of computing source properties for the same source detected multiple times. Future enhancements are also discussed. This work is supported by NASA contract NAS8-03060 (CXC).
A New Feedback-Based Method for Parameter Adaptation in Image Processing Routines.
Khan, Arif Ul Maula; Mikut, Ralf; Reischl, Markus
2016-01-01
The parametrization of automatic image processing routines is time-consuming if a lot of image processing parameters are involved. An expert can tune parameters sequentially to get desired results. This may not be productive for applications with difficult image analysis tasks, e.g. when high noise and shading levels in an image are present or images vary in their characteristics due to different acquisition conditions. Parameters are required to be tuned simultaneously. We propose a framework to improve standard image segmentation methods by using feedback-based automatic parameter adaptation. Moreover, we compare algorithms by implementing them in a feedforward fashion and then adapting their parameters. This comparison is proposed to be evaluated by a benchmark data set that contains challenging image distortions in an increasing fashion. This promptly enables us to compare different standard image segmentation algorithms in a feedback vs. feedforward implementation by evaluating their segmentation quality and robustness. We also propose an efficient way of performing automatic image analysis when only abstract ground truth is present. Such a framework evaluates robustness of different image processing pipelines using a graded data set. This is useful for both end-users and experts.
Automatic detection and quantitative analysis of cells in the mouse primary motor cortex
NASA Astrophysics Data System (ADS)
Meng, Yunlong; He, Yong; Wu, Jingpeng; Chen, Shangbin; Li, Anan; Gong, Hui
2014-09-01
Neuronal cells play very important role on metabolism regulation and mechanism control, so cell number is a fundamental determinant of brain function. Combined suitable cell-labeling approaches with recently proposed three-dimensional optical imaging techniques, whole mouse brain coronal sections can be acquired with 1-μm voxel resolution. We have developed a completely automatic pipeline to perform cell centroids detection, and provided three-dimensional quantitative information of cells in the primary motor cortex of C57BL/6 mouse. It involves four principal steps: i) preprocessing; ii) image binarization; iii) cell centroids extraction and contour segmentation; iv) laminar density estimation. Investigations on the presented method reveal promising detection accuracy in terms of recall and precision, with average recall rate 92.1% and average precision rate 86.2%. We also analyze laminar density distribution of cells from pial surface to corpus callosum from the output vectorizations of detected cell centroids in mouse primary motor cortex, and find significant cellular density distribution variations in different layers. This automatic cell centroids detection approach will be beneficial for fast cell-counting and accurate density estimation, as time-consuming and error-prone manual identification is avoided.
A New Feedback-Based Method for Parameter Adaptation in Image Processing Routines
Mikut, Ralf; Reischl, Markus
2016-01-01
The parametrization of automatic image processing routines is time-consuming if a lot of image processing parameters are involved. An expert can tune parameters sequentially to get desired results. This may not be productive for applications with difficult image analysis tasks, e.g. when high noise and shading levels in an image are present or images vary in their characteristics due to different acquisition conditions. Parameters are required to be tuned simultaneously. We propose a framework to improve standard image segmentation methods by using feedback-based automatic parameter adaptation. Moreover, we compare algorithms by implementing them in a feedforward fashion and then adapting their parameters. This comparison is proposed to be evaluated by a benchmark data set that contains challenging image distortions in an increasing fashion. This promptly enables us to compare different standard image segmentation algorithms in a feedback vs. feedforward implementation by evaluating their segmentation quality and robustness. We also propose an efficient way of performing automatic image analysis when only abstract ground truth is present. Such a framework evaluates robustness of different image processing pipelines using a graded data set. This is useful for both end-users and experts. PMID:27764213
Simultaneous Analysis and Quality Assurance for Diffusion Tensor Imaging
Lauzon, Carolyn B.; Asman, Andrew J.; Esparza, Michael L.; Burns, Scott S.; Fan, Qiuyun; Gao, Yurui; Anderson, Adam W.; Davis, Nicole; Cutting, Laurie E.; Landman, Bennett A.
2013-01-01
Diffusion tensor imaging (DTI) enables non-invasive, cyto-architectural mapping of in vivo tissue microarchitecture through voxel-wise mathematical modeling of multiple magnetic resonance imaging (MRI) acquisitions, each differently sensitized to water diffusion. DTI computations are fundamentally estimation processes and are sensitive to noise and artifacts. Despite widespread adoption in the neuroimaging community, maintaining consistent DTI data quality remains challenging given the propensity for patient motion, artifacts associated with fast imaging techniques, and the possibility of hardware changes/failures. Furthermore, the quantity of data acquired per voxel, the non-linear estimation process, and numerous potential use cases complicate traditional visual data inspection approaches. Currently, quality inspection of DTI data has relied on visual inspection and individual processing in DTI analysis software programs (e.g. DTIPrep, DTI-studio). However, recent advances in applied statistical methods have yielded several different metrics to assess noise level, artifact propensity, quality of tensor fit, variance of estimated measures, and bias in estimated measures. To date, these metrics have been largely studied in isolation. Herein, we select complementary metrics for integration into an automatic DTI analysis and quality assurance pipeline. The pipeline completes in 24 hours, stores statistical outputs, and produces a graphical summary quality analysis (QA) report. We assess the utility of this streamlined approach for empirical quality assessment on 608 DTI datasets from pediatric neuroimaging studies. The efficiency and accuracy of quality analysis using the proposed pipeline is compared with quality analysis based on visual inspection. The unified pipeline is found to save a statistically significant amount of time (over 70%) while improving the consistency of QA between a DTI expert and a pool of research associates. Projection of QA metrics to a low dimensional manifold reveal qualitative, but clear, QA-study associations and suggest that automated outlier/anomaly detection would be feasible. PMID:23637895
Grid Computing Application for Brain Magnetic Resonance Image Processing
NASA Astrophysics Data System (ADS)
Valdivia, F.; Crépeault, B.; Duchesne, S.
2012-02-01
This work emphasizes the use of grid computing and web technology for automatic post-processing of brain magnetic resonance images (MRI) in the context of neuropsychiatric (Alzheimer's disease) research. Post-acquisition image processing is achieved through the interconnection of several individual processes into pipelines. Each process has input and output data ports, options and execution parameters, and performs single tasks such as: a) extracting individual image attributes (e.g. dimensions, orientation, center of mass), b) performing image transformations (e.g. scaling, rotation, skewing, intensity standardization, linear and non-linear registration), c) performing image statistical analyses, and d) producing the necessary quality control images and/or files for user review. The pipelines are built to perform specific sequences of tasks on the alphanumeric data and MRIs contained in our database. The web application is coded in PHP and allows the creation of scripts to create, store and execute pipelines and their instances either on our local cluster or on high-performance computing platforms. To run an instance on an external cluster, the web application opens a communication tunnel through which it copies the necessary files, submits the execution commands and collects the results. We present result on system tests for the processing of a set of 821 brain MRIs from the Alzheimer's Disease Neuroimaging Initiative study via a nonlinear registration pipeline composed of 10 processes. Our results show successful execution on both local and external clusters, and a 4-fold increase in performance if using the external cluster. However, the latter's performance does not scale linearly as queue waiting times and execution overhead increase with the number of tasks to be executed.
RAP: RNA-Seq Analysis Pipeline, a new cloud-based NGS web application.
D'Antonio, Mattia; D'Onorio De Meo, Paolo; Pallocca, Matteo; Picardi, Ernesto; D'Erchia, Anna Maria; Calogero, Raffaele A; Castrignanò, Tiziana; Pesole, Graziano
2015-01-01
The study of RNA has been dramatically improved by the introduction of Next Generation Sequencing platforms allowing massive and cheap sequencing of selected RNA fractions, also providing information on strand orientation (RNA-Seq). The complexity of transcriptomes and of their regulative pathways make RNA-Seq one of most complex field of NGS applications, addressing several aspects of the expression process (e.g. identification and quantification of expressed genes and transcripts, alternative splicing and polyadenylation, fusion genes and trans-splicing, post-transcriptional events, etc.). In order to provide researchers with an effective and friendly resource for analyzing RNA-Seq data, we present here RAP (RNA-Seq Analysis Pipeline), a cloud computing web application implementing a complete but modular analysis workflow. This pipeline integrates both state-of-the-art bioinformatics tools for RNA-Seq analysis and in-house developed scripts to offer to the user a comprehensive strategy for data analysis. RAP is able to perform quality checks (adopting FastQC and NGS QC Toolkit), identify and quantify expressed genes and transcripts (with Tophat, Cufflinks and HTSeq), detect alternative splicing events (using SpliceTrap) and chimeric transcripts (with ChimeraScan). This pipeline is also able to identify splicing junctions and constitutive or alternative polyadenylation sites (implementing custom analysis modules) and call for statistically significant differences in genes and transcripts expression, splicing pattern and polyadenylation site usage (using Cuffdiff2 and DESeq). Through a user friendly web interface, the RAP workflow can be suitably customized by the user and it is automatically executed on our cloud computing environment. This strategy allows to access to bioinformatics tools and computational resources without specific bioinformatics and IT skills. RAP provides a set of tabular and graphical results that can be helpful to browse, filter and export analyzed data, according to the user needs.
Fuzzy-based propagation of prior knowledge to improve large-scale image analysis pipelines
Mikut, Ralf
2017-01-01
Many automatically analyzable scientific questions are well-posed and a variety of information about expected outcomes is available a priori. Although often neglected, this prior knowledge can be systematically exploited to make automated analysis operations sensitive to a desired phenomenon or to evaluate extracted content with respect to this prior knowledge. For instance, the performance of processing operators can be greatly enhanced by a more focused detection strategy and by direct information about the ambiguity inherent in the extracted data. We present a new concept that increases the result quality awareness of image analysis operators by estimating and distributing the degree of uncertainty involved in their output based on prior knowledge. This allows the use of simple processing operators that are suitable for analyzing large-scale spatiotemporal (3D+t) microscopy images without compromising result quality. On the foundation of fuzzy set theory, we transform available prior knowledge into a mathematical representation and extensively use it to enhance the result quality of various processing operators. These concepts are illustrated on a typical bioimage analysis pipeline comprised of seed point detection, segmentation, multiview fusion and tracking. The functionality of the proposed approach is further validated on a comprehensive simulated 3D+t benchmark data set that mimics embryonic development and on large-scale light-sheet microscopy data of a zebrafish embryo. The general concept introduced in this contribution represents a new approach to efficiently exploit prior knowledge to improve the result quality of image analysis pipelines. The generality of the concept makes it applicable to practically any field with processing strategies that are arranged as linear pipelines. The automated analysis of terabyte-scale microscopy data will especially benefit from sophisticated and efficient algorithms that enable a quantitative and fast readout. PMID:29095927
Fast, accurate and easy-to-pipeline methods for amplicon sequence processing
NASA Astrophysics Data System (ADS)
Antonielli, Livio; Sessitsch, Angela
2016-04-01
Next generation sequencing (NGS) technologies established since years as an essential resource in microbiology. While on the one hand metagenomic studies can benefit from the continuously increasing throughput of the Illumina (Solexa) technology, on the other hand the spreading of third generation sequencing technologies (PacBio, Oxford Nanopore) are getting whole genome sequencing beyond the assembly of fragmented draft genomes, making it now possible to finish bacterial genomes even without short read correction. Besides (meta)genomic analysis next-gen amplicon sequencing is still fundamental for microbial studies. Amplicon sequencing of the 16S rRNA gene and ITS (Internal Transcribed Spacer) remains a well-established widespread method for a multitude of different purposes concerning the identification and comparison of archaeal/bacterial (16S rRNA gene) and fungal (ITS) communities occurring in diverse environments. Numerous different pipelines have been developed in order to process NGS-derived amplicon sequences, among which Mothur, QIIME and USEARCH are the most well-known and cited ones. The entire process from initial raw sequence data through read error correction, paired-end read assembly, primer stripping, quality filtering, clustering, OTU taxonomic classification and BIOM table rarefaction as well as alternative "normalization" methods will be addressed. An effective and accurate strategy will be presented using the state-of-the-art bioinformatic tools and the example of a straightforward one-script pipeline for 16S rRNA gene or ITS MiSeq amplicon sequencing will be provided. Finally, instructions on how to automatically retrieve nucleotide sequences from NCBI and therefore apply the pipeline to targets other than 16S rRNA gene (Greengenes, SILVA) and ITS (UNITE) will be discussed.
Simultaneous analysis and quality assurance for diffusion tensor imaging.
Lauzon, Carolyn B; Asman, Andrew J; Esparza, Michael L; Burns, Scott S; Fan, Qiuyun; Gao, Yurui; Anderson, Adam W; Davis, Nicole; Cutting, Laurie E; Landman, Bennett A
2013-01-01
Diffusion tensor imaging (DTI) enables non-invasive, cyto-architectural mapping of in vivo tissue microarchitecture through voxel-wise mathematical modeling of multiple magnetic resonance imaging (MRI) acquisitions, each differently sensitized to water diffusion. DTI computations are fundamentally estimation processes and are sensitive to noise and artifacts. Despite widespread adoption in the neuroimaging community, maintaining consistent DTI data quality remains challenging given the propensity for patient motion, artifacts associated with fast imaging techniques, and the possibility of hardware changes/failures. Furthermore, the quantity of data acquired per voxel, the non-linear estimation process, and numerous potential use cases complicate traditional visual data inspection approaches. Currently, quality inspection of DTI data has relied on visual inspection and individual processing in DTI analysis software programs (e.g. DTIPrep, DTI-studio). However, recent advances in applied statistical methods have yielded several different metrics to assess noise level, artifact propensity, quality of tensor fit, variance of estimated measures, and bias in estimated measures. To date, these metrics have been largely studied in isolation. Herein, we select complementary metrics for integration into an automatic DTI analysis and quality assurance pipeline. The pipeline completes in 24 hours, stores statistical outputs, and produces a graphical summary quality analysis (QA) report. We assess the utility of this streamlined approach for empirical quality assessment on 608 DTI datasets from pediatric neuroimaging studies. The efficiency and accuracy of quality analysis using the proposed pipeline is compared with quality analysis based on visual inspection. The unified pipeline is found to save a statistically significant amount of time (over 70%) while improving the consistency of QA between a DTI expert and a pool of research associates. Projection of QA metrics to a low dimensional manifold reveal qualitative, but clear, QA-study associations and suggest that automated outlier/anomaly detection would be feasible.
DTS: The NOAO Data Transport System
NASA Astrophysics Data System (ADS)
Fitzpatrick, M.; Semple, T.
2014-05-01
The NOAO Data Transport System (DTS) provides high-throughput, reliable, data transfer between telescopes, pipelines and archive centers located in the Northern and Southern hemispheres. It is a distributed application using XML-RPC for command and control, and either parallel-TCP or UDT protocols for bulk data transport. The system is data-agnostic, allowing arbitrary files or directories to be moved using the same infrastructure. Data paths are configurable in the system by connecting nodes as the source or destination of data in a queue. Each leg of a data path may be configured independently based on the network environment between the sites. A queueing model is currently implemented to manage the automatic movement of data, a streaming model is planned to support arbitrarily large transfers (e.g. as in a disk recovery scenario) or to provide a 'pass-thru' interface to minize overheads. A web-based monitor allows anyone to get a graphical overview of the DTS system as it runs, operators will be able to control individual nodes in the system. Through careful tuning of the network paths DTS is able to achieve in excess of 80-percent of the nominal wire speed using only commodity networks, making it ideal for long-haul transport of large volumes of data.
Uniform Data Management and Access to Near Real-Time Seismic Data (Invited)
NASA Astrophysics Data System (ADS)
Casey, R.; Ahern, T. K.; Benson, R. B.; Karstens, R.; Stromme, S.; Trabant, C. M.; Weertman, B. R.
2010-12-01
The IRIS Data Management Center has its ears to the ground, receiving relayed seismic telemetry from all parts of the globe with delay times as little as a few seconds from sensor to data center. This immediacy of always-on geophysical information has spawned a demand for ready access to persistent data streams, quality assurance metrics, and automatic production of data products based on specific triggers. For the last ten years, IRIS DMC has developed an effective near real-time data pipeline that serves the needs of seismic networks needing a central data management system as well as the scientific community that need the ability to monitor and respond to events that occurred only moments before. A number of accessible applications have been developed that provide useful data both through the web and through freely available software. Metrics and products of the raw data are cataloged and managed as a chain of events that occur in near-real time. The technical challenges faced with such a system are general to the data management community. Delayed transmission of packetized data, out of order data transmissions, verification of complete data transmission, and data flow concurrency have all been areas of focus in order to provide the best possible level of service to scientists and educators.
3D modeling of building indoor spaces and closed doors from imagery and point clouds.
Díaz-Vilariño, Lucía; Khoshelham, Kourosh; Martínez-Sánchez, Joaquín; Arias, Pedro
2015-02-03
3D models of indoor environments are increasingly gaining importance due to the wide range of applications to which they can be subjected: from redesign and visualization to monitoring and simulation. These models usually exist only for newly constructed buildings; therefore, the development of automatic approaches for reconstructing 3D indoors from imagery and/or point clouds can make the process easier, faster and cheaper. Among the constructive elements defining a building interior, doors are very common elements and their detection can be very useful either for knowing the environment structure, to perform an efficient navigation or to plan appropriate evacuation routes. The fact that doors are topologically connected to walls by being coplanar, together with the unavoidable presence of clutter and occlusions indoors, increases the inherent complexity of the automation of the recognition process. In this work, we present a pipeline of techniques used for the reconstruction and interpretation of building interiors based on point clouds and images. The methodology analyses the visibility problem of indoor environments and goes in depth with door candidate detection. The presented approach is tested in real data sets showing its potential with a high door detection rate and applicability for robust and efficient envelope reconstruction.
Characterization of Stress Corrosion Cracking Using Laser Ultrasonics
DOT National Transportation Integrated Search
2008-08-31
In-service inspection of gas and oil pipelines is a subject of great current interest. Issues of safety and fitness for service have driven extensive efforts to develop effective monitoring and inspection techniques. A number of effective NDT techniq...
Automating usability of ATLAS Distributed Computing resources
NASA Astrophysics Data System (ADS)
Tupputi, S. A.; Di Girolamo, A.; Kouba, T.; Schovancová, J.; Atlas Collaboration
2014-06-01
The automation of ATLAS Distributed Computing (ADC) operations is essential to reduce manpower costs and allow performance-enhancing actions, which improve the reliability of the system. In this perspective a crucial case is the automatic handling of outages of ATLAS computing sites storage resources, which are continuously exploited at the edge of their capabilities. It is challenging to adopt unambiguous decision criteria for storage resources of non-homogeneous types, sizes and roles. The recently developed Storage Area Automatic Blacklisting (SAAB) tool has provided a suitable solution, by employing an inference algorithm which processes history of storage monitoring tests outcome. SAAB accomplishes both the tasks of providing global monitoring as well as automatic operations on single sites. The implementation of the SAAB tool has been the first step in a comprehensive review of the storage areas monitoring and central management at all levels. Such review has involved the reordering and optimization of SAM tests deployment and the inclusion of SAAB results in the ATLAS Site Status Board with both dedicated metrics and views. The resulting structure allows monitoring the storage resources status with fine time-granularity and automatic actions to be taken in foreseen cases, like automatic outage handling and notifications to sites. Hence, the human actions are restricted to reporting and following up problems, where and when needed. In this work we show SAAB working principles and features. We present also the decrease of human interactions achieved within the ATLAS Computing Operation team. The automation results in a prompt reaction to failures, which leads to the optimization of resource exploitation.
Automatic diet monitoring: a review of computer vision and wearable sensor-based methods.
Hassannejad, Hamid; Matrella, Guido; Ciampolini, Paolo; De Munari, Ilaria; Mordonini, Monica; Cagnoni, Stefano
2017-09-01
Food intake and eating habits have a significant impact on people's health. Widespread diseases, such as diabetes and obesity, are directly related to eating habits. Therefore, monitoring diet can be a substantial base for developing methods and services to promote healthy lifestyle and improve personal and national health economy. Studies have demonstrated that manual reporting of food intake is inaccurate and often impractical. Thus, several methods have been proposed to automate the process. This article reviews the most relevant and recent researches on automatic diet monitoring, discussing their strengths and weaknesses. In particular, the article reviews two approaches to this problem, accounting for most of the work in the area. The first approach is based on image analysis and aims at extracting information about food content automatically from food images. The second one relies on wearable sensors and has the detection of eating behaviours as its main goal.
a Continuous Health Monitoring Guided Wave Fmd System for Retrofit to Existing Offshore Oilrigs
NASA Astrophysics Data System (ADS)
Mijarez, R.; Solis, L.; Martinez, F.
2010-02-01
An automatic health monitoring guided wave flood member detection (FMD) system, for retrofit to existing offshore oilrigs is presented. The system employs a microcontroller piezoelectric (PZT) based transmitter and a receiver instrumentation package composed of a PZT 40 kHz ultrasound transducer and a digital signal processor (DSP) module connected to a PC via USB for monitoring purposes. The transmitter and receiver were attached, non-intrusively, to the external wall of a steel tube; 1 m×27 cm×2 mm. Experiments performed in the laboratory have successfully identified automatically flooded tubes.
A Mobile Multi-Agent Information System for Ubiquitous Fetal Monitoring
Su, Chuan-Jun; Chu, Ta-Wei
2014-01-01
Electronic fetal monitoring (EFM) systems integrate many previously separate clinical activities related to fetal monitoring. Promoting the use of ubiquitous fetal monitoring services with real time status assessments requires a robust information platform equipped with an automatic diagnosis engine. This paper presents the design and development of a mobile multi-agent platform-based open information systems (IMAIS) with an automated diagnosis engine to support intensive and distributed ubiquitous fetal monitoring. The automatic diagnosis engine that we developed is capable of analyzing data in both traditional paper-based and digital formats. Issues related to interoperability, scalability, and openness in heterogeneous e-health environments are addressed through the adoption of a FIPA2000 standard compliant agent development platform—the Java Agent Development Environment (JADE). Integrating the IMAIS with light-weight, portable fetal monitor devices allows for continuous long-term monitoring without interfering with a patient’s everyday activities and without restricting her mobility. The system architecture can be also applied to vast monitoring scenarios such as elder care and vital sign monitoring. PMID:24452256
Panjikar, Santosh; Parthasarathy, Venkataraman; Lamzin, Victor S; Weiss, Manfred S; Tucker, Paul A
2005-04-01
The EMBL-Hamburg Automated Crystal Structure Determination Platform is a system that combines a number of existing macromolecular crystallographic computer programs and several decision-makers into a software pipeline for automated and efficient crystal structure determination. The pipeline can be invoked as soon as X-ray data from derivatized protein crystals have been collected and processed. It is controlled by a web-based graphical user interface for data and parameter input, and for monitoring the progress of structure determination. A large number of possible structure-solution paths are encoded in the system and the optimal path is selected by the decision-makers as the structure solution evolves. The processes have been optimized for speed so that the pipeline can be used effectively for validating the X-ray experiment at a synchrotron beamline.
Almeida, Diogo F; Ruben, Rui B; Folgado, João; Fernandes, Paulo R; Audenaert, Emmanuel; Verhegghe, Benedict; De Beule, Matthieu
2016-12-01
Femur segmentation can be an important tool in orthopedic surgical planning. However, in order to overcome the need of an experienced user with extensive knowledge on the techniques, segmentation should be fully automatic. In this paper a new fully automatic femur segmentation method for CT images is presented. This method is also able to define automatically the medullary canal and performs well even in low resolution CT scans. Fully automatic femoral segmentation was performed adapting a template mesh of the femoral volume to medical images. In order to achieve this, an adaptation of the active shape model (ASM) technique based on the statistical shape model (SSM) and local appearance model (LAM) of the femur with a novel initialization method was used, to drive the template mesh deformation in order to fit the in-image femoral shape in a time effective approach. With the proposed method a 98% convergence rate was achieved. For high resolution CT images group the average error is less than 1mm. For the low resolution image group the results are also accurate and the average error is less than 1.5mm. The proposed segmentation pipeline is accurate, robust and completely user free. The method is robust to patient orientation, image artifacts and poorly defined edges. The results excelled even in CT images with a significant slice thickness, i.e., above 5mm. Medullary canal segmentation increases the geometric information that can be used in orthopedic surgical planning or in finite element analysis. Copyright © 2016 IPEM. Published by Elsevier Ltd. All rights reserved.
Fast and robust segmentation in the SDO-AIA era
NASA Astrophysics Data System (ADS)
Verbeeck, Cis; Delouille, Véronique; Mampaey, Benjamin; Hochedez, Jean-François; Boyes, David; Barra, Vincent
Solar images from the Atmospheric Imaging Assembly (AIA) aboard the Solar Dynamics Ob-servatory (SDO) will flood the solar physics community with a wealth of information on solar variability, of great importance both in solar physics and in view of Space Weather applica-tions. Obtaining this information, however, requires the ability to automatically process large amounts of data in an objective fashion. In previous work, we have proposed a multi-channel unsupervised spatially-constrained multi-channel fuzzy clustering algorithm (SPoCA) that automatically segments EUV solar images into Active Regions (AR), Coronal Holes (CH), and Quiet Sun (QS). This algorithm will run in near real time on AIA data as part of the SDO Feature Finding Project, a suite of software pipeline modules for automated feature recognition and analysis for the imagery from SDO. After having corrected for the limb brightening effect, SPoCA computes an optimal clustering with respect to the regions of interest using fuzzy logic on a quality criterion to manage the various noises present in the images and the imprecision in the definition of the above regions. Next, the algorithm applies a morphological opening operation, smoothing the cluster edges while preserving their general shape. The process is fast and automatic. A lower size limit is used to distinguish AR from Bright Points. As the algorithm segments the coronal images according to their brightness, it might happen that an AR is detected as several disjoint pieces, if the brightness in between is somewhat lower. Morphological dilation is employed to reconstruct the AR themselves from their constituent pieces. Combining SPoCA's detection of AR, CH, and QS on subsequent images allows automatic tracking and naming of any region of interest. In the SDO software pipeline, SPoCA will auto-matically populate the Heliophysics Events Knowledgebase(HEK) with Active Region events. Further, the algorithm has a huge potential for correct and automatic identification of AR, CH, and QS in any study that aims to address properties of those specific regions in the corona. SPoCA is now ready and waiting to tackle solar cycle 24 using SDO data. While we presently apply SPoCA to EUV data, the method is generic enough to allow the introduction of other channels or data, e.g., Differential Emission Measure (DEM) maps. Because of the unprecedented challenges brought up by the quantity of SDO data, European partners have gathered within an ISSI team on `Mining and Exploiting the NASA Solar Dynam-ics Observatory data in Europe' (a.k.a. Soldyneuro). Its aim is to provide automated feature recognition algorithms for scanning the SDO archive, as well as conducting scientific studies that combine different algorithm's outputs. Within the Soldyneuro project, we will use data from the EUV Variability Experiment (EVE) spectrometer in order to estimate the full Sun DEM. This DEM will next be used to estimate the total flux from AIA images so as to provide a validation for the calibration of AIA.
Roadway weather information system and automatic vehicle location (AVL) coordination.
DOT National Transportation Integrated Search
2011-02-28
Roadway Weather Information System and Automatic Vehicle Location Coordination involves the : development of an Inclement Weather Console that provides a new capability for the state of Oklahoma : to monitor weather-related roadway conditions. The go...
Automatic Calculation of Hydrostatic Pressure Gradient in Patients with Head Injury: A Pilot Study.
Moss, Laura; Shaw, Martin; Piper, Ian; Arvind, D K; Hawthorne, Christopher
2016-01-01
The non-surgical management of patients with traumatic brain injury is the treatment and prevention of secondary insults, such as low cerebral perfusion pressure (CPP). Most clinical pressure monitoring systems measure pressure relative to atmospheric pressure. If a patient is managed with their head tilted up, relative to their arterial pressure transducer, then a hydrostatic pressure gradient (HPG) can act against arterial pressure and cause significant errors in calculated CPP.To correct for HPG, the arterial pressure transducer should be placed level with the intracranial pressure transducer. However, this is not always achieved. In this chapter, we describe a pilot study investigating the application of speckled computing (or "specks") for the automatic monitoring of the patient's head tilt and subsequent automatic calculation of HPG. In future applications this will allow us to automatically correct CPP to take into account any HPG.
Reevaluation of pollen quantitation by an automatic pollen counter.
Muradil, Mutarifu; Okamoto, Yoshitaka; Yonekura, Syuji; Chazono, Hideaki; Hisamitsu, Minako; Horiguchi, Shigetoshi; Hanazawa, Toyoyuki; Takahashi, Yukie; Yokota, Kunihiko; Okumura, Satoshi
2010-01-01
Accurate and detailed pollen monitoring is useful for selection of medication and for allergen avoidance in patients with allergic rhinitis. Burkard and Durham pollen samplers are commonly used, but are labor and time intensive. In contrast, automatic pollen counters allow simple real-time pollen counting; however, these instruments have difficulty in distinguishing pollen from small nonpollen airborne particles. Misidentification and underestimation rates for an automatic pollen counter were examined to improve the accuracy of the pollen count. The characteristics of the automatic pollen counter were determined in a chamber study with exposure to cedar pollens or soil grains. The cedar pollen counts were monitored in 2006 and 2007, and compared with those from a Durham sampler. The pollen counts from the automatic counter showed a good correlation (r > 0.7) with those from the Durham sampler when pollen dispersal was high, but a poor correlation (r < 0.5) when pollen dispersal was low. The new correction method, which took into account the misidentification and underestimation, improved this correlation to r > 0.7 during the pollen season. The accuracy of automatic pollen counting can be improved using a correction to include rates of underestimation and misidentification in a particular geographical area.
LOOP marine and estuarine monitoring program, 1978-95 : volume 5 : demersal nekton.
DOT National Transportation Integrated Search
1998-01-01
The Louisiana Offshore Oil Port (LOOP) facilities in coastal Louisiana provide the United States with the country's only Superport for off-loading deep draft tankers. The facilities transport oil ashore through pipelines, and temporarily store oil be...
Smartphone data as an electronic biomarker of illness activity in bipolar disorder.
Faurholt-Jepsen, Maria; Vinberg, Maj; Frost, Mads; Christensen, Ellen Margrethe; Bardram, Jakob E; Kessing, Lars Vedel
2015-11-01
Objective methods are lacking for continuous monitoring of illness activity in bipolar disorder. Smartphones offer unique opportunities for continuous monitoring and automatic collection of real-time data. The objectives of the paper were to test the hypotheses that (i) daily electronic self-monitored data and (ii) automatically generated objective data collected using smartphones correlate with clinical ratings of depressive and manic symptoms in patients with bipolar disorder. Software for smartphones (the MONARCA I system) that collects automatically generated objective data and self-monitored data on illness activity in patients with bipolar disorder was developed by the authors. A total of 61 patients aged 18-60 years and with a diagnosis of bipolar disorder according to ICD-10 used the MONARCA I system for six months. Depressive and manic symptoms were assessed monthly using the Hamilton Depression Rating Scale 17-item (HDRS-17) and the Young Mania Rating Scale (YMRS), respectively. Data are representative of over 400 clinical ratings. Analyses were computed using linear mixed-effect regression models allowing for both between individual variation and within individual variation over time. Analyses showed significant positive correlations between the duration of incoming and outgoing calls/day and scores on the HDRS-17, and significant positive correlations between the number and duration of incoming calls/day and scores on the YMRS; the number of and duration of outgoing calls/day and scores on the YMRS; and the number of outgoing text messages/day and scores on the YMRS. Analyses showed significant negative correlations between self-monitored data (i.e., mood and activity) and scores on the HDRS-17, and significant positive correlations between self-monitored data (i.e., mood and activity) and scores on the YMRS. Finally, the automatically generated objective data were able to discriminate between affective states. Automatically generated objective data and self-monitored data collected using smartphones correlate with clinically rated depressive and manic symptoms and differ between affective states in patients with bipolar disorder. Smartphone apps represent an easy and objective way to monitor illness activity with real-time data in bipolar disorder and may serve as an electronic biomarker of illness activity. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Wiesel, Joseph; Salomone, Thomas J
2017-10-15
Early detection of asymptomatic atrial fibrillation (AF) provides an opportunity to treat patients to reduce their risk of stroke. Long-term residents of skilled nursing facilities frequently have multiple risk factors for strokes due to AF and may benefit from screening for AF. Patients in a skilled nursing facility 65 years and older, without a history of AF and without a pacemaker or defibrillator, were evaluated using a Microlife WatchBP Home A automatic blood pressure monitor that can detect AF when set to a triple reading mode. Those with readings positive for AF were evaluated with a standard 12-lead electrocardiogram (ECG) or a 30-second single-channel ECG to confirm the presence of AF. A total of 101 patients were screened with an average age of 78 years, and 48 (48%) were female. Nine automatic blood pressure monitor readings were positive for possible AF. Of those, 7 (6.9%, 95% confidence intervals 3.0% to 14.2%) had AF confirmed with ECG. Only 2 (2%, 95% confidence interval 0.3% to 7.7%) were false-positive readings. One-time screening for AF using an automatic blood pressure monitor in a skilled nursing facility resulted in a high number of patients with newly diagnosed AF. Copyright © 2017 Elsevier Inc. All rights reserved.
Rapid and Reliable Damage Proxy Map from InSAR Coherence
NASA Technical Reports Server (NTRS)
Yun, Sang-Ho; Fielding, Eric; Simons, Mark; Agram, Piyush; Rosen, Paul; Owen, Susan; Webb, Frank
2012-01-01
Future radar satellites will visit SoCal within a day after a disaster event. Data acquisition latency in 2015-2020 is 8 to approx. 15 hours. Data transfer latency that often involves human/agency intervention far exceeds the data acquisition latency. Need interagency cooperation to establish automatic pipeline for data transfer. The algorithm is tested with ALOS PALSAR data of Pasadena, California. Quantitative quality assessment is being pursued: Meeting with Pasadena City Hall computer engineers for a complete list of demolition/construction project 1. Estimate the probability of detection and probability of false alarm 2. Estimate the optimal threshold value.
Multiple-Nozzle Spray Head Applies Foam Insulation
NASA Technical Reports Server (NTRS)
Walls, Joe T.
1993-01-01
Spray head equipped with four-nozzle turret mixes two reactive components of polyurethane and polyisocyanurate foam insulating material and sprays reacting mixture onto surface to be insulated. If nozzle in use becomes clogged, fresh one automatically rotated into position, with minimal interruption of spraying process. Incorporates features recirculating and controlling pressures of reactive components to maintain quality of foam by ensuring proper blend at outset. Also used to spray protective coats on or in ships, aircraft, and pipelines. Sprays such reactive adhesives as epoxy/polyurethane mixtures. Components of spray contain solid-particle fillers for strength, fire retardance, toughness, resistance to abrasion, or radar absorption.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1994-02-01
Upper East Fork Poplar Creek Operable Unit 2 consists of the Abandoned Nitric Acid pipeline (ANAP). This pipeline was installed in 1951 to transport liquid wastes {approximately}4800 ft from Buildings 9212, 9215, and 9206 to the S-3 Ponds. Materials known to have been discharged through the pipeline include nitric acid, depleted and enriched uranium, various metal nitrates, salts, and lead skimmings. During the mid-1980s, sections of the pipeline were removed during various construction projects. A total of 19 locations were chosen to be investigated along the pipeline for the first phase of this Remedial Investigation. Sampling consisted of drilling downmore » to obtain a soil sample at a depth immediately below the pipeline. Additional samples were obtained deeper in the subsurface depending upon the depth of the pipeline, the depth of the water table, and the point of auger refusal. The 19 samples collected below the pipeline were analyzed by the Oak Ridge Y-12 Plant`s laboratory for metals, nitrate/nitrite, and isotopic uranium. Samples collected from three boreholes were also analyzed for volatile organic compounds because these samples produced a response with organic vapor monitoring equipment. Uranium activities in the soil samples ranged from 0.53 to 13.0 pCi/g for {sup 238}U, from 0.075 to 0.75 pCi/g for {sup 235}U, and from 0.71 to 5.0 pCi/g for {sup 238}U. Maximum total values for lead, chromium, and nickel were 75.1 mg/kg, 56.3 mg/kg, and 53.0 mg/kg, respectively. The maximum nitrate/nitrite value detected was 32.0 mg-N/kg. One sample obtained adjacent to a sewer line contained various organic compounds, at least some of which were tentatively identified as fragrance chemicals commonly associated with soaps and cleaning solutions. The results of the baseline human health risk assessment for the ANAP contaminants of potential concern show no unacceptable risks to human health.« less
Integrating the ODI-PPA scientific gateway with the QuickReduce pipeline for on-demand processing
NASA Astrophysics Data System (ADS)
Young, Michael D.; Kotulla, Ralf; Gopu, Arvind; Liu, Wilson
2014-07-01
As imaging systems improve, the size of astronomical data has continued to grow, making the transfer and processing of data a significant burden. To solve this problem for the WIYN Observatory One Degree Imager (ODI), we developed the ODI-Portal, Pipeline, and Archive (ODI-PPA) science gateway, integrating the data archive, data reduction pipelines, and a user portal. In this paper, we discuss the integration of the QuickReduce (QR) pipeline into PPA's Tier 2 processing framework. QR is a set of parallelized, stand-alone Python routines accessible to all users, and operators who can create master calibration products and produce standardized calibrated data, with a short turn-around time. Upon completion, the data are ingested into the archive and portal, and made available to authorized users. Quality metrics and diagnostic plots are generated and presented via the portal for operator approval and user perusal. Additionally, users can tailor the calibration process to their specific science objective(s) by selecting custom datasets, applying preferred master calibrations or generating their own, and selecting pipeline options. Submission of a QuickReduce job initiates data staging, pipeline execution, and ingestion of output data products all while allowing the user to monitor the process status, and to download or further process/analyze the output within the portal. User-generated data products are placed into a private user-space within the portal. ODI-PPA leverages cyberinfrastructure at Indiana University including the Big Red II supercomputer, the Scholarly Data Archive tape system and the Data Capacitor shared file system.
NASA Astrophysics Data System (ADS)
Vetrov, A.
2009-05-01
The condition of underground constructions, communication and supply systems in the cities has to be periodically monitored and controlled in order to prevent their breakage, which can result in serious accident, especially in urban area. The most risk of damage have the underground construction made of steal such as pipelines widely used for water, gas and heat supply. To ensure the pipeline survivability it is necessary to carry out the operative and inexpensive control of pipelines condition. Induced electromagnetic methods of geophysics can be applied to provide such diagnostics. The highly developed surface in urbane area is one of cause hampering the realization of electromagnetic methods of diagnostics. The main problem is in finding of an appropriate place for the source line and electrodes on a limited surface area and their optimal position relative to the observation path to minimize their influence on observed data. Author made a number of experiments of an underground heating system pipeline diagnostics using different position of the source line and electrodes. The experiments were made on a 200 meters section over 2 meters deep pipeline. The admissible length of the source line and angle between the source line and the observation path were determined. The minimal length of the source line for the experiment conditions and accuracy made 30 meters, the maximum admissible angle departure from the perpendicular position made 30 degrees. The work was undertaken in cooperation with diagnostics company DIsSO, Saint-Petersburg, Russia.
ERIC Educational Resources Information Center
Harris, Julian; Maurer, Hermann
An investigation into high level event monitoring within the scope of a well-known multimedia application, HyperCard--a program on the Macintosh computer, is carried out. A monitoring system is defined as a system which automatically monitors usage of some activity and gathers statistics based on what is has observed. Monitor systems can give the…
LOOP marine and estuarine monitoring program, 1978-95 : volume 4 : zooplankton and ichthyoplankton.
DOT National Transportation Integrated Search
1998-01-01
The Louisiana Offshore Oil Port (LOOP) facilities in coastal Louisiana provide the United States with the country's only Superport for off-loading deep draft tankers. The three single-point mooring (SPM) structures connected by pipelines to a platfor...
NASA Astrophysics Data System (ADS)
Wu, Huijuan; Sun, Zhenshi; Qian, Ya; Zhang, Tao; Rao, Yunjiang
2015-07-01
A hydrostatic leak test for water pipeline with a distributed optical fiber vibration sensing (DOVS) system based on the phase-sensitive OTDR technology is studied in this paper. By monitoring one end of a common communication optical fiber cable, which is laid in the inner wall of the pipe, we can detect and locate the water leakages easily. Different apertures under different pressures are tested and it shows that the DOVS has good responses when the aperture is equal or larger than 4 mm and the inner pressure reaches 0.2 Mpa for a steel pipe with DN 91cm×EN 2cm.
NASA Astrophysics Data System (ADS)
Kang, Jidong; Gianetto, James A.; Tyson, William R.
2018-03-01
Fracture toughness measurement is an integral part of structural integrity assessment of pipelines. Traditionally, a single-edge-notched bend (SE(B)) specimen with a deep crack is recommended in many existing pipeline structural integrity assessment procedures. Such a test provides high constraint and therefore conservative fracture toughness results. However, for girth welds in service, defects are usually subjected to primarily tensile loading where the constraint is usually much lower than in the three-point bend case. Moreover, there is increasing use of strain-based design of pipelines that allows applied strains above yield. Low-constraint toughness tests represent more realistic loading conditions for girth weld defects, and the corresponding increased toughness can minimize unnecessary conservatism in assessments. In this review, we present recent developments in low-constraint fracture toughness testing, specifically using single-edgenotched tension specimens, SENT or SE(T). We focus our review on the test procedure development and automation, round-robin test results and some common concerns such as the effect of crack tip, crack size monitoring techniques, and testing at low temperatures. Examples are also given of the integration of fracture toughness data from SE(T) tests into structural integrity assessment.
Assessment on inflow and infiltration in sewerage systems of Kuantan, Pahang.
Yap, Hiew Thong; Ngien, Su Kong
2017-12-01
Inflow and infiltration are important aspects of sewerage systems that need to be considered during the design stage and constantly monitored once the sewerage system is in operation. The aim of this research is to analyse the relationship of rainfall as well as inflow infiltration with sewage flow patterns through data collected from fieldwork. Three sewer pipelines were selected at the residential areas of Taman Lepar Hilir Saujana, Bandar Putra and Kota Sas for data collection. Sewage flow data were collected in terms of flowrate, velocity and depth of flow using flowmeters with ultrasonic sensors that utilize the continuous Doppler effect in the sewer pipelines, while rainfall intensity data were collected using rain gauges installed at the study locations. Based on the result, the average infiltration rates of Q peak and Q ave for the locations were 17% and 21%, which exceeded the respective values of 5% and 10% stated in Hammer and Hammer. The flowrate of wastewater in the sewer pipelines was found to be directly proportional to rainfall. These findings indicate that the sewer pipelines in the study areas may have been affected by capacity reduction, whereas the sewerage treatment plants receiving the wastewater influent may have been overloaded.
NASA Astrophysics Data System (ADS)
Bourgeat, Pierrick; Dore, Vincent; Fripp, Jurgen; Villemagne, Victor L.; Rowe, Chris C.; Salvado, Olivier
2015-03-01
With the advances of PET tracers for β-Amyloid (Aβ) detection in neurodegenerative diseases, automated quantification methods are desirable. For clinical use, there is a great need for PET-only quantification method, as MR images are not always available. In this paper, we validate a previously developed PET-only quantification method against MR-based quantification using 6 tracers: 18F-Florbetaben (N=148), 18F-Florbetapir (N=171), 18F-NAV4694 (N=47), 18F-Flutemetamol (N=180), 11C-PiB (N=381) and 18F-FDG (N=34). The results show an overall mean absolute percentage error of less than 5% for each tracer. The method has been implemented as a remote service called CapAIBL (http://milxcloud.csiro.au/capaibl). PET images are uploaded to a cloud platform where they are spatially normalised to a standard template and quantified. A report containing global as well as local quantification, along with surface projection of the β-Amyloid deposition is automatically generated at the end of the pipeline and emailed to the user.
Vishnyakova, Dina; Pasche, Emilie; Ruch, Patrick
2012-01-01
We report on the original integration of an automatic text categorization pipeline, so-called ToxiCat (Toxicogenomic Categorizer), that we developed to perform biomedical documents classification and prioritization in order to speed up the curation of the Comparative Toxicogenomics Database (CTD). The task can be basically described as a binary classification task, where a scoring function is used to rank a selected set of articles. Then components of a question-answering system are used to extract CTD-specific annotations from the ranked list of articles. The ranking function is generated using a Support Vector Machine, which combines three main modules: an information retrieval engine for MEDLINE (EAGLi), a gene normalization service (NormaGene) developed for a previous BioCreative campaign and finally, a set of answering components and entity recognizer for diseases and chemicals. The main components of the pipeline are publicly available both as web application and web services. The specific integration performed for the BioCreative competition is available via a web user interface at http://pingu.unige.ch:8080/Toxicat.
NASA Astrophysics Data System (ADS)
Brouwer, Albert; Brown, David; Tomuta, Elena
2017-04-01
To detect nuclear explosions, waveform data from over 240 SHI stations world-wide flows into the International Data Centre (IDC) of the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO), located in Vienna, Austria. A complex pipeline of software applications processes this data in numerous ways to form event hypotheses. The software codebase comprises over 2 million lines of code, reflects decades of development, and is subject to frequent enhancement and revision. Since processing must run continuously and reliably, software changes are subjected to thorough testing before being put into production. To overcome the limitations and cost of manual testing, the Continuous Automated Testing System (CATS) has been created. CATS provides an isolated replica of the IDC processing environment, and is able to build and test different versions of the pipeline software directly from code repositories that are placed under strict configuration control. Test jobs are scheduled automatically when code repository commits are made. Regressions are reported. We present the CATS design choices and test methods. Particular attention is paid to how the system accommodates the individual testing of strongly interacting software components that lack test instrumentation.
NASA Astrophysics Data System (ADS)
Karrenbach, M. H.; Cole, S.; Williams, J. J.; Biondi, B. C.; McMurtry, T.; Martin, E. R.; Yuan, S.
2017-12-01
Fiber-optic distributed acoustic sensing (DAS) uses conventional telecom fibers for a wide variety of monitoring purposes. Fiber-optic arrays can be located along pipelines for leak detection; along borders and perimeters to detect and locate intruders, or along railways and roadways to monitor traffic and identify and manage incidents. DAS can also be used to monitor oil and gas reservoirs and to detect earthquakes. Because thousands of such arrays are deployed worldwide and acquiring data continuously, they can be a valuable source of data for earthquake detection and location, and could potentially provide important information to earthquake early-warning systems. In this presentation, we show that DAS arrays in Mexico and the United States detected the M8.1 and M7.2 Mexico earthquakes in September 2017. At Stanford University, we have deployed a 2.4 km fiber-optic DAS array in a figure-eight pattern, with 600 channels spaced 4 meters apart. Data have been recorded continuously since September 2016. Over 800 earthquakes from across California have been detected and catalogued. Distant teleseismic events have also been recorded, including the two Mexican earthquakes. In Mexico, fiber-optic arrays attached to pipelines also detected these two events. Because of the length of these arrays and their proximity to the event locations, we can not only detect the earthquakes but also make location estimates, potentially in near real time. In this presentation, we review the data recorded for these two events recorded at Stanford and in Mexico. We compare the waveforms recorded by the DAS arrays to those recorded by traditional earthquake sensor networks. Using the wide coverage provided by the pipeline arrays, we estimate the event locations. Such fiber-optic DAS networks can potentially play a role in earthquake early-warning systems, allowing actions to be taken to minimize the impact of an earthquake on critical infrastructure components. While many such fiber-optic networks are already in place, new arrays can be created on demand, using existing fiber-optic telecom cables, for specific monitoring situations such as recording aftershocks of a large earthquake or monitoring induced seismicity.
Data visualization as a tool for improved decision making within transit agencies
DOT National Transportation Integrated Search
2007-02-01
TriMet, the regional transit provider in the Portland, OR, area has been a leader in bus transit performance monitoring using data collected via automatic vehicle location and automatic passenger counter technologies. This information is collected an...
DOT National Transportation Integrated Search
1977-06-01
In 1975, to further the development and to refine and dmonstrate multiuser Automatic Vehicle Monitoring (AVM) application, the Urban Mass Transportation Administration and the Transportation Systems Center (TSC) initiated a two-phase program. Phase I...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Doug Cathro
The Lake Charles CCS Project is a large-scale industrial carbon capture and sequestration (CCS) project which will demonstrate advanced technologies that capture and sequester carbon dioxide (CO{sub 2}) emissions from industrial sources into underground formations. Specifically the Lake Charles CCS Project will accelerate commercialization of large-scale CO{sub 2} storage from industrial sources by leveraging synergy between a proposed petroleum coke to chemicals plant (the LCC Gasification Project) and the largest integrated anthropogenic CO{sub 2} capture, transport, and monitored sequestration program in the U.S. Gulf Coast Region. The Lake Charles CCS Project will promote the expansion of EOR in Texas andmore » Louisiana and supply greater energy security by expanding domestic energy supplies. The capture, compression, pipeline, injection, and monitoring infrastructure will continue to sequester CO{sub 2} for many years after the completion of the term of the DOE agreement. The objectives of this project are expected to be fulfilled by working through two distinct phases. The overall objective of Phase 1 was to develop a fully definitive project basis for a competitive Renewal Application process to proceed into Phase 2 - Design, Construction and Operations. Phase 1 includes the studies attached hereto that will establish: the engineering design basis for the capture, compression and transportation of CO{sub 2} from the LCC Gasification Project, and the criteria and specifications for a monitoring, verification and accounting (MVA) plan at the Hastings oil field in Texas. The overall objective of Phase 2, provided a successful competitive down-selection, is to execute design, construction and operations of three capital projects: (1) the CO{sub 2} capture and compression equipment, (2) a Connector Pipeline from the LLC Gasification Project to the Green Pipeline owned by Denbury and an affiliate of Denbury, and (3) a comprehensive MVA system at the Hastings oil field.« less
Ultrasonic guided wave interpretation for structural health inspections
NASA Astrophysics Data System (ADS)
Bingham, Jill Paisley
Structural Health Management (SHM) combines the use of onboard sensors with artificial intelligence algorithms to automatically identify and monitor structural health issues. A fully integrated approach to SHM systems demands an understanding of the sensor output relative to the structure, along with sophisticated prognostic systems that automatically draw conclusions about structural integrity issues. Ultrasonic guided wave methods allow us to examine the interaction of multimode signals within key structural components. Since they propagate relatively long distances within plate- and shell-like structures, guided waves allow inspection of greater areas with fewer sensors, making this technique attractive for a variety of applications. This dissertation describes the experimental development of automatic guided wave interpretation for three real world applications. Using the guided wave theories for idealized plates we have systematically developed techniques for identifying the mass loading of underwater limpet mines on US Navy ship hulls, characterizing type and bonding of protective coatings on large diameter pipelines, and detecting the thinning effects of corrosion on aluminum aircraft structural stringers. In each of these circumstances the signals received are too complex for interpretation without knowledge of the guided wave physics. We employ a signal processing technique called the Dynamic Wavelet Fingerprint Technique (DFWT) in order to render the guided wave mode information in two-dimensional binary images. The use of wavelets allows us to keep track of both time and scale features from the original signals. With simple image processing we have developed automatic extraction algorithms for features that correspond to the arrival times of the guided wave modes of interest for each of the applications. Due to the dispersive nature of the guided wave modes, the mode arrival times give details of the structure in the propagation path. For further understanding of how the guided wave modes propagate through the real structures, we have developed parallel processing, 3D elastic wave simulations using the finite integration technique (EFIT). This full field, numeric simulation technique easily examines models too complex for analytical solutions. We have developed the algorithm to handle built up 3D structures as well as layers with different material properties and surface detail. The simulations produce informative visualizations of the guided wave modes in the structures as well as the output from sensors placed in the simulation space to mimic the placement from experiment. Using the previously developed mode extraction algorithms we were then able to compare our 3D EFIT data to their experimental counterparts with consistency.
Compile-time estimation of communication costs in multicomputers
NASA Technical Reports Server (NTRS)
Gupta, Manish; Banerjee, Prithviraj
1991-01-01
An important problem facing numerous research projects on parallelizing compilers for distributed memory machines is that of automatically determining a suitable data partitioning scheme for a program. Any strategy for automatic data partitioning needs a mechanism for estimating the performance of a program under a given partitioning scheme, the most crucial part of which involves determining the communication costs incurred by the program. A methodology is described for estimating the communication costs at compile-time as functions of the numbers of processors over which various arrays are distributed. A strategy is described along with its theoretical basis, for making program transformations that expose opportunities for combining of messages, leading to considerable savings in the communication costs. For certain loops with regular dependences, the compiler can detect the possibility of pipelining, and thus estimate communication costs more accurately than it could otherwise. These results are of great significance to any parallelization system supporting numeric applications on multicomputers. In particular, they lay down a framework for effective synthesis of communication on multicomputers from sequential program references.
Li, Yingxue; Hu, Yiying; Yang, Jingang; Li, Xiang; Liu, Haifeng; Xie, Guotong; Xu, Meilin; Hu, Jingyi; Yang, Yuejin
2017-01-01
Treatment effectiveness plays a fundamental role in patient therapies. In most observational studies, researchers often design an analysis pipeline for a specific treatment based on the study cohort. To evaluate other treatments in the data set, much repeated and multifarious work including cohort construction, statistical analysis need to be done. In addition, as treatments are often with an intrinsic hierarchical relationship, many rational comparable treatment pairs can be derived as new treatment variables besides the original single treatment one from the original cohort data set. In this paper, we propose an automatic treatment effectiveness analysis approach to solve this problem. With our approach, clinicians can assess the effect of treatments not only more conveniently but also more thoroughly and comprehensively. We applied this method to a real world case of estimating the drug effectiveness on Chinese Acute Myocardial Infarction (CAMI) data set and some meaningful results are obtained for potential improvement of patient treatments.
NASA Astrophysics Data System (ADS)
Valdes-Abellan, Javier; Jiménez-Martínez, Joaquin; Candela, Lucila
2013-04-01
For monitoring the vadose zone, different strategies can be chosen, depending on the objectives and scale of observation. The effects of non-conventional water use on the vadose zone might produce impacts in porous media which could lead to changes in soil hydraulic properties, among others. Controlling these possible effects requires an accurate monitoring strategy that controls the volumetric water content, θ, and soil pressure, h, along the studied profile. According to the available literature, different monitoring systems have been carried out independently, however less attention has received comparative studies between different techniques. An experimental plot of 9x5 m2 was set with automatic and non-automatic sensors to control θ and h up to 1.5m depth. The non-automatic system consisted of ten Jet Fill tensiometers at 30, 45, 60, 90 and 120 cm (Soil Moisture®) and a polycarbonate access tube of 44 mm (i.d) for soil moisture measurements with a TRIME FM TDR portable probe (IMKO®). Vertical installation was carefully performed; measurements with this system were manual, twice a week for θ and three times per week for h. The automatic system composed of five 5TE sensors (Decagon Devices®) installed at 20, 40, 60, 90 and 120 cm for θ measurements and one MPS1 sensor (Decagon Devices®) at 60 cm depth for h. Installation took place laterally in a 40-50 cm length hole bored in a side of a trench that was excavated. All automatic sensors hourly recorded and stored in a data-logger. Boundary conditions were controlled with a volume-meter and with a meteorological station. ET was modelled with Penman-Monteith equation. Soil characterization include bulk density, gravimetric water content, grain size distribution, saturated hydraulic conductivity and soil water retention curves determined following laboratory standards. Soil mineralogy was determined by X-Ray difractometry. Unsaturated soil hydraulic parameters were model-fitted through SWRC-fit code and ROSETTA based on soil textural fractions. Simulation of water flow using automatic and non-automatic date was carried out by HYDRUS-1D independently. A good agreement from collected automatic and non-automatic data and modelled results can be recognized. General trend was captured, except for the outlier values as expected. Slightly differences were found between hydraulic properties obtained from laboratory determinations, and from inverse modelling from the two approaches. Differences up to 14% of flux through the lower boundary were detected between the two strategies According to results, automatic sensors have more resolution and then they're more appropriated to detect subtle changes of soil hydraulic properties. Nevertheless, if the aim of the research is to control the general trend of water dynamics, no significant differences were observed between the two systems.
The design of the intelligent monitoring system for dam safety
NASA Astrophysics Data System (ADS)
Yuan, Chun-qiao; Jiang, Chen-guang; Wang, Guo-hui
2008-12-01
Being a vital manmade water-control structure, a dam plays a very important role in the living and production of human being. To make a dam run safely, the best design and the superior construction quality are paramount; moreover, with working periods increasing, various dynamic, alternative and bad loads generate little by little various distortions on the dam structure inevitably, which shall lead to potential safety problems or further a disaster (dam burst). There are many signs before the occurrence of a dam accident, so the timely and effective surveying on the distortion of a dam is important. On the basis of the cause supra, two intelligent (automatic) monitoring systems about the dam's safety based on the RTK-GPS technology and the measuring robot has been developed. The basic principle, monitoring method and monitoring process of these two intelligent (automatic) monitoring systems are introduced. It presents examples of monitor and puts forward the basic rule of dam warning based on data of actual monitor.
Dormann, H; Criegee-Rieck, M; Neubert, A; Egger, T; Levy, M; Hahn, E G; Brune, K
2004-02-01
To investigate the effectiveness of a computer monitoring system that detects adverse drug reactions (ADRs) by laboratory signals in gastroenterology. A prospective, 6-month, pharmaco-epidemiological survey was carried out on a gastroenterological ward at the University Hospital Erlangen-Nuremberg. Two methods were used to identify ADRs. (i) All charts were reviewed daily by physicians and clinical pharmacists. (ii) A computer monitoring system generated a daily list of automatic laboratory signals and alerts of ADRs, including patient data and dates of events. One hundred and nine ADRs were detected in 474 admissions (377 patients). The computer monitoring system generated 4454 automatic laboratory signals from 39 819 laboratory parameters tested, and issued 2328 alerts, 914 (39%) of which were associated with ADRs; 574 (25%) were associated with ADR-positive admissions. Of all the alerts generated, signals of hepatotoxicity (1255), followed by coagulation disorders (407) and haematological toxicity (207), were prevalent. Correspondingly, the prevailing ADRs were concerned with the metabolic and hepato-gastrointestinal system (61). The sensitivity was 91%: 69 of 76 ADR-positive patients were indicated by an alert. The specificity of alerts was increased from 23% to 76% after implementation of an automatic laboratory signal trend monitoring algorithm. This study shows that a computer monitoring system is a useful tool for the systematic and automated detection of ADRs in gastroenterological patients.
HIPAA-compliant automatic monitoring system for RIS-integrated PACS operation
NASA Astrophysics Data System (ADS)
Jin, Jin; Zhang, Jianguo; Chen, Xiaomeng; Sun, Jianyong; Yang, Yuanyuan; Liang, Chenwen; Feng, Jie; Sheng, Liwei; Huang, H. K.
2006-03-01
As a governmental regulation, Health Insurance Portability and Accountability Act (HIPAA) was issued to protect the privacy of health information that identifies individuals who are living or deceased. HIPAA requires security services supporting implementation features: Access control; Audit controls; Authorization control; Data authentication; and Entity authentication. These controls, which proposed in HIPAA Security Standards, are Audit trails here. Audit trails can be used for surveillance purposes, to detect when interesting events might be happening that warrant further investigation. Or they can be used forensically, after the detection of a security breach, to determine what went wrong and who or what was at fault. In order to provide security control services and to achieve the high and continuous availability, we design the HIPAA-Compliant Automatic Monitoring System for RIS-Integrated PACS operation. The system consists of two parts: monitoring agents running in each PACS component computer and a Monitor Server running in a remote computer. Monitoring agents are deployed on all computer nodes in RIS-Integrated PACS system to collect the Audit trail messages defined by the Supplement 95 of the DICOM standard: Audit Trail Messages. Then the Monitor Server gathers all audit messages and processes them to provide security information in three levels: system resources, PACS/RIS applications, and users/patients data accessing. Now the RIS-Integrated PACS managers can monitor and control the entire RIS-Integrated PACS operation through web service provided by the Monitor Server. This paper presents the design of a HIPAA-compliant automatic monitoring system for RIS-Integrated PACS Operation, and gives the preliminary results performed by this monitoring system on a clinical RIS-integrated PACS.
Design of an automatic production monitoring system on job shop manufacturing
NASA Astrophysics Data System (ADS)
Prasetyo, Hoedi; Sugiarto, Yohanes; Rosyidi, Cucuk Nur
2018-02-01
Every production process requires monitoring system, so the desired efficiency and productivity can be monitored at any time. This system is also needed in the job shop type of manufacturing which is mainly influenced by the manufacturing lead time. Processing time is one of the factors that affect the manufacturing lead time. In a conventional company, the recording of processing time is done manually by the operator on a sheet of paper. This method is prone to errors. This paper aims to overcome this problem by creating a system which is able to record and monitor the processing time automatically. The solution is realized by utilizing electric current sensor, barcode, RFID, wireless network and windows-based application. An automatic monitoring device is attached to the production machine. It is equipped with a touch screen-LCD so that the operator can use it easily. Operator identity is recorded through RFID which is embedded in his ID card. The workpiece data are collected from the database by scanning the barcode listed on its monitoring sheet. A sensor is mounted on the machine to measure the actual machining time. The system's outputs are actual processing time and machine's capacity information. This system is connected wirelessly to a workshop planning application belongs to the firm. Test results indicated that all functions of the system can run properly. This system successfully enables supervisors, PPIC or higher level management staffs to monitor the processing time quickly with a better accuracy.
The Gemini Recipe System: A Dynamic Workflow for Automated Data Reduction
NASA Astrophysics Data System (ADS)
Labrie, K.; Hirst, P.; Allen, C.
2011-07-01
Gemini's next generation data reduction software suite aims to offer greater automation of the data reduction process without compromising the flexibility required by science programs using advanced or unusual observing strategies. The Recipe System is central to our new data reduction software. Developed in Python, it facilitates near-real time processing for data quality assessment, and both on- and off-line science quality processing. The Recipe System can be run as a standalone application or as the data processing core of an automatic pipeline. Building on concepts that originated in ORAC-DR, a data reduction process is defined in a Recipe written in a science (as opposed to computer) oriented language, and consists of a sequence of data reduction steps called Primitives. The Primitives are written in Python and can be launched from the PyRAF user interface by users wishing for more hands-on optimization of the data reduction process. The fact that the same processing Primitives can be run within both the pipeline context and interactively in a PyRAF session is an important strength of the Recipe System. The Recipe System offers dynamic flow control allowing for decisions regarding processing and calibration to be made automatically, based on the pixel and the metadata properties of the dataset at the stage in processing where the decision is being made, and the context in which the processing is being carried out. Processing history and provenance recording are provided by the AstroData middleware, which also offers header abstraction and data type recognition to facilitate the development of instrument-agnostic processing routines. All observatory or instrument specific definitions are isolated from the core of the AstroData system and distributed in external configuration packages that define a lexicon including classifications, uniform metadata elements, and transformations.
The agile alert system for gamma-ray transients
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bulgarelli, A.; Trifoglio, M.; Gianotti, F.
2014-01-20
In recent years, a new generation of space missions has offered great opportunities for discovery in high-energy astrophysics. In this article we focus on the scientific operations of the Gamma-Ray Imaging Detector (GRID) on board the AGILE space mission. AGILE-GRID, sensitive in the energy range of 30 MeV-30 GeV, has detected many γ-ray transients of both galactic and extragalactic origin. This work presents the AGILE innovative approach to fast γ-ray transient detection, which is a challenging task and a crucial part of the AGILE scientific program. The goals are to describe (1) the AGILE Gamma-Ray Alert System, (2) a newmore » algorithm for blind search identification of transients within a short processing time, (3) the AGILE procedure for γ-ray transient alert management, and (4) the likelihood of ratio tests that are necessary to evaluate the post-trial statistical significance of the results. Special algorithms and an optimized sequence of tasks are necessary to reach our goal. Data are automatically analyzed at every orbital downlink by an alert pipeline operating on different timescales. As proper flux thresholds are exceeded, alerts are automatically generated and sent as SMS messages to cellular telephones, via e-mail, and via push notifications from an application for smartphones and tablets. These alerts are crosschecked with the results of two pipelines, and a manual analysis is performed. Being a small scientific-class mission, AGILE is characterized by optimization of both scientific analysis and ground-segment resources. The system is capable of generating alerts within two to three hours of a data downlink, an unprecedented reaction time in γ-ray astrophysics.« less
The AGILE Alert System for Gamma-Ray Transients
NASA Astrophysics Data System (ADS)
Bulgarelli, A.; Trifoglio, M.; Gianotti, F.; Tavani, M.; Parmiggiani, N.; Fioretti, V.; Chen, A. W.; Vercellone, S.; Pittori, C.; Verrecchia, F.; Lucarelli, F.; Santolamazza, P.; Fanari, G.; Giommi, P.; Beneventano, D.; Argan, A.; Trois, A.; Scalise, E.; Longo, F.; Pellizzoni, A.; Pucella, G.; Colafrancesco, S.; Conforti, V.; Tempesta, P.; Cerone, M.; Sabatini, P.; Annoni, G.; Valentini, G.; Salotti, L.
2014-01-01
In recent years, a new generation of space missions has offered great opportunities for discovery in high-energy astrophysics. In this article we focus on the scientific operations of the Gamma-Ray Imaging Detector (GRID) on board the AGILE space mission. AGILE-GRID, sensitive in the energy range of 30 MeV-30 GeV, has detected many γ-ray transients of both galactic and extragalactic origin. This work presents the AGILE innovative approach to fast γ-ray transient detection, which is a challenging task and a crucial part of the AGILE scientific program. The goals are to describe (1) the AGILE Gamma-Ray Alert System, (2) a new algorithm for blind search identification of transients within a short processing time, (3) the AGILE procedure for γ-ray transient alert management, and (4) the likelihood of ratio tests that are necessary to evaluate the post-trial statistical significance of the results. Special algorithms and an optimized sequence of tasks are necessary to reach our goal. Data are automatically analyzed at every orbital downlink by an alert pipeline operating on different timescales. As proper flux thresholds are exceeded, alerts are automatically generated and sent as SMS messages to cellular telephones, via e-mail, and via push notifications from an application for smartphones and tablets. These alerts are crosschecked with the results of two pipelines, and a manual analysis is performed. Being a small scientific-class mission, AGILE is characterized by optimization of both scientific analysis and ground-segment resources. The system is capable of generating alerts within two to three hours of a data downlink, an unprecedented reaction time in γ-ray astrophysics.
DICOM router: an open source toolbox for communication and correction of DICOM objects.
Hackländer, Thomas; Kleber, Klaus; Martin, Jens; Mertens, Heinrich
2005-03-01
Today, the exchange of medical images and clinical information is well defined by the digital imaging and communications in medicine (DICOM) and Health Level Seven (ie, HL7) standards. The interoperability among information systems is specified by the integration profiles of IHE (Integrating the Healthcare Enterprise). However, older imaging modalities frequently do not correctly support these interfaces and integration profiles, and some use cases are not yet specified by IHE. Therefore, corrections of DICOM objects are necessary to establish conformity. The aim of this project was to develop a toolbox that can automatically perform these recurrent corrections of the DICOM objects. The toolbox is composed of three main components: 1) a receiver to receive DICOM objects, 2) a processing pipeline to correct each object, and 3) one or more senders to forward each corrected object to predefined addressees. The toolbox is implemented under Java as an open source project. The processing pipeline is realized by means of plug ins. One of the plug ins can be programmed by the user via an external eXtensible Stylesheet Language (ie, XSL) file. Using this plug in, DICOM objects can also be converted into eXtensible Markup Language (ie, XML) documents or other data formats. DICOM storage services, DICOM CD-ROMs, and the local file system are defined as input and output channel. The toolbox is used clinically for different application areas. These are the automatic correction of DICOM objects from non-IHE-conforming modalities, the import of DICOM CD-ROMs into the picture archiving and communication system and the pseudo naming of DICOM images. The toolbox has been accepted by users in a clinical setting. Because of the open programming interfaces, the functionality can easily be adapted to future applications.
Li, Yuanyao; Huang, Jinsong; Jiang, Shui-Hua; Huang, Faming; Chang, Zhilu
2017-12-07
It is important to monitor the displacement time series and to explore the failure mechanism of reservoir landslide for early warning. Traditionally, it is a challenge to monitor the landslide displacements real-timely and automatically. Globe Position System (GPS) is considered as the best real-time monitoring technology, however, the accuracies of the landslide displacements monitored by GPS are not assessed effectively. A web-based GPS system is developed to monitor the landslide displacements real-timely and automatically in this study. And the discrete wavelet transform (DWT) is proposed to assess the accuracy of the GPS monitoring displacements. Wangmiao landslide in Three Gorges Reservoir area in China is used as case study. The results show that the web-based GPS system has advantages of high precision, real-time, remote control and automation for landslide monitoring; the Root Mean Square Errors of the monitoring landslide displacements are less than 5 mm. Meanwhile, the results also show that a rapidly falling reservoir water level can trigger the reactivation of Wangmiao landslide. Heavy rainfall is also an important factor, but not a crucial component.
Pipeline Processing with an Iterative, Context-based Detection Model
2014-04-19
stripping the incoming data stream of repeating and irrelevant signals prior to running primary detectors , adaptive beamforming and matched field processing...framework, pattern detectors , correlation detectors , subspace detectors , matched field detectors , nuclear explosion monitoring 16. SECURITY CLASSIFICATION...10 5. Teleseismic paths from earthquakes in
Functionalized multi-walled carbon nanotube based sensors for distributed methane leak detection
This paper presents a highly sensitive, energy efficient and low-cost distributed methane (CH4) sensor system (DMSS) for continuous monitoring, detection and localization of CH4 leaks in natural gas infrastructure such as transmission and distribution pipelines, wells, and produc...
DOT National Transportation Integrated Search
2016-10-01
This University of Maryland (UMD) project, in cooperation with Starodub Inc, : had the following objectives: : 1) Provide data analysis support for 40 bridge decks; : 2) Develop the analysis pipeline for producing structural reports according to the ...
Extending the Fermi-LAT Data Processing Pipeline to the Grid
NASA Astrophysics Data System (ADS)
Zimmer, S.; Arrabito, L.; Glanzman, T.; Johnson, T.; Lavalley, C.; Tsaregorodtsev, A.
2012-12-01
The Data Handling Pipeline (“Pipeline”) has been developed for the Fermi Gamma-Ray Space Telescope (Fermi) Large Area Telescope (LAT) which launched in June 2008. Since then it has been in use to completely automate the production of data quality monitoring quantities, reconstruction and routine analysis of all data received from the satellite and to deliver science products to the collaboration and the Fermi Science Support Center. Aside from the reconstruction of raw data from the satellite (Level 1), data reprocessing and various event-level analyses are also reasonably heavy loads on the pipeline and computing resources. These other loads, unlike Level 1, can run continuously for weeks or months at a time. In addition it receives heavy use in performing production Monte Carlo tasks. In daily use it receives a new data download every 3 hours and launches about 2000 jobs to process each download, typically completing the processing of the data before the next download arrives. The need for manual intervention has been reduced to less than 0.01% of submitted jobs. The Pipeline software is written almost entirely in Java and comprises several modules. The software comprises web-services that allow online monitoring and provides charts summarizing work flow aspects and performance information. The server supports communication with several batch systems such as LSF and BQS and recently also Sun Grid Engine and Condor. This is accomplished through dedicated job control services that for Fermi are running at SLAC and the other computing site involved in this large scale framework, the Lyon computing center of IN2P3. While being different in the logic of a task, we evaluate a separate interface to the Dirac system in order to communicate with EGI sites to utilize Grid resources, using dedicated Grid optimized systems rather than developing our own. More recently the Pipeline and its associated data catalog have been generalized for use by other experiments, and are currently being used by the Enriched Xenon Observatory (EXO), Cryogenic Dark Matter Search (CDMS) experiments as well as for Monte Carlo simulations for the future Cherenkov Telescope Array (CTA).
Instrument Performance Monitoring at Gemini North
NASA Astrophysics Data System (ADS)
Emig, Kimberly; Pohlen, M.; Chene, A.
2014-01-01
An instrument performance monitoring (IPM) project at the Gemini North Observatory evaluates the delivered throughput and sensitivity of, among other instruments, the Near-Infrared Integral Field Spectrometer (NIFS), the Gemini Near-Infrared Spectrograph (GNIRS), and the Gemini Multi-Object Spectrograph (GMOS-N). Systematic observations of standard stars allow the quality of the instruments and mirror to be assessed periodically. An automated pipeline has been implemented to process and analyze data obtained with NIFS, GNIRS cross-dispersed (XD) and long slit (LS) modes, and GMOS (photometry and spectroscopy). We focus the discussion of this poster on NIFS and GNIRS. We present the spectroscopic throughput determined for ZJHK bands on NIFS, the XJHKLM band for GNIRS XD mode and the K band for GNIRS LS. Additionally, the sensitivity is available for the JHK bands in NIFS and GNIRS XD, and for the K band in GNIRS LS. We consider data taken as early as March 2011. Furthermore, the pipeline setup and the methods used to determine throughput and sensitivity are described.
Report on Phase 1 Tests of Fairchild Automatic Vehicle Monitoring (AVM) System
DOT National Transportation Integrated Search
1977-08-01
During the winter of 1976-77 four different techniques for automatically locating land vehicles were tested in both the low and high-rise regions in Philadelphia, Pennsylvania. The tests were carried out by four different companies under separate con...
Automatic optometer operates with infrared test pattern
NASA Technical Reports Server (NTRS)
Cornsweet, T. N.; Crane, H. D.
1970-01-01
Refractive strength of human eye is monitored by optometer that automatically and continuously images infrared test pattern onto the retina. Condition of focus of the eye at any instant is determined from optometer settings needed to maintain focus of the pattern on the retina.
NASA Astrophysics Data System (ADS)
Yusop, Hanafi M.; Ghazali, M. F.; Yusof, M. F. M.; Remli, M. A. Pi; Kamarulzaman, M. H.
2017-10-01
In a recent study, the analysis of pressure transient signals could be seen as an accurate and low-cost method for leak and feature detection in water distribution systems. Transient phenomena occurs due to sudden changes in the fluid’s propagation in pipelines system caused by rapid pressure and flow fluctuation due to events such as closing and opening valves rapidly or through pump failure. In this paper, the feasibility of the Hilbert-Huang transform (HHT) method/technique in analysing the pressure transient signals in presented and discussed. HHT is a way to decompose a signal into intrinsic mode functions (IMF). However, the advantage of HHT is its difficulty in selecting the suitable IMF for the next data postprocessing method which is Hilbert Transform (HT). This paper reveals that utilizing the application of an integrated kurtosis-based algorithm for a z-filter technique (I-Kaz) to kurtosis ratio (I-Kaz-Kurtosis) allows/contributes to/leads to automatic selection of the IMF that should be used. This technique is demonstrated on a 57.90-meter medium high-density polyethylene (MDPE) pipe installed with a single artificial leak. The analysis results using the I-Kaz-kurtosis ratio revealed/confirmed that the method can be used as an automatic selection of the IMF although the noise level ratio of the signal is low. Therefore, the I-Kaz-kurtosis ratio method is recommended as a means to implement an automatic selection technique of the IMF for HHT analysis.
NCBI prokaryotic genome annotation pipeline.
Tatusova, Tatiana; DiCuccio, Michael; Badretdin, Azat; Chetvernin, Vyacheslav; Nawrocki, Eric P; Zaslavsky, Leonid; Lomsadze, Alexandre; Pruitt, Kim D; Borodovsky, Mark; Ostell, James
2016-08-19
Recent technological advances have opened unprecedented opportunities for large-scale sequencing and analysis of populations of pathogenic species in disease outbreaks, as well as for large-scale diversity studies aimed at expanding our knowledge across the whole domain of prokaryotes. To meet the challenge of timely interpretation of structure, function and meaning of this vast genetic information, a comprehensive approach to automatic genome annotation is critically needed. In collaboration with Georgia Tech, NCBI has developed a new approach to genome annotation that combines alignment based methods with methods of predicting protein-coding and RNA genes and other functional elements directly from sequence. A new gene finding tool, GeneMarkS+, uses the combined evidence of protein and RNA placement by homology as an initial map of annotation to generate and modify ab initio gene predictions across the whole genome. Thus, the new NCBI's Prokaryotic Genome Annotation Pipeline (PGAP) relies more on sequence similarity when confident comparative data are available, while it relies more on statistical predictions in the absence of external evidence. The pipeline provides a framework for generation and analysis of annotation on the full breadth of prokaryotic taxonomy. For additional information on PGAP see https://www.ncbi.nlm.nih.gov/genome/annotation_prok/ and the NCBI Handbook, https://www.ncbi.nlm.nih.gov/books/NBK174280/. Published by Oxford University Press on behalf of Nucleic Acids Research 2016. This work is written by (a) US Government employee(s) and is in the public domain in the US.