Sample records for stream processing platform

  1. Learning Analytics Platform, towards an Open Scalable Streaming Solution for Education

    ERIC Educational Resources Information Center

    Lewkow, Nicholas; Zimmerman, Neil; Riedesel, Mark; Essa, Alfred

    2015-01-01

    Next generation digital learning environments require delivering "just-in-time feedback" to learners and those who support them. Unlike traditional business intelligence environments, streaming data requires resilient infrastructure that can move data at scale from heterogeneous data sources, process the data quickly for use across…

  2. Data Streaming for Metabolomics: Accelerating Data Processing and Analysis from Days to Minutes

    PubMed Central

    2016-01-01

    The speed and throughput of analytical platforms has been a driving force in recent years in the “omics” technologies and while great strides have been accomplished in both chromatography and mass spectrometry, data analysis times have not benefited at the same pace. Even though personal computers have become more powerful, data transfer times still represent a bottleneck in data processing because of the increasingly complex data files and studies with a greater number of samples. To meet the demand of analyzing hundreds to thousands of samples within a given experiment, we have developed a data streaming platform, XCMS Stream, which capitalizes on the acquisition time to compress and stream recently acquired data files to data processing servers, mimicking just-in-time production strategies from the manufacturing industry. The utility of this XCMS Online-based technology is demonstrated here in the analysis of T cell metabolism and other large-scale metabolomic studies. A large scale example on a 1000 sample data set demonstrated a 10 000-fold time savings, reducing data analysis time from days to minutes. Further, XCMS Stream has the capability to increase the efficiency of downstream biochemical dependent data acquisition (BDDA) analysis by initiating data conversion and data processing on subsets of data acquired, expanding its application beyond data transfer to smart preliminary data decision-making prior to full acquisition. PMID:27983788

  3. Data streaming for metabolomics: Accelerating data processing and analysis from days to minutes

    DOE PAGES

    Montenegro-Burke, J. Rafael; Aisporna, Aries E.; Benton, H. Paul; ...

    2016-12-16

    The speed and throughput of analytical platforms has been a driving force in recent years in the “omics” technologies and while great strides have been accomplished in both chromatography and mass spectrometry, data analysis times have not benefited at the same pace. Even though personal computers have become more powerful, data transfer times still represent a bottleneck in data processing because of the increasingly complex data files and studies with a greater number of samples. To meet the demand of analyzing hundreds to thousands of samples within a given experiment, we have developed a data streaming platform, XCMS Stream, whichmore » capitalizes on the acquisition time to compress and stream recently acquired data files to data processing servers, mimicking just-in-time production strategies from the manufacturing industry. The utility of this XCMS Online-based technology is demonstrated here in the analysis of T cell metabolism and other large-scale metabolomic studies. A large scale example on a 1000 sample data set demonstrated a 10 000-fold time savings, reducing data analysis time from days to minutes. Here, XCMS Stream has the capability to increase the efficiency of downstream biochemical dependent data acquisition (BDDA) analysis by initiating data conversion and data processing on subsets of data acquired, expanding its application beyond data transfer to smart preliminary data decision-making prior to full acquisition.« less

  4. Data Streaming for Metabolomics: Accelerating Data Processing and Analysis from Days to Minutes.

    PubMed

    Montenegro-Burke, J Rafael; Aisporna, Aries E; Benton, H Paul; Rinehart, Duane; Fang, Mingliang; Huan, Tao; Warth, Benedikt; Forsberg, Erica; Abe, Brian T; Ivanisevic, Julijana; Wolan, Dennis W; Teyton, Luc; Lairson, Luke; Siuzdak, Gary

    2017-01-17

    The speed and throughput of analytical platforms has been a driving force in recent years in the "omics" technologies and while great strides have been accomplished in both chromatography and mass spectrometry, data analysis times have not benefited at the same pace. Even though personal computers have become more powerful, data transfer times still represent a bottleneck in data processing because of the increasingly complex data files and studies with a greater number of samples. To meet the demand of analyzing hundreds to thousands of samples within a given experiment, we have developed a data streaming platform, XCMS Stream, which capitalizes on the acquisition time to compress and stream recently acquired data files to data processing servers, mimicking just-in-time production strategies from the manufacturing industry. The utility of this XCMS Online-based technology is demonstrated here in the analysis of T cell metabolism and other large-scale metabolomic studies. A large scale example on a 1000 sample data set demonstrated a 10 000-fold time savings, reducing data analysis time from days to minutes. Further, XCMS Stream has the capability to increase the efficiency of downstream biochemical dependent data acquisition (BDDA) analysis by initiating data conversion and data processing on subsets of data acquired, expanding its application beyond data transfer to smart preliminary data decision-making prior to full acquisition.

  5. Cots Correlator Platform

    NASA Astrophysics Data System (ADS)

    Schaaf, Kjeld; Overeem, Ruud

    2004-06-01

    Moore’s law is best exploited by using consumer market hardware. In particular, the gaming industry pushes the limit of processor performance thus reducing the cost per raw flop even faster than Moore’s law predicts. Next to the cost benefits of Common-Of-The-Shelf (COTS) processing resources, there is a rapidly growing experience pool in cluster based processing. The typical Beowulf cluster of PC’s supercomputers are well known. Multiple examples exists of specialised cluster computers based on more advanced server nodes or even gaming stations. All these cluster machines build upon the same knowledge about cluster software management, scheduling, middleware libraries and mathematical libraries. In this study, we have integrated COTS processing resources and cluster nodes into a very high performance processing platform suitable for streaming data applications, in particular to implement a correlator. The required processing power for the correlator in modern radio telescopes is in the range of the larger supercomputers, which motivates the usage of supercomputer technology. Raw processing power is provided by graphical processors and is combined with an Infiniband host bus adapter with integrated data stream handling logic. With this processing platform a scalable correlator can be built with continuously growing processing power at consumer market prices.

  6. Near real time water resources data for river basin management

    NASA Technical Reports Server (NTRS)

    Paulson, R. W. (Principal Investigator)

    1973-01-01

    The author has identified the following significant results. Twenty Data Collection Platforms (DCP) are being field installed on USGS water resources stations in the Delaware River Basin. DCP's have been successfully installed and are operating well on five stream gaging stations, three observation wells, and one water quality monitor in the basin. DCP's have been installed at nine additional water quality monitors, and work is progressing on interfacing the platforms to the monitors. ERTS-related water resources data from the platforms are being provided in near real time, by the Goddard Space Flight Center to the Pennsylvania district, Water Resources Division, U.S. Geological Survey. On a daily basis, the data are computer processed by the Survey and provided to the Delaware River Basin Commission. Each daily summary contains data that were relayed during 4 or 5 of the 15 orbits made by ERTS-1 during the previous day. Water resources parameters relays by the platforms include dissolved oxygen concentrations, temperature, pH, specific conductance, well level, and stream gage height, which is used to compute stream flow for the daily summary.

  7. The research and realization of digital management platform for ultra-precision optical elements within life-cycle

    NASA Astrophysics Data System (ADS)

    Wang, Juan; Wang, Jian; Li, Lijuan; Zhou, Kun

    2014-08-01

    In order to solve the information fusion, process integration, collaborative design and manufacturing for ultra-precision optical elements within life-cycle management, this paper presents a digital management platform which is based on product data and business processes by adopting the modern manufacturing technique, information technique and modern management technique. The architecture and system integration of the digital management platform are discussed in this paper. The digital management platform can realize information sharing and interaction for information-flow, control-flow and value-stream from user's needs to offline in life-cycle, and it can also enhance process control, collaborative research and service ability of ultra-precision optical elements.

  8. Real-Time Processing of Continuous Physiological Signals in a Neurocritical Care Unit on a Stream Data Analytics Platform.

    PubMed

    Bai, Yong; Sow, Daby; Vespa, Paul; Hu, Xiao

    2016-01-01

    Continuous high-volume and high-frequency brain signals such as intracranial pressure (ICP) and electroencephalographic (EEG) waveforms are commonly collected by bedside monitors in neurocritical care. While such signals often carry early signs of neurological deterioration, detecting these signs in real time with conventional data processing methods mainly designed for retrospective analysis has been extremely challenging. Such methods are not designed to handle the large volumes of waveform data produced by bedside monitors. In this pilot study, we address this challenge by building a prototype system using the IBM InfoSphere Streams platform, a scalable stream computing platform, to detect unstable ICP dynamics in real time. The system continuously receives electrocardiographic and ICP signals and analyzes ICP pulse morphology looking for deviations from a steady state. We also designed a Web interface to display in real time the result of this analysis in a Web browser. With this interface, physicians are able to ubiquitously check on the status of their patients and gain direct insight into and interpretation of the patient's state in real time. The prototype system has been successfully tested prospectively on live hospitalized patients.

  9. Platform for intraoperative analysis of video streams

    NASA Astrophysics Data System (ADS)

    Clements, Logan; Galloway, Robert L., Jr.

    2004-05-01

    Interactive, image-guided surgery (IIGS) has proven to increase the specificity of a variety of surgical procedures. However, current IIGS systems do not compensate for changes that occur intraoperatively and are not reflected in preoperative tomograms. Endoscopes and intraoperative ultrasound, used in minimally invasive surgery, provide real-time (RT) information in a surgical setting. Combining the information from RT imaging modalities with traditional IIGS techniques will further increase surgical specificity by providing enhanced anatomical information. In order to merge these techniques and obtain quantitative data from RT imaging modalities, a platform was developed to allow both the display and processing of video streams in RT. Using a Bandit-II CV frame grabber board (Coreco Imaging, St. Laurent, Quebec) and the associated library API, a dynamic link library was created in Microsoft Visual C++ 6.0 such that the platform could be incorporated into the IIGS system developed at Vanderbilt University. Performance characterization, using two relatively inexpensive host computers, has shown the platform capable of performing simple image processing operations on frames captured from a CCD camera and displaying the processed video data at near RT rates both independent of and while running the IIGS system.

  10. Floating sample-collection platform with stage-activated automatic water sampler for streams with large variation in stage

    USGS Publications Warehouse

    Tarte, Stephen R.; Schmidt, A.R.; Sullivan, Daniel J.

    1992-01-01

    A floating sample-collection platform is described for stream sites where the vertical or horizontal distance between the stream-sampling point and a safe location for the sampler exceed the suction head of the sampler. The platform allows continuous water sampling over the entire storm-runoff hydrogrpah. The platform was developed for a site in southern Illinois.

  11. USGS tethered ACP platforms: New design means more safety and accuracy

    USGS Publications Warehouse

    Morlock, S.E.; Stewart, J.A.; Rehmel, M.S.

    2004-01-01

    The US Geological Survey has developed an innovative tethered platform that supports an Acoustic Current Profiler (ACP) in making stream-flow measurements (use of the term ACP in this article refers to a class of instruments and not a specific brand name or model). The tethered platform reduces the hazards involved in conventional methods of stream-flow measurement. The use of the platform reduces or eliminates time spent by personnel in streams and boats or on bridges and cableway and stream-flow measurement accuracy is increased.

  12. A Control System and Streaming DAQ Platform with Image-Based Trigger for X-ray Imaging

    NASA Astrophysics Data System (ADS)

    Stevanovic, Uros; Caselle, Michele; Cecilia, Angelica; Chilingaryan, Suren; Farago, Tomas; Gasilov, Sergey; Herth, Armin; Kopmann, Andreas; Vogelgesang, Matthias; Balzer, Matthias; Baumbach, Tilo; Weber, Marc

    2015-06-01

    High-speed X-ray imaging applications play a crucial role for non-destructive investigations of the dynamics in material science and biology. On-line data analysis is necessary for quality assurance and data-driven feedback, leading to a more efficient use of a beam time and increased data quality. In this article we present a smart camera platform with embedded Field Programmable Gate Array (FPGA) processing that is able to stream and process data continuously in real-time. The setup consists of a Complementary Metal-Oxide-Semiconductor (CMOS) sensor, an FPGA readout card, and a readout computer. It is seamlessly integrated in a new custom experiment control system called Concert that provides a more efficient way of operating a beamline by integrating device control, experiment process control, and data analysis. The potential of the embedded processing is demonstrated by implementing an image-based trigger. It records the temporal evolution of physical events with increased speed while maintaining the full field of view. The complete data acquisition system, with Concert and the smart camera platform was successfully integrated and used for fast X-ray imaging experiments at KIT's synchrotron radiation facility ANKA.

  13. 40-Gbps optical backbone network deep packet inspection based on FPGA

    NASA Astrophysics Data System (ADS)

    Zuo, Yuan; Huang, Zhiping; Su, Shaojing

    2014-11-01

    In the era of information, the big data, which contains huge information, brings about some problems, such as high speed transmission, storage and real-time analysis and process. As the important media for data transmission, the Internet is the significant part for big data processing research. With the large-scale usage of the Internet, the data streaming of network is increasing rapidly. The speed level in the main fiber optic communication of the present has reached 40Gbps, even 100Gbps, therefore data on the optical backbone network shows some features of massive data. Generally, data services are provided via IP packets on the optical backbone network, which is constituted with SDH (Synchronous Digital Hierarchy). Hence this method that IP packets are directly mapped into SDH payload is named POS (Packet over SDH) technology. Aiming at the problems of real time process of high speed massive data, this paper designs a process system platform based on ATCA for 40Gbps POS signal data stream recognition and packet content capture, which employs the FPGA as the CPU. This platform offers pre-processing of clustering algorithms, service traffic identification and data mining for the following big data storage and analysis with high efficiency. Also, the operational procedure is proposed in this paper. Four channels of 10Gbps POS signal decomposed by the analysis module, which chooses FPGA as the kernel, are inputted to the flow classification module and the pattern matching component based on TCAM. Based on the properties of the length of payload and net flows, buffer management is added to the platform to keep the key flow information. According to data stream analysis, DPI (deep packet inspection) and flow balance distribute, the signal is transmitted to the backend machine through the giga Ethernet ports on back board. Practice shows that the proposed platform is superior to the traditional applications based on ASIC and NP.

  14. Bringing Legacy Visualization Software to Modern Computing Devices via Application Streaming

    NASA Astrophysics Data System (ADS)

    Fisher, Ward

    2014-05-01

    Planning software compatibility across forthcoming generations of computing platforms is a problem commonly encountered in software engineering and development. While this problem can affect any class of software, data analysis and visualization programs are particularly vulnerable. This is due in part to their inherent dependency on specialized hardware and computing environments. A number of strategies and tools have been designed to aid software engineers with this task. While generally embraced by developers at 'traditional' software companies, these methodologies are often dismissed by the scientific software community as unwieldy, inefficient and unnecessary. As a result, many important and storied scientific software packages can struggle to adapt to a new computing environment; for example, one in which much work is carried out on sub-laptop devices (such as tablets and smartphones). Rewriting these packages for a new platform often requires significant investment in terms of development time and developer expertise. In many cases, porting older software to modern devices is neither practical nor possible. As a result, replacement software must be developed from scratch, wasting resources better spent on other projects. Enabled largely by the rapid rise and adoption of cloud computing platforms, 'Application Streaming' technologies allow legacy visualization and analysis software to be operated wholly from a client device (be it laptop, tablet or smartphone) while retaining full functionality and interactivity. It mitigates much of the developer effort required by other more traditional methods while simultaneously reducing the time it takes to bring the software to a new platform. This work will provide an overview of Application Streaming and how it compares against other technologies which allow scientific visualization software to be executed from a remote computer. We will discuss the functionality and limitations of existing application streaming frameworks and how a developer might prepare their software for application streaming. We will also examine the secondary benefits realized by moving legacy software to the cloud. Finally, we will examine the process by which a legacy Java application, the Integrated Data Viewer (IDV), is to be adapted for tablet computing via Application Streaming.

  15. SPAIDE: A Real-time Research Platform for the Clarion CII/90K Cochlear Implant

    NASA Astrophysics Data System (ADS)

    Van Immerseel, L.; Peeters, S.; Dykmans, P.; Vanpoucke, F.; Bracke, P.

    2005-12-01

    SPAIDE ( sound-processing algorithm integrated development environment) is a real-time platform of Advanced Bionics Corporation (Sylmar, Calif, USA) to facilitate advanced research on sound-processing and electrical-stimulation strategies with the Clarion CII and 90K implants. The platform is meant for testing in the laboratory. SPAIDE is conceptually based on a clear separation of the sound-processing and stimulation strategies, and, in specific, on the distinction between sound-processing and stimulation channels and electrode contacts. The development environment has a user-friendly interface to specify sound-processing and stimulation strategies, and includes the possibility to simulate the electrical stimulation. SPAIDE allows for real-time sound capturing from file or audio input on PC, sound processing and application of the stimulation strategy, and streaming the results to the implant. The platform is able to cover a broad range of research applications; from noise reduction and mimicking of normal hearing, over complex (simultaneous) stimulation strategies, to psychophysics. The hardware setup consists of a personal computer, an interface board, and a speech processor. The software is both expandable and to a great extent reusable in other applications.

  16. Geomorphic effectiveness of a long profile shape and the role of inherent geological controls in the Himalayan hinterland area of the Ganga River basin, India

    NASA Astrophysics Data System (ADS)

    Sonam; Jain, Vikrant

    2018-03-01

    Long profiles of rivers provide a platform to analyse interaction between geological and geomorphic processes operating at different time scales. Identification of an appropriate model for river long profile becomes important in order to establish a quantitative relationship between the profile shape, its geomorphic effectiveness, and inherent geological characteristics. This work highlights the variability in the long profile shape of the Ganga River and its major tributaries, its impact on stream power distribution pattern, and role of the geological controls on it. Long profile shapes are represented by the sum of two exponential functions through the curve fitting method. We have shown that coefficients of river long profile equations are governed by the geological characteristics of subbasins. These equations further define the spatial distribution pattern of stream power and help to understand stream power variability in different geological terrains. Spatial distribution of stream power in different geological terrains successfully explains spatial variability in geomorphic processes within the Himalayan hinterland area. In general, the stream power peaks of larger rivers lie in the Higher Himalaya, and rivers in the eastern hinterland area are characterised by the highest magnitude of stream power.

  17. Mapping of MPEG-4 decoding on a flexible architecture platform

    NASA Astrophysics Data System (ADS)

    van der Tol, Erik B.; Jaspers, Egbert G.

    2001-12-01

    In the field of consumer electronics, the advent of new features such as Internet, games, video conferencing, and mobile communication has triggered the convergence of television and computers technologies. This requires a generic media-processing platform that enables simultaneous execution of very diverse tasks such as high-throughput stream-oriented data processing and highly data-dependent irregular processing with complex control flows. As a representative application, this paper presents the mapping of a Main Visual profile MPEG-4 for High-Definition (HD) video onto a flexible architecture platform. A stepwise approach is taken, going from the decoder application toward an implementation proposal. First, the application is decomposed into separate tasks with self-contained functionality, clear interfaces, and distinct characteristics. Next, a hardware-software partitioning is derived by analyzing the characteristics of each task such as the amount of inherent parallelism, the throughput requirements, the complexity of control processing, and the reuse potential over different applications and different systems. Finally, a feasible implementation is proposed that includes amongst others a very-long-instruction-word (VLIW) media processor, one or more RISC processors, and some dedicated processors. The mapping study of the MPEG-4 decoder proves the flexibility and extensibility of the media-processing platform. This platform enables an effective HW/SW co-design yielding a high performance density.

  18. Performance Modeling in CUDA Streams - A Means for High-Throughput Data Processing.

    PubMed

    Li, Hao; Yu, Di; Kumar, Anand; Tu, Yi-Cheng

    2014-10-01

    Push-based database management system (DBMS) is a new type of data processing software that streams large volume of data to concurrent query operators. The high data rate of such systems requires large computing power provided by the query engine. In our previous work, we built a push-based DBMS named G-SDMS to harness the unrivaled computational capabilities of modern GPUs. A major design goal of G-SDMS is to support concurrent processing of heterogenous query processing operations and enable resource allocation among such operations. Understanding the performance of operations as a result of resource consumption is thus a premise in the design of G-SDMS. With NVIDIA's CUDA framework as the system implementation platform, we present our recent work on performance modeling of CUDA kernels running concurrently under a runtime mechanism named CUDA stream . Specifically, we explore the connection between performance and resource occupancy of compute-bound kernels and develop a model that can predict the performance of such kernels. Furthermore, we provide an in-depth anatomy of the CUDA stream mechanism and summarize the main kernel scheduling disciplines in it. Our models and derived scheduling disciplines are verified by extensive experiments using synthetic and real-world CUDA kernels.

  19. JS-MS: a cross-platform, modular javascript viewer for mass spectrometry signals.

    PubMed

    Rosen, Jebediah; Handy, Kyle; Gillan, André; Smith, Rob

    2017-11-06

    Despite the ubiquity of mass spectrometry (MS), data processing tools can be surprisingly limited. To date, there is no stand-alone, cross-platform 3-D visualizer for MS data. Available visualization toolkits require large libraries with multiple dependencies and are not well suited for custom MS data processing modules, such as MS storage systems or data processing algorithms. We present JS-MS, a 3-D, modular JavaScript client application for viewing MS data. JS-MS provides several advantages over existing MS viewers, such as a dependency-free, browser-based, one click, cross-platform install and better navigation interfaces. The client includes a modular Java backend with a novel streaming.mzML parser to demonstrate the API-based serving of MS data to the viewer. JS-MS enables custom MS data processing and evaluation by providing fast, 3-D visualization using improved navigation without dependencies. JS-MS is publicly available with a GPLv2 license at github.com/optimusmoose/jsms.

  20. A mixed signal ECG processing platform with an adaptive sampling ADC for portable monitoring applications.

    PubMed

    Kim, Hyejung; Van Hoof, Chris; Yazicioglu, Refet Firat

    2011-01-01

    This paper describes a mixed-signal ECG processing platform with an 12-bit ADC architecture that can adapt its sampling rate according to the input signals rate of change. This enables the sampling of ECG signals with significantly reduced data rate without loss of information. The presented adaptive sampling scheme reduces the ADC power consumption, enables the processing of ECG signals with lower power consumption, and reduces the power consumption of the radio while streaming the ECG signals. The test results show that running a CWT-based R peak detection algorithm using the adaptively sampled ECG signals consumes only 45.6 μW and it leads to 36% less overall system power consumption.

  1. MMX-I: data-processing software for multimodal X-ray imaging and tomography.

    PubMed

    Bergamaschi, Antoine; Medjoubi, Kadda; Messaoudi, Cédric; Marco, Sergio; Somogyi, Andrea

    2016-05-01

    A new multi-platform freeware has been developed for the processing and reconstruction of scanning multi-technique X-ray imaging and tomography datasets. The software platform aims to treat different scanning imaging techniques: X-ray fluorescence, phase, absorption and dark field and any of their combinations, thus providing an easy-to-use data processing tool for the X-ray imaging user community. A dedicated data input stream copes with the input and management of large datasets (several hundred GB) collected during a typical multi-technique fast scan at the Nanoscopium beamline and even on a standard PC. To the authors' knowledge, this is the first software tool that aims at treating all of the modalities of scanning multi-technique imaging and tomography experiments.

  2. A Software Defined Radio Based Architecture for the Reagan Test Site Telemetry Modernization (RTM) Program

    DTIC Science & Technology

    2015-10-26

    platforms and are quickly using up available spectrum. The national need in the commercial sector with emerging technologies such as 5G is pushing for...recovered and post processed later. The Front End Server also sends selected data stream across a high speed network link to the centralized

  3. Experiences with the Twitter Health Surveillance (THS) System

    PubMed Central

    Rodríguez-Martínez, Manuel

    2018-01-01

    Social media has become an important platform to gauge public opinion on topics related to our daily lives. In practice, processing these posts requires big data analytics tools since the volume of data and the speed of production overwhelm single-server solutions. Building an application to capture and analyze posts from social media can be a challenge simply because it requires combining a set of complex software tools that often times are tricky to configure, tune, and maintain. In many instances, the application ends up being an assorted collection of Java/Scala programs or Python scripts that developers cobble together to generate the data products they need. In this paper, we present the Twitter Health Surveillance (THS) application framework. THS is designed as a platform to allow end-users to monitor a stream of tweets, and process the stream with a combination of built-in functionality and their own user-defined functions. We discuss the architecture of THS, and describe its implementation atop the Apache Hadoop Ecosystem. We also present several lessons learned while developing our current prototype. PMID:29607412

  4. Experiences with the Twitter Health Surveillance (THS) System.

    PubMed

    Rodríguez-Martínez, Manuel

    2017-06-01

    Social media has become an important platform to gauge public opinion on topics related to our daily lives. In practice, processing these posts requires big data analytics tools since the volume of data and the speed of production overwhelm single-server solutions. Building an application to capture and analyze posts from social media can be a challenge simply because it requires combining a set of complex software tools that often times are tricky to configure, tune, and maintain. In many instances, the application ends up being an assorted collection of Java/Scala programs or Python scripts that developers cobble together to generate the data products they need. In this paper, we present the Twitter Health Surveillance (THS) application framework. THS is designed as a platform to allow end-users to monitor a stream of tweets, and process the stream with a combination of built-in functionality and their own user-defined functions. We discuss the architecture of THS, and describe its implementation atop the Apache Hadoop Ecosystem. We also present several lessons learned while developing our current prototype.

  5. On-Board Mining in the Sensor Web

    NASA Astrophysics Data System (ADS)

    Tanner, S.; Conover, H.; Graves, S.; Ramachandran, R.; Rushing, J.

    2004-12-01

    On-board data mining can contribute to many research and engineering applications, including natural hazard detection and prediction, intelligent sensor control, and the generation of customized data products for direct distribution to users. The ability to mine sensor data in real time can also be a critical component of autonomous operations, supporting deep space missions, unmanned aerial and ground-based vehicles (UAVs, UGVs), and a wide range of sensor meshes, webs and grids. On-board processing is expected to play a significant role in the next generation of NASA, Homeland Security, Department of Defense and civilian programs, providing for greater flexibility and versatility in measurements of physical systems. In addition, the use of UAV and UGV systems is increasing in military, emergency response and industrial applications. As research into the autonomy of these vehicles progresses, especially in fleet or web configurations, the applicability of on-board data mining is expected to increase significantly. Data mining in real time on board sensor platforms presents unique challenges. Most notably, the data to be mined is a continuous stream, rather than a fixed store such as a database. This means that the data mining algorithms must be modified to make only a single pass through the data. In addition, the on-board environment requires real time processing with limited computing resources, thus the algorithms must use fixed and relatively small amounts of processing time and memory. The University of Alabama in Huntsville is developing an innovative processing framework for the on-board data and information environment. The Environment for On-Board Processing (EVE) and the Adaptive On-board Data Processing (AODP) projects serve as proofs-of-concept of advanced information systems for remote sensing platforms. The EVE real-time processing infrastructure will upload, schedule and control the execution of processing plans on board remote sensors. These plans provide capabilities for autonomous data mining, classification and feature extraction using both streaming and buffered data sources. A ground-based testbed provides a heterogeneous, embedded hardware and software environment representing both space-based and ground-based sensor platforms, including wireless sensor mesh architectures. The AODP project explores the EVE concepts in the world of sensor-networks, including ad-hoc networks of small sensor platforms.

  6. A real-time remote video streaming platform for ultrasound imaging.

    PubMed

    Ahmadi, Mehdi; Gross, Warren J; Kadoury, Samuel

    2016-08-01

    Ultrasound is a viable imaging technology in remote and resources-limited areas. Ultrasonography is a user-dependent skill which depends on a high degree of training and hands-on experience. However, there is a limited number of skillful sonographers located in remote areas. In this work, we aim to develop a real-time video streaming platform which allows specialist physicians to remotely monitor ultrasound exams. To this end, an ultrasound stream is captured and transmitted through a wireless network into remote computers, smart-phones and tablets. In addition, the system is equipped with a camera to track the position of the ultrasound probe. The main advantage of our work is using an open source platform for video streaming which gives us more control over streaming parameters than the available commercial products. The transmission delays of the system are evaluated for several ultrasound video resolutions and the results show that ultrasound videos close to the high-definition (HD) resolution can be received and displayed on an Android tablet with the delay of 0.5 seconds which is acceptable for accurate real-time diagnosis.

  7. WDM mid-board optics for chip-to-chip wavelength routing interconnects in the H2020 ICT-STREAMS

    NASA Astrophysics Data System (ADS)

    Kanellos, G. T.; Pleros, N.

    2017-02-01

    Multi-socket server boards have emerged to increase the processing power density on the board level and further flatten the data center networks beyond leaf-spine architectures. Scaling however the number of processors per board puts current electronic technologies into challenge, as it requires high bandwidth interconnects and high throughput switches with increased number of ports that are currently unavailable. On-board optical interconnection has proved the potential to efficiently satisfy the bandwidth needs, but their use has been limited to parallel links without performing any smart routing functionality. With CWDM optical interconnects already a commodity, cyclical wavelength routing proposed to fit the datacom for rack-to-rack and board-to-board communication now becomes a promising on-board routing platform. ICT-STREAMS is a European research project that aims to combine WDM parallel on-board transceivers with a cyclical AWGR, in order to create a new board-level, chip-to-chip interconnection paradigm that will leverage WDM parallel transmission to a powerful wavelength routing platform capable to interconnect multiple processors with unprecedented bandwidth and throughput capacity. Direct, any-to-any, on-board interconnection of multiple processors will significantly contribute to further flatten the data centers and facilitate east-west communication. In the present communication, we present ICT-STREAMS on-board wavelength routing architecture for multiple chip-to-chip interconnections and evaluate the overall system performance in terms of throughput and latency for several schemes and traffic profiles. We also review recent advances of the ICT-STREAMS platform key-enabling technologies that span from Si in-plane lasers and polymer based electro-optical circuit boards to silicon photonics transceivers and photonic-crystal amplifiers.

  8. Performance Modeling in CUDA Streams - A Means for High-Throughput Data Processing

    PubMed Central

    Li, Hao; Yu, Di; Kumar, Anand; Tu, Yi-Cheng

    2015-01-01

    Push-based database management system (DBMS) is a new type of data processing software that streams large volume of data to concurrent query operators. The high data rate of such systems requires large computing power provided by the query engine. In our previous work, we built a push-based DBMS named G-SDMS to harness the unrivaled computational capabilities of modern GPUs. A major design goal of G-SDMS is to support concurrent processing of heterogenous query processing operations and enable resource allocation among such operations. Understanding the performance of operations as a result of resource consumption is thus a premise in the design of G-SDMS. With NVIDIA’s CUDA framework as the system implementation platform, we present our recent work on performance modeling of CUDA kernels running concurrently under a runtime mechanism named CUDA stream. Specifically, we explore the connection between performance and resource occupancy of compute-bound kernels and develop a model that can predict the performance of such kernels. Furthermore, we provide an in-depth anatomy of the CUDA stream mechanism and summarize the main kernel scheduling disciplines in it. Our models and derived scheduling disciplines are verified by extensive experiments using synthetic and real-world CUDA kernels. PMID:26566545

  9. MMX-I: data-processing software for multimodal X-ray imaging and tomography

    PubMed Central

    Bergamaschi, Antoine; Medjoubi, Kadda; Messaoudi, Cédric; Marco, Sergio; Somogyi, Andrea

    2016-01-01

    A new multi-platform freeware has been developed for the processing and reconstruction of scanning multi-technique X-ray imaging and tomography datasets. The software platform aims to treat different scanning imaging techniques: X-ray fluorescence, phase, absorption and dark field and any of their combinations, thus providing an easy-to-use data processing tool for the X-ray imaging user community. A dedicated data input stream copes with the input and management of large datasets (several hundred GB) collected during a typical multi-technique fast scan at the Nanoscopium beamline and even on a standard PC. To the authors’ knowledge, this is the first software tool that aims at treating all of the modalities of scanning multi-technique imaging and tomography experiments. PMID:27140159

  10. Use of the NetBeans Platform for NASA Robotic Conjunction Assessment Risk Analysis

    NASA Technical Reports Server (NTRS)

    Sabey, Nickolas J.

    2014-01-01

    The latest Java and JavaFX technologies are very attractive software platforms for customers involved in space mission operations such as those of NASA and the US Air Force. For NASA Robotic Conjunction Assessment Risk Analysis (CARA), the NetBeans platform provided an environment in which scalable software solutions could be developed quickly and efficiently. Both Java 8 and the NetBeans platform are in the process of simplifying CARA development in secure environments by providing a significant amount of capability in a single accredited package, where accreditation alone can account for 6-8 months for each library or software application. Capabilities either in use or being investigated by CARA include: 2D and 3D displays with JavaFX, parallelization with the new Streams API, and scalability through the NetBeans plugin architecture.

  11. Stepwise calibration procedure for regional coupled hydrological-hydrogeological models

    NASA Astrophysics Data System (ADS)

    Labarthe, Baptiste; Abasq, Lena; de Fouquet, Chantal; Flipo, Nicolas

    2014-05-01

    Stream-aquifer interaction is a complex process depending on regional and local processes. Indeed, the groundwater component of hydrosystem and large scale heterogeneities control the regional flows towards the alluvial plains and the rivers. In second instance, the local distribution of the stream bed permeabilities controls the dynamics of stream-aquifer water fluxes within the alluvial plain, and therefore the near-river piezometric head distribution. In order to better understand the water circulation and pollutant transport in watersheds, the integration of these multi-dimensional processes in modelling platform has to be performed. Thus, the nested interfaces concept in continental hydrosystem modelling (where regional fluxes, simulated by large scale models, are imposed at local stream-aquifer interfaces) has been presented in Flipo et al (2014). This concept has been implemented in EauDyssée modelling platform for a large alluvial plain model (900km2) part of a 11000km2 multi-layer aquifer system, located in the Seine basin (France). The hydrosystem modelling platform is composed of four spatially distributed modules (Surface, Sub-surface, River and Groundwater), corresponding to four components of the terrestrial water cycle. Considering the large number of parameters to be inferred simultaneously, the calibration process of coupled models is highly computationally demanding and therefore hardly applicable to a real case study of 10000km2. In order to improve the efficiency of the calibration process, a stepwise calibration procedure is proposed. The stepwise methodology involves determining optimal parameters of all components of the coupled model, to provide a near optimum prior information for the global calibration. It starts with the surface component parameters calibration. The surface parameters are optimised based on the comparison between simulated and observed discharges (or filtered discharges) at various locations. Once the surface parameters have been determined, the groundwater component is calibrated. The calibration procedure is performed under steady state hypothesis (to minimize the procedure time length) using recharge rates given by the surface component calibration and imposed fluxes boundary conditions given by the regional model. The calibration is performed using pilot point where the prior variogram is calculated from observed transmissivities values. This procedure uses PEST (http//:www.pesthomepage.org/Home.php) as the inverse modelling tool and EauDyssée as the direct model. During the stepwise calibration process, each modules, even if they are actually dependant from each other, are run and calibrated independently, therefore contributions between each module have to be determined. For the surface module, groundwater and runoff contributions have been determined by hydrograph separation. Among the automated base-flow separation methods, the one-parameter Chapman filter (Chapman et al 1999) has been chosen. This filter is a decomposition of the actual base-flow between the previous base-flow and the discharge gradient weighted by functions of the recession coefficient. For the groundwater module, the recharge has been determined from surface and sub-surface module. References : Flipo, N., A. Mourhi, B. Labarthe, and S. Biancamaria (2014). Continental hydrosystem modelling : the concept of nested stream-aquifer interfaces. Hydrol. Earth Syst. Sci. Discuss. 11, 451-500. Chapman,TG. (1999). A comparison of algorithms for stream flow recession and base-flow separation. hydrological Processes 13, 701-714.

  12. Performing Play: Cultural Production on Twitch.tv

    ERIC Educational Resources Information Center

    Pellicone, Anthony James

    2017-01-01

    Streaming is an emerging practice of videogame culture, where a player broadcasts a live capture of their game-play to an audience. Every day Twitch.tv, the most popular streaming platform, features thousands of streams broadcast to millions of viewers. Streams are detailed multimedia artifacts, and their study allows us to understand how the…

  13. Continuous Succinic Acid Production by Actinobacillus succinogenes on Xylose-Enriched Hydrolysate

    DOE PAGES

    Bradfield, Michael F. A.; Mohagheghi, Ali; Salvachua, Davinia; ...

    2015-11-14

    Bio-manufacturing of high-value chemicals in parallel to renewable biofuels has the potential to dramatically improve the overall economic landscape of integrated lignocellulosic biorefineries. However, this will require the generation of carbohydrate streams from lignocellulose in a form suitable for efficient microbial conversion and downstream processing appropriate to the desired end use, making overall process development, along with selection of appropriate target molecules, crucial to the integrated biorefinery. Succinic acid (SA), a high-value target molecule, can be biologically produced from sugars and has the potential to serve as a platform chemical for various chemical and polymer applications. However, the feasibility ofmore » microbial SA production at industrially relevant productivities and yields from lignocellulosic biorefinery streams has not yet been reported.« less

  14. Building a Billion Spatio-Temporal Object Search and Visualization Platform

    NASA Astrophysics Data System (ADS)

    Kakkar, D.; Lewis, B.

    2017-10-01

    With funding from the Sloan Foundation and Harvard Dataverse, the Harvard Center for Geographic Analysis (CGA) has developed a prototype spatio-temporal visualization platform called the Billion Object Platform or BOP. The goal of the project is to lower barriers for scholars who wish to access large, streaming, spatio-temporal datasets. The BOP is now loaded with the latest billion geo-tweets, and is fed a real-time stream of about 1 million tweets per day. The geo-tweets are enriched with sentiment and census/admin boundary codes when they enter the system. The system is open source and is currently hosted on Massachusetts Open Cloud (MOC), an OpenStack environment with all components deployed in Docker orchestrated by Kontena. This paper will provide an overview of the BOP architecture, which is built on an open source stack consisting of Apache Lucene, Solr, Kafka, Zookeeper, Swagger, scikit-learn, OpenLayers, and AngularJS. The paper will further discuss the approach used for harvesting, enriching, streaming, storing, indexing, visualizing and querying a billion streaming geo-tweets.

  15. CAOS: the nested catchment soil-vegetation-atmosphere observation platform

    NASA Astrophysics Data System (ADS)

    Weiler, Markus; Blume, Theresa

    2016-04-01

    Most catchment based observations linking hydrometeorology, ecohydrology, soil hydrology and hydrogeology are typically not integrated with each other and lack a consistent and appropriate spatial-temporal resolution. Within the research network CAOS (Catchments As Organized Systems), we have initiated and developed a novel and integrated observation platform in several catchments in Luxembourg. In 20 nested catchments covering three distinct geologies the subscale processes at the bedrock-soil-vegetation-atmosphere interface are being monitored at 46 sensor cluster locations. Each sensor cluster is designed to observe a variety of different fluxes and state variables above and below ground, in the saturated and unsaturated zone. The numbers of sensors are chosen to capture the spatial variability as well the average dynamics. At each of these sensor clusters three soil moisture profiles with sensors at different depths, four soil temperature profiles as well as matric potential, air temperature, relative humidity, global radiation, rainfall/throughfall, sapflow and shallow groundwater and stream water levels are measured continuously. In addition, most sensors also measure temperature (water, soil, atmosphere) and electrical conductivity. This setup allows us to determine the local water and energy balance at each of these sites. The discharge gauging sites in the nested catchments are also equipped with automatic water samplers to monitor water quality and water stable isotopes continuously. Furthermore, water temperature and electrical conductivity observations are extended to over 120 locations distributed across the entire stream network to capture the energy exchange between the groundwater, stream water and atmosphere. The measurements at the sensor clusters are complemented by hydrometeorological observations (rain radar, network of distrometers and dense network of precipitation gauges) and linked with high resolution meteorological models. In this presentation, we will highlight the potential of this integrated observation platform to estimate energy and water exchange between the terrestrial and aquatic systems and the atmosphere, to trace water flow pathways in the unsaturated and saturated zone, and to understand the organization of processes and fluxes and thus runoff generation at different temporal and spatial scales.

  16. Auto-Generated Semantic Processing Services

    NASA Technical Reports Server (NTRS)

    Davis, Rodney; Hupf, Greg

    2009-01-01

    Auto-Generated Semantic Processing (AGSP) Services is a suite of software tools for automated generation of other computer programs, denoted cross-platform semantic adapters, that support interoperability of computer-based communication systems that utilize a variety of both new and legacy communication software running in a variety of operating- system/computer-hardware combinations. AGSP has numerous potential uses in military, space-exploration, and other government applications as well as in commercial telecommunications. The cross-platform semantic adapters take advantage of common features of computer- based communication systems to enforce semantics, messaging protocols, and standards of processing of streams of binary data to ensure integrity of data and consistency of meaning among interoperating systems. The auto-generation aspect of AGSP Services reduces development time and effort by emphasizing specification and minimizing implementation: In effect, the design, building, and debugging of software for effecting conversions among complex communication protocols, custom device mappings, and unique data-manipulation algorithms is replaced with metadata specifications that map to an abstract platform-independent communications model. AGSP Services is modular and has been shown to be easily integrable into new and legacy NASA flight and ground communication systems.

  17. X-Graphs: Language and Algorithms for Heterogeneous Graph Streams

    DTIC Science & Technology

    2017-09-01

    INTRODUCTION 1 3 METHODS , ASUMPTIONS, AND PROCEDURES 2 Software Abstractions for Graph Analytic Applications 2 High performance Platforms for Graph Processing...data is stored in a distributed file system. 3 METHODS , ASUMPTIONS, AND PROCEDURES Software Abstractions for Graph Analytic Applications To...implementations of novel methods for networks analysis: several methods for detection of overlapping communities, personalized PageRank, node embeddings into a d

  18. Investigation of Matlab® as platform in navigation and control of an Automatic Guided Vehicle utilising an omnivision sensor.

    PubMed

    Kotze, Ben; Jordaan, Gerrit

    2014-08-25

    Automatic Guided Vehicles (AGVs) are navigated utilising multiple types of sensors for detecting the environment. In this investigation such sensors are replaced and/or minimized by the use of a single omnidirectional camera picture stream. An area of interest is extracted, and by using image processing the vehicle is navigated on a set path. Reconfigurability is added to the route layout by signs incorporated in the navigation process. The result is the possible manipulation of a number of AGVs, each on its own designated colour-signed path. This route is reconfigurable by the operator with no programming alteration or intervention. A low resolution camera and a Matlab® software development platform are utilised. The use of Matlab® lends itself to speedy evaluation and implementation of image processing options on the AGV, but its functioning in such an environment needs to be assessed.

  19. Investigation of Matlab® as Platform in Navigation and Control of an Automatic Guided Vehicle Utilising an Omnivision Sensor

    PubMed Central

    Kotze, Ben; Jordaan, Gerrit

    2014-01-01

    Automatic Guided Vehicles (AGVs) are navigated utilising multiple types of sensors for detecting the environment. In this investigation such sensors are replaced and/or minimized by the use of a single omnidirectional camera picture stream. An area of interest is extracted, and by using image processing the vehicle is navigated on a set path. Reconfigurability is added to the route layout by signs incorporated in the navigation process. The result is the possible manipulation of a number of AGVs, each on its own designated colour-signed path. This route is reconfigurable by the operator with no programming alteration or intervention. A low resolution camera and a Matlab® software development platform are utilised. The use of Matlab® lends itself to speedy evaluation and implementation of image processing options on the AGV, but its functioning in such an environment needs to be assessed. PMID:25157548

  20. Stream Discharge Measurements From Cableways

    USGS Publications Warehouse

    Nolan, K. Michael; Sultz, Lucky

    2000-01-01

    Cableways have been used for decades as a platform for making stream discharge measurements. Use of cableways eliminates the need to expose personnel to hazards associated with working from highway bridges. In addition, cableways allow sites to be selected that offer the best possible hydraulic characteristics for measuring stream discharge. This training presentation describes methods currently used by the U.S. Geological Survey to make stream discharge measurements from cableways.

  1. Simulation and analysis of main steam control system based on heat transfer calculation

    NASA Astrophysics Data System (ADS)

    Huang, Zhenqun; Li, Ruyan; Feng, Zhongbao; Wang, Songhan; Li, Wenbo; Cheng, Jiwei; Jin, Yingai

    2018-05-01

    In this paper, after thermal power plant 300MW boiler was studied, mat lab was used to write calculation program about heat transfer process between the main steam and boiler flue gas and amount of water was calculated to ensure the main steam temperature keeping in target temperature. Then heat transfer calculation program was introduced into Simulink simulation platform based on control system multiple models switching and heat transfer calculation. The results show that multiple models switching control system based on heat transfer calculation not only overcome the large inertia of main stream temperature, a large hysteresis characteristic of main stream temperature, but also adapted to the boiler load changing.

  2. Robust media processing on programmable power-constrained systems

    NASA Astrophysics Data System (ADS)

    McVeigh, Jeff

    2005-03-01

    To achieve consumer-level quality, media systems must process continuous streams of audio and video data while maintaining exacting tolerances on sampling rate, jitter, synchronization, and latency. While it is relatively straightforward to design fixed-function hardware implementations to satisfy worst-case conditions, there is a growing trend to utilize programmable multi-tasking solutions for media applications. The flexibility of these systems enables support for multiple current and future media formats, which can reduce design costs and time-to-market. This paper provides practical engineering solutions to achieve robust media processing on such systems, with specific attention given to power-constrained platforms. The techniques covered in this article utilize the fundamental concepts of algorithm and software optimization, software/hardware partitioning, stream buffering, hierarchical prioritization, and system resource and power management. A novel enhancement to dynamically adjust processor voltage and frequency based on buffer fullness to reduce system power consumption is examined in detail. The application of these techniques is provided in a case study of a portable video player implementation based on a general-purpose processor running a non real-time operating system that achieves robust playback of synchronized H.264 video and MP3 audio from local storage and streaming over 802.11.

  3. A novel land surface-hydrologic-sediment dynamics model for stream corridor conservation assessment and its first application

    NASA Astrophysics Data System (ADS)

    Smithgall, K.; Shen, C.; Langendoen, E. J.; Johnson, P. A.

    2015-12-01

    Nationally and in the Chesapeake Bay (CB), Stream Corridor restoration costs unsustainable amount of public resources, but decisions are often made with inadequate knowledge of regional-scale system behavior. Bank erosion is a significant issue relevant to sediment and nutrient pollution, aquatic and riparian habitat and stream health. Existing modeling effort either focuses only on reach-scale responses or overly simplifies the descriptions for bank failure mechanics. In this work we present a novel regional-scale processes model integrating hydrology, vegetation dynamics, hydraulics, bank mechanics and sediment transport, based on a coupling between Community Land Model, Process-based Adaptive Watershed Simulator and CONservational Channel Evolution and Pollutant Transport System (CLM + PAWS + CONCEPTS, CPC). We illustrate the feasibility of this modeling platform in a Valley and Ridge basin in Pennsylvania, USA, with channel geometry data collected in 2004 and 2014. The simulations are able to reproduce essential pattern of the observed trends. We study the causes of the noticeable evolution of a relocated channel and the hydrologic controls. Bridging processes on multiple scales, the CPC model creates a new, integrated system that may serve as a confluence point for inter-disciplinary research.

  4. High-speed limnology: using advanced sensors to investigate spatial variability in biogeochemistry and hydrology.

    PubMed

    Crawford, John T; Loken, Luke C; Casson, Nora J; Smith, Colin; Stone, Amanda G; Winslow, Luke A

    2015-01-06

    Advanced sensor technology is widely used in aquatic monitoring and research. Most applications focus on temporal variability, whereas spatial variability has been challenging to document. We assess the capability of water chemistry sensors embedded in a high-speed water intake system to document spatial variability. This new sensor platform continuously samples surface water at a range of speeds (0 to >45 km h(-1)) resulting in high-density, mesoscale spatial data. These novel observations reveal previously unknown variability in physical, chemical, and biological factors in streams, rivers, and lakes. By combining multiple sensors into one platform, we were able to detect terrestrial-aquatic hydrologic connections in a small dystrophic lake, to infer the role of main-channel vs backwater nutrient processing in a large river and to detect sharp chemical changes across aquatic ecosystem boundaries in a stream/lake complex. Spatial sensor data were verified in our examples by comparing with standard lab-based measurements of selected variables. Spatial fDOM data showed strong correlation with wet chemistry measurements of DOC, and optical NO3 concentrations were highly correlated with lab-based measurements. High-frequency spatial data similar to our examples could be used to further understand aquatic biogeochemical fluxes, ecological patterns, and ecosystem processes, and will both inform and benefit from fixed-site data.

  5. High-speed limnology: Using advanced sensors to investigate spatial variability in biogeochemistry and hydrology

    USGS Publications Warehouse

    Crawford, John T.; Loken, Luke C.; Casson, Nora J.; Smith, Collin; Stone, Amanda G.; Winslow, Luke A.

    2015-01-01

    Advanced sensor technology is widely used in aquatic monitoring and research. Most applications focus on temporal variability, whereas spatial variability has been challenging to document. We assess the capability of water chemistry sensors embedded in a high-speed water intake system to document spatial variability. This new sensor platform continuously samples surface water at a range of speeds (0 to >45 km h–1) resulting in high-density, mesoscale spatial data. These novel observations reveal previously unknown variability in physical, chemical, and biological factors in streams, rivers, and lakes. By combining multiple sensors into one platform, we were able to detect terrestrial–aquatic hydrologic connections in a small dystrophic lake, to infer the role of main-channel vs backwater nutrient processing in a large river and to detect sharp chemical changes across aquatic ecosystem boundaries in a stream/lake complex. Spatial sensor data were verified in our examples by comparing with standard lab-based measurements of selected variables. Spatial fDOM data showed strong correlation with wet chemistry measurements of DOC, and optical NO3 concentrations were highly correlated with lab-based measurements. High-frequency spatial data similar to our examples could be used to further understand aquatic biogeochemical fluxes, ecological patterns, and ecosystem processes, and will both inform and benefit from fixed-site data.

  6. A New Data Access Mechanism for HDFS

    NASA Astrophysics Data System (ADS)

    Li, Qiang; Sun, Zhenyu; Wei, Zhanchen; Sun, Gongxing

    2017-10-01

    With the era of big data emerging, Hadoop has become the de facto standard of big data processing platform. However, it is still difficult to get legacy applications, such as High Energy Physics (HEP) applications, to run efficiently on Hadoop platform. There are two reasons which lead to the difficulties mentioned above: firstly, random access is not supported on Hadoop File System (HDFS), secondly, it is difficult to make legacy applications adopt to HDFS streaming data processing mode. In order to address the two issues, a new read and write mechanism of HDFS is proposed. With this mechanism, data access is done on the local file system instead of through HDFS streaming interfaces. To enable files modified by users, three attributes including permissions, owner and group are imposed on Block objects. Blocks stored on Datanodes have the same attributes as the file they are owned by. Users can modify blocks when the Map task running locally, and HDFS is responsible to update the rest replicas later after the block modification finished. To further improve the performance of Hadoop system, a complete localization task execution mechanism is implemented for I/O intensive jobs. Test results show that average CPU utilization is improved by 10% with the new task selection strategy, data read and write performances are improved by about 10% and 30% separately.

  7. Enabling Incremental Query Re-Optimization.

    PubMed

    Liu, Mengmeng; Ives, Zachary G; Loo, Boon Thau

    2016-01-01

    As declarative query processing techniques expand to the Web, data streams, network routers, and cloud platforms, there is an increasing need to re-plan execution in the presence of unanticipated performance changes. New runtime information may affect which query plan we prefer to run. Adaptive techniques require innovation both in terms of the algorithms used to estimate costs , and in terms of the search algorithm that finds the best plan. We investigate how to build a cost-based optimizer that recomputes the optimal plan incrementally given new cost information, much as a stream engine constantly updates its outputs given new data. Our implementation especially shows benefits for stream processing workloads. It lays the foundations upon which a variety of novel adaptive optimization algorithms can be built. We start by leveraging the recently proposed approach of formulating query plan enumeration as a set of recursive datalog queries ; we develop a variety of novel optimization approaches to ensure effective pruning in both static and incremental cases. We further show that the lessons learned in the declarative implementation can be equally applied to more traditional optimizer implementations.

  8. Enabling Incremental Query Re-Optimization

    PubMed Central

    Liu, Mengmeng; Ives, Zachary G.; Loo, Boon Thau

    2017-01-01

    As declarative query processing techniques expand to the Web, data streams, network routers, and cloud platforms, there is an increasing need to re-plan execution in the presence of unanticipated performance changes. New runtime information may affect which query plan we prefer to run. Adaptive techniques require innovation both in terms of the algorithms used to estimate costs, and in terms of the search algorithm that finds the best plan. We investigate how to build a cost-based optimizer that recomputes the optimal plan incrementally given new cost information, much as a stream engine constantly updates its outputs given new data. Our implementation especially shows benefits for stream processing workloads. It lays the foundations upon which a variety of novel adaptive optimization algorithms can be built. We start by leveraging the recently proposed approach of formulating query plan enumeration as a set of recursive datalog queries; we develop a variety of novel optimization approaches to ensure effective pruning in both static and incremental cases. We further show that the lessons learned in the declarative implementation can be equally applied to more traditional optimizer implementations. PMID:28659658

  9. A programming framework for data streaming on the Xeon Phi

    NASA Astrophysics Data System (ADS)

    Chapeland, S.; ALICE Collaboration

    2017-10-01

    ALICE (A Large Ion Collider Experiment) is the dedicated heavy-ion detector studying the physics of strongly interacting matter and the quark-gluon plasma at the CERN LHC (Large Hadron Collider). After the second long shut-down of the LHC, the ALICE detector will be upgraded to cope with an interaction rate of 50 kHz in Pb-Pb collisions, producing in the online computing system (O2) a sustained throughput of 3.4 TB/s. This data will be processed on the fly so that the stream to permanent storage does not exceed 90 GB/s peak, the raw data being discarded. In the context of assessing different computing platforms for the O2 system, we have developed a framework for the Intel Xeon Phi processors (MIC). It provides the components to build a processing pipeline streaming the data from the PC memory to a pool of permanent threads running on the MIC, and back to the host after processing. It is based on explicit offloading mechanisms (data transfer, asynchronous tasks) and basic building blocks (FIFOs, memory pools, C++11 threads). The user only needs to implement the processing method to be run on the MIC. We present in this paper the architecture, implementation, and performance of this system.

  10. Moving People from Science Adjacent to Science Doers with Twitch.tv

    NASA Astrophysics Data System (ADS)

    Gay, Pamela L.; CosmoQuest

    2017-10-01

    The CosmoQuest community is testing the ability to attract people from playing online videogames to doing fully online citizen science by engaging people through the Twitch.tv streaming platform. Twitch.tv launched in 2011 as an online platform for video gamers to stream their gameplay while providing narrative. In its six years of regular growth, the platform has added support for people playing non-video games, and for those participating in non-game activities. As part of their expansion, in April 2017, Twitch.tv hosted a science week during which they streamed the Cosmos series and allowed different feeds provide real-time commentary. They also hosted panel discussions on a variety of science topics. CosmoQuest participated in this event and used it as a jumping off point for beginning to interact with Twitch.tv community members online. With CosmoQuest’s beta launch of Image Detectives, they expanded their use of this streaming platform to include regular “office hours”, during which team members did science with CosmoQuest’s online projects, took questions from community members, and otherwise promoted the CosmoQuest community. This presentation examines this case study, and looks at how well different kinds of Twitter engagements attracted audiences, the conversion rate from viewer to subscriber, and at how effectively CosmoQuest was able to migrate users from viewing citizen science on Twitch.tv to participating in citizen science on CosmoQuest.org.This project was supported through NASA cooperative agreement NNX17AD20A.

  11. XML in an Adaptive Framework for Instrument Control

    NASA Technical Reports Server (NTRS)

    Ames, Troy J.

    2004-01-01

    NASA Goddard Space Flight Center is developing an extensible framework for instrument command and control, known as Instrument Remote Control (IRC), that combines the platform independent processing capabilities of Java with the power of the Extensible Markup Language (XML). A key aspect of the architecture is software that is driven by an instrument description, written using the Instrument Markup Language (IML). IML is an XML dialect used to describe interfaces to control and monitor the instrument, command sets and command formats, data streams, communication mechanisms, and data processing algorithms.

  12. Constructing temporary sampling platforms for hydrologic studies

    Treesearch

    Manuel H. Martinez; Sandra E. Ryan

    2000-01-01

    This paper presents instructions for constructing platforms that span the width of stream channels to accommodate the measurement of hydrologic parameters over a wide range of discharges. The platforms provide a stable, safe, noninvasive, easily constructed, and relatively inexpensive means for permitting data collection without wading in the flow. We have used the...

  13. MODIS Cloud Microphysics Product (MOD_PR06OD) Data Collection 6 Updates

    NASA Technical Reports Server (NTRS)

    Wind, Gala; Platnick, Steven; King, Michael D.

    2014-01-01

    The MODIS Cloud Optical and Microphysical Product (MOD_PR060D) for Data Collection 6 has entered full scale production. Aqua reprocessing is almost completed and Terra reprocessing will begin shortly. Unlike previous collections, the CHIMAERA code base allows for simultaneous processing for multiple sensors and the operational CHIMAERA 6.0.76 stream is also available for VIIRS and SEVIRI sensors and for our E-MAS airborne platform.

  14. Utilising Raspberry Pi as a cheap and easy do it yourself streaming device for astronomy

    NASA Astrophysics Data System (ADS)

    Maulana, F.; Soegijoko, W.; Yamani, A.

    2016-11-01

    Recent developments in personal computing platforms have been revolutionary. With the advent of the Raspberry Pi series and the Arduino series, sub USD 100 computing platforms have changed the playing field altogether. It used to be that you would need a PC or an FPGA platform costing thousands of USD to create a dedicated device for a a dedicated task. Combining a PiCam with the Raspberry Pi allows for smaller budgets to be able to stream live images to the internet and to the public in general. This paper traces our path in designing and adapting the PiCam to a common sized eyepiece and telescope in preparation for the TSE in Indonesia this past March.

  15. Serial Interface through Stream Protocol on EPICS Platform for Distributed Control and Monitoring

    NASA Astrophysics Data System (ADS)

    Das Gupta, Arnab; Srivastava, Amit K.; Sunil, S.; Khan, Ziauddin

    2017-04-01

    Remote operation of any equipment or device is implemented in distributed systems in order to control and proper monitoring of process values. For such remote operations, Experimental Physics and Industrial Control System (EPICS) is used as one of the important software tool for control and monitoring of a wide range of scientific parameters. A hardware interface is developed for implementation of EPICS software so that different equipment such as data converters, power supplies, pump controllers etc. could be remotely operated through stream protocol. EPICS base was setup on windows as well as Linux operating system for control and monitoring while EPICS modules such as asyn and stream device were used to interface the equipment with standard RS-232/RS-485 protocol. Stream Device protocol communicates with the serial line with an interface to asyn drivers. Graphical user interface and alarm handling were implemented with Motif Editor and Display Manager (MEDM) and Alarm Handler (ALH) command line channel access utility tools. This paper will describe the developed application which was tested with different equipment and devices serially interfaced to the PCs on a distributed network.

  16. Multi-mode sensor processing on a dynamically reconfigurable massively parallel processor array

    NASA Astrophysics Data System (ADS)

    Chen, Paul; Butts, Mike; Budlong, Brad; Wasson, Paul

    2008-04-01

    This paper introduces a novel computing architecture that can be reconfigured in real time to adapt on demand to multi-mode sensor platforms' dynamic computational and functional requirements. This 1 teraOPS reconfigurable Massively Parallel Processor Array (MPPA) has 336 32-bit processors. The programmable 32-bit communication fabric provides streamlined inter-processor connections with deterministically high performance. Software programmability, scalability, ease of use, and fast reconfiguration time (ranging from microseconds to milliseconds) are the most significant advantages over FPGAs and DSPs. This paper introduces the MPPA architecture, its programming model, and methods of reconfigurability. An MPPA platform for reconfigurable computing is based on a structural object programming model. Objects are software programs running concurrently on hundreds of 32-bit RISC processors and memories. They exchange data and control through a network of self-synchronizing channels. A common application design pattern on this platform, called a work farm, is a parallel set of worker objects, with one input and one output stream. Statically configured work farms with homogeneous and heterogeneous sets of workers have been used in video compression and decompression, network processing, and graphics applications.

  17. Real-time depth processing for embedded platforms

    NASA Astrophysics Data System (ADS)

    Rahnama, Oscar; Makarov, Aleksej; Torr, Philip

    2017-05-01

    Obtaining depth information of a scene is an important requirement in many computer-vision and robotics applications. For embedded platforms, passive stereo systems have many advantages over their active counterparts (i.e. LiDAR, Infrared). They are power efficient, cheap, robust to lighting conditions and inherently synchronized to the RGB images of the scene. However, stereo depth estimation is a computationally expensive task that operates over large amounts of data. For embedded applications which are often constrained by power consumption, obtaining accurate results in real-time is a challenge. We demonstrate a computationally and memory efficient implementation of a stereo block-matching algorithm in FPGA. The computational core achieves a throughput of 577 fps at standard VGA resolution whilst consuming less than 3 Watts of power. The data is processed using an in-stream approach that minimizes memory-access bottlenecks and best matches the raster scan readout of modern digital image sensors.

  18. Design features of offshore oil production platforms influence their susceptibility to biocorrosion.

    PubMed

    Duncan, Kathleen E; Davidova, Irene A; Nunn, Heather S; Stamps, Blake W; Stevenson, Bradley S; Souquet, Pierre J; Suflita, Joseph M

    2017-08-01

    Offshore oil-producing platforms are designed for efficient and cost-effective separation of oil from water. However, design features and operating practices may create conditions that promote the proliferation and spread of biocorrosive microorganisms. The microbial communities and their potential for metal corrosion were characterized for three oil production platforms that varied in their oil-water separation processes, fluid recycling practices, and history of microbially influenced corrosion (MIC). Microbial diversity was evaluated by 16S rRNA gene sequencing, and numbers of total bacteria, archaea, and sulfate-reducing bacteria (SRB) were estimated by qPCR. The rates of 35 S sulfate reduction assay (SRA) were measured as a proxy for metal biocorrosion potential. A variety of microorganisms common to oil production facilities were found, but distinct communities were associated with the design of the platform and varied with different locations in the processing stream. Stagnant, lower temperature (<37 °C) sites in all platforms had more SRB and higher SRA compared to samples from sites with higher temperatures and flow rates. However, high (5 mmol L -1 ) levels of hydrogen sulfide and high numbers (10 7  mL -1 ) of SRB were found in only one platform. This platform alone contained large separation tanks with long retention times and recycled fluids from stagnant sites to the beginning of the oil separation train, thus promoting distribution of biocorrosive microorganisms. These findings tell us that tracking microbial sulfate-reducing activity and community composition on off-shore oil production platforms can be used to identify operational practices that inadvertently promote the proliferation, distribution, and activity of biocorrosive microorganisms.

  19. Understanding the physical dynamics and ecological interactions in tidal stream energy environments

    NASA Astrophysics Data System (ADS)

    Fraser, Shaun; Williamson, Benjamin J.; Nikora, Vladimir; Scott, Beth E.

    2017-04-01

    Tidal stream energy devices are intended to operate in energetic physical environments characterised by high flows and extreme turbulence. These environments are often of ecological importance to a range of marine species. Understanding the physical dynamics and ecological interactions at fine scales in such sites is essential for device/array design and to understand environmental impacts. However, investigating fine scale characteristics requires high resolution field measurements which are difficult to attain and interpret, with data often confounded by interference related to turbulence. Consequently, field observations in tidal stream energy environments are limited and require the development of specialised analysis methods and so significant knowledge gaps are still present. The seabed mounted FLOWBEC platform is addressing these knowledge gaps using upward facing instruments to collect information from around marine energy infrastructure. Multifrequency and multibeam echosounder data provide detailed information on the distribution and interactions of biological targets, such as fish and diving seabirds, while simultaneously recording the scales and intensity of turbulence. Novel processing methodologies and instrument integration techniques have been developed which combine different data types and successfully separates signal from noise to reveal new evidence about the behaviour of mobile species and the structure of turbulence at all speeds of the tide and throughout the water column. Multiple platform deployments in the presence and absence of marine energy infrastructure reveal the natural characteristics of high energy sites, and enable the interpretation of the physical and biological impacts of tidal stream devices. These methods and results are relevant to the design and consenting of marine renewable energy technologies, and provide novel information on the use of turbulence for foraging opportunities in high energy sites by mobile species.

  20. Dimensions and dynamics of citizen observatories: The case of online amateur weather networks

    NASA Astrophysics Data System (ADS)

    Gharesifard, Mohammad; Wehn, Uta; van der Zaag, Pieter

    2016-04-01

    Crowd-sourced environmental observations are being increasingly considered as having the potential to enhance the spatial and temporal resolution of current data streams from terrestrial and areal sensors. The rapid diffusion of ICTs during the past decades has facilitated the process of data collection and sharing by the general public (so-called citizen science) and has resulted in the formation of various online environmental citizen observatory networks. Online amateur weather networks are a particular example of such ICT-mediated citizen observatories as one of the oldest and most widely practiced citizen science activities. The objective of this paper is to introduce a conceptual framework that enables a systematic review of different dimensions of these mushrooming/expanding networks. These dimensions include the geographic scope and types of network participants; the network's establishment mechanism, revenue stream(s) and existing communication paradigm; efforts required by citizens and support offered by platform providers; and issues such as data accessibility, availability and quality. An in-depth understanding of these dimensions helps to analyze various dynamics such as interactions between different stakeholders, motivations to run these networks, sustainability of the platforms, data ownership and level of transparency of each network. This framework is then utilized to perform a critical and normative review of six existing online amateur weather networks based on publicly available data. The main findings of this analysis suggest that: (1) There are several key stakeholders such as emergency services and local authorities that are not (yet) engaged in these networks. (2) The revenue stream(s) of online amateur weather networks is one of the least discussed but most important dimensions that is crucial for the sustainability of these networks. (3) Although all of the networks included in this study have one or more explicit pattern of two-way communications, there is no sign (yet) of interactive information exchange among the triangle of weather observers, data aggregators and policy makers. KEYWORDS Citizen Science, Citizen Observatories, ICT-enabled citizen participation, online amateur weather networks

  1. Big data processing in the cloud - Challenges and platforms

    NASA Astrophysics Data System (ADS)

    Zhelev, Svetoslav; Rozeva, Anna

    2017-12-01

    Choosing the appropriate architecture and technologies for a big data project is a difficult task, which requires extensive knowledge in both the problem domain and in the big data landscape. The paper analyzes the main big data architectures and the most widely implemented technologies used for processing and persisting big data. Clouds provide for dynamic resource scaling, which makes them a natural fit for big data applications. Basic cloud computing service models are presented. Two architectures for processing big data are discussed, Lambda and Kappa architectures. Technologies for big data persistence are presented and analyzed. Stream processing as the most important and difficult to manage is outlined. The paper highlights main advantages of cloud and potential problems.

  2. Taking Science On-air with Google+

    NASA Astrophysics Data System (ADS)

    Gay, P.

    2014-01-01

    Cost has long been a deterrent when trying to stream live events to large audiences. While streaming providers like UStream have free options, they include advertising and typically limit broadcasts to originating from a single location. In the autumn of 2011, Google premiered a new, free, video streaming tool -- Hangouts on Air -- as part of their Google+ social network. This platform allows up to ten different computers to stream live content to an unlimited audience, and automatically archives that content to YouTube. In this article we discuss best practices for using this technology to stream events over the internet.

  3. Organic waste as a sustainable feedstock for platform chemicals.

    PubMed

    Coma, M; Martinez-Hernandez, E; Abeln, F; Raikova, S; Donnelly, J; Arnot, T C; Allen, M J; Hong, D D; Chuck, C J

    2017-09-21

    Biorefineries have been established since the 1980s for biofuel production, and there has been a switch lately from first to second generation feedstocks in order to avoid the food versus fuel dilemma. To a lesser extent, many opportunities have been investigated for producing chemicals from biomass using by-products of the present biorefineries, simple waste streams. Current facilities apply intensive pre-treatments to deal with single substrate types such as carbohydrates. However, most organic streams such as municipal solid waste or algal blooms present a high complexity and variable mixture of molecules, which makes specific compound production and separation difficult. Here we focus on flexible anaerobic fermentation and hydrothermal processes that can treat complex biomass as a whole to obtain a range of products within an integrated biorefinery concept.

  4. Organic waste as a sustainable feedstock for platform chemicals

    PubMed Central

    Martinez-Hernandez, E.; Abeln, F.; Raikova, S.; Donnelly, J.; Arnot, T. C.; Allen, M. J.; Hong, D. D.; Chuck, C. J.

    2017-01-01

    Biorefineries have been established since the 1980s for biofuel production, and there has been a switch lately from first to second generation feedstocks in order to avoid the food versus fuel dilemma. To a lesser extent, many opportunities have been investigated for producing chemicals from biomass using by-products of the present biorefineries, simple waste streams. Current facilities apply intensive pre-treatments to deal with single substrate types such as carbohydrates. However, most organic streams such as municipal solid waste or algal blooms present a high complexity and variable mixture of molecules, which makes specific compound production and separation difficult. Here we focus on flexible anaerobic fermentation and hydrothermal processes that can treat complex biomass as a whole to obtain a range of products within an integrated biorefinery concept. PMID:28654113

  5. PlanetSense: A Real-time Streaming and Spatio-temporal Analytics Platform for Gathering Geo-spatial Intelligence from Open Source Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thakur, Gautam S; Bhaduri, Budhendra L; Piburn, Jesse O

    Geospatial intelligence has traditionally relied on the use of archived and unvarying data for planning and exploration purposes. In consequence, the tools and methods that are architected to provide insight and generate projections only rely on such datasets. Albeit, if this approach has proven effective in several cases, such as land use identification and route mapping, it has severely restricted the ability of researchers to inculcate current information in their work. This approach is inadequate in scenarios requiring real-time information to act and to adjust in ever changing dynamic environments, such as evacuation and rescue missions. In this work, wemore » propose PlanetSense, a platform for geospatial intelligence that is built to harness the existing power of archived data and add to that, the dynamics of real-time streams, seamlessly integrated with sophisticated data mining algorithms and analytics tools for generating operational intelligence on the fly. The platform has four main components i) GeoData Cloud a data architecture for storing and managing disparate datasets; ii) Mechanism to harvest real-time streaming data; iii) Data analytics framework; iv) Presentation and visualization through web interface and RESTful services. Using two case studies, we underpin the necessity of our platform in modeling ambient population and building occupancy at scale.« less

  6. Layout Study and Application of Mobile App Recommendation Approach Based On Spark Streaming Framework

    NASA Astrophysics Data System (ADS)

    Wang, H. T.; Chen, T. T.; Yan, C.; Pan, H.

    2018-05-01

    For App recommended areas of mobile phone software, made while using conduct App application recommended combined weighted Slope One algorithm collaborative filtering algorithm items based on further improvement of the traditional collaborative filtering algorithm in cold start, data matrix sparseness and other issues, will recommend Spark stasis parallel algorithm platform, the introduction of real-time streaming streaming real-time computing framework to improve real-time software applications recommended.

  7. Implementation of a quantum cascade laser-based gas sensor prototype for sub-ppmv H2S measurements in a petrochemical process gas stream.

    PubMed

    Moser, Harald; Pölz, Walter; Waclawek, Johannes Paul; Ofner, Johannes; Lendl, Bernhard

    2017-01-01

    The implementation of a sensitive and selective as well as industrial fit gas sensor prototype based on wavelength modulation spectroscopy with second harmonic detection (2f-WMS) employing an 8-μm continuous-wave distributed feedback quantum cascade laser (CW-DFB-QCL) for monitoring hydrogen sulfide (H 2 S) at sub-ppm levels is reported. Regarding the applicability for analytical and industrial process purposes aimed at petrochemical environments, a synthetic methane (CH 4 ) matrix of up to 1000 ppmv together with a varying H 2 S content was chosen as the model environment for the laboratory-based performance evaluation performed at TU Wien. A noise-equivalent absorption sensitivity (NEAS) for H 2 S targeting the absorption line at 1247.2 cm -1 was found to be 8.419 × 10 -10  cm -1  Hz -1/2 , and a limit of detection (LOD) of 150 ppbv H 2 S could be achieved. The sensor prototype was then deployed for on-site measurements at the petrochemical research hydrogenation platform of the industrial partner OMV AG. In order to meet the company's on-site safety regulations, the H 2 S sensor platform was installed in an industry rack and equipped with the required safety infrastructure for protected operation in hazardous and explosive environments. The work reports the suitability of the sensor prototype for simultaneous monitoring of H 2 S and CH 4 content in the process streams of a research hydrodesulfurization (HDS) unit. Concentration readings were obtained every 15 s and revealed process dynamics not observed previously.

  8. Radio Astronomy Software Defined Receiver Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vacaliuc, Bogdan; Leech, Marcus; Oxley, Paul

    The paper describes a Radio Astronomy Software Defined Receiver (RASDR) that is currently under development. RASDR is targeted for use by amateurs and small institutions where cost is a primary consideration. The receiver will operate from HF thru 2.8 GHz. Front-end components such as preamps, block down-converters and pre-select bandpass filters are outside the scope of this development and will be provided by the user. The receiver includes RF amplifiers and attenuators, synthesized LOs, quadrature down converters, dual 8 bit ADCs and a Signal Processor that provides firmware processing of the digital bit stream. RASDR will interface to a usermore » s PC via a USB or higher speed Ethernet LAN connection. The PC will run software that provides processing of the bit stream, a graphical user interface, as well as data analysis and storage. Software should support MAC OS, Windows and Linux platforms and will focus on such radio astronomy applications as total power measurements, pulsar detection, and spectral line studies.« less

  9. The PubChem chemical structure sketcher

    PubMed Central

    2009-01-01

    PubChem is an important public, Web-based information source for chemical and bioactivity information. In order to provide convenient structure search methods on compounds stored in this database, one mandatory component is a Web-based drawing tool for interactive sketching of chemical query structures. Web-enabled chemical structure sketchers are not new, being in existence for years; however, solutions available rely on complex technology like Java applets or platform-dependent plug-ins. Due to general policy and support incident rate considerations, Java-based or platform-specific sketchers cannot be deployed as a part of public NCBI Web services. Our solution: a chemical structure sketching tool based exclusively on CGI server processing, client-side JavaScript functions, and image sequence streaming. The PubChem structure editor does not require the presence of any specific runtime support libraries or browser configurations on the client. It is completely platform-independent and verified to work on all major Web browsers, including older ones without support for Web2.0 JavaScript objects. PMID:20298522

  10. Design method of ARM based embedded iris recognition system

    NASA Astrophysics Data System (ADS)

    Wang, Yuanbo; He, Yuqing; Hou, Yushi; Liu, Ting

    2008-03-01

    With the advantages of non-invasiveness, uniqueness, stability and low false recognition rate, iris recognition has been successfully applied in many fields. Up to now, most of the iris recognition systems are based on PC. However, a PC is not portable and it needs more power. In this paper, we proposed an embedded iris recognition system based on ARM. Considering the requirements of iris image acquisition and recognition algorithm, we analyzed the design method of the iris image acquisition module, designed the ARM processing module and its peripherals, studied the Linux platform and the recognition algorithm based on this platform, finally actualized the design method of ARM-based iris imaging and recognition system. Experimental results show that the ARM platform we used is fast enough to run the iris recognition algorithm, and the data stream can flow smoothly between the camera and the ARM chip based on the embedded Linux system. It's an effective method of using ARM to actualize portable embedded iris recognition system.

  11. Nitrate dynamics within a stream-lake network through time and space

    NASA Astrophysics Data System (ADS)

    Loken, L. C.; Crawford, J. T.; Childress, E. S.; Casson, N. J.; Stanley, E. H.

    2014-12-01

    Nitrate dynamics in streams are governed by biology, hydrology, and geomorphology, and the ability to parse these drivers apart has improved with the development of accurate high-frequency sensors. By combining a stationary Eulerian and a quasi-Lagrangian sensor platform, we investigated the timing of nitrate flushing and identified locations of elevated biogeochemical cycling along a stream-lake network in Northern Wisconsin, USA. Two years of continuous oxygen, carbon dioxide, and discharge measurements were used to compute gross primary production (GPP) and ecosystem respiration (ER) downstream of a wetland reach of Allequash Creek. Metabolic rates and flow patterns were compared with nitrate concentrations measured every 30 minutes using an optical sensor. Additionally, we floated a sensor array from the headwater spring ponds through a heterogeneous stream reach consisting of wetlands, beaver ponds, forested segments, and two lakes. Two distinct temporal patterns of stream nitrate concentrations were observed. During high flow events such as spring snowmelt and summer rain events, nitrate concentrations increased from ~5 μM (baseflow) to 12 μM, suggesting flushing from catchment sources. During baseflow conditions, nitrate followed a diel cycle with a 0.3-1.0 μM daytime draw down. Daily nitrate reduction was positively correlated with GPP calculated from oxygen and carbon dioxide records. Lastly, spatial analyses revealed lowest nitrate concentrations in the wetland reach, approximately 2-3 μM lower than the upstream spring ponds, and downstream lakes and forested reaches. This snapshot implies greater nitrate removal potential in the wetland reach likely driven by denitrification in organic rich sediments and macrophyte uptake in the open canopy stream segment. Taken together the temporal and spatial results show the dynamics of hydrology, geomorphology, and biology to influence nitrate delivery and variability in ecosystem processing through a stream-lake system. Future ecosystem studies could benefit by including multiple reference frameworks to better assess processes not captured by a single station approach.

  12. A Stream lined Approach for the Payload Customer in Identifying Payload Design Requirements

    NASA Technical Reports Server (NTRS)

    Miller, Ladonna J.; Schneider, Walter F.; Johnson, Dexer E.; Roe, Lesa B.

    2001-01-01

    NASA payload developers from across various disciplines were asked to identify areas where process changes would simplify their task of developing and flying flight hardware. Responses to this query included a central location for consistent hardware design requirements for middeck payloads. The multidisciplinary team assigned to review the numerous payload interface design documents is assessing the Space Shuttle middeck, the SPACEHAB Inc. locker, as well as the MultiPurpose Logistics Module (MPLM) and EXpedite the PRocessing of Experiments to Space Station (EXPRESS) rack design requirements for the payloads. They are comparing the multiple carriers and platform requirements and developing a matrix which illustrates the individual requirements, and where possible, the envelope that encompasses all of the possibilities. The matrix will be expanded to form an overall envelope that the payload developers will have the option to utilize when designing their payload's hardware. This will optimize the flexibility for payload hardware and ancillary items to be manifested on multiple carriers and platforms with minimal impact to the payload developer.

  13. A flexible microbial co-culture platform for simultaneous utilization of methane and carbon dioxide from gas feedstocks

    DOE PAGES

    Hill, Eric A.; Chrisler, William B.; Beliaev, Alex S.; ...

    2017-01-03

    A new co-cultivation technology is presented that converts greenhouse gasses, CH 4 and CO 2, into microbial biomass. The methanotrophic bacterium, Methylomicrobium alcaliphilum 20z, was coupled to a cyanobacterium, Synechococcus PCC 7002 via oxygenic photosynthesis. The system exhibited robust growth on diverse gas mixtures ranging from biogas to those representative of a natural gas feedstock. A continuous processes was developed on a synthetic natural gas feed that achieved steady-state by imposing coupled light and O 2 limitations on the cyanobacterium and methanotroph, respectively. Continuous co-cultivation resulted in an O 2 depleted reactor and does not require CH 4/O 2 mixturesmore » to be fed into the system, thereby enhancing process safety considerations over traditional methanotroph mono-culture platforms. This co-culture technology is scalable with respect to its ability to utilize different gas streams and its biological components constructed from model bacteria that can be metabolically customized to produce a range of biofuels and bioproducts.« less

  14. A flexible microbial co-culture platform for simultaneous utilization of methane and carbon dioxide from gas feedstocks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hill, Eric A.; Chrisler, William B.; Beliaev, Alex S.

    A new co-cultivation technology is presented that converts greenhouse gasses, CH 4 and CO 2, into microbial biomass. The methanotrophic bacterium, Methylomicrobium alcaliphilum 20z, was coupled to a cyanobacterium, Synechococcus PCC 7002 via oxygenic photosynthesis. The system exhibited robust growth on diverse gas mixtures ranging from biogas to those representative of a natural gas feedstock. A continuous processes was developed on a synthetic natural gas feed that achieved steady-state by imposing coupled light and O 2 limitations on the cyanobacterium and methanotroph, respectively. Continuous co-cultivation resulted in an O 2 depleted reactor and does not require CH 4/O 2 mixturesmore » to be fed into the system, thereby enhancing process safety considerations over traditional methanotroph mono-culture platforms. This co-culture technology is scalable with respect to its ability to utilize different gas streams and its biological components constructed from model bacteria that can be metabolically customized to produce a range of biofuels and bioproducts.« less

  15. DIVE: A Graph-based Visual Analytics Framework for Big Data

    PubMed Central

    Rysavy, Steven J.; Bromley, Dennis; Daggett, Valerie

    2014-01-01

    The need for data-centric scientific tools is growing; domains like biology, chemistry, and physics are increasingly adopting computational approaches. As a result, scientists must now deal with the challenges of big data. To address these challenges, we built a visual analytics platform named DIVE: Data Intensive Visualization Engine. DIVE is a data-agnostic, ontologically-expressive software framework capable of streaming large datasets at interactive speeds. Here we present the technical details of the DIVE platform, multiple usage examples, and a case study from the Dynameomics molecular dynamics project. We specifically highlight our novel contributions to structured data model manipulation and high-throughput streaming of large, structured datasets. PMID:24808197

  16. The hospital tech laboratory: quality innovation in a new era of value-conscious care.

    PubMed

    Keteyian, Courtland K; Nallamothu, Brahmajee K; Ryan, Andrew M

    2017-08-01

    For decades, the healthcare industry has been incentivized to develop new diagnostic technologies, but this limitless progress fueled rapidly growing expenditures. With an emphasis on value, the future will favor information synthesis and processing over pure data generation, and hospitals will play a critical role in developing these systems. A Michigan Medicine, IBM, and AirStrip partnership created a robust streaming analytics platform tasked with creating predictive algorithms for critical care with the potential to support clinical decisions and deliver significant value.

  17. Presenting the master of all conductivity meters, and how it tastes streamflow

    NASA Astrophysics Data System (ADS)

    Weijs, S. V.; Parlange, M. B.

    2012-04-01

    For measuring streamflow in small Alpine streams, the salt dilution method is suitable and often used. By injecting a known mass of salt in the stream and measuring the downstream salt concentration as a function of time, we can obtain the streamflow by integration of the time signal. The underlying assumption is that the salt is well mixed within the stream cross-section. In this method, the salt concentration us usually measured through its relation with conductivity. Several commercial systems exist to do these conductivity measurements and automatically process the results. The problem we encountered when using these systems, however, is that uncertainty is often hidden under the hood. Because the processing happens onboard, researchers may be tempted to put too much trust in the final measurement outcomes. This is somewhat remediated by using a system with two probes which are individually processed to a streamflow outcome. We found that the salt-wave was differently shaped for the faster part of the stream compared to the sides, and therefore gave different readings for the discharge. To come a more probabilistic characterization of streamflow, and to know what is under the hood, we decided to build our own conductivity meter, equipped with eight probes covering the cross section. This enables quantifying some of the uncertainty in the streamflow measurements, which is important for testing hydrological models. This poster shows the first results and the hardware setup. We based our hardware on the open source hardware platform Arduino, and believe that by sharing both the design and the drawbacks, we contribute to the evolution of better measurement equipment or at least better understanding of its shortcomings.

  18. Research on quality metrics of wireless adaptive video streaming

    NASA Astrophysics Data System (ADS)

    Li, Xuefei

    2018-04-01

    With the development of wireless networks and intelligent terminals, video traffic has increased dramatically. Adaptive video streaming has become one of the most promising video transmission technologies. For this type of service, a good QoS (Quality of Service) of wireless network does not always guarantee that all customers have good experience. Thus, new quality metrics have been widely studies recently. Taking this into account, the objective of this paper is to investigate the quality metrics of wireless adaptive video streaming. In this paper, a wireless video streaming simulation platform with DASH mechanism and multi-rate video generator is established. Based on this platform, PSNR model, SSIM model and Quality Level model are implemented. Quality Level Model considers the QoE (Quality of Experience) factors such as image quality, stalling and switching frequency while PSNR Model and SSIM Model mainly consider the quality of the video. To evaluate the performance of these QoE models, three performance metrics (SROCC, PLCC and RMSE) which are used to make a comparison of subjective and predicted MOS (Mean Opinion Score) are calculated. From these performance metrics, the monotonicity, linearity and accuracy of these quality metrics can be observed.

  19. The Baselines Project: Establishing Reference Environmental Conditions for Marine Habitats in the Gulf of Mexico using Forecast Models and Satellite Data

    NASA Astrophysics Data System (ADS)

    Jolliff, J. K.; Gould, R. W.; deRada, S.; Teague, W. J.; Wijesekera, H. W.

    2012-12-01

    We provide an overview of the NASA-funded project, "High-Resolution Subsurface Physical and Optical Property Fields in the Gulf of Mexico: Establishing Baselines and Assessment Tools for Resource Managers." Data assimilative models, analysis fields, and multiple satellite data streams were used to construct temperature and photon flux climatologies for the Flower Garden Banks National Marine Sanctuary (FGBNMS) and similar habitats in the northwestern Gulf of Mexico where geologic features provide a platform for unique coral reef ecosystems. Comparison metrics of the products to in situ data collected during complimentary projects are also examined. Similarly, high-resolution satellite-data streams and advanced processing techniques were used to establish baseline suspended sediment load and turbidity conditions in selected northern Gulf of Mexico estuaries. The results demonstrate the feasibility of blending models and data into accessible web-based analysis products for resource managers, policy makers, and the public.

  20. Real-time WAMI streaming target tracking in fog

    NASA Astrophysics Data System (ADS)

    Chen, Yu; Blasch, Erik; Chen, Ning; Deng, Anna; Ling, Haibin; Chen, Genshe

    2016-05-01

    Real-time information fusion based on WAMI (Wide-Area Motion Imagery), FMV (Full Motion Video), and Text data is highly desired for many mission critical emergency or security applications. Cloud Computing has been considered promising to achieve big data integration from multi-modal sources. In many mission critical tasks, however, powerful Cloud technology cannot satisfy the tight latency tolerance as the servers are allocated far from the sensing platform, actually there is no guaranteed connection in the emergency situations. Therefore, data processing, information fusion, and decision making are required to be executed on-site (i.e., near the data collection). Fog Computing, a recently proposed extension and complement for Cloud Computing, enables computing on-site without outsourcing jobs to a remote Cloud. In this work, we have investigated the feasibility of processing streaming WAMI in the Fog for real-time, online, uninterrupted target tracking. Using a single target tracking algorithm, we studied the performance of a Fog Computing prototype. The experimental results are very encouraging that validated the effectiveness of our Fog approach to achieve real-time frame rates.

  1. Near Theoretical Gigabit Link Efficiency for Distributed Data Acquisition Systems

    PubMed Central

    Abu-Nimeh, Faisal T.; Choong, Woon-Seng

    2017-01-01

    Link efficiency, data integrity, and continuity for high-throughput and real-time systems is crucial. Most of these applications require specialized hardware and operating systems as well as extensive tuning in order to achieve high efficiency. Here, we present an implementation of gigabit Ethernet data streaming which can achieve 99.26% link efficiency while maintaining no packet losses. The design and implementation are built on OpenPET, an opensource data acquisition platform for nuclear medical imaging, where (a) a crate hosting multiple OpenPET detector boards uses a User Datagram Protocol over Internet Protocol (UDP/IP) Ethernet soft-core, that is capable of understanding PAUSE frames, to stream data out to a computer workstation; (b) the receiving computer uses Netmap to allow the processing software (i.e., user space), which is written in Python, to directly receive and manage the network card’s ring buffers, bypassing the operating system kernel’s networking stack; and (c) a multi-threaded application using synchronized queues is implemented in the processing software (Python) to free up the ring buffers as quickly as possible while preserving data integrity and flow continuity. PMID:28630948

  2. Near Theoretical Gigabit Link Efficiency for Distributed Data Acquisition Systems.

    PubMed

    Abu-Nimeh, Faisal T; Choong, Woon-Seng

    2017-03-01

    Link efficiency, data integrity, and continuity for high-throughput and real-time systems is crucial. Most of these applications require specialized hardware and operating systems as well as extensive tuning in order to achieve high efficiency. Here, we present an implementation of gigabit Ethernet data streaming which can achieve 99.26% link efficiency while maintaining no packet losses. The design and implementation are built on OpenPET, an opensource data acquisition platform for nuclear medical imaging, where (a) a crate hosting multiple OpenPET detector boards uses a User Datagram Protocol over Internet Protocol (UDP/IP) Ethernet soft-core, that is capable of understanding PAUSE frames, to stream data out to a computer workstation; (b) the receiving computer uses Netmap to allow the processing software (i.e., user space), which is written in Python, to directly receive and manage the network card's ring buffers, bypassing the operating system kernel's networking stack; and (c) a multi-threaded application using synchronized queues is implemented in the processing software (Python) to free up the ring buffers as quickly as possible while preserving data integrity and flow continuity.

  3. Valorization of industrial waste and by-product streams via fermentation for the production of chemicals and biopolymers.

    PubMed

    Koutinas, Apostolis A; Vlysidis, Anestis; Pleissner, Daniel; Kopsahelis, Nikolaos; Lopez Garcia, Isabel; Kookos, Ioannis K; Papanikolaou, Seraphim; Kwan, Tsz Him; Lin, Carol Sze Ki

    2014-04-21

    The transition from a fossil fuel-based economy to a bio-based economy necessitates the exploitation of synergies, scientific innovations and breakthroughs, and step changes in the infrastructure of chemical industry. Sustainable production of chemicals and biopolymers should be dependent entirely on renewable carbon. White biotechnology could provide the necessary tools for the evolution of microbial bioconversion into a key unit operation in future biorefineries. Waste and by-product streams from existing industrial sectors (e.g., food industry, pulp and paper industry, biodiesel and bioethanol production) could be used as renewable resources for both biorefinery development and production of nutrient-complete fermentation feedstocks. This review focuses on the potential of utilizing waste and by-product streams from current industrial activities for the production of chemicals and biopolymers via microbial bioconversion. The first part of this review presents the current status and prospects on fermentative production of important platform chemicals (i.e., selected C2-C6 metabolic products and single cell oil) and biopolymers (i.e., polyhydroxyalkanoates and bacterial cellulose). In the second part, the qualitative and quantitative characteristics of waste and by-product streams from existing industrial sectors are presented. In the third part, the techno-economic aspects of bioconversion processes are critically reviewed. Four case studies showing the potential of case-specific waste and by-product streams for the production of succinic acid and polyhydroxyalkanoates are presented. It is evident that fermentative production of chemicals and biopolymers via refining of waste and by-product streams is a highly important research area with significant prospects for industrial applications.

  4. NASA Data Acquisition System Software Development for Rocket Propulsion Test Facilities

    NASA Technical Reports Server (NTRS)

    Herbert, Phillip W., Sr.; Elliot, Alex C.; Graves, Andrew R.

    2015-01-01

    Current NASA propulsion test facilities include Stennis Space Center in Mississippi, Marshall Space Flight Center in Alabama, Plum Brook Station in Ohio, and White Sands Test Facility in New Mexico. Within and across these centers, a diverse set of data acquisition systems exist with different hardware and software platforms. The NASA Data Acquisition System (NDAS) is a software suite designed to operate and control many critical aspects of rocket engine testing. The software suite combines real-time data visualization, data recording to a variety formats, short-term and long-term acquisition system calibration capabilities, test stand configuration control, and a variety of data post-processing capabilities. Additionally, data stream conversion functions exist to translate test facility data streams to and from downstream systems, including engine customer systems. The primary design goals for NDAS are flexibility, extensibility, and modularity. Providing a common user interface for a variety of hardware platforms helps drive consistency and error reduction during testing. In addition, with an understanding that test facilities have different requirements and setups, the software is designed to be modular. One engine program may require real-time displays and data recording; others may require more complex data stream conversion, measurement filtering, or test stand configuration management. The NDAS suite allows test facilities to choose which components to use based on their specific needs. The NDAS code is primarily written in LabVIEW, a graphical, data-flow driven language. Although LabVIEW is a general-purpose programming language; large-scale software development in the language is relatively rare compared to more commonly used languages. The NDAS software suite also makes extensive use of a new, advanced development framework called the Actor Framework. The Actor Framework provides a level of code reuse and extensibility that has previously been difficult to achieve using LabVIEW. The

  5. An intelligent surveillance platform for large metropolitan areas with dense sensor deployment.

    PubMed

    Fernández, Jorge; Calavia, Lorena; Baladrón, Carlos; Aguiar, Javier M; Carro, Belén; Sánchez-Esguevillas, Antonio; Alonso-López, Jesus A; Smilansky, Zeev

    2013-06-07

    This paper presents an intelligent surveillance platform based on the usage of large numbers of inexpensive sensors designed and developed inside the European Eureka Celtic project HuSIMS. With the aim of maximizing the number of deployable units while keeping monetary and resource/bandwidth costs at a minimum, the surveillance platform is based on the usage of inexpensive visual sensors which apply efficient motion detection and tracking algorithms to transform the video signal in a set of motion parameters. In order to automate the analysis of the myriad of data streams generated by the visual sensors, the platform's control center includes an alarm detection engine which comprises three components applying three different Artificial Intelligence strategies in parallel. These strategies are generic, domain-independent approaches which are able to operate in several domains (traffic surveillance, vandalism prevention, perimeter security, etc.). The architecture is completed with a versatile communication network which facilitates data collection from the visual sensors and alarm and video stream distribution towards the emergency teams. The resulting surveillance system is extremely suitable for its deployment in metropolitan areas, smart cities, and large facilities, mainly because cheap visual sensors and autonomous alarm detection facilitate dense sensor network deployments for wide and detailed coverage.

  6. Metabolic engineering of industrial platform microorganisms for biorefinery applications--optimization of substrate spectrum and process robustness by rational and evolutive strategies.

    PubMed

    Buschke, Nele; Schäfer, Rudolf; Becker, Judith; Wittmann, Christoph

    2013-05-01

    Bio-based production promises a sustainable route to myriads of chemicals, materials and fuels. With regard to eco-efficiency, its future success strongly depends on a next level of bio-processes using raw materials beyond glucose. Such renewables, i.e., polymers, complex substrate mixtures and diluted waste streams, often cannot be metabolized naturally by the producing organisms. This particularly holds for well-known microorganisms from the traditional sugar-based biotechnology, including Escherichia coli, Corynebacterium glutamicum and Saccharomyces cerevisiae which have been engineered successfully to produce a broad range of products from glucose. In order to make full use of their production potential within the bio-refinery value chain, they have to be adapted to various feed-stocks of interest. This review focuses on the strategies to be applied for this purpose which combine rational and evolutive approaches. Hereby, the three industrial platform microorganisms, E. coli, C. glutamicum and S. cerevisiae are highlighted due to their particular importance. Copyright © 2012 Elsevier Ltd. All rights reserved.

  7. A Cloud Based Real-Time Collaborative Platform for eHealth.

    PubMed

    Ionescu, Bogdan; Gadea, Cristian; Solomon, Bogdan; Ionescu, Dan; Stoicu-Tivadar, Vasile; Trifan, Mircea

    2015-01-01

    For more than a decade, the eHealth initiative has been a government concern of many countries. In an Electronic Health Record (EHR) System, there is a need for sharing the data with a group of specialists simultaneously. Collaborative platforms alone are just a part of a solution, while a collaborative platform with parallel editing capabilities and with synchronized data streaming are stringently needed. In this paper, the design and implementation of a collaborative platform used in healthcare is introduced by describing the high level architecture and its implementation. A series of eHealth services are identified and usage examples in a healthcare environment are given.

  8. Escherichia coli counting using lens-free imaging for sepsis diagnosis

    NASA Astrophysics Data System (ADS)

    Moon, Sangjun; Manzur, Fahim; Manzur, Tariq; Klapperich, Catherine; Demirci, Utkan

    2009-09-01

    Sepsis causes 9.3% of overall deaths in United States. To diagnose sepsis, cell/bacteria capture and culturing methods have been widely investigated in the medical field. Escherichia Coli (E. Coli) is used as a model organism for sepsis in blood stream since wide variety of antibodies are established and the genetic modification process is well documented for fluorescent tagging. In point-of-care testing applications, the sepsis diagnostics require fast monitoring, inexpensive testing, and reliable results at resource limited settings, i.e. battle field, home care for dialysis. However, the cell/E.coli are hard to directly capture and see at the POCT because of the small size, 2 μm long and 0.5 μm in diameter, and the bacteria are rare in the blood stream in sepsis. Here, we propose a novel POCT platform to image and enumerate cell/E.coli on a microfluidic surface to diagnose sepsis at resource limited conditions. We demonstrate that target cells are captured from 5 μl of whole blood using specific antibodies and E.coli are imaged using a lens-free imaging platform, 2.2 μm pixel CMOS based imaging sensor. This POCT cell/bacteria capture and enumeration approach can further be used for medical diagnostics of sepsis. We also show approaches to rapidly quantify white blood cell counts from blood which can be used to monitor immune response.

  9. [Construction and analysis of a monitoring system with remote real-time multiple physiological parameters based on cloud computing].

    PubMed

    Zhu, Lingyun; Li, Lianjie; Meng, Chunyan

    2014-12-01

    There have been problems in the existing multiple physiological parameter real-time monitoring system, such as insufficient server capacity for physiological data storage and analysis so that data consistency can not be guaranteed, poor performance in real-time, and other issues caused by the growing scale of data. We therefore pro posed a new solution which was with multiple physiological parameters and could calculate clustered background data storage and processing based on cloud computing. Through our studies, a batch processing for longitudinal analysis of patients' historical data was introduced. The process included the resource virtualization of IaaS layer for cloud platform, the construction of real-time computing platform of PaaS layer, the reception and analysis of data stream of SaaS layer, and the bottleneck problem of multi-parameter data transmission, etc. The results were to achieve in real-time physiological information transmission, storage and analysis of a large amount of data. The simulation test results showed that the remote multiple physiological parameter monitoring system based on cloud platform had obvious advantages in processing time and load balancing over the traditional server model. This architecture solved the problems including long turnaround time, poor performance of real-time analysis, lack of extensibility and other issues, which exist in the traditional remote medical services. Technical support was provided in order to facilitate a "wearable wireless sensor plus mobile wireless transmission plus cloud computing service" mode moving towards home health monitoring for multiple physiological parameter wireless monitoring.

  10. Accelerated Analyte Uptake on Single Beads in Microliter-scale Batch Separations using Acoustic Streaming: Plutonium Uptake by Anion Exchange for Analysis by Mass Spectrometry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Paxton, Walter F.; O'Hara, Matthew J.; Peper, Shane M.

    2008-06-01

    The use of acoustic streaming as a non-contact mixing platform to accelerate mass transport-limited diffusion processes in small volume heterogeneous reactions has been investigated. Single bead anion exchange of plutonium at nanomolar and sub-picomolar concentrations in 20 microliter liquid volumes was used to demonstrate the effect of acoustic mixing. Pu uptake rates on individual ~760 micrometer diameter AG 1x4 anion exchange resin beads were determined using acoustic mixing and compared with Pu uptake rates achieved by static diffusion alone. An 82 MHz surface acoustic wave (SAW) device was placed in contact with the underside of a 384-well microplate containing flat-bottomedmore » semiconical wells. Acoustic energy was coupled into the solution in the well, inducing acoustic streaming. Pu uptake rates were determined by the plutonium remaining in solution after specific elapsed time intervals, using liquid scintillation counting (LSC) for nanomolar concentrations and thermal ionization mass spectrometry (TIMS) analysis for the sub-picomolar concentration experiments. It was found that this small batch uptake reaction could be accelerated by a factor of about five-fold or more, depending on the acoustic power applied.« less

  11. Distributing Data to Hand-Held Devices in a Wireless Network

    NASA Technical Reports Server (NTRS)

    Hodges, Mark; Simmons, Layne

    2008-01-01

    ADROIT is a developmental computer program for real-time distribution of complex data streams for display on Web-enabled, portable terminals held by members of an operational team of a spacecraft-command-and-control center who may be located away from the center. Examples of such terminals include personal data assistants, laptop computers, and cellular telephones. ADROIT would make it unnecessary to equip each terminal with platform- specific software for access to the data streams or with software that implements the information-sharing protocol used to deliver telemetry data to clients in the center. ADROIT is a combination of middleware plus software specific to the center. (Middleware enables one application program to communicate with another by performing such functions as conversion, translation, consolidation, and/or integration.) ADROIT translates a data stream (voice, video, or alphanumerical data) from the center into Extensible Markup Language, effectuates a subscription process to determine who gets what data when, and presents the data to each user in real time. Thus, ADROIT is expected to enable distribution of operations and to reduce the cost of operations by reducing the number of persons required to be in the center.

  12. Virtual Exploitation Environment Demonstration for Atmospheric Missions

    NASA Astrophysics Data System (ADS)

    Natali, Stefano; Mantovani, Simone; Hirtl, Marcus; Santillan, Daniel; Triebnig, Gerhard; Fehr, Thorsten; Lopes, Cristiano

    2017-04-01

    The scientific and industrial communities are being confronted with a strong increase of Earth Observation (EO) satellite missions and related data. This is in particular the case for the Atmospheric Sciences communities, with the upcoming Copernicus Sentinel-5 Precursor, Sentinel-4, -5 and -3, and ESA's Earth Explorers scientific satellites ADM-Aeolus and EarthCARE. The challenge is not only to manage the large volume of data generated by each mission / sensor, but to process and analyze the data streams. Creating synergies among the different datasets will be key to exploit the full potential of the available information. As a preparation activity supporting scientific data exploitation for Earth Explorer and Sentinel atmospheric missions, ESA funded the "Technology and Atmospheric Mission Platform" (TAMP) [1] [2] project; a scientific and technological forum (STF) has been set-up involving relevant European entities from different scientific and operational fields to define the platforḿs requirements. Data access, visualization, processing and download services have been developed to satisfy useŕs needs; use cases defined with the STF, such as study of the SO2 emissions for the Holuhraun eruption (2014) by means of two numerical models, two satellite platforms and ground measurements, global Aerosol analyses from long time series of satellite data, and local Aerosol analysis using satellite and LIDAR, have been implemented to ensure acceptance of TAMP by the atmospheric sciences community. The platform pursues the "virtual workspace" concept: all resources (data, processing, visualization, collaboration tools) are provided as "remote services", accessible through a standard web browser, to avoid the download of big data volumes and for allowing utilization of provided infrastructure for computation, analysis and sharing of results. Data access and processing are achieved through standardized protocols (WCS, WPS). As evolution toward a pre-operational environment, the "Virtual Exploitation Environment Demonstration for Atmospheric Missions" (VEEDAM) aims at maintaining, running and evolving the platform, demonstrating e.g. the possibility to perform massive processing over heterogeneous data sources. This work presents the VEEDAM concepts, provides pre-operational examples, stressing on the interoperability achievable exposing standardized data access and processing services (e.g. making accessible data and processing resources from different VREs). [1] TAMP platform landing page http://vtpip.zamg.ac.at/ [2] TAMP introductory video https://www.youtube.com/watch?v=xWiy8h1oXQY

  13. The Sensor Management for Applied Research Technologies (SMART) Project

    NASA Technical Reports Server (NTRS)

    Goodman, Michael; Jedlovec, Gary; Conover, Helen; Botts, Mike; Robin, Alex; Blakeslee, Richard; Hood, Robbie; Ingenthron, Susan; Li, Xiang; Maskey, Manil; hide

    2007-01-01

    NASA seeks on-demand data processing and analysis of Earth science observations to facilitate timely decision-making that can lead to the realization of the practical benefits of satellite instruments, airborne and surface remote sensing systems. However, a significant challenge exists in accessing and integrating data from multiple sensors or platforms to address Earth science problems because of the large data volumes, varying sensor scan characteristics, unique orbital coverage, and the steep "learning curve" associated with each sensor, data type, and associated products. The development of sensor web capabilities to autonomously process these data streams (whether real-time or archived) provides an opportunity to overcome these obstacles and facilitate the integration and synthesis of Earth science data and weather model output.

  14. Pseudo-spectral methodology for a quantitative assessment of the cover of in-stream vegetation in small streams

    NASA Astrophysics Data System (ADS)

    Hershkovitz, Yaron; Anker, Yaakov; Ben-Dor, Eyal; Schwartz, Guy; Gasith, Avital

    2010-05-01

    In-stream vegetation is a key ecosystem component in many fluvial ecosystems, having cascading effects on stream conditions and biotic structure. Traditionally, ground-level surveys (e.g. grid and transect analyses) are commonly used for estimating cover of aquatic macrophytes. Nonetheless, this methodological approach is highly time consuming and usually yields information which is practically limited to habitat and sub-reach scales. In contrast, remote-sensing techniques (e.g. satellite imagery and airborne photography), enable collection of large datasets over section, stream and basin scales, in relatively short time and reasonable cost. However, the commonly used spatial high resolution (1m) is often inadequate for examining aquatic vegetation on habitat or sub-reach scales. We examined the utility of a pseudo-spectral methodology, using RGB digital photography for estimating the cover of in-stream vegetation in a small Mediterranean-climate stream. We compared this methodology with that obtained by traditional ground-level grid methodology and with an airborne hyper-spectral remote sensing survey (AISA-ES). The study was conducted along a 2 km section of an intermittent stream (Taninim stream, Israel). When studied, the stream was dominated by patches of watercress (Nasturtium officinale) and mats of filamentous algae (Cladophora glomerata). The extent of vegetation cover at the habitat and section scales (100 and 104 m, respectively) were estimated by the pseudo-spectral methodology, using an airborne Roli camera with a Phase-One P 45 (39 MP) CCD image acquisition unit. The swaths were taken in elevation of about 460 m having a spatial resolution of about 4 cm (NADIR). For measuring vegetation cover at the section scale (104 m) we also used a 'push-broom' AISA-ES hyper-spectral swath having a sensor configuration of 182 bands (350-2500 nm) at elevation of ca. 1,200 m (i.e. spatial resolution of ca. 1 m). Simultaneously, with every swath we used an Analytical Spectral Device (ASD) to measure hyper-spectral signatures (2150 bands configuration; 350-2500 nm) of selected ground-level targets (located by GPS) of soil, water; vegetation (common reed, watercress, filamentous algae) and standard EVA foam colored sheets (red, green, blue, black and white). Processing and analysis of the data were performed over an ITT ENVI platform. The hyper-spectral image underwent radiometric calibration according to the flight and sensor calibration parameters on CALIGEO platform and the raw DN scale was converted into radiance scale. Ground level visual survey of vegetation cover and height was applied at the habitat scale (100 m) by placing a 1m2 netted grids (10x10cm cells) along 'bank-to-bank' transect (in triplicates). Estimates of plant cover obtained by the pseudo-spectral methodology at the habitat scale were 35-61% for the watercress, 0.4-25% for the filamentous algae and 27-51% for plant-free patches. The respective estimates by ground level visual survey were 26-50, 14-43% and 36-50%. The pseudo-spectral methodology also yielded estimates for the section scale (104 m) of ca. 39% for the watercress, ca. 32% for the filamentous algae and 6% for plant-free patches. The respective estimates obtained by hyper-spectral swath were 38, 26 and 8%. Validation against ground-level measurements proved that pseudo-spectral methodology gives reasonably good estimates of in-stream plant cover. Therefore, this methodology can serve as a substitute for ground level estimates at small stream scales and for the low resolution hyper-spectral methodology at larger scales.

  15. The NOvA software testing framework

    NASA Astrophysics Data System (ADS)

    Tamsett, M.; C Group

    2015-12-01

    The NOvA experiment at Fermilab is a long-baseline neutrino experiment designed to study vε appearance in a vμ beam. NOvA has already produced more than one million Monte Carlo and detector generated files amounting to more than 1 PB in size. This data is divided between a number of parallel streams such as far and near detector beam spills, cosmic ray backgrounds, a number of data-driven triggers and over 20 different Monte Carlo configurations. Each of these data streams must be processed through the appropriate steps of the rapidly evolving, multi-tiered, interdependent NOvA software framework. In total there are greater than 12 individual software tiers, each of which performs a different function and can be configured differently depending on the input stream. In order to regularly test and validate that all of these software stages are working correctly NOvA has designed a powerful, modular testing framework that enables detailed validation and benchmarking to be performed in a fast, efficient and accessible way with minimal expert knowledge. The core of this system is a novel series of python modules which wrap, monitor and handle the underlying C++ software framework and then report the results to a slick front-end web-based interface. This interface utilises modern, cross-platform, visualisation libraries to render the test results in a meaningful way. They are fast and flexible, allowing for the easy addition of new tests and datasets. In total upwards of 14 individual streams are regularly tested amounting to over 70 individual software processes, producing over 25 GB of output files. The rigour enforced through this flexible testing framework enables NOvA to rapidly verify configurations, results and software and thus ensure that data is available for physics analysis in a timely and robust manner.

  16. An Intelligent Surveillance Platform for Large Metropolitan Areas with Dense Sensor Deployment

    PubMed Central

    Fernández, Jorge; Calavia, Lorena; Baladrón, Carlos; Aguiar, Javier M.; Carro, Belén; Sánchez-Esguevillas, Antonio; Alonso-López, Jesus A.; Smilansky, Zeev

    2013-01-01

    This paper presents an intelligent surveillance platform based on the usage of large numbers of inexpensive sensors designed and developed inside the European Eureka Celtic project HuSIMS. With the aim of maximizing the number of deployable units while keeping monetary and resource/bandwidth costs at a minimum, the surveillance platform is based on the usage of inexpensive visual sensors which apply efficient motion detection and tracking algorithms to transform the video signal in a set of motion parameters. In order to automate the analysis of the myriad of data streams generated by the visual sensors, the platform's control center includes an alarm detection engine which comprises three components applying three different Artificial Intelligence strategies in parallel. These strategies are generic, domain-independent approaches which are able to operate in several domains (traffic surveillance, vandalism prevention, perimeter security, etc.). The architecture is completed with a versatile communication network which facilitates data collection from the visual sensors and alarm and video stream distribution towards the emergency teams. The resulting surveillance system is extremely suitable for its deployment in metropolitan areas, smart cities, and large facilities, mainly because cheap visual sensors and autonomous alarm detection facilitate dense sensor network deployments for wide and detailed coverage. PMID:23748169

  17. Inferring Aquifer Transmissivity from River Flow Data

    NASA Astrophysics Data System (ADS)

    Trichakis, Ioannis; Pistocchi, Alberto

    2016-04-01

    Daily streamflow data is the measurable result of many different hydrological processes within a basin; therefore, it includes information about all these processes. In this work, recession analysis applied to a pan-European dataset of measured streamflow was used to estimate hydrogeological parameters of the aquifers that contribute to the stream flow. Under the assumption that base-flow in times of no precipitation is mainly due to groundwater, we estimated parameters of European shallow aquifers connected with the stream network, and identified on the basis of the 1:1,500,000 scale Hydrogeological map of Europe. To this end, Master recession curves (MRCs) were constructed based on the RECESS model of the USGS for 1601 stream gauge stations across Europe. The process consists of three stages. Firstly, the model analyses the stream flow time-series. Then, it uses regression to calculate the recession index. Finally, it infers characteristics of the aquifer from the recession index. During time-series analysis, the model identifies those segments, where the number of successive recession days is above a certain threshold. The reason for this pre-processing lies in the necessity for an adequate number of points when performing regression at a later stage. The recession index derives from the semi-logarithmic plot of stream flow over time, and the post processing involves the calculation of geometrical parameters of the watershed through a GIS platform. The program scans the full stream flow dataset of all the stations. For each station, it identifies the segments with continuous recession that exceed a predefined number of days. When the algorithm finds all the segments of a certain station, it analyses them and calculates the best linear fit between time and the logarithm of flow. The algorithm repeats this procedure for the full number of segments, thus it calculates many different values of recession index for each station. After the program has found all the recession segments, it performs calculations to determine the expression for the MRC. Further processing of the MRCs can yield estimates of transmissivity or response time representative of the aquifers upstream of the station. These estimates can be useful for large scale (e.g. continental) groundwater modelling. The above procedure allowed calculating values of transmissivity for a large share of European aquifers, ranging from Tmin = 4.13E-04 m²/d to Tmax = 8.12E+03 m²/d, with an average value Taverage = 9.65E+01 m²/d. These results are in line with the literature, indicating that the procedure may provide realistic results for large-scale groundwater modelling. In this contribution we present the results in the perspective of their application for the parameterization of a pan-European bi-dimensional shallow groundwater flow model.

  18. Real-Time Earthquake Intensity Estimation Using Streaming Data Analysis of Social and Physical Sensors

    NASA Astrophysics Data System (ADS)

    Kropivnitskaya, Yelena; Tiampo, Kristy F.; Qin, Jinhui; Bauer, Michael A.

    2017-06-01

    Earthquake intensity is one of the key components of the decision-making process for disaster response and emergency services. Accurate and rapid intensity calculations can help to reduce total loss and the number of casualties after an earthquake. Modern intensity assessment procedures handle a variety of information sources, which can be divided into two main categories. The first type of data is that derived from physical sensors, such as seismographs and accelerometers, while the second type consists of data obtained from social sensors, such as witness observations of the consequences of the earthquake itself. Estimation approaches using additional data sources or that combine sources from both data types tend to increase intensity uncertainty due to human factors and inadequate procedures for temporal and spatial estimation, resulting in precision errors in both time and space. Here we present a processing approach for the real-time analysis of streams of data from both source types. The physical sensor data is acquired from the U.S. Geological Survey (USGS) seismic network in California and the social sensor data is based on Twitter user observations. First, empirical relationships between tweet rate and observed Modified Mercalli Intensity (MMI) are developed using data from the M6.0 South Napa, CAF earthquake that occurred on August 24, 2014. Second, the streams of both data types are analyzed together in simulated real-time to produce one intensity map. The second implementation is based on IBM InfoSphere Streams, a cloud platform for real-time analytics of big data. To handle large processing workloads for data from various sources, it is deployed and run on a cloud-based cluster of virtual machines. We compare the quality and evolution of intensity maps from different data sources over 10-min time intervals immediately following the earthquake. Results from the joint analysis shows that it provides more complete coverage, with better accuracy and higher resolution over a larger area than either data source alone.

  19. Metals removal and recovery in bioelectrochemical systems: A review.

    PubMed

    Nancharaiah, Y V; Venkata Mohan, S; Lens, P N L

    2015-11-01

    Metal laden wastes and contamination pose a threat to ecosystem well being and human health. Metal containing waste streams are also a valuable resource for recovery of precious and scarce elements. Although biological methods are inexpensive and effective for treating metal wastewaters and in situ bioremediation of metal(loid) contamination, little progress has been made towards metal(loid) recovery. Bioelectrochemical systems are emerging as a new technology platform for removal and recovery of metal ions from metallurgical wastes, process streams and wastewaters. Biodegradation of organic matter by electroactive biofilms at the anode has been successfully coupled to cathodic reduction of metal ions. Until now, leaching of Co(II) from LiCoO2 particles, and removal of metal ions i.e. Co(III/II), Cr(VI), Cu(II), Hg(II), Ag(I), Se(IV), and Cd(II) from aqueous solutions has been demonstrated. This article reviews the state of art research of bioelectrochemical systems for removal and recovery of metal(loid) ions and pertaining removal mechanisms. Copyright © 2015 Elsevier Ltd. All rights reserved.

  20. Real-Time Visualization of Network Behaviors for Situational Awareness

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Best, Daniel M.; Bohn, Shawn J.; Love, Douglas V.

    Plentiful, complex, and dynamic data make understanding the state of an enterprise network difficult. Although visualization can help analysts understand baseline behaviors in network traffic and identify off-normal events, visual analysis systems often do not scale well to operational data volumes (in the hundreds of millions to billions of transactions per day) nor to analysis of emergent trends in real-time data. We present a system that combines multiple, complementary visualization techniques coupled with in-stream analytics, behavioral modeling of network actors, and a high-throughput processing platform called MeDICi. This system provides situational understanding of real-time network activity to help analysts takemore » proactive response steps. We have developed these techniques using requirements gathered from the government users for which the tools are being developed. By linking multiple visualization tools to a streaming analytic pipeline, and designing each tool to support a particular kind of analysis (from high-level awareness to detailed investigation), analysts can understand the behavior of a network across multiple levels of abstraction.« less

  1. Microbial utilization of lignin: available biotechnologies for its degradation and valorization.

    PubMed

    Palazzolo, Martín A; Kurina-Sanz, Marcela

    2016-10-01

    Lignocellulosic biomasses, either from non-edible plants or from agricultural residues, stock biomacromolecules that can be processed to produce both energy and bioproducts. Therefore, they become major candidates to replace petroleum as the main source of energy. However, to shift the fossil-based economy to a bio-based one, it is imperative to develop robust biotechnologies to efficiently convert lignocellulosic streams in power and platform chemicals. Although most of the biomass processing facilities use celluloses and hemicelluloses to produce bioethanol and paper, there is no consolidated bioprocess to produce valuable compounds out of lignin at industrial scale available currently. Usually, lignin is burned to provide heat or it remains as a by-product in different streams, thus arising environmental concerns. In this way, the biorefinery concept is not extended to completion. Due to Nature offers an arsenal of biotechnological tools through microorganisms to accomplish lignin valorization or degradation, an increasing number of projects dealing with these tasks have been described recently. In this review, outstanding reports over the last 6 years are described, comprising the microbial utilization of lignin to produce a variety of valuable compounds as well as to diminish its ecological impact. Furthermore, perspectives on these topics are given.

  2. Water survey of Canada: Application for use of ERTS-A for retransmission of water resources data

    NASA Technical Reports Server (NTRS)

    Halliday, R. A. (Principal Investigator)

    1973-01-01

    The author has identified the following significant results. Nine sites in isolated regions in Canada have been selected for installation of ERTS data collection platforms. Seven platforms were installed in 1972, one of which did not operate. The six operating platforms transmitted over 7000 water level readings from stream gauging stations. This data is available on a near real time basis through the Canada Center for Remote Sensing and is used for river flow forecasting. The practicability of using satellite retransmission as a means of obtaining data from remote areas has been demonstrated.

  3. Production experience with the ATLAS Event Service

    NASA Astrophysics Data System (ADS)

    Benjamin, D.; Calafiura, P.; Childers, T.; De, K.; Guan, W.; Maeno, T.; Nilsson, P.; Tsulaia, V.; Van Gemmeren, P.; Wenaus, T.; ATLAS Collaboration

    2017-10-01

    The ATLAS Event Service (AES) has been designed and implemented for efficient running of ATLAS production workflows on a variety of computing platforms, ranging from conventional Grid sites to opportunistic, often short-lived resources, such as spot market commercial clouds, supercomputers and volunteer computing. The Event Service architecture allows real time delivery of fine grained workloads to running payload applications which process dispatched events or event ranges and immediately stream the outputs to highly scalable Object Stores. Thanks to its agile and flexible architecture the AES is currently being used by grid sites for assigning low priority workloads to otherwise idle computing resources; similarly harvesting HPC resources in an efficient back-fill mode; and massively scaling out to the 50-100k concurrent core level on the Amazon spot market to efficiently utilize those transient resources for peak production needs. Platform ports in development include ATLAS@Home (BOINC) and the Google Compute Engine, and a growing number of HPC platforms. After briefly reviewing the concept and the architecture of the Event Service, we will report the status and experience gained in AES commissioning and production operations on supercomputers, and our plans for extending ES application beyond Geant4 simulation to other workflows, such as reconstruction and data analysis.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rehder, J.B.

    The project focuses on an appropriate technology for small-scale hydro power: floating waterwheels and turbines. For background, relic and existing systems such as early floating mills, traditional Amish waterwheels, and micro-hydro systems are examined. In the design phase of the project, new designs for Floating Hydro Power Systems include: an analysis of floatation materials and systems; a floating undershot waterwheel design; a floating cylinder (fiberglass storage tank) design; a submerged tube design; and a design for a floating platform with submerged propellers. Finally, in the applications phase, stream flow data from East Tennessee streams are used in a discussion ofmore » the potential applications of floating hydro power systems in small streams.« less

  5. ATLAS Live: Collaborative Information Streams

    NASA Astrophysics Data System (ADS)

    Goldfarb, Steven; ATLAS Collaboration

    2011-12-01

    I report on a pilot project launched in 2010 focusing on facilitating communication and information exchange within the ATLAS Collaboration, through the combination of digital signage software and webcasting. The project, called ATLAS Live, implements video streams of information, ranging from detailed detector and data status to educational and outreach material. The content, including text, images, video and audio, is collected, visualised and scheduled using digital signage software. The system is robust and flexible, utilizing scripts to input data from remote sources, such as the CERN Document Server, Indico, or any available URL, and to integrate these sources into professional-quality streams, including text scrolling, transition effects, inter and intra-screen divisibility. Information is published via the encoding and webcasting of standard video streams, viewable on all common platforms, using a web browser or other common video tool. Authorisation is enforced at the level of the streaming and at the web portals, using the CERN SSO system.

  6. OpenCFU, a new free and open-source software to count cell colonies and other circular objects.

    PubMed

    Geissmann, Quentin

    2013-01-01

    Counting circular objects such as cell colonies is an important source of information for biologists. Although this task is often time-consuming and subjective, it is still predominantly performed manually. The aim of the present work is to provide a new tool to enumerate circular objects from digital pictures and video streams. Here, I demonstrate that the created program, OpenCFU, is very robust, accurate and fast. In addition, it provides control over the processing parameters and is implemented in an intuitive and modern interface. OpenCFU is a cross-platform and open-source software freely available at http://opencfu.sourceforge.net.

  7. A portable platform to collect and review behavioral data simultaneously with neurophysiological signals.

    PubMed

    Tianxiao Jiang; Siddiqui, Hasan; Ray, Shruti; Asman, Priscella; Ozturk, Musa; Ince, Nuri F

    2017-07-01

    This paper presents a portable platform to collect and review behavioral data simultaneously with neurophysiological signals. The whole system is comprised of four parts: a sensor data acquisition interface, a socket server for real-time data streaming, a Simulink system for real-time processing and an offline data review and analysis toolbox. A low-cost microcontroller is used to acquire data from external sensors such as accelerometer and hand dynamometer. The micro-controller transfers the data either directly through USB or wirelessly through a bluetooth module to a data server written in C++ for MS Windows OS. The data server also interfaces with the digital glove and captures HD video from webcam. The acquired sensor data are streamed under User Datagram Protocol (UDP) to other applications such as Simulink/Matlab for real-time analysis and recording. Neurophysiological signals such as electroencephalography (EEG), electrocorticography (ECoG) and local field potential (LFP) recordings can be collected simultaneously in Simulink and fused with behavioral data. In addition, we developed a customized Matlab Graphical User Interface (GUI) software to review, annotate and analyze the data offline. The software provides a fast, user-friendly data visualization environment with synchronized video playback feature. The software is also capable of reviewing long-term neural recordings. Other featured functions such as fast preprocessing with multithreaded filters, annotation, montage selection, power-spectral density (PSD) estimate, time-frequency map and spatial spectral map are also implemented.

  8. Real-time video streaming in mobile cloud over heterogeneous wireless networks

    NASA Astrophysics Data System (ADS)

    Abdallah-Saleh, Saleh; Wang, Qi; Grecos, Christos

    2012-06-01

    Recently, the concept of Mobile Cloud Computing (MCC) has been proposed to offload the resource requirements in computational capabilities, storage and security from mobile devices into the cloud. Internet video applications such as real-time streaming are expected to be ubiquitously deployed and supported over the cloud for mobile users, who typically encounter a range of wireless networks of diverse radio access technologies during their roaming. However, real-time video streaming for mobile cloud users across heterogeneous wireless networks presents multiple challenges. The network-layer quality of service (QoS) provision to support high-quality mobile video delivery in this demanding scenario remains an open research question, and this in turn affects the application-level visual quality and impedes mobile users' perceived quality of experience (QoE). In this paper, we devise a framework to support real-time video streaming in this new mobile video networking paradigm and evaluate the performance of the proposed framework empirically through a lab-based yet realistic testing platform. One particular issue we focus on is the effect of users' mobility on the QoS of video streaming over the cloud. We design and implement a hybrid platform comprising of a test-bed and an emulator, on which our concept of mobile cloud computing, video streaming and heterogeneous wireless networks are implemented and integrated to allow the testing of our framework. As representative heterogeneous wireless networks, the popular WLAN (Wi-Fi) and MAN (WiMAX) networks are incorporated in order to evaluate effects of handovers between these different radio access technologies. The H.264/AVC (Advanced Video Coding) standard is employed for real-time video streaming from a server to mobile users (client nodes) in the networks. Mobility support is introduced to enable continuous streaming experience for a mobile user across the heterogeneous wireless network. Real-time video stream packets are captured for analytical purposes on the mobile user node. Experimental results are obtained and analysed. Future work is identified towards further improvement of the current design and implementation. With this new mobile video networking concept and paradigm implemented and evaluated, results and observations obtained from this study would form the basis of a more in-depth, comprehensive understanding of various challenges and opportunities in supporting high-quality real-time video streaming in mobile cloud over heterogeneous wireless networks.

  9. Immersion and contact freezing experiments in the Mainz wind tunnel laboratory

    NASA Astrophysics Data System (ADS)

    Eppers, Oliver; Mayer, Amelie; Diehl, Karoline; Mitra, Subir; Borrmann, Stephan; Szakáll, Miklós

    2016-04-01

    Immersion and contact freezing are of outmost important ice nucleation processes in mixed phase clouds. Experimental studies are carried out in the Mainz vertical wind tunnel laboratory in order to characterize these nucleation processes for different ice nucleating particles (INP), such as for mineral dust or biological particles. Immersion freezing is investigated in our laboratory with two different experimental techniques, both attaining contact-free levitation of liquid droplets and cooling of the surrounding air down to about -25 °C. In an acoustic levitator placed in the cold room of our laboratory, drops with diameters of 2 mm are investigated. In the vertical air stream of the wind tunnel droplets with diameter of 700 micron are freely floated at their terminal velocities, simulating the flow conditions of the free atmosphere. Furthermore, the wind tunnel offers a unique platform for contact freezing experiments. Supercooled water droplets are floated in the vertical air stream at their terminal velocities and INP are injected into the tunnel air stream upstream of them. As soon as INP collides with the supercooled droplet the contact freezing is initiated. The first results of immersion and contact freezing experiments with cellulose particles both in the acoustic levitator and in the wind tunnel will be presented. Cellulose is considered as typical INP of biological origin and a macrotracer for plant debris. Nucleating properties of cellulose will be provided, mainly focusing on the temperature, INP concentration, and specific surface area dependences of the freezing processes. Direct comparison between the different experimental techniques (acoustic levitator and wind tunnel), as well as between nucleation modes (immersion and contact freezing) will be presented. The work is carried out within the framework of the German research unit INUIT.

  10. Embedded Streaming Deep Neural Networks Accelerator With Applications.

    PubMed

    Dundar, Aysegul; Jin, Jonghoon; Martini, Berin; Culurciello, Eugenio

    2017-07-01

    Deep convolutional neural networks (DCNNs) have become a very powerful tool in visual perception. DCNNs have applications in autonomous robots, security systems, mobile phones, and automobiles, where high throughput of the feedforward evaluation phase and power efficiency are important. Because of this increased usage, many field-programmable gate array (FPGA)-based accelerators have been proposed. In this paper, we present an optimized streaming method for DCNNs' hardware accelerator on an embedded platform. The streaming method acts as a compiler, transforming a high-level representation of DCNNs into operation codes to execute applications in a hardware accelerator. The proposed method utilizes maximum computational resources available based on a novel-scheduled routing topology that combines data reuse and data concatenation. It is tested with a hardware accelerator implemented on the Xilinx Kintex-7 XC7K325T FPGA. The system fully explores weight-level and node-level parallelizations of DCNNs and achieves a peak performance of 247 G-ops while consuming less than 4 W of power. We test our system with applications on object classification and object detection in real-world scenarios. Our results indicate high-performance efficiency, outperforming all other presented platforms while running these applications.

  11. Discretized Streams: A Fault-Tolerant Model for Scalable Stream Processing

    DTIC Science & Technology

    2012-12-14

    Discretized Streams: A Fault-Tolerant Model for Scalable Stream Processing Matei Zaharia Tathagata Das Haoyuan Li Timothy Hunter Scott Shenker Ion...SUBTITLE Discretized Streams: A Fault-Tolerant Model for Scalable Stream Processing 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER...time. However, current programming models for distributed stream processing are relatively low-level often leaving the user to worry about consistency of

  12. Cellular-enabled water quality measurements

    NASA Astrophysics Data System (ADS)

    Zhao, Y.; Kerkez, B.

    2013-12-01

    While the past decade has seen significant improvements in our ability to measure nutrients and other water quality parameters, the use of these sensors has yet to gain traction due to their costprohibitive nature and deployment expertise required on the part of researchers. Furthermore, an extra burden is incurred when real-time data access becomes an experimental requirement. We present an open-source hardware design to facilitate the real-time, low-cost, and robust measurements of water quality across large urbanized areas. Our hardware platform interfaces an embedded, vastly configurable, high-precision, ultra-low power measurement system, with a low-power cellular module. Each sensor station is configured with an IP address, permitting reliable streaming of sensor data to off-site locations as measurements are made. We discuss the role of high-quality hardware components during extreme event scenarios, and present preliminary performance metrics that validate the ability of the platform to provide streaming access to sensor measurements.

  13. Erosion and deposition on a beach raised by the 1964 earthquake, Montague Island, Alaska: Chapter H in The Alaska earthquake, March 27, 1964: regional effects

    USGS Publications Warehouse

    Kirkby, M.J.; Kirkby, Anne V.

    1969-01-01

    During the 1964 Alaska earthquake, tectonic deformation uplifted the southern end of Montague Island as much as 33 feet or more. The uplifted shoreline is rapidly being modified by subaerial and marine processes. The new raised beach is formed in bedrock, sand, gravel, and deltaic bay-head deposits, and the effect of each erosional process was measured in each material. Fieldwork was concentrated in two areas—MacLeod Harbor on the northwest side and Patton Bay on the southeast side of Montague Island. In the unconsolidated deltaic deposits of MacLeod Harbor, 97 percent of the erosion up to June 1965, 15 months after the earthquake, was fluvial, 2.2 percent was by rainwash, and only 0.8 percent was marine; 52 percent of the total available raised beach material had already been removed. The volume removed by stream erosion was proportional to low-flow discharge raised to the power of 0.75 to 0.95, and this volume increased as the bed material became finer. Stream response to the relative fall in base level was very rapid, most of the downcutting in unconsolidated materials occurring within 48 hours of the uplift for streams with low flows greater than 10 cubic feet per second. Since then, erosion by these streams has been predominantly lateral. Streams with lower discharges, in unconsolidated materials, still had knickpoints after 15 months. No response to uplift could be detected in stream courses above the former preearthquake sea level. Where the raised beach is in bedrock, it is being destroyed principally by marine action but at such a low rate that no appreciable erosion of bedrock was found 15 months after the earthquake. A dated rock platform raised earlier has eroded at a mean rate of 0.49 foot per year. In this area the factor limiting the rate of erosion was rock resistance rather than the transporting capacity of the waves. The break in slope between the top of the raised beach and the former seacliff is being obliterated by debris which is accumulating at the base of the cliffs and which is no longer being removed by the sea. Current cliff retreat by rockfall, mudflows, and landslides was estimated at 0.7 to 2.0 feet per year, and in parts of Patton Bay the accumulation of debris has obliterated 78 percent of the original break in slope in 15 months. Evidence of two relative sea-level changes before 1964 was found in Patton Bay. At a high stand of sea level lasting until about 2000 B.P. (before present), an older raised beach was formed which, over a distance of 5 miles, shows 40 feet of deformation relative to the present sea level. Peat deposits exposed by the 1964 uplift also record a low sea level that lasted until at least 600 B.P. The 1964 raised beach was used to test the accuracy of identification of former sea-level elevations from raised beach features. The Pre-1964 sea level could be accurately determined from the height of the former barnacle line, so an independent check on high-water level was available. The most reliable topographic indicator was the elevation of the break in slope at the top of a beach between a bedrock platform and a cliff. Even here, the former sea level could only be identified within 5 feet. The breaks in slope at the top of gravel beaches were found to be poor indicators of former sea level. On Montague Island, evidence of former high sea levels appeared to be best preserved (1) as raised bedrock platforms on rocks of moderate resistance in slightly sheltered locations and (2) as raised storm beaches where the relief immediately inland was very low.

  14. Data-intensive computing on numerically-insensitive supercomputers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ahrens, James P; Fasel, Patricia K; Habib, Salman

    2010-12-03

    With the advent of the era of petascale supercomputing, via the delivery of the Roadrunner supercomputing platform at Los Alamos National Laboratory, there is a pressing need to address the problem of visualizing massive petascale-sized results. In this presentation, I discuss progress on a number of approaches including in-situ analysis, multi-resolution out-of-core streaming and interactive rendering on the supercomputing platform. These approaches are placed in context by the emerging area of data-intensive supercomputing.

  15. An open-source wireless sensor stack: from Arduino to SDI-12 to Water One Flow

    NASA Astrophysics Data System (ADS)

    Hicks, S.; Damiano, S. G.; Smith, K. M.; Olexy, J.; Horsburgh, J. S.; Mayorga, E.; Aufdenkampe, A. K.

    2013-12-01

    Implementing a large-scale streaming environmental sensor network has previously been limited by the high cost of the datalogging and data communication infrastructure. The Christina River Basin Critical Zone Observatory (CRB-CZO) is overcoming the obstacles to large near-real-time data collection networks by using Arduino, an open source electronics platform, in combination with XBee ZigBee wireless radio modules. These extremely low-cost and easy-to-use open source electronics are at the heart of the new DIY movement and have provided solutions to countless projects by over half a million users worldwide. However, their use in environmental sensing is in its infancy. At present a primary limitation to widespread deployment of open-source electronics for environmental sensing is the lack of a simple, open-source software stack to manage streaming data from heterogeneous sensor networks. Here we present a functioning prototype software stack that receives sensor data over a self-meshing ZigBee wireless network from over a hundred sensors, stores the data locally and serves it on demand as a CUAHSI Water One Flow (WOF) web service. We highlight a few new, innovative components, including: (1) a versatile open data logger design based the Arduino electronics platform and ZigBee radios; (2) a software library implementing SDI-12 communication protocol between any Arduino platform and SDI12-enabled sensors without the need for additional hardware (https://github.com/StroudCenter/Arduino-SDI-12); and (3) 'midStream', a light-weight set of Python code that receives streaming sensor data, appends it with metadata on the fly by querying a relational database structured on an early version of the Observations Data Model version 2.0 (ODM2), and uses the WOFpy library to serve the data as WaterML via SOAP and REST web services.

  16. An Extensible Processing Framework for Eddy-covariance Data

    NASA Astrophysics Data System (ADS)

    Durden, D.; Fox, A. M.; Metzger, S.; Sturtevant, C.; Durden, N. P.; Luo, H.

    2016-12-01

    The evolution of large data collecting networks has not only led to an increase of available information, but also in the complexity of analyzing the observations. Timely dissemination of readily usable data products necessitates a streaming processing framework that is both automatable and flexible. Tower networks, such as ICOS, Ameriflux, and NEON, exemplify this issue by requiring large amounts of data to be processed from dispersed measurement sites. Eddy-covariance data from across the NEON network are expected to amount to 100 Gigabytes per day. The complexity of the algorithmic processing necessary to produce high-quality data products together with the continued development of new analysis techniques led to the development of a modular R-package, eddy4R. This allows algorithms provided by NEON and the larger community to be deployed in streaming processing, and to be used by community members alike. In order to control the processing environment, provide a proficient parallel processing structure, and certify dependencies are available during processing, we chose Docker as our "Development and Operations" (DevOps) platform. The Docker framework allows our processing algorithms to be developed, maintained and deployed at scale. Additionally, the eddy4R-Docker framework fosters community use and extensibility via pre-built Docker images and the Github distributed version control system. The capability to process large data sets is reliant upon efficient input and output of data, data compressibility to reduce compute resource loads, and the ability to easily package metadata. The Hierarchical Data Format (HDF5) is a file format that can meet these needs. A NEON standard HDF5 file structure and metadata attributes allow users to explore larger data sets in an intuitive "directory-like" structure adopting the NEON data product naming conventions.

  17. COSBID-M3: a platform for multimodal monitoring, data collection, and research in neurocritical care.

    PubMed

    Wilson, J Adam; Shutter, Lori A; Hartings, Jed A

    2013-01-01

    Neuromonitoring in patients with severe brain trauma and stroke is often limited to intracranial pressure (ICP); advanced neuroscience intensive care units may also monitor brain oxygenation (partial pressure of brain tissue oxygen, P(bt)O(2)), electroencephalogram (EEG), cerebral blood flow (CBF), or neurochemistry. For example, cortical spreading depolarizations (CSDs) recorded by electrocorticography (ECoG) are associated with delayed cerebral ischemia after subarachnoid hemorrhage and are an attractive target for novel therapeutic approaches. However, to better understand pathophysiologic relations and realize the potential of multimodal monitoring, a common platform for data collection and integration is needed. We have developed a multimodal system that integrates clinical, research, and imaging data into a single research and development (R&D) platform. Our system is adapted from the widely used BCI2000, a brain-computer interface tool which is written in the C++ language and supports over 20 data acquisition systems. It is optimized for real-time analysis of multimodal data using advanced time and frequency domain analyses and is extensible for research development using a combination of C++, MATLAB, and Python languages. Continuous streams of raw and processed data, including BP (blood pressure), ICP, PtiO2, CBF, ECoG, EEG, and patient video are stored in an open binary data format. Selected events identified in raw (e.g., ICP) or processed (e.g., CSD) measures are displayed graphically, can trigger alarms, or can be sent to researchers or clinicians via text message. For instance, algorithms for automated detection of CSD have been incorporated, and processed ECoG signals are projected onto three-dimensional (3D) brain models based on patient magnetic resonance imaging (MRI) and computed tomographic (CT) scans, allowing real-time correlation of pathoanatomy and cortical function. This platform will provide clinicians and researchers with an advanced tool to investigate pathophysiologic relationships and novel measures of cerebral status, as well as implement treatment algorithms based on such multimodal measures.

  18. Prospects for energy recovery during hydrothermal and biological processing of waste biomass.

    PubMed

    Gerber Van Doren, Léda; Posmanik, Roy; Bicalho, Felipe A; Tester, Jefferson W; Sills, Deborah L

    2017-02-01

    Thermochemical and biological processes represent promising technologies for converting wet biomasses, such as animal manure, organic waste, or algae, to energy. To convert biomass to energy and bio-chemicals in an economical manner, internal energy recovery should be maximized to reduce the use of external heat and power. In this study, two conversion pathways that couple hydrothermal liquefaction with anaerobic digestion or catalytic hydrothermal gasification were compared. Each of these platforms is followed by two alternative processes for gas utilization: 1) combined heat and power; and 2) combustion in a boiler. Pinch analysis was applied to integrate thermal streams among unit processes and improve the overall system efficiency. A techno-economic analysis was conducted to compare the feasibility of the four modeled scenarios under different market conditions. Our results show that a systems approach designed to recover internal heat and power can reduce external energy demands and increase the overall process sustainability. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. Map Classification In Image Data

    DTIC Science & Technology

    2015-09-25

    showing the signicant portion of image and video data transfers via Youtube , Facebook, and Flickr as primary platforms from Infographic (2015) digital...reserves • hydrography: lakes, rivers, streams, swamps, coastal flats • relief: mountains, valleys, slopes, depressions • vegetation: wooded and cleared

  20. Method for enhanced atomization of liquids

    DOEpatents

    Thompson, Richard E.; White, Jerome R.

    1993-01-01

    In a process for atomizing a slurry or liquid process stream in which a slurry or liquid is passed through a nozzle to provide a primary atomized process stream, an improvement which comprises subjecting the liquid or slurry process stream to microwave energy as the liquid or slurry process stream exits the nozzle, wherein sufficient microwave heating is provided to flash vaporize the primary atomized process stream.

  1. Geomorphic effectiveness of long profile shape and role of inherent geological controls, Ganga River Basin, India

    NASA Astrophysics Data System (ADS)

    Sonam, Sonam; Jain, Vikrant

    2017-04-01

    River long profile is one of the fundamental geomorphic parameters which provides a platform to study interaction of geological and geomorphic processes at different time scales. Long profile shape is governed by geological processes at 10 ^ 5 - 10 ^ 6 years' time scale and it controls the modern day (10 ^ 0 - 10 ^ 1 years' time scale) fluvial processes by controlling the spatial variability of channel slope. Identification of an appropriate model for river long profile may provide a tool to analyse the quantitative relationship between basin geology, profile shape and its geomorphic effectiveness. A systematic analysis of long profiles has been carried for the Himalayan tributaries of the Ganga River basin. Long profile shape and stream power distribution pattern is derived using SRTM DEM data (90 m spatial resolution). Peak discharge data from 34 stations is used for hydrological analysis. Lithological variability and major thrusts are marked along the river long profile. The best fit of long profile is analysed for power, logarithmic and exponential function. Second order exponential function provides the best representation of long profiles. The second order exponential equation is Z = K1*exp(-β1*L) + K2*exp(-β2*L), where Z is elevation of channel long profile, L is the length, K and β are coefficients of the exponential function. K1 and K2 are the proportion of elevation change of the long profile represented by β1 (fast) and β2 (slow) decay coefficients of the river long profile. Different values of coefficients express the variability in long profile shapes and is related with the litho-tectonic variability of the study area. Channel slope of long profile is estimated taking the derivative of exponential function. Stream power distribution pattern along long profile is estimated by superimposing the discharge and long profile slope. Sensitivity analysis of stream power distribution with decay coefficients of the second order exponential equation is evaluated for a range of coefficient values. Our analysis suggests that the amplitude of stream power peak value is dependent on K1, the proportion of elevation change coming under the fast decay exponent and the location of stream power peak is dependent of the long profile decay coefficient (β1). Different long profile shapes owing to litho-tectonic variability across the Himalayas are responsible for spatial variability of stream power distribution pattern. Most of the stream power peaks lie in the Higher Himalaya. In general, eastern rivers have higher stream power in hinterland area and low stream power in the alluvial plains. This is responsible for, 1) higher erosion rate and sediment supply in hinterland of eastern rivers, 2) the incised and stable nature of channels in the western alluvial plains and 3) aggrading channels with dynamic nature in the eastern alluvial plains. Our study shows that the spatial variability of litho-units defines the coefficients of long profile function which in turn controls the position and magnitude of stream power maxima and hence the geomorphic variability in a fluvial system.

  2. Coupling biophysical processes and water rights to simulate spatially distributed water use in an intensively managed hydrologic system

    NASA Astrophysics Data System (ADS)

    Han, Bangshuai; Benner, Shawn G.; Bolte, John P.; Vache, Kellie B.; Flores, Alejandro N.

    2017-07-01

    Humans have significantly altered the redistribution of water in intensively managed hydrologic systems, shifting the spatiotemporal patterns of surface water. Evaluating water availability requires integration of hydrologic processes and associated human influences. In this study, we summarize the development and evaluation of an extensible hydrologic model that explicitly integrates water rights to spatially distribute irrigation waters in a semi-arid agricultural region in the western US, using the Envision integrated modeling platform. The model captures both human and biophysical systems, particularly the diversion of water from the Boise River, which is the main water source that supports irrigated agriculture in this region. In agricultural areas, water demand is estimated as a function of crop type and local environmental conditions. Surface water to meet crop demand is diverted from the stream reaches, constrained by the amount of water available in the stream, the water-rights-appropriated amount, and the priority dates associated with particular places of use. Results, measured by flow rates at gaged stream and canal locations within the study area, suggest that the impacts of irrigation activities on the magnitude and timing of flows through this intensively managed system are well captured. The multi-year averaged diverted water from the Boise River matches observations well, reflecting the appropriation of water according to the water rights database. Because of the spatially explicit implementation of surface water diversion, the model can help diagnose places and times where water resources are likely insufficient to meet agricultural water demands, and inform future water management decisions.

  3. Pre-Hardware Optimization of Spacecraft Image Processing Software Algorithms and Hardware Implementation

    NASA Technical Reports Server (NTRS)

    Kizhner, Semion; Flatley, Thomas P.; Hestnes, Phyllis; Jentoft-Nilsen, Marit; Petrick, David J.; Day, John H. (Technical Monitor)

    2001-01-01

    Spacecraft telemetry rates have steadily increased over the last decade presenting a problem for real-time processing by ground facilities. This paper proposes a solution to a related problem for the Geostationary Operational Environmental Spacecraft (GOES-8) image processing application. Although large super-computer facilities are the obvious heritage solution, they are very costly, making it imperative to seek a feasible alternative engineering solution at a fraction of the cost. The solution is based on a Personal Computer (PC) platform and synergy of optimized software algorithms and re-configurable computing hardware technologies, such as Field Programmable Gate Arrays (FPGA) and Digital Signal Processing (DSP). It has been shown in [1] and [2] that this configuration can provide superior inexpensive performance for a chosen application on the ground station or on-board a spacecraft. However, since this technology is still maturing, intensive pre-hardware steps are necessary to achieve the benefits of hardware implementation. This paper describes these steps for the GOES-8 application, a software project developed using Interactive Data Language (IDL) (Trademark of Research Systems, Inc.) on a Workstation/UNIX platform. The solution involves converting the application to a PC/Windows/RC platform, selected mainly by the availability of low cost, adaptable high-speed RC hardware. In order for the hybrid system to run, the IDL software was modified to account for platform differences. It was interesting to examine the gains and losses in performance on the new platform, as well as unexpected observations before implementing hardware. After substantial pre-hardware optimization steps, the necessity of hardware implementation for bottleneck code in the PC environment became evident and solvable beginning with the methodology described in [1], [2], and implementing a novel methodology for this specific application [6]. The PC-RC interface bandwidth problem for the class of applications with moderate input-output data rates but large intermediate multi-thread data streams has been addressed and mitigated. This opens a new class of satellite image processing applications for bottleneck problems solution using RC technologies. The issue of a science algorithm level of abstraction necessary for RC hardware implementation is also described. Selected Matlab functions already implemented in hardware were investigated for their direct applicability to the GOES-8 application with the intent to create a library of Matlab and IDL RC functions for ongoing work. A complete class of spacecraft image processing applications using embedded re-configurable computing technology to meet real-time requirements, including performance results and comparison with the existing system, is described in this paper.

  4. OpenCFU, a New Free and Open-Source Software to Count Cell Colonies and Other Circular Objects

    PubMed Central

    Geissmann, Quentin

    2013-01-01

    Counting circular objects such as cell colonies is an important source of information for biologists. Although this task is often time-consuming and subjective, it is still predominantly performed manually. The aim of the present work is to provide a new tool to enumerate circular objects from digital pictures and video streams. Here, I demonstrate that the created program, OpenCFU, is very robust, accurate and fast. In addition, it provides control over the processing parameters and is implemented in an intuitive and modern interface. OpenCFU is a cross-platform and open-source software freely available at http://opencfu.sourceforge.net. PMID:23457446

  5. Apparatus for the liquefaction of natural gas and methods relating to same

    DOEpatents

    Wilding, Bruce M [Idaho Falls, ID; McKellar, Michael G [Idaho Falls, ID; Turner, Terry D [Ammon, ID; Carney, Francis H [Idaho Falls, ID

    2009-09-29

    An apparatus and method for producing liquefied natural gas. A liquefaction plant may be coupled to a source of unpurified natural gas, such as a natural gas pipeline at a pressure letdown station. A portion of the gas is drawn off and split into a process stream and a cooling stream. The cooling stream passes through an expander creating work output. A compressor may be driven by the work output and compresses the process stream. The compressed process stream is cooled, such as by the expanded cooling stream. The cooled, compressed process stream is divided into first and second portions with the first portion being expanded to liquefy the natural gas. A gas-liquid separator separates the vapor from the liquid natural gas. The second portion of the cooled, compressed process stream is also expanded and used to cool the compressed process stream.

  6. Toward a Rapid Synthesis of Field and Desktop Data for Classifying Streams in the Pacific Northwest: Guiding the Sampling and Management of Salmonid Habitat

    NASA Astrophysics Data System (ADS)

    Kasprak, A.; Wheaton, J. M.; Bouwes, N.; Weber, N. P.; Trahan, N. C.; Jordan, C. E.

    2012-12-01

    River managers often seek to understand habitat availability and quality for riverine organisms within the physical template provided by their landscape. Yet the large amount of natural heterogeneity in landscapes gives rise to stream systems which are highly variable over small spatial scales, potentially complicating site selection for surveying aquatic habitat while simultaneously making a simple, wide-reaching management strategy elusive. This is particularly true in the rugged John Day River Basin of northern Oregon, where efforts as part of the Columbia Habitat Monitoring Program to conduct site-based surveys of physical habitat for endangered steelhead salmon (Oncorhynchus mykiss) are underway. As a complete understanding of the type and distribution of habitat available to these fish would require visits to all streams in the basin (impractical due to its large size), here we develop an approach for classifying channel types which combines remote desktop GIS analyses with rapid field-based stream and landscape surveys. At the core of this method, we build off of the River Styles Framework, an open-ended and process-based approach for classifying streams and informing management decisions. This framework is combined with on-the-ground fluvial audits, which aim to quickly and continuously map sediment dynamics and channel behavior along selected channels. Validation of this classification method is completed by on-the-ground stream surveys using a digital iPad platform and by rapid small aircraft overflights to confirm or refine predictions. We further compare this method with existing channel classification approaches for the region (e.g. Beechie, Montgomery and Buffington). The results of this study will help guide both the refinement of site stratification and selection for salmonid habitat monitoring within the basin, and will be vital in designing and prioritizing restoration and management strategies tailored to the distribution of river styles found across the region.

  7. Methods of natural gas liquefaction and natural gas liquefaction plants utilizing multiple and varying gas streams

    DOEpatents

    Wilding, Bruce M; Turner, Terry D

    2014-12-02

    A method of natural gas liquefaction may include cooling a gaseous NG process stream to form a liquid NG process stream. The method may further include directing the first tail gas stream out of a plant at a first pressure and directing a second tail gas stream out of the plant at a second pressure. An additional method of natural gas liquefaction may include separating CO.sub.2 from a liquid NG process stream and processing the CO.sub.2 to provide a CO.sub.2 product stream. Another method of natural gas liquefaction may include combining a marginal gaseous NG process stream with a secondary substantially pure NG stream to provide an improved gaseous NG process stream. Additionally, a NG liquefaction plant may include a first tail gas outlet, and at least a second tail gas outlet, the at least a second tail gas outlet separate from the first tail gas outlet.

  8. Femtosecond laser fabrication of fiber based optofluidic platform for flow cytometry applications

    NASA Astrophysics Data System (ADS)

    Serhatlioglu, Murat; Elbuken, Caglar; Ortac, Bulend; Solmaz, Mehmet E.

    2017-02-01

    Miniaturized optofluidic platforms play an important role in bio-analysis, detection and diagnostic applications. The advantages of such miniaturized devices are extremely low sample requirement, low cost development and rapid analysis capabilities. Fused silica is advantageous for optofluidic systems due to properties such as being chemically inert, mechanically stable, and optically transparent to a wide spectrum of light. As a three dimensional manufacturing method, femtosecond laser scanning followed by chemical etching shows great potential to fabricate glass based optofluidic chips. In this study, we demonstrate fabrication of all-fiber based, optofluidic flow cytometer in fused silica glass by femtosecond laser machining. 3D particle focusing was achieved through a straightforward planar chip design with two separately fabricated fused silica glass slides thermally bonded together. Bioparticles in a fluid stream encounter with optical interrogation region specifically designed to allocate 405nm single mode fiber laser source and two multi-mode collection fibers for forward scattering (FSC) and side scattering (SSC) signals detection. Detected signal data collected with oscilloscope and post processed with MATLAB script file. We were able to count number of events over 4000events/sec, and achieve size distribution for 5.95μm monodisperse polystyrene beads using FSC and SSC signals. Our platform shows promise for optical and fluidic miniaturization of flow cytometry systems.

  9. Closed-Loop, Multichannel Experimentation Using the Open-Source NeuroRighter Electrophysiology Platform

    PubMed Central

    Newman, Jonathan P.; Zeller-Townson, Riley; Fong, Ming-Fai; Arcot Desai, Sharanya; Gross, Robert E.; Potter, Steve M.

    2013-01-01

    Single neuron feedback control techniques, such as voltage clamp and dynamic clamp, have enabled numerous advances in our understanding of ion channels, electrochemical signaling, and neural dynamics. Although commercially available multichannel recording and stimulation systems are commonly used for studying neural processing at the network level, they provide little native support for real-time feedback. We developed the open-source NeuroRighter multichannel electrophysiology hardware and software platform for closed-loop multichannel control with a focus on accessibility and low cost. NeuroRighter allows 64 channels of stimulation and recording for around US $10,000, along with the ability to integrate with other software and hardware. Here, we present substantial enhancements to the NeuroRighter platform, including a redesigned desktop application, a new stimulation subsystem allowing arbitrary stimulation patterns, low-latency data servers for accessing data streams, and a new application programming interface (API) for creating closed-loop protocols that can be inserted into NeuroRighter as plugin programs. This greatly simplifies the design of sophisticated real-time experiments without sacrificing the power and speed of a compiled programming language. Here we present a detailed description of NeuroRighter as a stand-alone application, its plugin API, and an extensive set of case studies that highlight the system’s abilities for conducting closed-loop, multichannel interfacing experiments. PMID:23346047

  10. Monitoring/characterization of stickies contaminants coming from a papermaking plant--Toward an innovative exploitation of the screen rejects to levulinic acid.

    PubMed

    Licursi, Domenico; Antonetti, Claudia; Martinelli, Marco; Ribechini, Erika; Zanaboni, Marco; Raspolli Galletti, Anna Maria

    2016-03-01

    Recycled paper needs a lot of mechanical/chemical treatments for its re-use in the papermaking process. Some of these ones produce considerable rejected waste fractions, such as "screen rejects", which include both cellulose fibers and non-fibrous organic contaminants, or "stickies", these last representing a shortcoming both for the papermaking process and for the quality of the final product. Instead, the accepted fractions coming from these unit operations become progressively poorer in contaminants and richer in cellulose. Here, input and output streams coming from mechanical screening systems of a papermaking plant using recycled paper for cardboard production were sampled and analyzed directly and after solvent extraction, thus confirming the abundant presence of styrene-butadiene rubber (SBR) and ethylene vinyl acetate (EVA) copolymers in the output rejected stream and cellulose in the output accepted one. Despite some significant drawbacks, the "screen reject" fraction could be traditionally used as fuel for energy recovery within the paper mill, in agreement with the integrated recycled paper mill approach. The waste, which still contains a cellulose fraction, can be also exploited by means of the hydrothermal route to give levulinic acid, a platform chemical of very high value added. Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. Apparatus for the liquefaction of natural gas and methods relating to same

    DOEpatents

    Wilding, Bruce M [Idaho Falls, ID; Bingham, Dennis N [Idaho Falls, ID; McKellar, Michael G [Idaho Falls, ID; Turner, Terry D [Ammon, ID; Raterman, Kevin T [Idaho Falls, ID; Palmer, Gary L [Shelley, ID; Klingler, Kerry M [Idaho Falls, ID; Vranicar, John J [Concord, CA

    2007-05-22

    An apparatus and method for producing liquefied natural gas. A liquefaction plant may be coupled to a source of unpurified natural gas, such as a natural gas pipeline at a pressure letdown station. A portion of the gas is drawn off and split into a process stream and a cooling stream. The cooling stream passes through a turbo expander creating work output. A compressor is driven by the work output and compresses the process stream. The compressed process stream is cooled, such as by the expanded cooling stream. The cooled, compressed process stream is divided into first and second portions with the first portion being expanded to liquefy the natural gas. A gas-liquid separator separates the vapor from the liquid natural gas. The second portion of the cooled, compressed process stream is also expanded and used to cool the compressed process stream. Additional features and techniques may be integrated with the liquefaction process including a water clean-up cycle and a carbon dioxide (CO.sub.2) clean-up cycle.

  12. Apparatus For The Liquefaaction Of Natural Gas And Methods Relating To Same

    DOEpatents

    Wilding, Bruce M.; Bingham, Dennis N.; McKellar, Michael G.; Turner, Terry D.; Rateman, Kevin T.; Palmer, Gary L.; Klinger, Kerry M.; Vranicar, John J.

    2005-11-08

    An apparatus and method for producing liquefied natural gas. A liquefaction plant may be coupled to a source of unpurified natural gas, such as a natural gas pipeline at a pressure letdown station. A portion of the gas is drawn off and split into a process stream and a cooling stream. The cooling stream passes through a turbo expander creating work output. A compressor is driven by the work output and compresses the process stream. The compressed process stream is cooled, such as by the expanded cooling stream. The cooled, compressed process stream is divided into first and second portions with the first portion being expanded to liquefy the natural gas. A gas-liquid separator separates the vapor from the liquid natural gas. The second portion of the cooled, compressed process stream is also expanded and used to cool the compressed process stream. Additional features and techniques may be integrated with the liquefaction process including a water clean-up cycle and a carbon dioxide (CO2) clean-up cycle.

  13. Apparatus For The Liquefaaction Of Natural Gas And Methods Relating To Same

    DOEpatents

    Wilding, Bruce M.; Bingham, Dennis N.; McKellar, Michael G.; Turner, Terry D.; Raterman, Kevin T.; Palmer, Gary L.; Klingler, Kerry M.; Vranicar, John J.

    2005-05-03

    An apparatus and method for producing liquefied natural gas. A liquefaction plant may be coupled to a source of unpurified natural gas, such as a natural gas pipeline at a pressure letdown station. A portion of the gas is drawn off and split into a process stream and a cooling stream. The cooling stream passes through a turbo expander creating work output. A compressor is driven by the work output and compresses the process stream. The compressed process stream is cooled, such as by the expanded cooling stream. The cooled, compressed process stream is divided into first and second portions with the first portion being expanded to liquefy the natural gas. A gas-liquid separator separates the vapor from the liquid natural gas. The second portion of the cooled, compressed process stream is also expanded and used to cool the compressed process stream. Additional features and techniques may be integrated with the liquefaction process including a water clean-up cycle and a carbon dioxide (CO2) clean-up cycle.

  14. Apparatus For The Liquefaaction Of Natural Gas And Methods Relating To Same

    DOEpatents

    Wilding, Bruce M.; Bingham, Dennis N.; McKellar, Michael G.; Turner, Terry D.; Raterman, Kevin T.; Palmer, Gary L.; Klingler, Kerry M.; Vranicar, John J.

    2003-06-24

    An apparatus and method for producing liquefied natural gas. A liquefaction plant may be coupled to a source of unpurified natural gas, such as a natural gas pipeline at a pressure letdown station. A portion of the gas is drawn off and split into a process stream and a cooling stream. The cooling stream passes through a turbo expander creating work output. A compressor is driven by the work output and compresses the process stream. The compressed process stream is cooled, such as by the expanded cooling stream. The cooled, compressed process stream is divided into first and second portions with the first portion being expanded to liquefy the natural gas. A gas-liquid separator separates the vapor from the liquid natural gas. The second portion of the cooled, compressed process stream is also expanded and used to cool the compressed process stream. Additional features and techniques may be integrated with the liquefaction process including a water clean-up cycle and a carbon dioxide (CO.sub.2) clean-up cycle.

  15. VOTable JAVA Streaming Writer and Applications.

    NASA Astrophysics Data System (ADS)

    Kulkarni, P.; Kembhavi, A.; Kale, S.

    2004-07-01

    Virtual Observatory related tools use a new standard for data transfer called the VOTable format. This is a variant of the xml format that enables easy transfer of data over the web. We describe a streaming interface that can bridge the VOTable format, through a user friendly graphical interface, with the FITS and ASCII formats, which are commonly used by astronomers. A streaming interface is important for efficient use of memory because of the large size of catalogues. The tools are developed in JAVA to provide a platform independent interface. We have also developed a stand-alone version that can be used to convert data stored in ASCII or FITS format on a local machine. The Streaming writer is successfully being used in VOPlot (See Kale et al 2004 for a description of VOPlot).We present the test results of converting huge FITS and ASCII data into the VOTable format on machines that have only limited memory.

  16. Text Stream Trend Analysis using Multiscale Visual Analytics with Applications to Social Media Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steed, Chad A; Beaver, Justin M; BogenII, Paul L.

    In this paper, we introduce a new visual analytics system, called Matisse, that allows exploration of global trends in textual information streams with specific application to social media platforms. Despite the potential for real-time situational awareness using these services, interactive analysis of such semi-structured textual information is a challenge due to the high-throughput and high-velocity properties. Matisse addresses these challenges through the following contributions: (1) robust stream data management, (2) automated sen- timent/emotion analytics, (3) inferential temporal, geospatial, and term-frequency visualizations, and (4) a flexible drill-down interaction scheme that progresses from macroscale to microscale views. In addition to describing thesemore » contributions, our work-in-progress paper concludes with a practical case study focused on the analysis of Twitter 1% sample stream information captured during the week of the Boston Marathon bombings.« less

  17. Removal of hydrogen sulfide as ammonium sulfate from hydropyrolysis product vapors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marker, Terry L.; Felix, Larry G.; Linck, Martin B.

    A system and method for processing biomass into hydrocarbon fuels that includes processing a biomass in a hydropyrolysis reactor resulting in hydrocarbon fuels and a process vapor stream and cooling the process vapor stream to a condensation temperature resulting in an aqueous stream. The aqueous stream is sent to a catalytic reactor where it is oxidized to obtain a product stream containing ammonia and ammonium sulfate. A resulting cooled product vapor stream includes non-condensable process vapors comprising H.sub.2, CH.sub.4, CO, CO.sub.2, ammonia and hydrogen sulfide.

  18. Removal of hydrogen sulfide as ammonium sulfate from hydropyrolysis product vapors

    DOEpatents

    Marker, Terry L; Felix, Larry G; Linck, Martin B; Roberts, Michael J

    2014-10-14

    A system and method for processing biomass into hydrocarbon fuels that includes processing a biomass in a hydropyrolysis reactor resulting in hydrocarbon fuels and a process vapor stream and cooling the process vapor stream to a condensation temperature resulting in an aqueous stream. The aqueous stream is sent to a catalytic reactor where it is oxidized to obtain a product stream containing ammonia and ammonium sulfate. A resulting cooled product vapor stream includes non-condensable process vapors comprising H.sub.2, CH.sub.4, CO, CO.sub.2, ammonia and hydrogen sulfide.

  19. Prioritized Contact Transport Stream

    NASA Technical Reports Server (NTRS)

    Hunt, Walter Lee, Jr. (Inventor)

    2015-01-01

    A detection process, contact recognition process, classification process, and identification process are applied to raw sensor data to produce an identified contact record set containing one or more identified contact records. A prioritization process is applied to the identified contact record set to assign a contact priority to each contact record in the identified contact record set. Data are removed from the contact records in the identified contact record set based on the contact priorities assigned to those contact records. A first contact stream is produced from the resulting contact records. The first contact stream is streamed in a contact transport stream. The contact transport stream may include and stream additional contact streams. The contact transport stream may be varied dynamically over time based on parameters such as available bandwidth, contact priority, presence/absence of contacts, system state, and configuration parameters.

  20. Mobile magnetic particles as solid-supports for rapid surface-based bioanalysis in continuous flow.

    PubMed

    Peyman, Sally A; Iles, Alexander; Pamme, Nicole

    2009-11-07

    An extremely versatile microfluidic device is demonstrated in which multi-step (bio)chemical procedures can be performed in continuous flow. The system operates by generating several co-laminar flow streams, which contain reagents for specific (bio)reactions across a rectangular reaction chamber. Functionalized magnetic microparticles are employed as mobile solid-supports and are pulled from one side of the reaction chamber to the other by use of an external magnetic field. As the particles traverse the co-laminar reagent streams, binding and washing steps are performed on their surface in one operation in continuous flow. The applicability of the platform was first demonstrated by performing a proof-of-principle binding assay between streptavidin coated magnetic particles and biotin in free solution with a limit of detection of 20 ng mL(-1) of free biotin. The system was then applied to a mouse IgG sandwich immunoassay as a first example of a process involving two binding steps and two washing steps, all performed within 60 s, a fraction of the time required for conventional testing.

  1. Spacecube: A Family of Reconfigurable Hybrid On-Board Science Data Processors

    NASA Technical Reports Server (NTRS)

    Flatley, Thomas P.

    2015-01-01

    SpaceCube is a family of Field Programmable Gate Array (FPGA) based on-board science data processing systems developed at the NASA Goddard Space Flight Center (GSFC). The goal of the SpaceCube program is to provide 10x to 100x improvements in on-board computing power while lowering relative power consumption and cost. SpaceCube is based on the Xilinx Virtex family of FPGAs, which include processor, FPGA logic and digital signal processing (DSP) resources. These processing elements are leveraged to produce a hybrid science data processing platform that accelerates the execution of algorithms by distributing computational functions to the most suitable elements. This approach enables the implementation of complex on-board functions that were previously limited to ground based systems, such as on-board product generation, data reduction, calibration, classification, eventfeature detection, data mining and real-time autonomous operations. The system is fully reconfigurable in flight, including data parameters, software and FPGA logic, through either ground commanding or autonomously in response to detected eventsfeatures in the instrument data stream.

  2. Melter Throughput Enhancements for High-Iron HLW

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kruger, A. A.; Gan, Hoa; Joseph, Innocent

    2012-12-26

    This report describes work performed to develop and test new glass and feed formulations in order to increase glass melting rates in high waste loading glass formulations for HLW with high concentrations of iron. Testing was designed to identify glass and melter feed formulations that optimize waste loading and waste processing rate while meeting all processing and product quality requirements. The work included preparation and characterization of crucible melts to assess melt rate using a vertical gradient furnace system and to develop new formulations with enhanced melt rate. Testing evaluated the effects of waste loading on glass properties and themore » maximum waste loading that can be achieved. The results from crucible-scale testing supported subsequent DuraMelter 100 (DM100) tests designed to examine the effects of enhanced glass and feed formulations on waste processing rate and product quality. The DM100 was selected as the platform for these tests due to its extensive previous use in processing rate determination for various HLW streams and glass compositions.« less

  3. EPOS Thematic Core Service ANTHROPOGENIC HAZARDS (TCS AH) - development of e-research platform

    NASA Astrophysics Data System (ADS)

    Orlecka-Sikora, Beata

    2017-04-01

    TCS AH is based on IS-EPOS Platform. The Platform facilitates research on anthropogenic hazards and is available online, free of charge https://tcs.ah-epos.eu/. The Platform is a final product of the IS-EPOS project, founded by the national programme - POIG - which was implemented in 2013-2015 (POIG.02.03.00-14-090/13-00). The platform is a result of a joint work of scientific community and industrial partners. Currently, the development of TCS AH is carried under EPOS IP project (H2020-INFRADEV-1-2015-1, INFRADEV-3-2015). Platform is an open virtual access point for researchers and Ph. D. students interested in anthropogenic seismicity and related hazards. This environment is designed to ensure a researcher the maximum possible liberty for experimentation by providing a virtual laboratory, in which the researcher can design own processing streams and process the data integrated on the platform. TCS AH integrates: data and specific high-level services. Data gathered in the so-called "episodes", comprehensively describing a geophysical process, induced or triggered by human technological activity, which, under certain circumstances can become hazardous for people, infrastructure and the environment. 7 sets of seismic, geological and technological data were made available on the Platform. The data come from Poland, Germany, UK and Vietnam, and refer to underground mining, reservoir impoundment, shale gas exploitation and geothermal energy production. The next at least 19 new episodes related to conventional hydrocarbon extraction, reservoir treatment, underground mining and geothermal energy production are being integrated within the framework of EPOS IP project. The heterogeneous multi-disciplinary data (seismic, displacement, geomechanical data, production data etc.) are transformed to unified structures to form integrated and validated datasets. To deal with this various data the problem-oriented services were designed and implemented. The particular attention devoted to methods analyzing correlations between technology, geophysical response and resulting hazard was stressed out in service preparation. TCS AH contains a number of computing and data visualization services, which give opportunity to make graphical presentations of the available data. Further development of the Platform, except integration of at least new episodes of all types of anthropogenic hazards, will be covering gradually implementation of new services. TCS AH platform is open for the whole research community. The platform is also designated to be used in research projects, eg. it serves "Shale gas exploration and exploitation induced risks (SHEER)" project (Horizon 2020, call LCE 16-2014). In addition, it is also meant to serve the public sector expert knowledge and background information. In order to fulfill this aim the services for outreach, dissemination & communication will be implemented. TCS AH was used as a teaching tool in Ph. D. students education within IG PAS seismology course for Ph. D. candidates, Interdisciplinary Polar Studies as well as in several workshops for Polish and international students. Additionally, the platform is also used within educational project ERIS (Exploitation of Research results In School practice) aimed for junior high and high schools, funded with support from the European Commission within ERASMUS+ Programme.

  4. A platform for real-time online health analytics during spaceflight

    NASA Astrophysics Data System (ADS)

    McGregor, Carolyn

    Monitoring the health and wellbeing of astronauts during spaceflight is an important aspect of any manned mission. To date the monitoring has been based on a sequential set of discontinuous samplings of physiological data to support initial studies on aspects such as weightlessness, and its impact on the cardiovascular system and to perform proactive monitoring for health status. The research performed and the real-time monitoring has been hampered by the lack of a platform to enable a more continuous approach to real-time monitoring. While any spaceflight is monitored heavily by Mission Control, an important requirement within the context of any spaceflight setting and in particular where there are extended periods with a lack of communication with Mission Control, is the ability for the mission to operate in an autonomous manner. This paper presents a platform to enable real-time astronaut monitoring for prognostics and health management within space medicine using online health analytics. The platform is based on extending previous online health analytics research known as the Artemis and Artemis Cloud platforms which have demonstrated their relevance for multi-patient, multi-diagnosis and multi-stream temporal analysis in real-time for clinical management and research within Neonatal Intensive Care. Artemis and Artemis Cloud source data from a range of medical devices capable of transmission of the signal via wired or wireless connectivity and hence are well suited to process real-time data acquired from astronauts. A key benefit of this platform is its ability to monitor their health and wellbeing onboard the mission as well as enabling the astronaut's physiological data, and other clinical data, to be sent to the platform components at Mission Control at each stage when that communication is available. As a result, researchers at Mission Control would be able to simulate, deploy and tailor predictive analytics and diagnostics during the same spaceflight for - reater medical support.

  5. Deductive Verification of Cryptographic Software

    NASA Technical Reports Server (NTRS)

    Almeida, Jose Barcelar; Barbosa, Manuel; Pinto, Jorge Sousa; Vieira, Barbara

    2009-01-01

    We report on the application of an off-the-shelf verification platform to the RC4 stream cipher cryptographic software implementation (as available in the openSSL library), and introduce a deductive verification technique based on self-composition for proving the absence of error propagation.

  6. Progress on the CWU READI Analysis Center

    NASA Astrophysics Data System (ADS)

    Melbourne, T. I.; Szeliga, W. M.; Santillan, V. M.; Scrivner, C.

    2015-12-01

    Real-time GPS position streams are desirable for a variety of seismic monitoring and hazard mitigation applications. We report on progress in our development of a comprehensive real-time GPS-based seismic monitoring system for the Cascadia subduction zone. This system is based on 1 Hz point position estimates computed in the ITRF08 reference frame. Convergence from phase and range observables to point position estimates is accelerated using a Kalman filter based, on-line stream editor that produces independent estimations of carrier phase integer biases and other parameters. Positions are then estimated using a short-arc approach and algorithms from JPL's GIPSY-OASIS software with satellite clock and orbit products from the International GNSS Service (IGS). The resulting positions show typical RMS scatter of 2.5 cm in the horizontal and 5 cm in the vertical with latencies below 2 seconds. To facilitate the use of these point position streams for applications such as seismic monitoring, we broadcast real-time positions and covariances using custom-built aggregation-distribution software based on RabbitMQ messaging platform. This software is capable of buffering 24-hour streams for hundreds of stations and providing them through a REST-ful web interface. To demonstrate the power of this approach, we have developed a Java-based front-end that provides a real-time visual display of time-series, displacement vector fields, and map-view, contoured, peak ground displacement. This Java-based front-end is available for download through the PANGA website. We are currently analyzing 80 PBO and PANGA stations along the Cascadia margin and gearing up to process all 400+ real-time stations that are operating in the Pacific Northwest, many of which are currently telemetered in real-time to CWU. These will serve as milestones towards our over-arching goal of extending our processing to include all of the available real-time streams from the Pacific rim. In addition, we have developed a Kalman filter to combine CWU real-time PPP solutions with those from Scripps Institute of Oceanography's PPP-AR real-time solutions as well as real-time solutions from the USGS. These combined products should improve the robustness and reliability of real-time point-position streams in the near future.

  7. Characterizing Milky Way Tidal Streams and Dark Matter with MilkyWay@home

    NASA Astrophysics Data System (ADS)

    Newberg, Heidi Jo; Shelton, Siddhartha; Weiss, Jake

    2018-01-01

    MilkyWay@home is a 0.5 PetaFLOPS volunteer computing platform that is mapping out the density substructure of the Sagittarius Dwarf Tidal Stream, the so-called bifurcated portion of the Sagittarius Stream, and the Virgo Overdensity, using turnoff stars from the Sloan Digital Sky Survey. It is also using the density of stars along tidal streams such as the Orphan Stream to constrain properties of the dwarf galaxy progenitor of this stream, including the dark matter portion. Both of these programs are enabled by a specially-built optimization package that uses differential evolution or particle swarm methods to find the optimal model parameters to fit a set of data. To fit the density of tidal streams, 20 parameters are simultaneously fit to each 2.5-degree-wide stripe of SDSS data. Five parameters describing the stellar and dark matter profile of the Orphan Stream progenitor and the time that the dwarf galaxy has been evolved through the Galactic potential are used in an n-body simulation that is then fit to observations of the Orphan Stream. New results from MilkyWay@home will be presented. This project was supported by NSF grant AST 16-15688, the NASA/NY Space Grant fellowship, and contributions made by The Marvin Clan, Babette Josephs, Manit Limlamai, and the 2015 Crowd Funding Campaign to Support Milky Way Research.

  8. Intensity Maps Production Using Real-Time Joint Streaming Data Processing From Social and Physical Sensors

    NASA Astrophysics Data System (ADS)

    Kropivnitskaya, Y. Y.; Tiampo, K. F.; Qin, J.; Bauer, M.

    2015-12-01

    Intensity is one of the most useful measures of earthquake hazard, as it quantifies the strength of shaking produced at a given distance from the epicenter. Today, there are several data sources that could be used to determine intensity level which can be divided into two main categories. The first category is represented by social data sources, in which the intensity values are collected by interviewing people who experienced the earthquake-induced shaking. In this case, specially developed questionnaires can be used in addition to personal observations published on social networks such as Twitter. These observations are assigned to the appropriate intensity level by correlating specific details and descriptions to the Modified Mercalli Scale. The second category of data sources is represented by observations from different physical sensors installed with the specific purpose of obtaining an instrumentally-derived intensity level. These are usually based on a regression of recorded peak acceleration and/or velocity amplitudes. This approach relates the recorded ground motions to the expected felt and damage distribution through empirical relationships. The goal of this work is to implement and evaluate streaming data processing separately and jointly from both social and physical sensors in order to produce near real-time intensity maps and compare and analyze their quality and evolution through 10-minute time intervals immediately following an earthquake. Results are shown for the case study of the M6.0 2014 South Napa, CA earthquake that occurred on August 24, 2014. The using of innovative streaming and pipelining computing paradigms through IBM InfoSphere Streams platform made it possible to read input data in real-time for low-latency computing of combined intensity level and production of combined intensity maps in near-real time. The results compare three types of intensity maps created based on physical, social and combined data sources. Here we correlate the count and density of Tweets with intensity level and show the importance of processing combined data sources at the earliest time stages after earthquake happens. This method can supplement existing approaches of intensity level detection, especially in the regions with high number of Twitter users and low density of seismic networks.

  9. Feature integration and object representations along the dorsal stream visual hierarchy

    PubMed Central

    Perry, Carolyn Jeane; Fallah, Mazyar

    2014-01-01

    The visual system is split into two processing streams: a ventral stream that receives color and form information and a dorsal stream that receives motion information. Each stream processes that information hierarchically, with each stage building upon the previous. In the ventral stream this leads to the formation of object representations that ultimately allow for object recognition regardless of changes in the surrounding environment. In the dorsal stream, this hierarchical processing has classically been thought to lead to the computation of complex motion in three dimensions. However, there is evidence to suggest that there is integration of both dorsal and ventral stream information into motion computation processes, giving rise to intermediate object representations, which facilitate object selection and decision making mechanisms in the dorsal stream. First we review the hierarchical processing of motion along the dorsal stream and the building up of object representations along the ventral stream. Then we discuss recent work on the integration of ventral and dorsal stream features that lead to intermediate object representations in the dorsal stream. Finally we propose a framework describing how and at what stage different features are integrated into dorsal visual stream object representations. Determining the integration of features along the dorsal stream is necessary to understand not only how the dorsal stream builds up an object representation but also which computations are performed on object representations instead of local features. PMID:25140147

  10. Energy Efficient, Cross-Layer Enabled, Dynamic Aggregation Networks for Next Generation Internet

    NASA Astrophysics Data System (ADS)

    Wang, Michael S.

    Today, the Internet traffic is growing at a near exponential rate, driven predominately by data center-based applications and Internet-of-Things services. This fast-paced growth in Internet traffic calls into question the ability of the existing optical network infrastructure to support this continued growth. The overall optical networking equipment efficiency has not been able to keep up with the traffic growth, creating a energy gap that makes energy and cost expenditures scale linearly with the traffic growth. The implication of this energy gap is that it is infeasible to continue using existing networking equipment to meet the growing bandwidth demand. A redesign of the optical networking platform is needed. The focus of this dissertation is on the design and implementation of energy efficient, cross-layer enabled, dynamic optical networking platforms, which is a promising approach to address the exponentially growing Internet bandwidth demand. Chapter 1 explains the motivation for this work by detailing the huge Internet traffic growth and the unsustainable energy growth of today's networking equipment. Chapter 2 describes the challenges and objectives of enabling agile, dynamic optical networking platforms and the vision of the Center for Integrated Access Networks (CIAN) to realize these objectives; the research objectives of this dissertation and the large body of related work in this field is also summarized. Chapter 3 details the design and implementation of dynamic networking platforms that support wavelength switching granularity. The main contribution of this work involves the experimental validation of deep cross-layer communication across the optical performance monitoring (OPM), data, and control planes. The first experiment shows QoS-aware video streaming over a metro-scale test-bed through optical power monitoring of the transmission wavelength and cross-layer feedback control of the power level. The second experiment extends the performance monitoring capabilities to include real-time monitoring of OSNR and polarization mode dispersion (PMD) to enable dynamic wavelength switching and selective restoration. Chapter 4 explains the author?s contributions in designing dynamic networking at the sub-wavelength switching granularity, which can provide greater network efficiency due to its finer granularity. To support dynamic switching, regeneration, adding/dropping, and control decisions on each individual packet, the cross-layer enabled node architecture is enhanced with a FPGA controller that brings much more precise timing and control to the switching, OPM, and control planes. Furthermore, QoS-aware packet protection and dynamic switching, dropping, and regeneration functionalities were experimentally demonstrated in a multi-node network. Chapter 5 describes a technique to perform optical grooming, a process of optically combining multiple incoming data streams into a single data stream, which can simultaneously achieve greater bandwidth utilization and increased spectral efficiency. In addition, an experimental demonstration highlighting a fully functioning multi-node, agile optical networking platform is detailed. Finally, a summary and discussion of future work is provided in Chapter 6. The future of the Internet is very exciting, filled with not-yet-invented applications and services driven by cloud computing and Internet-of-Things. The author is cautiously optimistic that agile, dynamically reconfigurable optical networking is the solution to realizing this future.

  11. Device for staged carbon monoxide oxidation

    DOEpatents

    Vanderborgh, Nicholas E.; Nguyen, Trung V.; Guante, Jr., Joseph

    1993-01-01

    A method and apparatus for selectively oxidizing carbon monoxide in a hydrogen rich feed stream. The method comprises mixing a feed stream consisting essentially of hydrogen, carbon dioxide, water and carbon monoxide with a first predetermined quantity of oxygen (air). The temperature of the mixed feed/oxygen stream is adjusted in a first the heat exchanger assembly (20) to a first temperature. The mixed feed/oxygen stream is sent to reaction chambers (30,32) having an oxidation catalyst contained therein. The carbon monoxide of the feed stream preferentially absorbs on the catalyst at the first temperature to react with the oxygen in the chambers (30,32) with minimal simultaneous reaction of the hydrogen to form an intermediate hydrogen rich process stream having a lower carbon monoxide content than the feed stream. The elevated outlet temperature of the process stream is carefully controlled in a second heat exchanger assembly (42) to a second temperature above the first temperature. The process stream is then mixed with a second predetermined quantity of oxygen (air). The carbon monoxide of the process stream preferentially reacts with the second quantity of oxygen in a second stage reaction chamber (56) with minimal simultaneous reaction of the hydrogen in the process stream. The reaction produces a hydrogen rich product stream having a lower carbon monoxide content than the process stream. The product stream is then cooled in a third heat exchanger assembly (72) to a third predetermined temperature. Three or more stages may be desirable, each with metered oxygen injection.

  12. NDSI products system based on Hadoop platform

    NASA Astrophysics Data System (ADS)

    Zhou, Yan; Jiang, He; Yang, Xiaoxia; Geng, Erhui

    2015-12-01

    Snow is solid state of water resources on earth, and plays an important role in human life. Satellite remote sensing is significant in snow extraction with the advantages of cyclical, macro, comprehensiveness, objectivity, timeliness. With the continuous development of remote sensing technology, remote sensing data access to the trend of multiple platforms, multiple sensors and multiple perspectives. At the same time, in view of the remote sensing data of compute-intensive applications demand increase gradually. However, current the producing system of remote sensing products is in a serial mode, and this kind of production system is used for professional remote sensing researchers mostly, and production systems achieving automatic or semi-automatic production are relatively less. Facing massive remote sensing data, the traditional serial mode producing system with its low efficiency has been difficult to meet the requirements of mass data timely and efficient processing. In order to effectively improve the production efficiency of NDSI products, meet the demand of large-scale remote sensing data processed timely and efficiently, this paper build NDSI products production system based on Hadoop platform, and the system mainly includes the remote sensing image management module, NDSI production module, and system service module. Main research contents and results including: (1)The remote sensing image management module: includes image import and image metadata management two parts. Import mass basis IRS images and NDSI product images (the system performing the production task output) into HDFS file system; At the same time, read the corresponding orbit ranks number, maximum/minimum longitude and latitude, product date, HDFS storage path, Hadoop task ID (NDSI products), and other metadata information, and then create thumbnails, and unique ID number for each record distribution, import it into base/product image metadata database. (2)NDSI production module: includes the index calculation, production tasks submission and monitoring two parts. Read HDF images related to production task in the form of a byte stream, and use Beam library to parse image byte stream to the form of Product; Use MapReduce distributed framework to perform production tasks, at the same time monitoring task status; When the production task complete, calls remote sensing image management module to store NDSI products. (3)System service module: includes both image search and DNSI products download. To image metadata attributes described in JSON format, return to the image sequence ID existing in the HDFS file system; For the given MapReduce task ID, package several task output NDSI products into ZIP format file, and return to the download link (4)System evaluation: download massive remote sensing data and use the system to process it to get the NDSI products testing the performance, and the result shows that the system has high extendibility, strong fault tolerance, fast production speed, and the image processing results with high accuracy.

  13. TERMINAL ELECTRON ACCEPTING PROCESSES IN THE ALLUVIAL SEDIMENTS OF A HEADWATER STREAM

    EPA Science Inventory

    Chemical fluxes between catchments and streams are influenced by biochemical processes in the groundwater-stream water (GW-SW) ecotone, the interface between stream surface water and groundwater. Terminal electron accepting processes (TEAPs) that are utilized in respiration of ...

  14. Iowa Flood Information System: Towards Integrated Data Management, Analysis and Visualization

    NASA Astrophysics Data System (ADS)

    Demir, I.; Krajewski, W. F.; Goska, R.; Mantilla, R.; Weber, L. J.; Young, N.

    2012-04-01

    The Iowa Flood Information System (IFIS) is a web-based platform developed by the Iowa Flood Center (IFC) to provide access to flood inundation maps, real-time flood conditions, flood forecasts both short-term and seasonal, flood-related data, information and interactive visualizations for communities in Iowa. The key element of the system's architecture is the notion of community. Locations of the communities, those near streams and rivers, define basin boundaries. The IFIS provides community-centric watershed and river characteristics, weather (rainfall) conditions, and streamflow data and visualization tools. Interactive interfaces allow access to inundation maps for different stage and return period values, and flooding scenarios with contributions from multiple rivers. Real-time and historical data of water levels, gauge heights, and rainfall conditions are available in the IFIS by streaming data from automated IFC bridge sensors, USGS stream gauges, NEXRAD radars, and NWS forecasts. Simple 2D and 3D interactive visualizations in the IFIS make the data more understandable to general public. Users are able to filter data sources for their communities and selected rivers. The data and information on IFIS is also accessible through web services and mobile applications. The IFIS is optimized for various browsers and screen sizes to provide access through multiple platforms including tablets and mobile devices. The IFIS includes a rainfall-runoff forecast model to provide a five-day flood risk estimate for around 500 communities in Iowa. Multiple view modes in the IFIS accommodate different user types from general public to researchers and decision makers by providing different level of tools and details. River view mode allows users to visualize data from multiple IFC bridge sensors and USGS stream gauges to follow flooding condition along a river. The IFIS will help communities make better-informed decisions on the occurrence of floods, and will alert communities in advance to help minimize damage of floods. This presentation provides an overview and live demonstration of the tools and interfaces in the IFIS developed to date to provide a platform for one-stop access to flood related data, visualizations, flood conditions, and forecast.

  15. Iowa Flood Information System

    NASA Astrophysics Data System (ADS)

    Demir, I.; Krajewski, W. F.; Goska, R.; Mantilla, R.; Weber, L. J.; Young, N.

    2011-12-01

    The Iowa Flood Information System (IFIS) is a web-based platform developed by the Iowa Flood Center (IFC) to provide access to flood inundation maps, real-time flood conditions, flood forecasts both short-term and seasonal, flood-related data, information and interactive visualizations for communities in Iowa. The key element of the system's architecture is the notion of community. Locations of the communities, those near streams and rivers, define basin boundaries. The IFIS provides community-centric watershed and river characteristics, weather (rainfall) conditions, and streamflow data and visualization tools. Interactive interfaces allow access to inundation maps for different stage and return period values, and flooding scenarios with contributions from multiple rivers. Real-time and historical data of water levels, gauge heights, and rainfall conditions are available in the IFIS by streaming data from automated IFC bridge sensors, USGS stream gauges, NEXRAD radars, and NWS forecasts. Simple 2D and 3D interactive visualizations in the IFIS make the data more understandable to general public. Users are able to filter data sources for their communities and selected rivers. The data and information on IFIS is also accessible through web services and mobile applications. The IFIS is optimized for various browsers and screen sizes to provide access through multiple platforms including tablets and mobile devices. The IFIS includes a rainfall-runoff forecast model to provide a five-day flood risk estimate for around 500 communities in Iowa. Multiple view modes in the IFIS accommodate different user types from general public to researchers and decision makers by providing different level of tools and details. River view mode allows users to visualize data from multiple IFC bridge sensors and USGS stream gauges to follow flooding condition along a river. The IFIS will help communities make better-informed decisions on the occurrence of floods, and will alert communities in advance to help minimize damage of floods. This presentation provides an overview of the tools and interfaces in the IFIS developed to date to provide a platform for one-stop access to flood related data, visualizations, flood conditions, and forecast.

  16. Flood Risk Management in Iowa through an Integrated Flood Information System

    NASA Astrophysics Data System (ADS)

    Demir, Ibrahim; Krajewski, Witold

    2013-04-01

    The Iowa Flood Information System (IFIS) is a web-based platform developed by the Iowa Flood Center (IFC) to provide access to flood inundation maps, real-time flood conditions, flood forecasts both short-term and seasonal, flood-related data, information and interactive visualizations for communities in Iowa. The key element of the system's architecture is the notion of community. Locations of the communities, those near streams and rivers, define basin boundaries. The IFIS provides community-centric watershed and river characteristics, weather (rainfall) conditions, and streamflow data and visualization tools. Interactive interfaces allow access to inundation maps for different stage and return period values, and flooding scenarios with contributions from multiple rivers. Real-time and historical data of water levels, gauge heights, and rainfall conditions are available in the IFIS by streaming data from automated IFC bridge sensors, USGS stream gauges, NEXRAD radars, and NWS forecasts. Simple 2D and 3D interactive visualizations in the IFIS make the data more understandable to general public. Users are able to filter data sources for their communities and selected rivers. The data and information on IFIS is also accessible through web services and mobile applications. The IFIS is optimized for various browsers and screen sizes to provide access through multiple platforms including tablets and mobile devices. The IFIS includes a rainfall-runoff forecast model to provide a five-day flood risk estimate for around 1100 communities in Iowa. Multiple view modes in the IFIS accommodate different user types from general public to researchers and decision makers by providing different level of tools and details. River view mode allows users to visualize data from multiple IFC bridge sensors and USGS stream gauges to follow flooding condition along a river. The IFIS will help communities make better-informed decisions on the occurrence of floods, and will alert communities in advance to help minimize damage of floods. This presentation provides an overview and live demonstration of the tools and interfaces in the IFIS developed to date to provide a platform for one-stop access to flood related data, visualizations, flood conditions, and forecast.

  17. Extraction and downstream processing of plant-derived recombinant proteins.

    PubMed

    Buyel, J F; Twyman, R M; Fischer, R

    2015-11-01

    Plants offer the tantalizing prospect of low-cost automated manufacturing processes for biopharmaceutical proteins, but several challenges must be addressed before such goals are realized and the most significant hurdles are found during downstream processing (DSP). In contrast to the standardized microbial and mammalian cell platforms embraced by the biopharmaceutical industry, there are many different plant-based expression systems vying for attention, and those with the greatest potential to provide inexpensive biopharmaceuticals are also the ones with the most significant drawbacks in terms of DSP. This is because the most scalable plant systems are based on the expression of intracellular proteins in whole plants. The plant tissue must therefore be disrupted to extract the product, challenging the initial DSP steps with an unusually high load of both particulate and soluble contaminants. DSP platform technologies can accelerate and simplify process development, including centrifugation, filtration, flocculation, and integrated methods that combine solid-liquid separation, purification and concentration, such as aqueous two-phase separation systems. Protein tags can also facilitate these DSP steps, but they are difficult to transfer to a commercial environment and more generic, flexible and scalable strategies to separate target and host cell proteins are preferable, such as membrane technologies and heat/pH precipitation. In this context, clarified plant extracts behave similarly to the feed stream from microbes or mammalian cells and the corresponding purification methods can be applied, as long as they are adapted for plant-specific soluble contaminants such as the superabundant protein RuBisCO. Plant-derived pharmaceutical proteins cannot yet compete directly with established platforms but they are beginning to penetrate niche markets that allow the beneficial properties of plants to be exploited, such as the ability to produce 'biobetters' with tailored glycans, the ability to scale up production rapidly for emergency responses and the ability to produce commodity recombinant proteins on an agricultural scale. Copyright © 2015 Elsevier Inc. All rights reserved.

  18. Technology Readiness Level (TRL) Advancement of the MSPI On-Board Processing Platform for the ACE Decadal Survey Mission

    NASA Technical Reports Server (NTRS)

    Pingree, Paula J.; Werne, Thomas A.; Bekker, Dmitriy L.; Wilson, Thor O.

    2011-01-01

    The Xilinx Virtex-5QV is a new Single-event Immune Reconfigurable FPGA (SIRF) device that is targeted as the spaceborne processor for the NASA Decadal Survey Aerosol-Cloud-Ecosystem (ACE) mission's Multiangle SpectroPolarimetric Imager (MSPI) instrument, currently under development at JPL. A key technology needed for MSPI is on-board processing (OBP) to calculate polarimetry data as imaged by each of the 9 cameras forming the instrument. With funding from NASA's ESTO1 AIST2 Program, JPL is demonstrating how signal data at 95 Mbytes/sec over 16 channels for each of the 9 multi-angle cameras can be reduced to 0.45 Mbytes/sec, thereby substantially reducing the image data volume for spacecraft downlink without loss of science information. This is done via a least-squares fitting algorithm implemented on the Virtex-5 FPGA operating in real-time on the raw video data stream.

  19. Microencapsulation of curcumin in PLGA microcapsules by coaxial flow focusing

    NASA Astrophysics Data System (ADS)

    Lei, Fan; Si, Ting; Luo, Xisheng; Xu, Ronald X.

    2014-03-01

    Curcumin-loaded PLGA microcapsules are fabricated by a liquid-driving coaxial flow focusing device. In the process, a stable coaxial cone-jet configuration is formed under the action of a coflowing liquid stream and the coaxial liquid jet eventually breaks up into microcapsules because of flow instability. This process can be well controlled by adjusting the flow rates of three phases including the driving PVA water solution, the outer PLGA ethyl acetate solution and the inner curcumin propylene glycol solution. Confocal and SEM imaging methods clearly indicate the core-shell structure of the resultant microcapsules. The encapsulation rate of curcumin in PLGA is measured to be more than 70%, which is much higher than the tranditional methods such as emulsion. The size distribution of resultant microcapsules under different conditions is presented and compared. An in vitro release simulation platform is further developed to verify the feasibility and reliability of the method.

  20. Visual and visuomotor processing of hands and tools as a case study of cross talk between the dorsal and ventral streams.

    PubMed

    Almeida, Jorge; Amaral, Lénia; Garcea, Frank E; Aguiar de Sousa, Diana; Xu, Shan; Mahon, Bradford Z; Martins, Isabel Pavão

    2018-05-24

    A major principle of organization of the visual system is between a dorsal stream that processes visuomotor information and a ventral stream that supports object recognition. Most research has focused on dissociating processing across these two streams. Here we focus on how the two streams interact. We tested neurologically-intact and impaired participants in an object categorization task over two classes of objects that depend on processing within both streams-hands and tools. We measured how unconscious processing of images from one of these categories (e.g., tools) affects the recognition of images from the other category (i.e., hands). Our findings with neurologically-intact participants demonstrated that processing an image of a hand hampers the subsequent processing of an image of a tool, and vice versa. These results were not present in apraxic patients (N = 3). These findings suggest local and global inhibitory processes working in tandem to co-register information across the two streams.

  1. Accelerating bacterial growth detection and antimicrobial susceptibility assessment in integrated picoliter droplet platform.

    PubMed

    Kaushik, Aniruddha M; Hsieh, Kuangwen; Chen, Liben; Shin, Dong Jin; Liao, Joseph C; Wang, Tza-Huei

    2017-11-15

    There remains an urgent need for rapid diagnostic methods that can evaluate antibiotic resistance for pathogenic bacteria in order to deliver targeted antibiotic treatments. Toward this end, we present a rapid and integrated single-cell biosensing platform, termed dropFAST, for bacterial growth detection and antimicrobial susceptibility assessment. DropFAST utilizes a rapid resazurin-based fluorescent growth assay coupled with stochastic confinement of bacteria in 20 pL droplets to detect signal from growing bacteria after 1h incubation, equivalent to 2-3 bacterial replications. Full integration of droplet generation, incubation, and detection into a single, uninterrupted stream also renders this platform uniquely suitable for in-line bacterial phenotypic growth assessment. To illustrate the concept of rapid digital antimicrobial susceptibility assessment, we employ the dropFAST platform to evaluate the antibacterial effect of gentamicin on E. coli growth. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. Scalable and responsive event processing in the cloud

    PubMed Central

    Suresh, Visalakshmi; Ezhilchelvan, Paul; Watson, Paul

    2013-01-01

    Event processing involves continuous evaluation of queries over streams of events. Response-time optimization is traditionally done over a fixed set of nodes and/or by using metrics measured at query-operator levels. Cloud computing makes it easy to acquire and release computing nodes as required. Leveraging this flexibility, we propose a novel, queueing-theory-based approach for meeting specified response-time targets against fluctuating event arrival rates by drawing only the necessary amount of computing resources from a cloud platform. In the proposed approach, the entire processing engine of a distinct query is modelled as an atomic unit for predicting response times. Several such units hosted on a single node are modelled as a multiple class M/G/1 system. These aspects eliminate intrusive, low-level performance measurements at run-time, and also offer portability and scalability. Using model-based predictions, cloud resources are efficiently used to meet response-time targets. The efficacy of the approach is demonstrated through cloud-based experiments. PMID:23230164

  3. Carbon recovery by fermentation of CO-rich off gases - Turning steel mills into biorefineries.

    PubMed

    Molitor, Bastian; Richter, Hanno; Martin, Michael E; Jensen, Rasmus O; Juminaga, Alex; Mihalcea, Christophe; Angenent, Largus T

    2016-09-01

    Technological solutions to reduce greenhouse gas (GHG) emissions from anthropogenic sources are required. Heavy industrial processes, such as steel making, contribute considerably to GHG emissions. Fermentation of carbon monoxide (CO)-rich off gases with wild-type acetogenic bacteria can be used to produce ethanol, acetate, and 2,3-butanediol, thereby, reducing the carbon footprint of heavy industries. Here, the processes for the production of ethanol from CO-rich off gases are discussed and a perspective on further routes towards an integrated biorefinery at a steel mill is given. Recent achievements in genetic engineering as well as integration of other biotechnology platforms to increase the product portfolio are summarized. Already, yields have been increased and the portfolio of products broadened. To develop a commercially viable process, however, the extraction from dilute product streams is a critical step and alternatives to distillation are discussed. Finally, another critical step is waste(water) treatment with the possibility to recover resources. Copyright © 2016 Elsevier Ltd. All rights reserved.

  4. Valve For Extracting Samples From A Process Stream

    NASA Technical Reports Server (NTRS)

    Callahan, Dave

    1995-01-01

    Valve for extracting samples from process stream includes cylindrical body bolted to pipe that contains stream. Opening in valve body matched and sealed against opening in pipe. Used to sample process streams in variety of facilities, including cement plants, plants that manufacture and reprocess plastics, oil refineries, and pipelines.

  5. Membrane-based systems for carbon capture and hydrogen purification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Berchtold, Kathryn A

    2010-11-24

    This presentation describes the activities being conducted at Los Alamos National Laboratory to develop carbon capture technologies for power systems. This work is aimed at continued development and demonstration of a membrane based pre- and post-combustion carbon capture technology and separation schemes. Our primary work entails the development and demonstration of an innovative membrane technology for pre-combustion capture of carbon dioxide that operates over a broad range of conditions relevant to the power industry while meeting the US DOE's Carbon Sequestration Program goals of 90% CO{sub 2} capture at less than a 10% increase in the cost of energy services.more » Separating and capturing carbon dioxide from mixed gas streams is a first and critical step in carbon sequestration. To be technically and economically viable, a successful separation method must be applicable to industrially relevant gas streams at realistic temperatures and pressures as well as be compatible with large gas volumes. Our project team is developing polymer membranes based on polybenzimidazole (PBI) chemistries that can purify hydrogen and capture CO{sub 2} at industrially relevant temperatures. Our primary objectives are to develop and demonstrate polymer-based membrane chemistries, structures, deployment platforms, and sealing technologies that achieve the critical combination of high selectivity, high permeability, chemical stability, and mechanical stability all at elevated temperatures (> 150 C) and packaged in a scalable, economically viable, high area density system amenable to incorporation into an advanced Integrated Gasification Combined-Cycle (IGCC) plant for pre-combustion CO{sub 2} capture. Stability requirements are focused on tolerance to the primary synthesis gas components and impurities at various locations in the IGCC process. Since the process stream compositions and conditions (temperature and pressure) vary throughout the IGCC process, the project is focused on the optimization of a technology that could be positioned upstream or downstream of one or more of the water-gas-shift reactors (WGSRs) or integrated with a WGSR.« less

  6. Real-time analysis for intensive care: development and deployment of the artemis analytic system.

    PubMed

    Blount, Marion; Ebling, Maria R; Eklund, J Mikael; James, Andrew G; McGregor, Carolyn; Percival, Nathan; Smith, Kathleen P; Sow, Daby

    2010-01-01

    The lives of many thousands of children born premature or ill at term around the world have been saved by those who work within neonatal intensive care units (NICUs). Modern-day neonatologists, together with nursing staff and other specialists within this domain, enjoy modern technologies for activities such as financial transactions, online purchasing, music, and video on demand. Yet, when they move into their workspace, in many cases, they are supported by nearly the same technology they used 20 years ago. Medical devices provide visual displays of vital signs through physiological streams such as electrocardiogram (ECG), heart rate, blood oxygen saturation (SpO(2)), and respiratory rate. Electronic health record initiatives around the world provide an environment for the electronic management of medical records, but they fail to support the high-frequency interpretation of streaming physiological data. We have taken a collaborative research approach to address this need to provide a flexible platform for the real-time online analysis of patients' data streams to detect medically significant conditions that precede the onset of medical complications. The platform supports automated or clinician-driven knowledge discovery to discover new relationships between physiological data stream events and latent medical conditions as well as to refine existing analytics. Patients benefit from the system because earlier detection of signs of the medical conditions may lead to earlier intervention that may potentially lead to improved patient outcomes and reduced length of stays. The clinician benefits from a decision support tool that provides insight into multiple streams of data that are too voluminous to assess with traditional methods. The remainder of this article summarizes the strengths of our research collaboration and the resulting environment known as Artemis, which is currently being piloted within the NICU of The Hospital for Sick Children (SickKids) in Toronto, Ontario, Canada. Although the discussion in this article focuses on a NICU, the technologies can be applied to any intensive care environment.

  7. Particle dispersing system and method for testing semiconductor manufacturing equipment

    DOEpatents

    Chandrachood, Madhavi; Ghanayem, Steve G.; Cantwell, Nancy; Rader, Daniel J.; Geller, Anthony S.

    1998-01-01

    The system and method prepare a gas stream comprising particles at a known concentration using a particle disperser for moving particles from a reservoir of particles into a stream of flowing carrier gas. The electrostatic charges on the particles entrained in the carrier gas are then neutralized or otherwise altered, and the resulting particle-laden gas stream is then diluted to provide an acceptable particle concentration. The diluted gas stream is then split into a calibration stream and the desired output stream. The particles in the calibration stream are detected to provide an indication of the actual size distribution and concentration of particles in the output stream that is supplied to a process chamber being analyzed. Particles flowing out of the process chamber within a vacuum pumping system are detected, and the output particle size distribution and concentration are compared with the particle size distribution and concentration of the calibration stream in order to determine the particle transport characteristics of a process chamber, or to determine the number of particles lodged in the process chamber as a function of manufacturing process parameters such as pressure, flowrate, temperature, process chamber geometry, particle size, particle charge, and gas composition.

  8. Sensing Slow Mobility and Interesting Locations for Lombardy Region (italy): a Case Study Using Pointwise Geolocated Open Data

    NASA Astrophysics Data System (ADS)

    Brovelli, M. A.; Oxoli, D.; Zurbarán, M. A.

    2016-06-01

    During the past years Web 2.0 technologies have caused the emergence of platforms where users can share data related to their activities which in some cases are then publicly released with open licenses. Popular categories for this include community platforms where users can upload GPS tracks collected during slow travel activities (e.g. hiking, biking and horse riding) and platforms where users share their geolocated photos. However, due to the high heterogeneity of the information available on the Web, the sole use of these user-generated contents makes it an ambitious challenge to understand slow mobility flows as well as to detect the most visited locations in a region. Exploiting the available data on community sharing websites allows to collect near real-time open data streams and enables rigorous spatial-temporal analysis. This work presents an approach for collecting, unifying and analysing pointwise geolocated open data available from different sources with the aim of identifying the main locations and destinations of slow mobility activities. For this purpose, we collected pointwise open data from the Wikiloc platform, Twitter, Flickr and Foursquare. The analysis was confined to the data uploaded in Lombardy Region (Northern Italy) - corresponding to millions of pointwise data. Collected data was processed through the use of Free and Open Source Software (FOSS) in order to organize them into a suitable database. This allowed to run statistical analyses on data distribution in both time and space by enabling the detection of users' slow mobility preferences as well as places of interest at a regional scale.

  9. Fabrication of uniform multi-compartment particles using microfludic electrospray technology for cell co-culture studies.

    PubMed

    Liu, Zhou; Shum, Ho Cheung

    2013-01-01

    In this work, we demonstrate a robust and reliable approach to fabricate multi-compartment particles for cell co-culture studies. By taking advantage of the laminar flow within our microfluidic nozzle, multiple parallel streams of liquids flow towards the nozzle without significant mixing. Afterwards, the multiple parallel streams merge into a single stream, which is sprayed into air, forming monodisperse droplets under an electric field with a high field strength. The resultant multi-compartment droplets are subsequently cross-linked in a calcium chloride solution to form calcium alginate micro-particles with multiple compartments. Each compartment of the particles can be used for encapsulating different types of cells or biological cell factors. These hydrogel particles with cross-linked alginate chains show similarity in the physical and mechanical environment as the extracellular matrix of biological cells. Thus, the multi-compartment particles provide a promising platform for cell studies and co-culture of different cells. In our study, cells are encapsulated in the multi-compartment particles and the viability of cells is quantified using a fluorescence microscope after the cells are stained for a live/dead assay. The high cell viability after encapsulation indicates the cytocompatibility and feasibility of our technique. Our multi-compartment particles have great potential as a platform for studying cell-cell interactions as well as interactions of cells with extracellular factors.

  10. Fabrication of uniform multi-compartment particles using microfludic electrospray technology for cell co-culture studies

    PubMed Central

    Liu, Zhou; Shum, Ho Cheung

    2013-01-01

    In this work, we demonstrate a robust and reliable approach to fabricate multi-compartment particles for cell co-culture studies. By taking advantage of the laminar flow within our microfluidic nozzle, multiple parallel streams of liquids flow towards the nozzle without significant mixing. Afterwards, the multiple parallel streams merge into a single stream, which is sprayed into air, forming monodisperse droplets under an electric field with a high field strength. The resultant multi-compartment droplets are subsequently cross-linked in a calcium chloride solution to form calcium alginate micro-particles with multiple compartments. Each compartment of the particles can be used for encapsulating different types of cells or biological cell factors. These hydrogel particles with cross-linked alginate chains show similarity in the physical and mechanical environment as the extracellular matrix of biological cells. Thus, the multi-compartment particles provide a promising platform for cell studies and co-culture of different cells. In our study, cells are encapsulated in the multi-compartment particles and the viability of cells is quantified using a fluorescence microscope after the cells are stained for a live/dead assay. The high cell viability after encapsulation indicates the cytocompatibility and feasibility of our technique. Our multi-compartment particles have great potential as a platform for studying cell-cell interactions as well as interactions of cells with extracellular factors. PMID:24404050

  11. Multi-source Geospatial Data Analysis with Google Earth Engine

    NASA Astrophysics Data System (ADS)

    Erickson, T.

    2014-12-01

    The Google Earth Engine platform is a cloud computing environment for data analysis that combines a public data catalog with a large-scale computational facility optimized for parallel processing of geospatial data. The data catalog is a multi-petabyte archive of georeferenced datasets that include images from Earth observing satellite and airborne sensors (examples: USGS Landsat, NASA MODIS, USDA NAIP), weather and climate datasets, and digital elevation models. Earth Engine supports both a just-in-time computation model that enables real-time preview and debugging during algorithm development for open-ended data exploration, and a batch computation mode for applying algorithms over large spatial and temporal extents. The platform automatically handles many traditionally-onerous data management tasks, such as data format conversion, reprojection, and resampling, which facilitates writing algorithms that combine data from multiple sensors and/or models. Although the primary use of Earth Engine, to date, has been the analysis of large Earth observing satellite datasets, the computational platform is generally applicable to a wide variety of use cases that require large-scale geospatial data analyses. This presentation will focus on how Earth Engine facilitates the analysis of geospatial data streams that originate from multiple separate sources (and often communities) and how it enables collaboration during algorithm development and data exploration. The talk will highlight current projects/analyses that are enabled by this functionality.https://earthengine.google.org

  12. Effect of Culverts on Predator-Prey Interactions in a Tropical Stream.

    NASA Astrophysics Data System (ADS)

    Hein, C. L.; Kikkert, D. A.; Crowl, T. A.

    2005-05-01

    As part of a biocomplexity project in Puerto Rico, we use river and road networks as a platform to understand the interactions between stream biota, the physical environment, and human activity. Specifically, we ask if humans affect aquatic organisms through road building and recreational activities. Culverts have been documented to impede or slow migration of aquatic biota. This is especially important in these streams because all of the freshwater, stream species have diadramous life cycles. If culverts do act as bottlenecks to shrimp migrations, we expect altered predator-prey interactions downstream through density-dependent predation dynamics. In order to determine how roads may affect predation rates on upstream migrating shrimp, we parameterized functional response curves for mountain mullet (Agonostomus monticola) consuming shrimp (Xiphocaris sp.) using artificial mesocosm experiments. We then used data obtained from underwater videography to determine how culverts decrease the rate and number of shrimp moving upstream. These data were combined in a predator-prey model to quantify the effects of culverts on localized shrimp densities and fish predation.

  13. Vulnerable transportation and utility assets near actively migrating streams in Indiana

    USGS Publications Warehouse

    Sperl, Benjamin J.

    2017-11-02

    An investigation was completed by the U.S. Geological Survey in cooperation with the Indiana Office of Community and Rural Affairs that found 1,132 transportation and utility assets in Indiana are vulnerable to fluvial erosion hazards due to close proximity to actively migrating streams. Locations of transportation assets (bridges, roadways, and railroad lines) and selected utility assets (high-capacity overhead power-transmission lines, underground pipelines, water treatment facilities, and in-channel dams) were determined using aerial imagery hosted by the Google Earth platform. Identified assets were aggregated by stream reach, county, and class. Accompanying the report is a polyline shapefile of the stream reaches documented by Robinson. The shapefile, derived from line work in the National Hydrography Dataset and attributed with channel migration rates, is released with complete Federal Geographic Data Committee metadata. The data presented in this report are intended to help stakeholders and others identify high-risk areas where transportation and utility assets may be threatened by fluvial erosion hazards thus warranting consideration for mitigation strategies.

  14. Extrusion Processing of Raw Food Materials and by-products: A Review.

    PubMed

    Offiah, Vivian; Kontogiorgos, Vassilis; Falade, Kolawole O

    2018-05-22

    Extrusion technology has rapidly transformed the food industry with its numerous advantages over other processing methods. It offers a platform for processing different products from various food groups by modifying minor or major ingredients and processing conditions. Although cereals occupy a large portion of the extruded foods market, several other types of raw materials have been used. Extrusion processing of various food groups, including cereals and pseudo cereals, roots and tubers, pulses and oilseeds, fruits and vegetables, and animal products, as well as structural and nutritional changes in these food matrices are reviewed. Value addition by extrusion to food processing wastes and by-products from fruits and vegetables, dairy, meat and seafood, cereals and residues from starch, syrup and alcohol production, and oilseed processing are also discussed. Extrusion presents an economical technology for incorporating food processing residues and by-products back into the food stream. In contemporary scenarios, rising demand for extruded products with functional ingredients, attributed to evolving lifestyles and preferences, have led to innovations in the form, texture, color and content of extruded products. Information presented in this review would be of importance to processors and researchers as they seek to enhance nutritional quality and delivery of extruded products.

  15. Evaluating the Effects of Culvert Designs on Ecosystem Processes in Northern Wisconsin Streams

    Treesearch

    J. C. Olson; A. M. Marcarelli; A.L. Timm; S.L. Eggert; R.K. Kolka

    2017-01-01

    Culvert replacements are commonly undertaken to restore aquatic organism passage and stream hydrologic and geomorphic conditions, but their effects on ecosystem processes are rarely quantified. The objective of this study was to investigate the effects of two culvert replacement designs on stream ecosystem processes. The stream simulation design, where culverts...

  16. Towards benchmarking citizen observatories: Features and functioning of online amateur weather networks.

    PubMed

    Gharesifard, Mohammad; Wehn, Uta; van der Zaag, Pieter

    2017-05-15

    Crowd-sourced environmental observations are increasingly being considered as having the potential to enhance the spatial and temporal resolution of current data streams from terrestrial and areal sensors. The rapid diffusion of ICTs during the past decades has facilitated the process of data collection and sharing by the general public and has resulted in the formation of various online environmental citizen observatory networks. Online amateur weather networks are a particular example of such ICT-mediated observatories that are rooted in one of the oldest and most widely practiced citizen science activities, namely amateur weather observation. The objective of this paper is to introduce a conceptual framework that enables a systematic review of the features and functioning of these expanding networks. This is done by considering distinct dimensions, namely the geographic scope and types of participants, the network's establishment mechanism, revenue stream(s), existing communication paradigm, efforts required by data sharers, support offered by platform providers, and issues such as data accessibility, availability and quality. An in-depth understanding of these dimensions helps to analyze various dynamics such as interactions between different stakeholders, motivations to run the networks, and their sustainability. This framework is then utilized to perform a critical review of six existing online amateur weather networks based on publicly available data. The main findings of this analysis suggest that: (1) there are several key stakeholders such as emergency services and local authorities that are not (yet) engaged in these networks; (2) the revenue stream(s) of online amateur weather networks is one of the least discussed but arguably most important dimensions that is crucial for the sustainability of these networks; and (3) all of the networks included in this study have one or more explicit modes of bi-directional communication, however, this is limited to feedback mechanisms that are mainly designed to educate the data sharers. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. Apparatus and process for the refrigeration, liquefaction and separation of gases with varying levels of purity

    DOEpatents

    Bingham, Dennis N.; Wilding, Bruce M.; McKellar, Michael G.

    2002-01-01

    A process for the separation and liquefaction of component gasses from a pressurized mix gas stream is disclosed. The process involves cooling the pressurized mixed gas stream in a heat exchanger so as to condensing one or more of the gas components having the highest condensation point; separating the condensed components from the remaining mixed gas stream in a gas-liquid separator; cooling the separated condensed component stream by passing it through an expander; and passing the cooled component stream back through the heat exchanger such that the cooled component stream functions as the refrigerant for the heat exchanger. The cycle is then repeated for the remaining mixed gas stream so as to draw off the next component gas and further cool the remaining mixed gas stream. The process continues until all of the component gases are separated from the desired gas stream. The final gas stream is then passed through a final heat exchanger and expander. The expander decreases the pressure on the gas stream, thereby cooling the stream and causing a portion of the gas stream to liquify within a tank. The portion of the gas which is hot liquefied is passed back through each of the heat exchanges where it functions as a refrigerant.

  18. Apparatus and process for the refrigeration, liquefaction and separation of gases with varying levels of purity

    DOEpatents

    Bingham, Dennis N.; Wilding, Bruce M.; McKellar, Michael G.

    2000-01-01

    A process for the separation and liquefaction of component gasses from a pressurized mix gas stream is disclosed. The process involves cooling the pressurized mixed gas stream in a heat exchanger so as to condense one or more of the gas components having the highest condensation point; separating the condensed components from the remaining mixed gas stream in a gas-liquid separator; cooling the separated condensed component stream by passing it through an expander; and passing the cooled component stream back through the heat exchanger such that the cooled component stream functions as the refrigerant for the heat exchanger. The cycle is then repeated for the remaining mixed gas stream so as to draw off the next component gas and further cool the remaining mixed gas stream. The process continues until all of the component gases are separated from the desired gas stream. The final gas stream is then passed through a final heat exchanger and expander. The expander decreases the pressure on the gas stream, thereby cooling the stream and causing a portion of the gas stream to liquify within a tank. The portion of the gas which is not liquefied is passed back through each of the heat exchanges where it functions as a refrigerant.

  19. Apparatus for the liquefaction of natural gas and methods relating to same

    DOEpatents

    Turner, Terry D [Ammon, ID; Wilding, Bruce M [Idaho Falls, ID; McKellar, Michael G [Idaho Falls, ID

    2009-09-22

    An apparatus and method for producing liquefied natural gas. A liquefaction plant may be coupled to a source of unpurified natural gas, such as a natural gas pipeline at a pressure letdown station. A portion of the gas is drawn off and split into a process stream and a cooling stream. The cooling stream passes through an expander creating work output. A compressor may be driven by the work output and compresses the process stream. The compressed process stream is cooled, such as by the expanded cooling stream. The cooled, compressed process stream is expanded to liquefy the natural gas. A gas-liquid separator separates a vapor from the liquid natural gas. A portion of the liquid gas is used for additional cooling. Gas produced within the system may be recompressed for reintroduction into a receiving line or recirculation within the system for further processing.

  20. Estimating stream discharge from a Himalayan Glacier using coupled satellite sensor data

    NASA Astrophysics Data System (ADS)

    Child, S. F.; Stearns, L. A.; van der Veen, C. J.; Haritashya, U. K.; Tarpanelli, A.

    2015-12-01

    The 4th IPCC report highlighted our limited understanding of Himalayan glacier behavior and contribution to the region's hydrology. Seasonal snow and glacier melt in the Himalayas are important sources of water, but estimates greatly differ about the actual contribution of melted glacier ice to stream discharge. A more comprehensive understanding of the contribution of glaciers to stream discharge is needed because streams being fed by glaciers affect the livelihoods of a large part of the world's population. Most of the streams in the Himalayas are unmonitored because in situ measurements are logistically difficult and costly. This necessitates the use of remote sensing platforms to obtain estimates of river discharge for validating hydrological models. In this study, we estimate stream discharge using cost-effective methods via repeat satellite imagery from Landsat-8 and SENTINEL-1A sensors. The methodology is based on previous studies, which show that ratio values from optical satellite bands correlate well with measured stream discharge. While similar, our methodology relies on significantly higher resolution imagery (30 m) and utilizes bands that are in the blue and near-infrared spectrum as opposed to previous studies using 250 m resolution imagery and spectral bands only in the near-infrared. Higher resolution imagery is necessary for streams where the source is a glacier's terminus because the width of the stream is often only 10s of meters. We validate our methodology using two rivers in the state of Kansas, where stream gauges are plentiful. We then apply our method to the Bhagirathi River, in the North-Central Himalayas, which is fed by the Gangotri Glacier and has a well monitored stream gauge. The analysis will later be used to couple river discharge and glacier flow and mass balance through an integrated hydrologic model in the Bhagirathi Basin.

  1. WPSS: watching people security services

    NASA Astrophysics Data System (ADS)

    Bouma, Henri; Baan, Jan; Borsboom, Sander; van Zon, Kasper; Luo, Xinghan; Loke, Ben; Stoeller, Bram; van Kuilenburg, Hans; Dijk, Judith

    2013-10-01

    To improve security, the number of surveillance cameras is rapidly increasing. However, the number of human operators remains limited and only a selection of the video streams are observed. Intelligent software services can help to find people quickly, evaluate their behavior and show the most relevant and deviant patterns. We present a software platform that contributes to the retrieval and observation of humans and to the analysis of their behavior. The platform consists of mono- and stereo-camera tracking, re-identification, behavioral feature computation, track analysis, behavior interpretation and visualization. This system is demonstrated in a busy shopping mall with multiple cameras and different lighting conditions.

  2. Dual-stream modulation failure: a novel hypothesis for the formation and maintenance of delusions in schizophrenia.

    PubMed

    Speechley, William J; Ngan, Elton T C

    2008-01-01

    Delusions, a cardinal feature of schizophrenia, are characterized by the development and preservation of false beliefs despite reason and evidence to the contrary. A number of cognitive models have made important contributions to our understanding of delusions, though it remains unclear which core cognitive processes are malfunctioning to enable individuals with delusions to form and maintain erroneous beliefs. We propose a modified dual-stream processing model that provides a viable and testable mechanism that can account for this debilitating symptom. Dual-stream models divide decision-making into two streams: a fast, intuitive and automatic form of processing (Stream 1); and a slower, conscious and deliberative process (Stream 2). Our novel model proposes two key influences on the way these streams interact in everyday decision-making: conflict and emotion. Conflict: in most decision-making scenarios one obvious answer presents itself and the two streams converge onto the same conclusion. However, in instances where there are competing alternative possibilities, an individual often experiences dissonance, or a sense of conflict. The detection of this conflict biases processing towards the more deliberative Stream 2. Emotion: highly emotional states can result in behavior that is reflexive and action-oriented. This may be due to the power of emotionally valenced stimuli to bias reasoning towards Stream 1. We propose that in schizophrenia, an abnormal response to these two influences results in a pathological schism between Stream 1 and Stream 2, enabling erroneous intuitive explanations to coexist with contrary logical explanations of the same event. Specifically, we suggest that delusions are the result of a failure to reconcile the two streams due to both a failure of conflict to bias decision-making towards Stream 2 and an accentuated emotional bias towards Stream 1.

  3. Using high-frequency nitrogen and carbon measurements to decouple temporal dynamics of catchment and in-stream transport and reaction processes in a headwater stream

    NASA Astrophysics Data System (ADS)

    Blaen, P.; Riml, J.; Khamis, K.; Krause, S.

    2017-12-01

    Within river catchments across the world, headwater streams represent important sites of nutrient transformation and uptake due to their high rates of microbial community processing and relative abundance in the landscape. However, separating the combined influence of in-stream transport and reaction processes from the overall catchment response can be difficult due to spatio-temporal variability in nutrient and organic matter inputs, flow regimes, and reaction rates. Recent developments in optical sensor technologies enable high-frequency, in situ nutrient measurements, and thus provide opportunities for greater insights into in-stream processes. Here, we use in-stream observations of hourly nitrate (NO3-N), dissolved organic carbon (DOC) and dissolved oxygen (DO) measurements from paired in situ sensors that bound a 1 km headwater stream reach in a mixed-use catchment in central England. We employ a spectral approach to decompose (1) variances in solute loading from the surrounding landscape, and (2) variances in reach-scale in-stream nutrient transport and reaction processes. In addition, we estimate continuous rates of reach-scale NO3-N and DOC assimilation/dissimilation, ecosystem respiration and primary production. Comparison of these results over a range of hydrological conditions (baseflow, variable storm events) and timescales (event-based, diel, seasonal) facilitates new insights into the physical and biogeochemical processes that drive in-stream nutrient dynamics in headwater streams.

  4. Separation process using pervaporation and dephlegmation

    DOEpatents

    Vane, Leland M.; Mairal, Anurag P.; Ng, Alvin; Alvarez, Franklin R.; Baker, Richard W.

    2004-06-29

    A process for treating liquids containing organic compounds and water. The process includes a pervaporation step in conjunction with a dephlegmation step to treat at least a portion of the permeate vapor from the pervaporation step. The process yields a membrane residue stream, a stream enriched in the more volatile component (usually the organic) as the overhead stream from the dephlegmator and a condensate stream enriched in the less volatile component (usually the water) as a bottoms stream from the dephlegmator. Any of these may be the principal product of the process. The membrane separation step may also be performed in the vapor phase, or by membrane distillation.

  5. Coupled stream and population dynamics: Modeling the role beaver (Castor canadensis) play in generating juvenile steelhead (Oncorhynchus mykiss) habitat

    NASA Astrophysics Data System (ADS)

    Jordan, C.; Bouwes, N.; Wheaton, J. M.; Pollock, M.

    2013-12-01

    Over the past several centuries, the population of North American Beaver has been dramatically reduced through fur trapping. As a result, the geomorphic impacts long-term beaver occupancy and activity can have on fluvial systems have been lost, both from the landscape and from our collective memory such that physical and biological models of floodplain system function neither consider nor have the capacity to incorporate the role beaver can play in structuring the dynamics of streams. Concomitant with the decline in beaver populations was an increasing pressure on streams and floodplains through human activity, placing numerous species of stream rearing fishes in peril, most notably the ESA listing of trout and salmon populations across the entirety of the Western US. The rehabilitation of stream systems is seen as one of the primary means by which population and ecosystem recovery can be achieved, yet the methods of stream rehabilitation are applied almost exclusively with the expected outcome of a static idealized stream planform, occasionally with an acknowledgement of restoring processes rather than form and only rarely with the goal of a beaver dominated riverscape. We have constructed an individual based model of trout and beaver populations that allows the exploration of fish population dynamics as a function of stream habitat quality and quantity. We based the simulation tool on Bridge Creek (John Day River basin, Oregon) where we have implemented a large-scale restoration experiment using wooden posts to provide beavers with stable platforms for dam building and to simulate the dams themselves. Extensive monitoring captured geomorphic and riparian changes, as well as fish and beaver population responses; information we use to parameterize the model as to the geomorphic and fish response to dam building beavers. In the simulation environment, stream habitat quality and quantity can be manipulated directly through rehabilitation actions and indirectly through the dynamics of the co-occurring beaver population. The model allowed to us to ask questions critical for designing restoration strategies based on dam building beaver activity, such as what beaver population growth rate is required to develop and maintain floodplain connectivity in an incised system, or what beaver population size is required to increase juvenile steelhead production? The model was sensitive to several variables including beaver colony size, dams and colony dynamics and site fidelity, and thus highlights further research needs to fill critical information gaps.

  6. Concept, Implementation and Testing of PRESTo: Real-time experimentation in Southern Italy and worldwide applications

    NASA Astrophysics Data System (ADS)

    Zollo, Aldo; Emolo, Antonio; Festa, Gaetano; Picozzi, Matteo; Elia, Luca; Martino, Claudio; Colombelli, Simona; Brondi, Piero; Caruso, Alessandro

    2016-04-01

    The past two decades have witnessed a huge progress in the development, implementation and testing of Earthquakes Early Warning Systems (EEWS) worldwide, as the result of a joint effort of the seismological and earthquake engineering communities to set up robust and efficient methodologies for the real-time seismic risk mitigation. This work presents an overview of the worldwide applications of the system PRESTo (PRobabilistic and Evolutionary early warning SysTem), which is the highly configurable and easily portable platform for Earthquake Early Warning developed by the RISSCLab group of the University of Naples Federico II. In particular, we first present the results of the real-time experimentation of PRESTo in Suthern Italy on the data streams of the Irpinia Seismic Network (ISNet), in Southern Italy. ISNet is a dense high-dynamic range, earthquake observing system, which operates in true real-time mode, thanks to a mixed data transmission system based on proprietary digital terrestrial links, standard ADSL and UMTS technologies. Using the seedlink protocol data are transferred to the network center unit, running the software platform PRESTo which is devoted to process the real-time data streaming, estimate source parameters and issue the alert. The software platform PRESTo uses a P-wave, network-based approach which has evolved and improved during the time since its first release. In its original version consisted in a series of modules, aimed at the event detection/picking, probabilistic real-time earthquake location and magnitude estimation, prediction of peak ground motion at distant sites through ground motion prediction equations for the area. In the recent years, PRESTo has been also implemented at the accelerometric and broad-band seismic networks in South Korea, Romania, North-East Italy, and Turkey and off-line tested in Iberian Peninsula, Israel, and Japan. Moreover, the feasibility of a PRESTo-based, EEWS at national scale in Italy, has been tested by evaluating its performance for the Italian Accelerometric Network. These testing experiments and the EEWS performance results will be summarized in the near-future perspective of building the next generation of early warning systems.

  7. A failure of conflict to modulate dual-stream processing may underlie the formation and maintenance of delusions.

    PubMed

    Speechley, W J; Murray, C B; McKay, R M; Munz, M T; Ngan, E T C

    2010-03-01

    Dual-stream information processing proposes that reasoning is composed of two interacting processes: a fast, intuitive system (Stream 1) and a slower, more logical process (Stream 2). In non-patient controls, divergence of these streams may result in the experience of conflict, modulating decision-making towards Stream 2, and initiating a more thorough examination of the available evidence. In delusional schizophrenia patients, a failure of conflict to modulate decision-making towards Stream 2 may reduce the influence of contradictory evidence, resulting in a failure to correct erroneous beliefs. Delusional schizophrenia patients and non-patient controls completed a deductive reasoning task requiring logical validity judgments of two-part conditional statements. Half of the statements were characterized by a conflict between logical validity (Stream 2) and content believability (Stream 1). Patients were significantly worse than controls in determining the logical validity of both conflict and non-conflict conditional statements. This between groups difference was significantly greater for the conflict condition. The results are consistent with the hypothesis that delusional schizophrenia patients fail to use conflict to modulate towards Stream 2 when the two streams of reasoning arrive at incompatible judgments. This finding provides encouraging preliminary support for the Dual-Stream Modulation Failure model of delusion formation and maintenance. 2009 Elsevier Masson SAS. All rights reserved.

  8. GISpark: A Geospatial Distributed Computing Platform for Spatiotemporal Big Data

    NASA Astrophysics Data System (ADS)

    Wang, S.; Zhong, E.; Wang, E.; Zhong, Y.; Cai, W.; Li, S.; Gao, S.

    2016-12-01

    Geospatial data are growing exponentially because of the proliferation of cost effective and ubiquitous positioning technologies such as global remote-sensing satellites and location-based devices. Analyzing large amounts of geospatial data can provide great value for both industrial and scientific applications. Data- and compute- intensive characteristics inherent in geospatial big data increasingly pose great challenges to technologies of data storing, computing and analyzing. Such challenges require a scalable and efficient architecture that can store, query, analyze, and visualize large-scale spatiotemporal data. Therefore, we developed GISpark - a geospatial distributed computing platform for processing large-scale vector, raster and stream data. GISpark is constructed based on the latest virtualized computing infrastructures and distributed computing architecture. OpenStack and Docker are used to build multi-user hosting cloud computing infrastructure for GISpark. The virtual storage systems such as HDFS, Ceph, MongoDB are combined and adopted for spatiotemporal data storage management. Spark-based algorithm framework is developed for efficient parallel computing. Within this framework, SuperMap GIScript and various open-source GIS libraries can be integrated into GISpark. GISpark can also integrated with scientific computing environment (e.g., Anaconda), interactive computing web applications (e.g., Jupyter notebook), and machine learning tools (e.g., TensorFlow/Orange). The associated geospatial facilities of GISpark in conjunction with the scientific computing environment, exploratory spatial data analysis tools, temporal data management and analysis systems make up a powerful geospatial computing tool. GISpark not only provides spatiotemporal big data processing capacity in the geospatial field, but also provides spatiotemporal computational model and advanced geospatial visualization tools that deals with other domains related with spatial property. We tested the performance of the platform based on taxi trajectory analysis. Results suggested that GISpark achieves excellent run time performance in spatiotemporal big data applications.

  9. The Ocean Observatories Initiative: Data Acquisition Functions and Its Built-In Automated Python Modules

    NASA Astrophysics Data System (ADS)

    Smith, M. J.; Vardaro, M.; Crowley, M. F.; Glenn, S. M.; Schofield, O.; Belabbassi, L.; Garzio, L. M.; Knuth, F.; Fram, J. P.; Kerfoot, J.

    2016-02-01

    The Ocean Observatories Initiative (OOI), funded by the National Science Foundation, provides users with access to long-term datasets from a variety of oceanographic sensors. The Endurance Array in the Pacific Ocean consists of two separate lines off the coasts of Oregon and Washington. The Oregon line consists of 7 moorings, two cabled benthic experiment packages and 6 underwater gliders. The Washington line comprises 6 moorings and 6 gliders. Each mooring is outfitted with a variety of instrument packages. The raw data from these instruments are sent to shore via satellite communication and in some cases, via fiber optic cable. Raw data is then sent to the cyberinfrastructure (CI) group at Rutgers where it is aggregated, parsed into thousands of different data streams, and integrated into a software package called uFrame. The OOI CI delivers the data to the general public via a web interface that outputs data into commonly used scientific data file formats such as JSON, netCDF, and CSV. The Rutgers data management team has developed a series of command-line Python tools that streamline data acquisition in order to facilitate the QA/QC review process. The first step in the process is querying the uFrame database for a list of all available platforms. From this list, a user can choose a specific platform and automatically download all available datasets from the specified platform. The downloaded dataset is plotted using a generalized Python netcdf plotting routine that utilizes a data visualization toolbox called matplotlib. This routine loads each netCDF file separately and outputs plots by each available parameter. These Python tools have been uploaded to a Github repository that is openly available to help facilitate OOI data access and visualization.

  10. SOA approach to battle command: simulation interoperability

    NASA Astrophysics Data System (ADS)

    Mayott, Gregory; Self, Mid; Miller, Gordon J.; McDonnell, Joseph S.

    2010-04-01

    NVESD is developing a Sensor Data and Management Services (SDMS) Service Oriented Architecture (SOA) that provides an innovative approach to achieve seamless application functionality across simulation and battle command systems. In 2010, CERDEC will conduct a SDMS Battle Command demonstration that will highlight the SDMS SOA capability to couple simulation applications to existing Battle Command systems. The demonstration will leverage RDECOM MATREX simulation tools and TRADOC Maneuver Support Battle Laboratory Virtual Base Defense Operations Center facilities. The battle command systems are those specific to the operation of a base defense operations center in support of force protection missions. The SDMS SOA consists of four components that will be discussed. An Asset Management Service (AMS) will automatically discover the existence, state, and interface definition required to interact with a named asset (sensor or a sensor platform, a process such as level-1 fusion, or an interface to a sensor or other network endpoint). A Streaming Video Service (SVS) will automatically discover the existence, state, and interfaces required to interact with a named video stream, and abstract the consumers of the video stream from the originating device. A Task Manager Service (TMS) will be used to automatically discover the existence of a named mission task, and will interpret, translate and transmit a mission command for the blue force unit(s) described in a mission order. JC3IEDM data objects, and software development kit (SDK), will be utilized as the basic data object definition for implemented web services.

  11. A neuro-inspired spike-based PID motor controller for multi-motor robots with low cost FPGAs.

    PubMed

    Jimenez-Fernandez, Angel; Jimenez-Moreno, Gabriel; Linares-Barranco, Alejandro; Dominguez-Morales, Manuel J; Paz-Vicente, Rafael; Civit-Balcells, Anton

    2012-01-01

    In this paper we present a neuro-inspired spike-based close-loop controller written in VHDL and implemented for FPGAs. This controller has been focused on controlling a DC motor speed, but only using spikes for information representation, processing and DC motor driving. It could be applied to other motors with proper driver adaptation. This controller architecture represents one of the latest layers in a Spiking Neural Network (SNN), which implements a bridge between robotics actuators and spike-based processing layers and sensors. The presented control system fuses actuation and sensors information as spikes streams, processing these spikes in hard real-time, implementing a massively parallel information processing system, through specialized spike-based circuits. This spike-based close-loop controller has been implemented into an AER platform, designed in our labs, that allows direct control of DC motors: the AER-Robot. Experimental results evidence the viability of the implementation of spike-based controllers, and hardware synthesis denotes low hardware requirements that allow replicating this controller in a high number of parallel controllers working together to allow a real-time robot control.

  12. SIproc: an open-source biomedical data processing platform for large hyperspectral images.

    PubMed

    Berisha, Sebastian; Chang, Shengyuan; Saki, Sam; Daeinejad, Davar; He, Ziqi; Mankar, Rupali; Mayerich, David

    2017-04-10

    There has recently been significant interest within the vibrational spectroscopy community to apply quantitative spectroscopic imaging techniques to histology and clinical diagnosis. However, many of the proposed methods require collecting spectroscopic images that have a similar region size and resolution to the corresponding histological images. Since spectroscopic images contain significantly more spectral samples than traditional histology, the resulting data sets can approach hundreds of gigabytes to terabytes in size. This makes them difficult to store and process, and the tools available to researchers for handling large spectroscopic data sets are limited. Fundamental mathematical tools, such as MATLAB, Octave, and SciPy, are extremely powerful but require that the data be stored in fast memory. This memory limitation becomes impractical for even modestly sized histological images, which can be hundreds of gigabytes in size. In this paper, we propose an open-source toolkit designed to perform out-of-core processing of hyperspectral images. By taking advantage of graphical processing unit (GPU) computing combined with adaptive data streaming, our software alleviates common workstation memory limitations while achieving better performance than existing applications.

  13. A Neuro-Inspired Spike-Based PID Motor Controller for Multi-Motor Robots with Low Cost FPGAs

    PubMed Central

    Jimenez-Fernandez, Angel; Jimenez-Moreno, Gabriel; Linares-Barranco, Alejandro; Dominguez-Morales, Manuel J.; Paz-Vicente, Rafael; Civit-Balcells, Anton

    2012-01-01

    In this paper we present a neuro-inspired spike-based close-loop controller written in VHDL and implemented for FPGAs. This controller has been focused on controlling a DC motor speed, but only using spikes for information representation, processing and DC motor driving. It could be applied to other motors with proper driver adaptation. This controller architecture represents one of the latest layers in a Spiking Neural Network (SNN), which implements a bridge between robotics actuators and spike-based processing layers and sensors. The presented control system fuses actuation and sensors information as spikes streams, processing these spikes in hard real-time, implementing a massively parallel information processing system, through specialized spike-based circuits. This spike-based close-loop controller has been implemented into an AER platform, designed in our labs, that allows direct control of DC motors: the AER-Robot. Experimental results evidence the viability of the implementation of spike-based controllers, and hardware synthesis denotes low hardware requirements that allow replicating this controller in a high number of parallel controllers working together to allow a real-time robot control. PMID:22666004

  14. Process for recovering organic components from liquid streams

    DOEpatents

    Blume, Ingo; Baker, Richard W.

    1991-01-01

    A separation process for recovering organic components from liquid streams. The process is a combination of pervaporation and decantation. In cases where the liquid stream contains the organic to be separated in dissolved form, the pervaporation step is used to concentrate the organic to a point above the solubility limit, so that a two-phase permeate is formed and then decanted. In cases where the liquid stream is a two-phase mixture, the decantation step is performed first, to remove the organic product phase, and the residue from the decanter is then treated by pervaporation. The condensed permeate from the pervaporation unit is sufficiently concentrated in the organic component to be fed back to the decanter. The process can be tailored to produce only two streams: an essentially pure organic product stream suitable for reuse, and a residue stream for discharge or reuse.

  15. Developing Geospatial Intelligence Stewardship for Multinational Operations

    DTIC Science & Technology

    2010-06-11

    and chaired by the NGA’s It is important to note that the GEOINT data stream requires the largest bandwidth for full motion video , hyper-spectral...platforms, as in the phrase “Predator Porn .” Yet, what should be used could be called an ISR-operational design, in an end-ways-means approach...

  16. AADL and Model-based Engineering

    DTIC Science & Technology

    2014-10-20

    and MBE Feiler, Oct 20, 2014 © 2014 Carnegie Mellon University We Rely on Software for Safe Aircraft Operation Embedded software systems ...D eveloper Compute Platform Runtime Architecture Application Software Embedded SW System Engineer Data Stream Characteristics Latency...confusion Hardware Engineer Why do system level failures still occur despite fault tolerance techniques being deployed in systems ? Embedded software

  17. Microbial electrolysis cells turning to be versatile technology: recent advances and future challenges.

    PubMed

    Zhang, Yifeng; Angelidaki, Irini

    2014-06-01

    Microbial electrolysis cells (MECs) are an electricity-mediated microbial bioelectrochemical technology, which is originally developed for high-efficiency biological hydrogen production from waste streams. Compared to traditional biological technologies, MECs can overcome thermodynamic limitations and achieve high-yield hydrogen production from wide range of organic matters at relatively mild conditions. This approach greatly reduces the electric energy cost for hydrogen production in contrast to direct water electrolysis. In addition to hydrogen production, MECs may also support several energetically unfavorable biological/chemical reactions. This unique advantage of MECs has led to several alternative applications such as chemicals synthesis, recalcitrant pollutants removal, resources recovery, bioelectrochemical research platform and biosensors, which have greatly broaden the application scopes of MECs. MECs are becoming a versatile platform technology and offer a new solution for emerging environmental issues related to waste streams treatment and energy and resource recovery. Different from previous reviews that mainly focus on hydrogen production, this paper provides an up-to-date review of all the new applications of MECs and their resulting performance, current challenges and prospects of future. Copyright © 2014 Elsevier Ltd. All rights reserved.

  18. Real-Time Processing Library for Open-Source Hardware Biomedical Sensors

    PubMed Central

    Castro-García, Juan A.; Lebrato-Vázquez, Clara

    2018-01-01

    Applications involving data acquisition from sensors need samples at a preset frequency rate, the filtering out of noise and/or analysis of certain frequency components. We propose a novel software architecture based on open-software hardware platforms which allows programmers to create data streams from input channels and easily implement filters and frequency analysis objects. The performances of the different classes given in the size of memory allocated and execution time (number of clock cycles) were analyzed in the low-cost platform Arduino Genuino. In addition, 11 people took part in an experiment in which they had to implement several exercises and complete a usability test. Sampling rates under 250 Hz (typical for many biomedical applications) makes it feasible to implement filters, sliding windows and Fourier analysis, operating in real time. Participants rated software usability at 70.2 out of 100 and the ease of use when implementing several signal processing applications was rated at just over 4.4 out of 5. Participants showed their intention of using this software because it was percieved as useful and very easy to use. The performances of the library showed that it may be appropriate for implementing small biomedical real-time applications or for human movement monitoring, even in a simple open-source hardware device like Arduino Genuino. The general perception about this library is that it is easy to use and intuitive. PMID:29596394

  19. Real-Time Processing Library for Open-Source Hardware Biomedical Sensors.

    PubMed

    Molina-Cantero, Alberto J; Castro-García, Juan A; Lebrato-Vázquez, Clara; Gómez-González, Isabel M; Merino-Monge, Manuel

    2018-03-29

    Applications involving data acquisition from sensors need samples at a preset frequency rate, the filtering out of noise and/or analysis of certain frequency components. We propose a novel software architecture based on open-software hardware platforms which allows programmers to create data streams from input channels and easily implement filters and frequency analysis objects. The performances of the different classes given in the size of memory allocated and execution time (number of clock cycles) were analyzed in the low-cost platform Arduino Genuino. In addition, 11 people took part in an experiment in which they had to implement several exercises and complete a usability test. Sampling rates under 250 Hz (typical for many biomedical applications) makes it feasible to implement filters, sliding windows and Fourier analysis, operating in real time. Participants rated software usability at 70.2 out of 100 and the ease of use when implementing several signal processing applications was rated at just over 4.4 out of 5. Participants showed their intention of using this software because it was percieved as useful and very easy to use. The performances of the library showed that it may be appropriate for implementing small biomedical real-time applications or for human movement monitoring, even in a simple open-source hardware device like Arduino Genuino. The general perception about this library is that it is easy to use and intuitive.

  20. The NifTK software platform for image-guided interventions: platform overview and NiftyLink messaging.

    PubMed

    Clarkson, Matthew J; Zombori, Gergely; Thompson, Steve; Totz, Johannes; Song, Yi; Espak, Miklos; Johnsen, Stian; Hawkes, David; Ourselin, Sébastien

    2015-03-01

    To perform research in image-guided interventions, researchers need a wide variety of software components, and assembling these components into a flexible and reliable system can be a challenging task. In this paper, the NifTK software platform is presented. A key focus has been high-performance streaming of stereo laparoscopic video data, ultrasound data and tracking data simultaneously. A new messaging library called NiftyLink is introduced that uses the OpenIGTLink protocol and provides the user with easy-to-use asynchronous two-way messaging, high reliability and comprehensive error reporting. A small suite of applications called NiftyGuide has been developed, containing lightweight applications for grabbing data, currently from position trackers and ultrasound scanners. These applications use NiftyLink to stream data into NiftyIGI, which is a workstation-based application, built on top of MITK, for visualisation and user interaction. Design decisions, performance characteristics and initial applications are described in detail. NiftyLink was tested for latency when transmitting images, tracking data, and interleaved imaging and tracking data. NiftyLink can transmit tracking data at 1,024 frames per second (fps) with latency of 0.31 milliseconds, and 512 KB images with latency of 6.06 milliseconds at 32 fps. NiftyIGI was tested, receiving stereo high-definition laparoscopic video at 30 fps, tracking data from 4 rigid bodies at 20-30 fps and ultrasound data at 20 fps with rendering refresh rates between 2 and 20 Hz with no loss of user interaction. These packages form part of the NifTK platform and have proven to be successful in a variety of image-guided surgery projects. Code and documentation for the NifTK platform are available from http://www.niftk.org . NiftyLink is provided open-source under a BSD license and available from http://github.com/NifTK/NiftyLink . The code for this paper is tagged IJCARS-2014.

  1. 3D Numerical simulation of bed morphological responses to complex in-streamstructures

    NASA Astrophysics Data System (ADS)

    Xu, Y.; Liu, X.

    2017-12-01

    In-stream structures are widely used in stream restoration for both hydraulic and ecologicalpurposes. The geometries of the structures are usually designed to be extremely complex andirregular, so as to provide nature-like physical habitat. The aim of this study is to develop anumerical model to accurately predict the bed-load transport and the morphological changescaused by the complex in-stream structures. This model is developed in the platform ofOpenFOAM. In the hydrodynamics part, it utilizes different turbulence models to capture thedetailed turbulence information near the in-stream structures. The technique of immersedboundary method (IBM) is efficiently implemented in the model to describe the movable bendand the rigid solid body of in-stream structures. With IBM, the difficulty of mesh generation onthe complex geometry is greatly alleviated, and the bed surface deformation is able to becoupled in to flow system. This morphodynamics model is firstly validated by simple structures,such as the morphology of the scour in log-vane structure. Then it is applied in a more complexstructure, engineered log jams (ELJ), which consists of multiple logs piled together. Thenumerical results including turbulence flow information and bed morphological responses areevaluated against the experimental measurement within the exact same flow condition.

  2. Investigation of particle lateral migration in sample-sheath flow of viscoelastic fluid and Newtonian fluid.

    PubMed

    Yuan, Dan; Zhang, Jun; Yan, Sheng; Peng, Gangrou; Zhao, Qianbin; Alici, Gursel; Du, Hejun; Li, Weihua

    2016-08-01

    In this work, particle lateral migration in sample-sheath flow of viscoelastic fluid and Newtonian fluid was experimentally investigated. The 4.8-μm micro-particles were dispersed in a polyethylene oxide (PEO) viscoelastic solution, and then the solution was injected into a straight rectangular channel with a deionised (DI) water Newtonian sheath flow. Micro-particles suspended in PEO solution migrated laterally to a DI water stream, but migration in the opposite direction from a DI water stream to a PEO solution stream or from one DI water stream to another DI water stream could not be achieved. The lateral migration of particles depends on the viscoelastic properties of the sample fluids. Furthermore, the effects of channel length, flow rate, and PEO concentration were studied. By using viscoelastic sample flow and Newtonian sheath flow, a selective particle lateral migration can be achieved in a simple straight channel, without any external force fields. This particle lateral migration technique could be potentially used in solution exchange fields such as automated cell staining and washing in microfluidic platforms, and holds numerous biomedical applications. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. Informing future NRT satellite distribution capabilities: Lessons learned from NASA's Land Atmosphere NRT capability for EOS (LANCE)

    NASA Astrophysics Data System (ADS)

    Davies, D.; Murphy, K. J.; Michael, K.

    2013-12-01

    NASA's Land Atmosphere Near real-time Capability for EOS (Earth Observing System) (LANCE) provides data and imagery from Terra, Aqua and Aura satellites in less than 3 hours from satellite observation, to meet the needs of the near real-time (NRT) applications community. This article describes the architecture of the LANCE and outlines the modifications made to achieve the 3-hour latency requirement with a view to informing future NRT satellite distribution capabilities. It also describes how latency is determined. LANCE is a distributed system that builds on the existing EOS Data and Information System (EOSDIS) capabilities. To achieve the NRT latency requirement, many components of the EOS satellite operations, ground and science processing systems have been made more efficient without compromising the quality of science data processing. The EOS Data and Operations System (EDOS) processes the NRT stream with higher priority than the science data stream in order to minimize latency. In addition to expediting transfer times, the key difference between the NRT Level 0 products and those for standard science processing is the data used to determine the precise location and tilt of the satellite. Standard products use definitive geo-location (attitude and ephemeris) data provided daily, whereas NRT products use predicted geo-location provided by the instrument Global Positioning System (GPS) or approximation of navigational data (depending on platform). Level 0 data are processed in to higher-level products at designated Science Investigator-led Processing Systems (SIPS). The processes used by LANCE have been streamlined and adapted to work with datasets as soon as they are downlinked from satellites or transmitted from ground stations. Level 2 products that require ancillary data have modified production rules to relax the requirements for ancillary data so reducing processing times. Looking to the future, experience gained from LANCE can provide valuable lessons on satellite and ground system architectures and on how the delivery of NRT products from other NASA missions might be achieved.

  4. Methods of producing alkylated hydrocarbons from an in situ heat treatment process liquid

    DOEpatents

    Roes, Augustinus Wilhelmus Maria [Houston, TX; Mo, Weijian [Sugar Land, TX; Muylle, Michel Serge Marie [Houston, TX; Mandema, Remco Hugo [Houston, TX; Nair, Vijay [Katy, TX

    2009-09-01

    A method for producing alkylated hydrocarbons is disclosed. Formation fluid is produced from a subsurface in situ heat treatment process. The formation fluid is separated to produce a liquid stream and a first gas stream. The first gas stream includes olefins. The liquid stream is fractionated to produce at least a second gas stream including hydrocarbons having a carbon number of at least 3. The first gas stream and the second gas stream are introduced into an alkylation unit to produce alkylated hydrocarbons. At least a portion of the olefins in the first gas stream enhance alkylation.

  5. Potential for real-time understanding of coupled hydrologic and biogeochemical processes in stream ecosystems: Future integration of telemetered data with process models for glacial meltwater streams

    NASA Astrophysics Data System (ADS)

    McKnight, Diane M.; Cozzetto, Karen; Cullis, James D. S.; Gooseff, Michael N.; Jaros, Christopher; Koch, Joshua C.; Lyons, W. Berry; Neupauer, Roseanna; Wlostowski, Adam

    2015-08-01

    While continuous monitoring of streamflow and temperature has been common for some time, there is great potential to expand continuous monitoring to include water quality parameters such as nutrients, turbidity, oxygen, and dissolved organic material. In many systems, distinguishing between watershed and stream ecosystem controls can be challenging. The usefulness of such monitoring can be enhanced by the application of quantitative models to interpret observed patterns in real time. Examples are discussed primarily from the glacial meltwater streams of the McMurdo Dry Valleys, Antarctica. Although the Dry Valley landscape is barren of plants, many streams harbor thriving cyanobacterial mats. Whereas a daily cycle of streamflow is controlled by the surface energy balance on the glaciers and the temporal pattern of solar exposure, the daily signal for biogeochemical processes controlling water quality is generated along the stream. These features result in an excellent outdoor laboratory for investigating fundamental ecosystem process and the development and validation of process-based models. As part of the McMurdo Dry Valleys Long-Term Ecological Research project, we have conducted field experiments and developed coupled biogeochemical transport models for the role of hyporheic exchange in controlling weathering reactions, microbial nitrogen cycling, and stream temperature regulation. We have adapted modeling approaches from sediment transport to understand mobilization of stream biomass with increasing flows. These models help to elucidate the role of in-stream processes in systems where watershed processes also contribute to observed patterns, and may serve as a test case for applying real-time stream ecosystem models.

  6. Application of the Hydroecological Integrity Assessment Process for Missouri Streams

    USGS Publications Warehouse

    Kennen, Jonathan G.; Henriksen, James A.; Heasley, John; Cade, Brian S.; Terrell, James W.

    2009-01-01

    Natural flow regime concepts and theories have established the justification for maintaining or restoring the range of natural hydrologic variability so that physiochemical processes, native biodiversity, and the evolutionary potential of aquatic and riparian assemblages can be sustained. A synthesis of recent research advances in hydroecology, coupled with stream classification using hydroecologically relevant indices, has produced the Hydroecological Integrity Assessment Process (HIP). HIP consists of (1) a regional classification of streams into hydrologic stream types based on flow data from long-term gaging-station records for relatively unmodified streams, (2) an identification of stream-type specific indices that address 11 subcomponents of the flow regime, (3) an ability to establish environmental flow standards, (4) an evaluation of hydrologic alteration, and (5) a capacity to conduct alternative analyses. The process starts with the identification of a hydrologic baseline (reference condition) for selected locations, uses flow data from a stream-gage network, and proceeds to classify streams into hydrologic stream types. Concurrently, the analysis identifies a set of non-redundant and ecologically relevant hydrologic indices for 11 subcomponents of flow for each stream type. Furthermore, regional hydrologic models for synthesizing flow conditions across a region and the development of flow-ecology response relations for each stream type can be added to further enhance the process. The application of HIP to Missouri streams identified five stream types ((1) intermittent, (2) perennial runoff-flashy, (3) perennial runoff-moderate baseflow, (4) perennial groundwater-stable, and (5) perennial groundwater-super stable). Two Missouri-specific computer software programs were developed: (1) a Missouri Hydrologic Assessment Tool (MOHAT) which is used to establish a hydrologic baseline, provide options for setting environmental flow standards, and compare past and proposed hydrologic alterations; and (2) a Missouri Stream Classification Tool (MOSCT) designed for placing previously unclassified streams into one of the five pre-defined stream types.

  7. Stream dynamics: An overview for land managers

    Treesearch

    Burchard H. Heede

    1980-01-01

    Concepts of stream dynamics are demonstrated through discussion of processes and process indicators; theory is included only where helpful to explain concepts. Present knowledge allows only qualitative prediction of stream behavior. However, such predictions show how management actions will affect the stream and its environment.

  8. Leaf litter processing in West Virginia mountain streams: effects of temperature and stream chemistry

    Treesearch

    Jacquelyn M. Rowe; William B. Perry; Sue A. Perry

    1996-01-01

    Climate change has the potential to alter detrital processing in headwater streams, which receive the majority of their nutrient input as terrestrial leaf litter. Early placement of experimental leaf packs in streams, one month prior to most abscission, was used as an experimental manipulation to increase stream temperature during leaf pack breakdown. We studied leaf...

  9. Riparian communities associated with pacific northwest headwater streams: assemblages, processes, and uniqueness.

    Treesearch

    John S. Richardson; Robert J. Naiman; Frederick J. Swanson; David E. Hibbs

    2005-01-01

    Riparian areas of large streams provide important habitat to many species and control many instream processes - but is the same true for the margins of small streams? This review considers riparian areas alongside small streams in forested, mountainous areas of the Pacific Northwest and asks if there are fundamental ecological differences from larger streams and from...

  10. Functional Process Zones Characterizing Aquatic Insect Communities in Streams of the Brazilian Cerrado.

    PubMed

    Godoy, B S; Simião-Ferreira, J; Lodi, S; Oliveira, L G

    2016-04-01

    Stream ecology studies see to understand ecological dynamics in lotic systems. The characterization of streams into Functional Process Zones (FPZ) has been currently debated in stream ecology because aquatic communities respond to functional processes of river segments. Therefore, we tested if different functional process zones have different number of genera and trophic structure using the aquatic insect community of Neotropical streams. We also assessed whether using physical and chemical variables may complement the approach of using FPZ to model communities of aquatic insects in Cerrado streams. This study was conducted in 101 streams or rivers from the central region of the state of Goiás, Brazil. We grouped the streams into six FPZ associated to size of the river system, presence of riparian forest, and riverbed heterogeneity. We used Bayesian models to compare number of genera and relative frequency of the feeding groups between FPZs. Streams classified in different FPZs had a different number of genera, and the largest and best preserved rivers had an average of four additional genera. Trophic structure exhibited low variability among FPZs, with little difference both in the number of genera and in abundance. Using functional process zones in Cerrado streams yielded good results for Ephemeroptera, Plecoptera, and Trichoptera communities. Thus, species distribution and community structure in the river basin account for functional processes and not necessarily for the position of the community along a longitudinal dimension of the lotic system.

  11. A Study of Cloud Radiative Forcing and Feedback

    NASA Technical Reports Server (NTRS)

    Ramanathan, Veerabhadran

    2000-01-01

    The main objective of the grant proposal was to participate in the CERES (Cloud and Earth's Radiant Energy System) Satellite experiment and perform interdisciplinary investigation of NASA's Earth Observing System (EOS). During the grant period, massive amounts of scientific data from diverse platforms have been accessed, processed and archived for continuing use; several software packages have been developed for integration of different data streams for performing scientific evaluation; extensive validation studies planned have been completed culminating in the development of important algorithms that are being used presently in the operational production of data from the CERES. Contributions to the inter-disciplinary science investigations have been significantly more than originally envisioned. The results of these studies have appeared in several refereed journals and conference proceedings. They are listed at the end of this report.

  12. The chemistry of iron, aluminum, and dissolved organic material in three acidic, metal-enriched, mountain streams, as controlled by watershed and in-stream processes

    USGS Publications Warehouse

    McKnight, Diane M.; Bencala, Kenneth E.

    1990-01-01

    Several studies were conducted in three acidic, metal-enriched, mountain streams, and the results are discussed together in this paper to provide a synthesis of watershed and in-stream processes controlling Fe, Al, and DOC (dissolved organic carbon) concentrations. One of the streams, the Snake River, is naturally acidic; the other two, Peru Creek and St. Kevin Gulch, receive acid mine drainage. Analysis of stream water chemistry data for the acidic headwaters of the Snake River shows that some trace metal solutes (Al, Mn, Zn) are correlated with major ions, indicating that watershed processes control their concentrations. Once in the stream, biogeochemical processes can control transport if they occur over time scales comparable to those for hydrologic transport. Examples of the following in-stream reactions are presented: (1) photoreduction and dissolution of hydrous iron oxides in response to an experimental decrease in stream pH, (2) precipitation of Al at three stream confluences, and (3) sorption of dissolved organic material by hydrous iron and aluminum oxides in a stream confluence. The extent of these reactions is evaluated using conservative tracers and a transport model that includes storage in the substream zone.

  13. Apparatus for the liquefaction of a gas and methods relating to same

    DOEpatents

    Turner, Terry D [Idaho Falls, ID; Wilding, Bruce M [Idaho Falls, ID; McKellar, Michael G [Idaho Falls, ID

    2009-12-29

    Apparatuses and methods are provided for producing liquefied gas, such as liquefied natural gas. In one embodiment, a liquefaction plant may be coupled to a source of unpurified natural gas, such as a natural gas pipeline at a pressure letdown station. A portion of the gas is drawn off and split into a process stream and a cooling stream. The cooling stream may be sequentially pass through a compressor and an expander. The process stream may also pass through a compressor. The compressed process stream is cooled, such as by the expanded cooling stream. The cooled, compressed process stream is expanded to liquefy the natural gas. A gas-liquid separator separates the vapor from the liquid natural gas. A portion of the liquid gas may be used for additional cooling. Gas produced within the system may be recompressed for reintroduction into a receiving line.

  14. Revealing the dual streams of speech processing.

    PubMed

    Fridriksson, Julius; Yourganov, Grigori; Bonilha, Leonardo; Basilakos, Alexandra; Den Ouden, Dirk-Bart; Rorden, Christopher

    2016-12-27

    Several dual route models of human speech processing have been proposed suggesting a large-scale anatomical division between cortical regions that support motor-phonological aspects vs. lexical-semantic aspects of speech processing. However, to date, there is no complete agreement on what areas subserve each route or the nature of interactions across these routes that enables human speech processing. Relying on an extensive behavioral and neuroimaging assessment of a large sample of stroke survivors, we used a data-driven approach using principal components analysis of lesion-symptom mapping to identify brain regions crucial for performance on clusters of behavioral tasks without a priori separation into task types. Distinct anatomical boundaries were revealed between a dorsal frontoparietal stream and a ventral temporal-frontal stream associated with separate components. Collapsing over the tasks primarily supported by these streams, we characterize the dorsal stream as a form-to-articulation pathway and the ventral stream as a form-to-meaning pathway. This characterization of the division in the data reflects both the overlap between tasks supported by the two streams as well as the observation that there is a bias for phonological production tasks supported by the dorsal stream and lexical-semantic comprehension tasks supported by the ventral stream. As such, our findings show a division between two processing routes that underlie human speech processing and provide an empirical foundation for studying potential computational differences that distinguish between the two routes.

  15. Hamming and Accumulator Codes Concatenated with MPSK or QAM

    NASA Technical Reports Server (NTRS)

    Divsalar, Dariush; Dolinar, Samuel

    2009-01-01

    In a proposed coding-and-modulation scheme, a high-rate binary data stream would be processed as follows: 1. The input bit stream would be demultiplexed into multiple bit streams. 2. The multiple bit streams would be processed simultaneously into a high-rate outer Hamming code that would comprise multiple short constituent Hamming codes a distinct constituent Hamming code for each stream. 3. The streams would be interleaved. The interleaver would have a block structure that would facilitate parallelization for high-speed decoding. 4. The interleaved streams would be further processed simultaneously into an inner two-state, rate-1 accumulator code that would comprise multiple constituent accumulator codes - a distinct accumulator code for each stream. 5. The resulting bit streams would be mapped into symbols to be transmitted by use of a higher-order modulation - for example, M-ary phase-shift keying (MPSK) or quadrature amplitude modulation (QAM). The novelty of the scheme lies in the concatenation of the multiple-constituent Hamming and accumulator codes and the corresponding parallel architectures of the encoder and decoder circuitry (see figure) needed to process the multiple bit streams simultaneously. As in the cases of other parallel-processing schemes, one advantage of this scheme is that the overall data rate could be much greater than the data rate of each encoder and decoder stream and, hence, the encoder and decoder could handle data at an overall rate beyond the capability of the individual encoder and decoder circuits.

  16. Optimized heat exchange in a CO2 de-sublimation process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baxter, Larry; Terrien, Paul; Tessier, Pascal

    The present invention is a process for removing carbon dioxide from a compressed gas stream including cooling the compressed gas in a first heat exchanger, introducing the cooled gas into a de-sublimating heat exchanger, thereby producing a first solid carbon dioxide stream and a first carbon dioxide poor gas stream, expanding the carbon dioxide poor gas stream, thereby producing a second solid carbon dioxide stream and a second carbon dioxide poor gas stream, combining the first solid carbon dioxide stream and the second solid carbon dioxide stream, thereby producing a combined solid carbon dioxide stream, and indirectly exchanging heat betweenmore » the combined solid carbon dioxide stream and the compressed gas in the first heat exchanger.« less

  17. PLUS: open-source toolkit for ultrasound-guided intervention systems.

    PubMed

    Lasso, Andras; Heffter, Tamas; Rankin, Adam; Pinter, Csaba; Ungi, Tamas; Fichtinger, Gabor

    2014-10-01

    A variety of advanced image analysis methods have been under the development for ultrasound-guided interventions. Unfortunately, the transition from an image analysis algorithm to clinical feasibility trials as part of an intervention system requires integration of many components, such as imaging and tracking devices, data processing algorithms, and visualization software. The objective of our paper is to provide a freely available open-source software platform-PLUS: Public software Library for Ultrasound-to facilitate rapid prototyping of ultrasound-guided intervention systems for translational clinical research. PLUS provides a variety of methods for interventional tool pose and ultrasound image acquisition from a wide range of tracking and imaging devices, spatial and temporal calibration, volume reconstruction, simulated image generation, and recording and live streaming of the acquired data. This paper introduces PLUS, explains its functionality and architecture, and presents typical uses and performance in ultrasound-guided intervention systems. PLUS fulfills the essential requirements for the development of ultrasound-guided intervention systems and it aspires to become a widely used translational research prototyping platform. PLUS is freely available as open source software under BSD license and can be downloaded from http://www.plustoolkit.org.

  18. Structures linking physical and biological processes in headwater streams of the Maybeso watershed, Southeast Alaska

    Treesearch

    Mason D. Bryant; Takashi Gomi; Jack J. Piccolo

    2007-01-01

    We focus on headwater streams originating in the mountainous terrain of northern temperate rain forests. These streams rapidly descend from gradients greater than 20% to less than 5% in U-shaped glacial valleys. We use a set of studies on headwater streams in southeast Alaska to define headwater stream catchments, link physical and biological processes, and describe...

  19. Future of Hydroinformatics: Towards Open, Integrated and Interactive Online Platforms

    NASA Astrophysics Data System (ADS)

    Demir, I.; Krajewski, W. F.

    2012-12-01

    Hydroinformatics is a domain of science and technology dealing with the management of information in the field of hydrology (IWA, 2011). There is the need for innovative solutions to the challenges towards open information, integration, and communication in the Internet. This presentation provides an overview of the trends and challenges in the future of hydroinformatics, and demonstrates an information system, Iowa Flood Information System (IFIS), developed within the light of these challenges. The IFIS is a web-based platform developed by the Iowa Flood Center (IFC) to provide access to flood inundation maps, real-time flood conditions, flood forecasts both short-term and seasonal, flood-related data, information and interactive visualizations for communities in Iowa. The key element of the system's architecture is the notion of community. Locations of the communities, those near streams and rivers, define basin boundaries. The IFIS provides community-centric watershed and river characteristics, weather (rainfall) conditions, and streamflow data and visualization tools. Interactive interfaces allow access to inundation maps for different stage and return period values, and flooding scenarios with contributions from multiple rivers. Real-time and historical data of water levels, gauge heights, and rainfall conditions are available in the IFIS by streaming data from automated IFC bridge sensors, USGS stream gauges, NEXRAD radars, and NWS forecasts. 2D and 3D interactive visualizations in the IFIS make the data more understandable to general public. Users are able to filter data sources for their communities and selected rivers. The data and information on IFIS is also accessible through web services and mobile applications. The IFIS is optimized for various browsers and screen sizes to provide access through multiple platforms including tablets and mobile devices. The IFIS includes a rainfall-runoff forecast model to provide a five-day flood risk estimate for more than 1000 communities in Iowa. Multiple view modes in the IFIS accommodate different user types from general public to researchers and decision makers by providing different level of tools and details. River view mode allows users to visualize data from multiple IFC bridge sensors and USGS stream gauges to follow flooding condition along a river. The IFIS will help communities make better-informed decisions on the occurrence of floods, and will alert communities in advance to help minimize damage of floods.

  20. Rich media streaming for just-in-time training of first responders

    NASA Astrophysics Data System (ADS)

    Bandera, Cesar; Marsico, Michael

    2005-05-01

    The diversity of first responders and of asymmetric threats precludes the effectiveness of any single training syllabus. Just-in-time training (JITT) addresses this variability, but requires training content to be quickly tailored to the subject (the threat), the learner (the responder), and the infrastructure (the C2 chain from DHS to the responder"s equipment). We present a distributed system for personalized just-in-time training of first responders. The authoring and delivery of interactive rich media and simulations, and the integration of JITT with C2 centers, are demonstrated. Live and archived video, imagery, 2-D and 3-D models, and simulations are autonomously (1) aggregated from object-oriented databases into SCORM-compliant objects, (2) tailored to the individual learner"s training history, preferences, connectivity and computing platform (from workstations to wireless PDAs), (3) conveyed as secure and reliable MPEG-4 compliant streams with data rights management, and (4) rendered as interactive high-definition rich media that promotes knowledge retention and the refinement of learner skills without the need of special hardware. We review the object-oriented implications of SCORM and the higher level profiles of the MPEG-4 standard, and show how JITT can be integrated into - and improve the ROI of - existing training infrastructures, including COTS content authoring tools, LMS/CMS, man-in-the-loop simulators, and legacy content. Lastly, we compare the audiovisual quality of different streaming platforms under varying connectivity conditions.

  1. Treatment of gas from an in situ conversion process

    DOEpatents

    Diaz, Zaida [Katy, TX; Del Paggio, Alan Anthony [Spring, TX; Nair, Vijay [Katy, TX; Roes, Augustinus Wilhelmus Maria [Houston, TX

    2011-12-06

    A method of producing methane is described. The method includes providing formation fluid from a subsurface in situ conversion process. The formation fluid is separated to produce a liquid stream and a first gas stream. The first gas stream includes olefins. At least the olefins in the first gas stream are contacted with a hydrogen source in the presence of one or more catalysts and steam to produce a second gas stream. The second gas stream is contacted with a hydrogen source in the presence of one or more additional catalysts to produce a third gas stream. The third gas stream includes methane.

  2. Aqueous stream characterization from biomass fast pyrolysis and catalytic fast pyrolysis

    DOE PAGES

    Black, Brenna A.; Michener, William E.; Ramirez, Kelsey J.; ...

    2016-09-05

    Here, biomass pyrolysis offers a promising means to rapidly depolymerize lignocellulosic biomass for subsequent catalytic upgrading to renewable fuels. Substantial efforts are currently ongoing to optimize pyrolysis processes including various fast pyrolysis and catalytic fast pyrolysis schemes. In all cases, complex aqueous streams are generated containing solubilized organic compounds that are not converted to target fuels or chemicals and are often slated for wastewater treatment, in turn creating an economic burden on the biorefinery. Valorization of the species in these aqueous streams, however, offers significant potential for substantially improving the economics and sustainability of thermochemical biorefineries. To that end, heremore » we provide a thorough characterization of the aqueous streams from four pilot-scale pyrolysis processes: namely, from fast pyrolysis, fast pyrolysis with downstream fractionation, in situ catalytic fast pyrolysis, and ex situ catalytic fast pyrolysis. These configurations and processes represent characteristic pyrolysis processes undergoing intense development currently. Using a comprehensive suite of aqueous-compatible analytical techniques, we quantitatively characterize between 12 g kg -1 of organic carbon of a highly aqueous catalytic fast pyrolysis stream and up to 315 g kg -1 of organic carbon present in the fast pyrolysis aqueous streams. In all cases, the analysis ranges between 75 and 100% of mass closure. The composition and stream properties closely match the nature of pyrolysis processes, with high contents of carbohydrate-derived compounds in the fast pyrolysis aqueous phase, high acid content in nearly all streams, and mostly recalcitrant phenolics in the heavily deoxygenated ex situ catalytic fast pyrolysis stream. Overall, this work provides a detailed compositional analysis of aqueous streams from leading thermochemical processes -- analyses that are critical for subsequent development of selective valorization strategies for these waste streams.« less

  3. Aqueous stream characterization from biomass fast pyrolysis and catalytic fast pyrolysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Black, Brenna A.; Michener, William E.; Ramirez, Kelsey J.

    Here, biomass pyrolysis offers a promising means to rapidly depolymerize lignocellulosic biomass for subsequent catalytic upgrading to renewable fuels. Substantial efforts are currently ongoing to optimize pyrolysis processes including various fast pyrolysis and catalytic fast pyrolysis schemes. In all cases, complex aqueous streams are generated containing solubilized organic compounds that are not converted to target fuels or chemicals and are often slated for wastewater treatment, in turn creating an economic burden on the biorefinery. Valorization of the species in these aqueous streams, however, offers significant potential for substantially improving the economics and sustainability of thermochemical biorefineries. To that end, heremore » we provide a thorough characterization of the aqueous streams from four pilot-scale pyrolysis processes: namely, from fast pyrolysis, fast pyrolysis with downstream fractionation, in situ catalytic fast pyrolysis, and ex situ catalytic fast pyrolysis. These configurations and processes represent characteristic pyrolysis processes undergoing intense development currently. Using a comprehensive suite of aqueous-compatible analytical techniques, we quantitatively characterize between 12 g kg -1 of organic carbon of a highly aqueous catalytic fast pyrolysis stream and up to 315 g kg -1 of organic carbon present in the fast pyrolysis aqueous streams. In all cases, the analysis ranges between 75 and 100% of mass closure. The composition and stream properties closely match the nature of pyrolysis processes, with high contents of carbohydrate-derived compounds in the fast pyrolysis aqueous phase, high acid content in nearly all streams, and mostly recalcitrant phenolics in the heavily deoxygenated ex situ catalytic fast pyrolysis stream. Overall, this work provides a detailed compositional analysis of aqueous streams from leading thermochemical processes -- analyses that are critical for subsequent development of selective valorization strategies for these waste streams.« less

  4. Geosfear? - Overcoming System Boundaries by Open-source Based Monitoring of Spatio-temporal Processes

    NASA Astrophysics Data System (ADS)

    Brandt, T.; Schima, R.; Goblirsch, T.; Paschen, M.; Francyk, B.; Bumberger, J.; Zacharias, S.; Dietrich, P.; Rubin, Y.; Rinke, K.; Fleckenstein, J. H.; Schmidt, C.; Vieweg, M.

    2016-12-01

    The impact of global change, intensive agriculture and complex interactions between humans and the environment show different effects on different scales. However, the desire to obtain a better understanding of ecosystems and process dynamics in nature accentuates the need for observing these processes at higher temporal and spatial resolutions. Especially with regard to the process dynamics and heterogeneity of rivers and catchment areas, a comprehensive monitoring of the ongoing processes and effects remains to be a challenging issue. What we need are monitoring systems which can collect most diverse data across different environmental compartments and scales. Today, open-source based electronics and innovative sensors and sensor components are offering a promising approach to investigate new possibilities of mobile data acquisition to improve our understanding of the geosphere. To start with, we have designed and implemented a multi-operable, embedded Linux platform for fast integration of different sensors within a single infrastructure. In addition, a GPS module in combination with a GSM transmitter ensures the synchronization and geo-referencing of all data, no matter how old-fashioned the sensors are. To this end, initial field experiments were conducted at a 3rd order stream in the Central German Lowland. Here, we linked in-stream DOC inputs with subsurface metabolism by coupling miniaturized DOC sensor probes with a modified vertical oxygen profiler in situ. Starting from metrological observations to water quality and subsurface conditions, the overarching goal is the detection of interlinked process dynamics across highly reactive biogeochemical interfaces. Overall, the field experiments demonstrated the feasibility of this emerging technology and its potential towards a cutting-edge strategy based on a holistic and integrated process. Now, we are only a few steps away from realizing adaptive and event-triggered observations close to real-time. Environmental monitoring and data evaluation have never been more exiting.

  5. Remote stereoscopic video play platform for naked eyes based on the Android system

    NASA Astrophysics Data System (ADS)

    Jia, Changxin; Sang, Xinzhu; Liu, Jing; Cheng, Mingsheng

    2014-11-01

    As people's life quality have been improved significantly, the traditional 2D video technology can not meet people's urgent desire for a better video quality, which leads to the rapid development of 3D video technology. Simultaneously people want to watch 3D video in portable devices,. For achieving the above purpose, we set up a remote stereoscopic video play platform. The platform consists of a server and clients. The server is used for transmission of different formats of video and the client is responsible for receiving remote video for the next decoding and pixel restructuring. We utilize and improve Live555 as video transmission server. Live555 is a cross-platform open source project which provides solutions for streaming media such as RTSP protocol and supports transmission of multiple video formats. At the receiving end, we use our laboratory own player. The player for Android, which is with all the basic functions as the ordinary players do and able to play normal 2D video, is the basic structure for redevelopment. Also RTSP is implemented into this structure for telecommunication. In order to achieve stereoscopic display, we need to make pixel rearrangement in this player's decoding part. The decoding part is the local code which JNI interface calls so that we can extract video frames more effectively. The video formats that we process are left and right, up and down and nine grids. In the design and development, a large number of key technologies from Android application development have been employed, including a variety of wireless transmission, pixel restructuring and JNI call. By employing these key technologies, the design plan has been finally completed. After some updates and optimizations, the video player can play remote 3D video well anytime and anywhere and meet people's requirement.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    The Britannia Field is 130 miles northeast of Aberdeen. It underlies a separate oil field, Alba, which was discovered by Chevron in 1984 and has been on stream since 1994. Britannia`s reserves of gas and condensate are held in cretaceous sandstone at a depth of approximately 13,000 ft. When Britannia reaches full production, it has the potential (at 740 MMcf/D gas) to supply 8% of the total U.K. gas demand. Britannia`s reserves are being developed through a single drilling, production, and accommodation platform at the east end of the field. The platform has 36 well slots and is supported onmore » an eight-legged jacket in 459-ft-deep water. A subsea well center with 14 well slots will be 9 miles west of the platform. The paper discusses field development, field management, and performance to date.« less

  7. "Can You Hear Me, Hanoi?" Compensatory Mechanisms Employed in Synchronous Net-Based English Language Learning

    ERIC Educational Resources Information Center

    Cunningham, Una; Fagersten, Kristy Beers; Holmsten, Elin

    2010-01-01

    At Dalarna University, Sweden, modes of communication are offered at many points of Kenning's continuum with a web-based learning platform, including asynchronous document exchange and collaborative writing tools, e-mail, recorded lectures in various formats, live streamed lectures with the possibility of text questions to the lecturer in real…

  8. Quantitative measurement of stream respiration using the resazurin-resorufin system

    NASA Astrophysics Data System (ADS)

    Gonzalez Pinzon, R. A.; Acker, S.; Haggerty, R.; Myrold, D.

    2011-12-01

    After three decades of active research in hydrology and stream ecology, the relationship between stream solute transport, metabolism and nutrient dynamics is still unresolved. These knowledge gaps obscure the function of stream ecosystems and how they interact with other landscape processes. To date, measuring rates of stream metabolism is accomplished with techniques that have vast uncertainties and are not spatially representative. These limitations mask the role of metabolism in nutrient processing. Clearly, more robust techniques are needed to develop mechanistic relationships that will ultimately improve our fundamental understanding of in-stream processes and how streams interact with other ecosystems. We investigated the "metabolic window of detection" of the Resazurin (Raz)-Resorufin (Rru) system (Haggerty et al., 2008, 2009). Although previous results have shown that the transformation of Raz to Rru is strongly correlated with respiration, a quantitative relationship between them is needed. We investigated this relationship using batch experiments with pure cultures (aerobic and anaerobic) and flow-through columns with incubated sediments from four different streams. The results suggest that the Raz-Rru system is a suitable approach that will enable hydrologists and stream ecologists to measure in situ and in vivo respiration at different scales, thus opening a reliable alternative to investigate how solute transport and stream metabolism control nutrient processing.

  9. Modeling nutrient retention at the watershed scale: Does small stream research apply to the whole river network?

    NASA Astrophysics Data System (ADS)

    Aguilera, Rosana; Marcé, Rafael; Sabater, Sergi

    2013-06-01

    are conveyed from terrestrial and upstream sources through drainage networks. Streams and rivers contribute to regulate the material exported downstream by means of transformation, storage, and removal of nutrients. It has been recently suggested that the efficiency of process rates relative to available nutrient concentration in streams eventually declines, following an efficiency loss (EL) dynamics. However, most of these predictions are based at the reach scale in pristine streams, failing to describe the role of entire river networks. Models provide the means to study nutrient cycling from the stream network perspective via upscaling to the watershed the key mechanisms occurring at the reach scale. We applied a hybrid process-based and statistical model (SPARROW, Spatially Referenced Regression on Watershed Attributes) as a heuristic approach to describe in-stream nutrient processes in a highly impaired, high stream order watershed (the Llobregat River Basin, NE Spain). The in-stream decay specifications of the model were modified to include a partial saturation effect in uptake efficiency (expressed as a power law) and better capture biological nutrient retention in river systems under high anthropogenic stress. The stream decay coefficients were statistically significant in both nitrate and phosphate models, indicating the potential role of in-stream processing in limiting nutrient export. However, the EL concept did not reliably describe the patterns of nutrient uptake efficiency for the concentration gradient and streamflow values found in the Llobregat River basin, posing in doubt its complete applicability to explain nutrient retention processes in stream networks comprising highly impaired rivers.

  10. A Roadmap to Continuous Integration for ATLAS Software Development

    NASA Astrophysics Data System (ADS)

    Elmsheuser, J.; Krasznahorkay, A.; Obreshkov, E.; Undrus, A.; ATLAS Collaboration

    2017-10-01

    The ATLAS software infrastructure facilitates efforts of more than 1000 developers working on the code base of 2200 packages with 4 million lines of C++ and 1.4 million lines of python code. The ATLAS offline code management system is the powerful, flexible framework for processing new package versions requests, probing code changes in the Nightly Build System, migration to new platforms and compilers, deployment of production releases for worldwide access and supporting physicists with tools and interfaces for efficient software use. It maintains multi-stream, parallel development environment with about 70 multi-platform branches of nightly releases and provides vast opportunities for testing new packages, for verifying patches to existing software and for migrating to new platforms and compilers. The system evolution is currently aimed on the adoption of modern continuous integration (CI) practices focused on building nightly releases early and often, with rigorous unit and integration testing. This paper describes the CI incorporation program for the ATLAS software infrastructure. It brings modern open source tools such as Jenkins and GitLab into the ATLAS Nightly System, rationalizes hardware resource allocation and administrative operations, provides improved feedback and means to fix broken builds promptly for developers. Once adopted, ATLAS CI practices will improve and accelerate innovation cycles and result in increased confidence in new software deployments. The paper reports the status of Jenkins integration with the ATLAS Nightly System as well as short and long term plans for the incorporation of CI practices.

  11. High-precision GPS autonomous platforms for sea ice dynamics and physical oceanography

    NASA Astrophysics Data System (ADS)

    Elosegui, P.; Wilkinson, J.; Olsson, M.; Rodwell, S.; James, A.; Hagan, B.; Hwang, B.; Forsberg, R.; Gerdes, R.; Johannessen, J.; Wadhams, P.; Nettles, M.; Padman, L.

    2012-12-01

    Project "Arctic Ocean sea ice and ocean circulation using satellite methods" (SATICE), is the first high-rate, high-precision, continuous GPS positioning experiment on sea ice in the Arctic Ocean. The SATICE systems collect continuous, dual-frequency carrier-phase GPS data while drifting on sea ice. Additional geophysical measurements also collected include ocean water pressure, ocean surface salinity, atmospheric pressure, snow-depth, air-ice-ocean temperature profiles, photographic imagery, and others, enabling sea ice drift, freeboard, weather, ice mass balance, and sea-level height determination. Relatively large volumes of data from each buoy are streamed over a satellite link to a central computer on the Internet in near real time, where they are processed to estimate the time-varying buoy positions. SATICE system obtains continuous GPS data at sub-minute intervals with a positioning precision of a few centimetres in all three dimensions. Although monitoring of sea ice motions goes back to the early days of satellite observations, these autonomous platforms bring out a level of spatio-temporal detail that has never been seen before, especially in the vertical axis. These high-resolution data allows us to address new polar science questions and challenge our present understanding of both sea ice dynamics and Arctic oceanography. We will describe the technology behind this new autonomous platform, which could also be adapted to other applications that require high resolution positioning information with sustained operations and observations in the polar marine environment, and present results pertaining to sea ice dynamics and physical oceanography.

  12. A world of opportunities with nanopore sequencing.

    PubMed

    Leggett, Richard M; Clark, Matthew D

    2017-11-28

    Oxford Nanopore Technologies' MinION sequencer was launched in pre-release form in 2014 and represents an exciting new sequencing paradigm. The device offers multi-kilobase reads and a streamed mode of operation that allows processing of reads as they are generated. Crucially, it is an extremely compact device that is powered from the USB port of a laptop computer, enabling it to be taken out of the lab and facilitating previously impossible in-field sequencing experiments to be undertaken. Many of the initial publications concerning the platform focused on provision of tools to access and analyse the new sequence formats and then demonstrating the assembly of microbial genomes. More recently, as throughput and accuracy have increased, it has been possible to begin work involving more complex genomes and metagenomes. With the release of the high-throughput GridION X5 and PromethION platforms, the sequencing of large genomes will become more cost efficient, and enable the leveraging of extremely long (>100 kb) reads for resolution of complex genomic structures. This review provides a brief overview of nanopore sequencing technology, describes the growing range of nanopore bioinformatics tools, and highlights some of the most influential publications that have emerged over the last 2 years. Finally, we look to the future and the potential the platform has to disrupt work in human, microbiome, and plant genomics. © The Author 2017. Published by Oxford University Press on behalf of the Society for Experimental Biology. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  13. JIP: Java image processing on the Internet

    NASA Astrophysics Data System (ADS)

    Wang, Dongyan; Lin, Bo; Zhang, Jun

    1998-12-01

    In this paper, we present JIP - Java Image Processing on the Internet, a new Internet based application for remote education and software presentation. JIP offers an integrate learning environment on the Internet where remote users not only can share static HTML documents and lectures notes, but also can run and reuse dynamic distributed software components, without having the source code or any extra work of software compilation, installation and configuration. By implementing a platform-independent distributed computational model, local computational resources are consumed instead of the resources on a central server. As an extended Java applet, JIP allows users to selected local image files on their computers or specify any image on the Internet using an URL as input. Multimedia lectures such as streaming video/audio and digital images are integrated into JIP and intelligently associated with specific image processing functions. Watching demonstrations an practicing the functions with user-selected input data dramatically encourages leaning interest, while promoting the understanding of image processing theory. The JIP framework can be easily applied to other subjects in education or software presentation, such as digital signal processing, business, mathematics, physics, or other areas such as employee training and charged software consumption.

  14. Cavity-induced microstreaming for simultaneous on-chip pumping and size-based separation of cells and particles.

    PubMed

    Patel, Maulik V; Nanayakkara, Imaly A; Simon, Melinda G; Lee, Abraham P

    2014-10-07

    We present a microfluidic platform for simultaneous on-chip pumping and size-based separation of cells and particles without external fluidic control systems required for most existing platforms. The device utilizes an array of acoustically actuated air/liquid interfaces generated using dead-end side channels termed Lateral Cavity Acoustic Transducers (LCATs). The oscillating interfaces generate local streaming flow while the angle of the LCATs relative to the main channel generates a global bulk flow from the inlet to the outlet. The interaction of these two competing velocity fields (i.e. global bulk velocity vs. local streaming velocity) is responsible for the observed separation. It is shown that the separation of 5 μm and 10 μm polystyrene beads is dependent on the ratio of these two competing velocity fields. The experimental and simulation results suggest that particle trajectories based only on Stokes drag force cannot fully explain the separation behavior and that the impact of additional forces due to the oscillating flow field must be considered to determine the trajectory of the beads and ultimately the separation behavior of the device. To demonstrate an application of this separation platform with cellular components, smaller red blood cells (7.5 ± 0.8 μm) are separated from larger K562 cells (16.3 ± 2.0 μm) with viabilities comparable to those of controls based on a trypan blue exclusion assay.

  15. Conversion of stranded waste-stream carbon and nutrients into value-added products via metabolically coupled binary heterotroph-photoautotroph system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bohutskyi, Pavlo; Kucek, Leo A.; Hill, Eric

    Metabolic flexibility and robustness of phototroph- heterotroph co-cultures provide a flexible binary engineering platform for a variety of biotechnological and environmental applications. Here, we metabolically coupled a heterotrophic bacterium Bacillus subtilis with astaxanthin producing alga Haematococcus pluvialis and successfully applied this binary co-culture for conversion of the starch-rich waste stream into valuable astaxanthin-rich biomass. Importantly, the implemented system required less mass transfer of CO2 and O2 due to in-situ exchange between heterotroph and phototroph, which can contribute to reduction in energy consumption for wastewater treatment. In addition, the maximum reduction in chemical oxygen demand, total nitrogen and phosphorus reached 65%,more » 55% and 30%, respectively. The preliminary economic analysis indicated that realization of produced biomass with 0.8% astaxanthin content may generate annual revenues of $3.2M (baseline scenario) from treatment of wastewater (1,090 m3/day) from a potato processing plant. Moreover, the revenues may be increased up to $18.2M for optimized scenario with astaxanthin content in algae of 2%. This work demonstrates a successful proof-of-principle for conversion of waste carbon and nutrients into targeted value-added products through metabolic connection of heterotrophic and phototrophic organisms. Utilization of heterotrophic-algal binary cultures opens new perspectives for designing highly-efficient production processes for feedstock biomass production as well as allows utilization of variety of organic agricultural, chemical, or municipal wastes.« less

  16. Vapor-fed microfluidic hydrogen generator.

    PubMed

    Modestino, M A; Dumortier, M; Hosseini Hashemi, S M; Haussener, S; Moser, C; Psaltis, D

    2015-05-21

    Water-splitting devices that operate with humid air feeds are an attractive alternative for hydrogen production as the required water input can be obtained directly from ambient air. This article presents a novel proof-of-concept microfluidic platform that makes use of polymeric ion conductor (Nafion®) thin films to absorb water from air and performs the electrochemical water-splitting process. Modelling and experimental tools are used to demonstrate that these microstructured devices can achieve the delicate balance between water, gas, and ionic transport processes required for vapor-fed devices to operate continuously and at steady state, at current densities above 3 mA cm(-2). The results presented here show that factors such as the thickness of the Nafion films covering the electrodes, convection of air streams, and water content of the ionomer can significantly affect the device performance. The insights presented in this work provide important guidelines for the material requirements and device designs that can be used to create practical electrochemical hydrogen generators that work directly under ambient air.

  17. Validation of Contamination Control in Rapid Transfer Port Chambers for Pharmaceutical Manufacturing Processes

    PubMed Central

    Hu, Shih-Cheng; Shiue, Angus; Liu, Han-Yang; Chiu, Rong-Ben

    2016-01-01

    There is worldwide concern with regard to the adverse effects of drug usage. However, contaminants can gain entry into a drug manufacturing process stream from several sources such as personnel, poor facility design, incoming ventilation air, machinery and other equipment for production, etc. In this validation study, we aimed to determine the impact and evaluate the contamination control in the preparation areas of the rapid transfer port (RTP) chamber during the pharmaceutical manufacturing processes. The RTP chamber is normally tested for airflow velocity, particle counts, pressure decay of leakage, and sterility. The air flow balance of the RTP chamber is affected by the airflow quantity and the height above the platform. It is relatively easy to evaluate the RTP chamber′s leakage by the pressure decay, where the system is charged with the air, closed, and the decay of pressure is measured by the time period. We conducted the determination of a vaporized H2O2 of a sufficient concentration to complete decontamination. The performance of the RTP chamber will improve safety and can be completely tested at an ISO Class 5 environment. PMID:27845748

  18. Validation of Contamination Control in Rapid Transfer Port Chambers for Pharmaceutical Manufacturing Processes.

    PubMed

    Hu, Shih-Cheng; Shiue, Angus; Liu, Han-Yang; Chiu, Rong-Ben

    2016-11-12

    There is worldwide concern with regard to the adverse effects of drug usage. However, contaminants can gain entry into a drug manufacturing process stream from several sources such as personnel, poor facility design, incoming ventilation air, machinery and other equipment for production, etc. In this validation study, we aimed to determine the impact and evaluate the contamination control in the preparation areas of the rapid transfer port (RTP) chamber during the pharmaceutical manufacturing processes. The RTP chamber is normally tested for airflow velocity, particle counts, pressure decay of leakage, and sterility. The air flow balance of the RTP chamber is affected by the airflow quantity and the height above the platform. It is relatively easy to evaluate the RTP chamber's leakage by the pressure decay, where the system is charged with the air, closed, and the decay of pressure is measured by the time period. We conducted the determination of a vaporized H₂O₂ of a sufficient concentration to complete decontamination. The performance of the RTP chamber will improve safety and can be completely tested at an ISO Class 5 environment.

  19. Pervaporation process and use in treating waste stream from glycol dehydrator

    DOEpatents

    Kaschemekat, Jurgen; Baker, Richard W.

    1994-01-01

    Pervaporation processes and apparatus with few moving parts. Ideally, only one pump is used to provide essentially all of the motive power and driving force needed. The process is particularly useful for handling small streams with flow rates less than about 700 gpd. Specifically, the process can be used to treat waste streams from glycol dehydrator regeneration units.

  20. Current and potential uses of bioactive molecules from marine processing waste.

    PubMed

    Suleria, Hafiz Ansar Rasul; Masci, Paul; Gobe, Glenda; Osborne, Simone

    2016-03-15

    Food industries produce huge amounts of processing waste that are often disposed of incurring expenses and impacting upon the environment. For these and other reasons, food processing waste streams, in particular marine processing waste streams, are gaining popularity amongst pharmaceutical, cosmetic and nutraceutical industries as sources of bioactive molecules. In the last 30 years, there has been a gradual increase in processed marine products with a concomitant increase in waste streams that include viscera, heads, skins, fins, bones, trimmings and shellfish waste. In 2010, these waste streams equated to approximately 24 million tonnes of mostly unused resources. Marine processing waste streams not only represent an abundant resource, they are also enriched with structurally diverse molecules that possess a broad panel of bioactivities including anti-oxidant, anti-coagulant, anti-thrombotic, anti-cancer and immune-stimulatory activities. Retrieval and characterisation of bioactive molecules from marine processing waste also contributes valuable information to the vast field of marine natural product discovery. This review summarises the current use of bioactive molecules from marine processing waste in different products and industries. Moreover, this review summarises new research into processing waste streams and the potential for adoption by industries in the creation of new products containing marine processing waste bioactives. © 2015 Society of Chemical Industry.

  1. Flow directionality, mountain barriers and functional traits determine diatom metacommunity structuring of high mountain streams.

    PubMed

    Dong, Xiaoyu; Li, Bin; He, Fengzhi; Gu, Yuan; Sun, Meiqin; Zhang, Haomiao; Tan, Lu; Xiao, Wen; Liu, Shuoran; Cai, Qinghua

    2016-04-19

    Stream metacommunities are structured by a combination of local (environmental filtering) and regional (dispersal) processes. The unique characters of high mountain streams could potentially determine metacommunity structuring, which is currently poorly understood. Aiming at understanding how these characters influenced metacommunity structuring, we explored the relative importance of local environmental conditions and various dispersal processes, including through geographical (overland), topographical (across mountain barriers) and network (along flow direction) pathways in shaping benthic diatom communities. From a trait perspective, diatoms were categorized into high-profile, low-profile and motile guild to examine the roles of functional traits. Our results indicated that both environmental filtering and dispersal processes influenced metacommunity structuring, with dispersal contributing more than environmental processes. Among the three pathways, stream corridors were primary pathway. Deconstructive analysis suggested different responses to environmental and spatial factors for each of three ecological guilds. However, regardless of traits, dispersal among streams was limited by mountain barriers, while dispersal along stream was promoted by rushing flow in high mountain stream. Our results highlighted that directional processes had prevailing effects on metacommunity structuring in high mountain streams. Flow directionality, mountain barriers and ecological guilds contributed to a better understanding of the roles that mountains played in structuring metacommunity.

  2. Chromium: A Stress-Processing Framework for Interactive Rendering on Clusters

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Humphreys, G,; Houston, M.; Ng, Y.-R.

    2002-01-11

    We describe Chromium, a system for manipulating streams of graphics API commands on clusters of workstations. Chromium's stream filters can be arranged to create sort-first and sort-last parallel graphics architectures that, in many cases, support the same applications while using only commodity graphics accelerators. In addition, these stream filters can be extended programmatically, allowing the user to customize the stream transformations performed by nodes in a cluster. Because our stream processing mechanism is completely general, any cluster-parallel rendering algorithm can be either implemented on top of or embedded in Chromium. In this paper, we give examples of real-world applications thatmore » use Chromium to achieve good scalability on clusters of workstations, and describe other potential uses of this stream processing technology. By completely abstracting the underlying graphics architecture, network topology, and API command processing semantics, we allow a variety of applications to run in different environments.« less

  3. Mass, energy and material balances of SRF production process. Part 1: SRF produced from commercial and industrial waste.

    PubMed

    Nasrullah, Muhammad; Vainikka, Pasi; Hannula, Janne; Hurme, Markku; Kärki, Janne

    2014-08-01

    This paper presents the mass, energy and material balances of a solid recovered fuel (SRF) production process. The SRF is produced from commercial and industrial waste (C&IW) through mechanical treatment (MT). In this work various streams of material produced in SRF production process are analyzed for their proximate and ultimate analysis. Based on this analysis and composition of process streams their mass, energy and material balances are established for SRF production process. Here mass balance describes the overall mass flow of input waste material in the various output streams, whereas material balance describes the mass flow of components of input waste stream (such as paper and cardboard, wood, plastic (soft), plastic (hard), textile and rubber) in the various output streams of SRF production process. A commercial scale experimental campaign was conducted on an MT waste sorting plant to produce SRF from C&IW. All the process streams (input and output) produced in this MT plant were sampled and treated according to the CEN standard methods for SRF: EN 15442 and EN 15443. The results from the mass balance of SRF production process showed that of the total input C&IW material to MT waste sorting plant, 62% was recovered in the form of SRF, 4% as ferrous metal, 1% as non-ferrous metal and 21% was sorted out as reject material, 11.6% as fine fraction, and 0.4% as heavy fraction. The energy flow balance in various process streams of this SRF production process showed that of the total input energy content of C&IW to MT plant, 75% energy was recovered in the form of SRF, 20% belonged to the reject material stream and rest 5% belonged with the streams of fine fraction and heavy fraction. In the material balances, mass fractions of plastic (soft), plastic (hard), paper and cardboard and wood recovered in the SRF stream were 88%, 70%, 72% and 60% respectively of their input masses to MT plant. A high mass fraction of plastic (PVC), rubber material and non-combustibles (such as stone/rock and glass particles), was found in the reject material stream. Copyright © 2014 Elsevier Ltd. All rights reserved.

  4. 40 CFR 63.138 - Process wastewater provisions-performance standards for treatment processes managing Group 1...

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... each treatment process. (b) Control options: Group 1 wastewater streams for Table 9 compounds. The... section. (c) Control options: Group 1 wastewater streams for Table 8 compounds. The owner or operator...) Residuals. For each residual removed from a Group 1 wastewater stream, the owner or operator shall control...

  5. 40 CFR 63.138 - Process wastewater provisions-performance standards for treatment processes managing Group 1...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... each treatment process. (b) Control options: Group 1 wastewater streams for Table 9 compounds. The... section. (c) Control options: Group 1 wastewater streams for Table 8 compounds. The owner or operator...) Residuals. For each residual removed from a Group 1 wastewater stream, the owner or operator shall control...

  6. 40 CFR 63.138 - Process wastewater provisions-performance standards for treatment processes managing Group 1...

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... each treatment process. (b) Control options: Group 1 wastewater streams for Table 9 compounds. The... section. (c) Control options: Group 1 wastewater streams for Table 8 compounds. The owner or operator...) Residuals. For each residual removed from a Group 1 wastewater stream, the owner or operator shall control...

  7. The sagittarius tidal stream and the shape of the galactic stellar halo

    NASA Astrophysics Data System (ADS)

    Newby, Matthew T.

    The stellar halo that surrounds our Galaxy contains clues to understanding galaxy formation, cosmology, stellar evolution, and the nature of dark matter. Gravitationally disrupted dwarf galaxies form tidal streams, which roughly trace orbits through the Galactic halo. The Sagittarius (Sgr) dwarf tidal debris is the most dominant of these streams, and its properties place important constraints on the distribution of mass (including dark matter) in the Galaxy. Stars not associated with substructures form the "smooth" component of the stellar halo, the origin of which is still under investigation. Characterizing halo substructures such as the Sgr stream and the smooth halo provides valuable information on the formation history and evolution of our galaxy, and places constraints on cosmological models. This thesis is primarily concerned with characterizing the 3-dimensional stellar densities of the Sgr tidal debris system and the smooth stellar halo, using data from the Sloan Digital Sky Survey (SDSS). F turnoff stars are used to infer distances, as they are relatively bright, numerous, and distributed about a single intrinsic brightness (magnitude). The inherent spread in brightnesses of these stars is overcome through the use of the recently-developed technique of statistical photometric parallax, in which the bulk properties of a stellar population are used to create a probability distribution for a given star's distance. This was used to build a spatial density model for the smooth stellar halo and tidal streams. The free parameters in this model are then fit to SDSS data with a maximum likelihood technique, and the parameters are optimized by advanced computational methods. Several computing platforms are used in this study, including the RPI SUR Bluegene and the Milkyway home volunteer computing project. Fits to the Sgr stream in 18 SDSS data stripes were performed, and a continuous density profile is found for the major Sgr stream. The stellar halo is found to be strongly oblate (flattening parameter q=0.53). A catalog of stars consistent with this density profile is produced as a template for matching future disruption models. The results of this analysis favor a description of the Sgr debris system that includes more than one dwarf galaxy progenitor, with the major streams above and below the Galactic disk being separate substructures. Preliminary results for the minor tidal stream characterizations are presented and discussed. Additionally, a more robust characterization of halo turnoff star brightnesses is performed, and it is found that increasing color errors with distance result in a previously unaccounted for incompleteness in star counts as the SDSS magnitude limit is approached. These corrections are currently in the process of being implemented on MilkyWay home.

  8. Increased functional connectivity in the ventral and dorsal streams during retrieval of novel words in professional musicians.

    PubMed

    Dittinger, Eva; Valizadeh, Seyed Abolfazl; Jäncke, Lutz; Besson, Mireille; Elmer, Stefan

    2018-02-01

    Current models of speech and language processing postulate the involvement of two parallel processing streams (the dual stream model): a ventral stream involved in mapping sensory and phonological representations onto lexical and conceptual representations and a dorsal stream contributing to sound-to-motor mapping, articulation, and to how verbal information is encoded and manipulated in memory. Based on previous evidence showing that music training has an influence on language processing, cognitive functions, and word learning, we examined EEG-based intracranial functional connectivity in the ventral and dorsal streams while musicians and nonmusicians learned the meaning of novel words through picture-word associations. In accordance with the dual stream model, word learning was generally associated with increased beta functional connectivity in the ventral stream compared to the dorsal stream. In addition, in the linguistically most demanding "semantic task," musicians outperformed nonmusicians, and this behavioral advantage was accompanied by increased left-hemispheric theta connectivity in both streams. Moreover, theta coherence in the left dorsal pathway was positively correlated with the number of years of music training. These results provide evidence for a complex interplay within a network of brain regions involved in semantic processing and verbal memory functions, and suggest that intensive music training can modify its functional architecture leading to advantages in novel word learning. © 2017 Wiley Periodicals, Inc.

  9. Complex Catchment Processes that Control Stream Nitrogen and Organic Matter Concentrations in a Northeastern USA Upland Catchment

    NASA Astrophysics Data System (ADS)

    Sebestyen, S. D.; Shanley, J. B.; Pellerin, B.; Saraceno, J.; Aiken, G. R.; Boyer, E. W.; Doctor, D. H.; Kendall, C.

    2009-05-01

    There is a need to understand the coupled biogeochemical and hydrological processes that control stream hydrochemistry in upland forested catchments. At watershed 9 (W-9) of the Sleepers River Research Watershed in the northeastern USA, we use high-frequency sampling, environmental tracers, end-member mixing analysis, and stream reach mass balances to understand dynamic factors affect forms and concentrations of nitrogen and organic matter in streamflow. We found that rates of stream nitrate processing changed during autumn baseflow and that up to 70% of nitrate inputs to a stream reach were retained. At the same time, the stream reach was a net source of the dissolved organic carbon (DOC) and dissolved organic nitrogen (DON) fractions of dissolved organic matter (DOM). The in-stream nitrate loss and DOM gains are examples of hot moments of biogeochemical transformations during autumn when deciduous litter fall increases DOM availability. As hydrological flowpaths changed during rainfall events, the sources and transformations of nitrate and DOM differed from baseflow. For example, during storm flow we measured direct inputs of unprocessed atmospheric nitrate to streams that were as large as 30% of the stream nitrate loading. At the same time, stream DOM composition shifted to reflect inputs of reactive organic matter from surficial upland soils. The transport of atmospheric nitrate and reactive DOM to streams underscores the importance of quantifying source variation during short-duration stormflow events. Building upon these findings we present a conceptual model of interacting ecosystem processes that control the flow of water and nutrients to streams in a temperate upland catchment.

  10. System for processing an encrypted instruction stream in hardware

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Griswold, Richard L.; Nickless, William K.; Conrad, Ryan C.

    A system and method of processing an encrypted instruction stream in hardware is disclosed. Main memory stores the encrypted instruction stream and unencrypted data. A central processing unit (CPU) is operatively coupled to the main memory. A decryptor is operatively coupled to the main memory and located within the CPU. The decryptor decrypts the encrypted instruction stream upon receipt of an instruction fetch signal from a CPU core. Unencrypted data is passed through to the CPU core without decryption upon receipt of a data fetch signal.

  11. Downstream processing of biopharmaceutical proteins produced in plants: the pros and cons of flocculants.

    PubMed

    Buyel, Johannes Felix; Fischer, Rainer

    2014-01-01

    All biological platforms for the manufacture of biopharmaceutical proteins produce an initially turbid extract that must be clarified to avoid fouling sensitive media such as chromatography resins. Clarification is more challenging if the feed stream contains large amounts of dispersed particles, because these rapidly clog the filter media typically used to remove suspended solids. Charged polymers (flocculants) can increase the apparent size of the dispersed particles by aggregation, facilitating the separation of solids and liquids, and thus reducing process costs. However, many different factors can affect the behavior of flocculants, including the pH and conductivity of the medium, the size and charge distribution of the particulates, and the charge density and molecular mass of the polymer. Importantly, these properties can also affect the recovery of the target protein and the overall safety profile of the process. We therefore used a design of experiments approach to establish reliable predictive models that characterize the impact of flocculants during the downstream processing of biopharmaceutical proteins. We highlight strategies for the selection of flocculants during process optimization. These strategies will contribute to the quality by design aspects of process development and facilitate the development of safe and efficient downstream processes for plant-derived pharmaceutical proteins.

  12. North Atlantic Ocean OSSE system development: Nature Run evaluation and application to hurricane interaction with the Gulf Stream

    NASA Astrophysics Data System (ADS)

    Kourafalou, Vassiliki H.; Androulidakis, Yannis S.; Halliwell, George R.; Kang, HeeSook; Mehari, Michael M.; Le Hénaff, Matthieu; Atlas, Robert; Lumpkin, Rick

    2016-11-01

    A high resolution, free-running model has been developed for the hurricane region of the North Atlantic Ocean. The model is evaluated with a variety of observations to ensure that it adequately represents both the ocean climatology and variability over this region, with a focus on processes relevant to hurricane-ocean interactions. As such, it can be used as the "Nature Run" (NR) model within the framework of Observing System Simulation Experiments (OSSEs), designed specifically to improve the ocean component of coupled ocean-atmosphere hurricane forecast models. The OSSE methodology provides quantitative assessment of the impact of specific observations on the skill of forecast models and enables the comprehensive design of future observational platforms and the optimization of existing ones. Ocean OSSEs require a state-of-the-art, high-resolution free-running model simulation that represents the true ocean (the NR). This study concentrates on the development and data based evaluation of the NR model component, which leads to a reliable model simulation that has a dual purpose: (a) to provide the basis for future hurricane related OSSEs; (b) to explore process oriented studies of hurricane-ocean interactions. A specific example is presented, where the impact of Hurricane Bill (2009) on the eastward extension and transport of the Gulf Stream is analyzed. The hurricane induced cold wake is shown in both NR simulation and observations. Interaction of storm-forced currents with the Gulf Stream produced a temporary large reduction in eastward transport downstream from Cape Hatteras and had a marked influence on frontal displacement in the upper ocean. The kinetic energy due to ageostrophic currents showed a significant increase as the storm passed, and then decreased to pre-storm levels within 8 days after the hurricane advanced further north. This is a unique result of direct hurricane impact on a western boundary current, with possible implications on the ocean feedback on hurricane evolution.

  13. Posterior Parietal Cortex Drives Inferotemporal Activations During Three-Dimensional Object Vision.

    PubMed

    Van Dromme, Ilse C; Premereur, Elsie; Verhoef, Bram-Ernst; Vanduffel, Wim; Janssen, Peter

    2016-04-01

    The primate visual system consists of a ventral stream, specialized for object recognition, and a dorsal visual stream, which is crucial for spatial vision and actions. However, little is known about the interactions and information flow between these two streams. We investigated these interactions within the network processing three-dimensional (3D) object information, comprising both the dorsal and ventral stream. Reversible inactivation of the macaque caudal intraparietal area (CIP) during functional magnetic resonance imaging (fMRI) reduced fMRI activations in posterior parietal cortex in the dorsal stream and, surprisingly, also in the inferotemporal cortex (ITC) in the ventral visual stream. Moreover, CIP inactivation caused a perceptual deficit in a depth-structure categorization task. CIP-microstimulation during fMRI further suggests that CIP projects via posterior parietal areas to the ITC in the ventral stream. To our knowledge, these results provide the first causal evidence for the flow of visual 3D information from the dorsal stream to the ventral stream, and identify CIP as a key area for depth-structure processing. Thus, combining reversible inactivation and electrical microstimulation during fMRI provides a detailed view of the functional interactions between the two visual processing streams.

  14. Posterior Parietal Cortex Drives Inferotemporal Activations During Three-Dimensional Object Vision

    PubMed Central

    Van Dromme, Ilse C.; Premereur, Elsie; Verhoef, Bram-Ernst; Vanduffel, Wim; Janssen, Peter

    2016-01-01

    The primate visual system consists of a ventral stream, specialized for object recognition, and a dorsal visual stream, which is crucial for spatial vision and actions. However, little is known about the interactions and information flow between these two streams. We investigated these interactions within the network processing three-dimensional (3D) object information, comprising both the dorsal and ventral stream. Reversible inactivation of the macaque caudal intraparietal area (CIP) during functional magnetic resonance imaging (fMRI) reduced fMRI activations in posterior parietal cortex in the dorsal stream and, surprisingly, also in the inferotemporal cortex (ITC) in the ventral visual stream. Moreover, CIP inactivation caused a perceptual deficit in a depth-structure categorization task. CIP-microstimulation during fMRI further suggests that CIP projects via posterior parietal areas to the ITC in the ventral stream. To our knowledge, these results provide the first causal evidence for the flow of visual 3D information from the dorsal stream to the ventral stream, and identify CIP as a key area for depth-structure processing. Thus, combining reversible inactivation and electrical microstimulation during fMRI provides a detailed view of the functional interactions between the two visual processing streams. PMID:27082854

  15. Sperm Scoring Using Multi-Spectral Flow Imaging and FISH-IS Final Report CRADA No. TC02088.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marchetti, F.; Morrissey, P. J.

    This was to be a collaborative effort between The Regents of the University of California, Lawrence Livermore National Laboratory (LLNL) and Amnis Corporation, to develop an automated system for scoring sperm interphase cells for the presence of chromosomal abnormalities using fluorescence in situ hybridization and the Amnis ImageStream technology platform.

  16. Technology and Career Preparation: Using Virtual Interview Recordings (VIRs) in an Apparel, Design, and Textiles (ADT) Professional Seminar Course

    ERIC Educational Resources Information Center

    Eike, Rachel J.; Rowell, Amy; Mihuta, Tiffani

    2016-01-01

    The purpose of this study was to identify key virtual-recorded interview (VIR) skills that are essential to Apparel, Design, and Textile (ADT) student performance. The virtual, computer-recording interview platform, InterviewStream, was used as the data collection instrument in this qualitative, exploratory case study. Virtual interviews have been…

  17. GPUs: An Emerging Platform for General-Purpose Computation

    DTIC Science & Technology

    2007-08-01

    programming; real-time cinematic quality graphics Peak stream (26) License required (limited time no- cost evaluation program) Commercially...folding.stanford.edu (accessed 30 March 2007). 2. Fan, Z.; Qiu, F.; Kaufman, A.; Yoakum-Stover, S. GPU Cluster for High Performance Computing. ACM/IEEE...accessed 30 March 2007). 8. Goodnight, N.; Wang, R.; Humphreys, G. Computation on Programmable Graphics Hardware. IEEE Computer Graphics and

  18. An Incremental Life-cycle Assurance Strategy for Critical System Certification

    DTIC Science & Technology

    2014-11-04

    for Safe Aircraft Operation Embedded software systems introduce a new class of problems not addressed by traditional system modeling & analysis...Platform Runtime Architecture Application Software Embedded SW System Engineer Data Stream Characteristics Latency jitter affects control behavior...do system level failures still occur despite fault tolerance techniques being deployed in systems ? Embedded software system as major source of

  19. Integrated Environment for Development and Assurance

    DTIC Science & Technology

    2015-01-26

    Jan 26, 2015 © 2015 Carnegie Mellon University We Rely on Software for Safe Aircraft Operation Embedded software systems introduce a new class of...eveloper Compute Platform Runtime Architecture Application Software Embedded SW System Engineer Data Stream Characteristics Latency jitter affects...Why do system level failures still occur despite fault tolerance techniques being deployed in systems ? Embedded software system as major source of

  20. The Global Climate Dashboard: a Software Interface to Stream Comprehensive Climate Data

    NASA Astrophysics Data System (ADS)

    Gardiner, N.; Phillips, M.; NOAA Climate Portal Dashboard

    2011-12-01

    The Global Climate Dashboard is an integral component of NOAA's web portal to climate data, services, and value-added content for decision-makers, teachers, and the science-attentive public (www.clmate.gov). The dashboard provides a rapid view of observational data that demonstrate climate change and variability, as well as outputs from the Climate Model Intercomparison Project version 3, which was built to support the Intergovernmental Panel on Climate Change fourth assessment. The data shown in the dashboard therefore span a range of climate science disciplines with applications that serve audiences with diverse needs. The dashboard is designed with reusable software components that allow it to be implemented incrementally on a wide range of platforms including desktops, tablet devices, and mobile phones. The underlying software components support live streaming of data and provide a way of encapsulating graph sytles and other presentation details into a device-independent standard format that results in a common visual look and feel across all platforms. Here we describe the pedagogical objectives, technical implementation, and the deployment of the dashboard through climate.gov and partner web sites and describe plans to develop a mobile application using the same framework.

  1. Modelling and Forecasting of Rice Yield in support of Crop Insurance

    NASA Astrophysics Data System (ADS)

    Weerts, A.; van Verseveld, W.; Trambauer, P.; de Vries, S.; Conijn, S.; van Valkengoed, E.; Hoekman, D.; Hengsdijk, H.; Schrevel, A.

    2016-12-01

    The Government of Indonesia has embarked on a policy to bring crop insurance to all of Indonesia's farmers. To support the Indonesian government, the G4INDO project (www.g4indo.org) is developing/constructing an integrated platform for judging and handling insurance claims. The platform consists of bringing together remote sensed data (both visible and radar) and hydrologic and crop modelling and forecasting to improve predictions in one forecasting platform (i.e. Delft-FEWS, Werner et al., 2013). The hydrological model and crop model (LINTUL) are coupled on time stepping basis in the OpenStreams framework (see https://github.com/openstreams/wflow) and deployed in a Delft-FEWS forecasting platform to support seasonal forecasting of water availability and crop yield. First we will show the general idea about the project, the integrated platform (including Sentinel 1 & 2 data) followed by first (reforecast) results of the coupled models for predicting water availability and crop yield in the Brantas catchment in Java, Indonesia. Werner, M., Schellekens, J., Gijsbers, P., Van Dijk, M., Van den Akker, O. and Heynert K, 2013. The Delft-FEWS flow forecasting system, Environmental Modelling & Software; 40:65-77. DOI: 10.1016/j.envsoft.2012.07.010 .

  2. Flexible, secure agent development framework

    DOEpatents

    Goldsmith,; Steven, Y [Rochester, MN

    2009-04-07

    While an agent generator is generating an intelligent agent, it can also evaluate the data processing platform on which it is executing, in order to assess a risk factor associated with operation of the agent generator on the data processing platform. The agent generator can retrieve from a location external to the data processing platform an open site that is configurable by the user, and load the open site into an agent substrate, thereby creating a development agent with code development capabilities. While an intelligent agent is executing a functional program on a data processing platform, it can also evaluate the data processing platform to assess a risk factor associated with performing the data processing function on the data processing platform.

  3. Comparison of drinking water treatment process streams for optimal bacteriological water quality.

    PubMed

    Ho, Lionel; Braun, Kalan; Fabris, Rolando; Hoefel, Daniel; Morran, Jim; Monis, Paul; Drikas, Mary

    2012-08-01

    Four pilot-scale treatment process streams (Stream 1 - Conventional treatment (coagulation/flocculation/dual media filtration); Stream 2 - Magnetic ion exchange (MIEX)/Conventional treatment; Stream 3 - MIEX/Conventional treatment/granular activated carbon (GAC) filtration; Stream 4 - Microfiltration/nanofiltration) were commissioned to compare their effectiveness in producing high quality potable water prior to disinfection. Despite receiving highly variable source water quality throughout the investigation, each stream consistently reduced colour and turbidity to below Australian Drinking Water Guideline levels, with the exception of Stream 1 which was difficult to manage due to the reactive nature of coagulation control. Of particular interest was the bacteriological quality of the treated waters where flow cytometry was shown to be the superior monitoring tool in comparison to the traditional heterotrophic plate count method. Based on removal of total and active bacteria, the treatment process streams were ranked in the order: Stream 4 (average log removal of 2.7) > Stream 2 (average log removal of 2.3) > Stream 3 (average log removal of 1.5) > Stream 1 (average log removal of 1.0). The lower removals in Stream 3 were attributed to bacteria detaching from the GAC filter. Bacterial community analysis revealed that the treatments affected the bacteria present, with the communities in streams incorporating conventional treatment clustering with each other, while the community composition of Stream 4 was very different to those of Streams 1, 2 and 3. MIEX treatment was shown to enhance removal of bacteria due to more efficient flocculation which was validated through the novel application of the photometric dispersion analyser. Copyright © 2012 Elsevier Ltd. All rights reserved.

  4. Development of the Hydroecological Integrity Assessment Process for Determining Environmental Flows for New Jersey Streams

    USGS Publications Warehouse

    Kennen, Jonathan G.; Henriksen, James A.; Nieswand, Steven P.

    2007-01-01

    The natural flow regime paradigm and parallel stream ecological concepts and theories have established the benefits of maintaining or restoring the full range of natural hydrologic variation for physiochemical processes, biodiversity, and the evolutionary potential of aquatic and riparian communities. A synthesis of recent advances in hydroecological research coupled with stream classification has resulted in a new process to determine environmental flows and assess hydrologic alteration. This process has national and international applicability. It allows classification of streams into hydrologic stream classes and identification of a set of non-redundant and ecologically relevant hydrologic indices for 10 critical sub-components of flow. Three computer programs have been developed for implementing the Hydroecological Integrity Assessment Process (HIP): (1) the Hydrologic Indices Tool (HIT), which calculates 171 ecologically relevant hydrologic indices on the basis of daily-flow and peak-flow stream-gage data; (2) the New Jersey Hydrologic Assessment Tool (NJHAT), which can be used to establish a hydrologic baseline period, provide options for setting baseline environmental-flow standards, and compare past and proposed streamflow alterations; and (3) the New Jersey Stream Classification Tool (NJSCT), designed for placing unclassified streams into pre-defined stream classes. Biological and multivariate response models including principal-component, cluster, and discriminant-function analyses aided in the development of software and implementation of the HIP for New Jersey. A pilot effort is currently underway by the New Jersey Department of Environmental Protection in which the HIP is being used to evaluate the effects of past and proposed surface-water use, ground-water extraction, and land-use changes on stream ecosystems while determining the most effective way to integrate the process into ongoing regulatory programs. Ultimately, this scientifically defensible process will help to quantify the effects of anthropogenic changes and development on hydrologic variability and help planners and resource managers balance current and future water requirements with ecological needs.

  5. Machine vision system for automated detection of stained pistachio nuts

    NASA Astrophysics Data System (ADS)

    Pearson, Tom C.

    1995-01-01

    A machine vision system was developed to separate stained pistachio nuts, which comprise of about 5% of the California crop, from unstained nuts. The system may be used to reduce labor involved with manual grading or to remove aflatoxin contaminated product from low grade process streams. The system was tested on two different pistachio process streams: the bi- chromatic color sorter reject stream and the small nut shelling stock stream. The system had a minimum overall error rate of 14% for the bi-chromatic sorter reject stream and 15% for the small shelling stock stream.

  6. Ammonia Monitor

    NASA Technical Reports Server (NTRS)

    Sauer, Richard L. (Inventor); Akse, James R. (Inventor); Thompson, John O. (Inventor); Atwater, James E. (Inventor)

    1999-01-01

    Ammonia monitor and method of use are disclosed. A continuous, real-time determination of the concentration of ammonia in an aqueous process stream is possible over a wide dynamic range of concentrations. No reagents are required because pH is controlled by an in-line solid-phase base. Ammonia is selectively transported across a membrane from the process stream to an analytical stream to an analytical stream under pH control. The specific electrical conductance of the analytical stream is measured and used to determine the concentration of ammonia.

  7. Analogue modelling of the influence of ice shelf collapse on the flow of ice sheets grounded below sea-level

    NASA Astrophysics Data System (ADS)

    Corti, Giacomo; Zeoli, Antonio

    2016-04-01

    The sudden breakup of ice shelves is expected to result in significant acceleration of inland glaciers, a process related to the removal of the buttressing effect exerted by the ice shelf on the tributary glaciers. This effect has been tested in previous analogue models, which however applied to ice sheets grounded above sea level (e.g., East Antarctic Ice Sheet; Antarctic Peninsula and the Larsen Ice Shelf). In this work we expand these previous results by performing small-scale laboratory models that analyse the influence of ice shelf collapse on the flow of ice streams draining an ice sheet grounded below sea level (e.g., the West Antarctic Ice Sheet). The analogue models, with dimensions (width, length, thickness) of 120x70x1.5cm were performed at the Tectonic Modelling Laboratory of CNR-IGG of Florence, Italy, by using Polydimethilsyloxane (PDMS) as analogue for the flowing ice. This transparent, Newtonian silicone has been shown to well approximate the rheology of natural ice. The silicone was allowed to flow into a water reservoir simulating natural conditions in which ice streams flow into the sea, terminating in extensive ice shelves which act as a buttress for their glaciers and slow their flow. The geometric scaling ratio was 10(-5), such that 1cm in the models simulated 1km in nature; velocity of PDMS (a few mm per hour) simulated natural velocities of 100-1000 m/year. Instability of glacier flow was induced by manually removing a basal silicone platform (floating on water) exerting backstresses to the flowing analogue glacier: the simple set-up adopted in the experiments isolates the effect of the removal of the buttressing effect that the floating platform exerts on the flowing glaciers, thus offering insights into the influence of this parameter on the flow perturbations resulting from a collapse event. The experimental results showed a significant increase in glacier velocity close to its outlet following ice shelf breakup, a process similar to what observed in previous models. This transient effect did not significantly propagate upstream towards the inner parts of ice sheet, and rapidly decayed with time. The process was also accompanied by significant ice thinning. Models results suggest that the ice sheet is almost unaffected by flow perturbations induced by ice shelf collapse, unless other processes (e.g., grounding line instability induced by warm water penetration) are involved.

  8. Address-event-based platform for bioinspired spiking systems

    NASA Astrophysics Data System (ADS)

    Jiménez-Fernández, A.; Luján, C. D.; Linares-Barranco, A.; Gómez-Rodríguez, F.; Rivas, M.; Jiménez, G.; Civit, A.

    2007-05-01

    Address Event Representation (AER) is an emergent neuromorphic interchip communication protocol that allows a real-time virtual massive connectivity between huge number neurons, located on different chips. By exploiting high speed digital communication circuits (with nano-seconds timings), synaptic neural connections can be time multiplexed, while neural activity signals (with mili-seconds timings) are sampled at low frequencies. Also, neurons generate "events" according to their activity levels. More active neurons generate more events per unit time, and access the interchip communication channel more frequently, while neurons with low activity consume less communication bandwidth. When building multi-chip muti-layered AER systems, it is absolutely necessary to have a computer interface that allows (a) reading AER interchip traffic into the computer and visualizing it on the screen, and (b) converting conventional frame-based video stream in the computer into AER and injecting it at some point of the AER structure. This is necessary for test and debugging of complex AER systems. In the other hand, the use of a commercial personal computer implies to depend on software tools and operating systems that can make the system slower and un-robust. This paper addresses the problem of communicating several AER based chips to compose a powerful processing system. The problem was discussed in the Neuromorphic Engineering Workshop of 2006. The platform is based basically on an embedded computer, a powerful FPGA and serial links, to make the system faster and be stand alone (independent from a PC). A new platform is presented that allow to connect up to eight AER based chips to a Spartan 3 4000 FPGA. The FPGA is responsible of the network communication based in Address-Event and, at the same time, to map and transform the address space of the traffic to implement a pre-processing. A MMU microprocessor (Intel XScale 400MHz Gumstix Connex computer) is also connected to the FPGA to allow the platform to implement eventbased algorithms to interact to the AER system, like control algorithms, network connectivity, USB support, etc. The LVDS transceiver allows a bandwidth of up to 1.32 Gbps, around ~66 Mega events per second (Mevps).

  9. Mass, energy and material balances of SRF production process. Part 2: SRF produced from construction and demolition waste.

    PubMed

    Nasrullah, Muhammad; Vainikka, Pasi; Hannula, Janne; Hurme, Markku; Kärki, Janne

    2014-11-01

    In this work, the fraction of construction and demolition waste (C&D waste) complicated and economically not feasible to sort out for recycling purposes is used to produce solid recovered fuel (SRF) through mechanical treatment (MT). The paper presents the mass, energy and material balances of this SRF production process. All the process streams (input and output) produced in MT waste sorting plant to produce SRF from C&D waste are sampled and treated according to CEN standard methods for SRF. Proximate and ultimate analysis of these streams is performed and their composition is determined. Based on this analysis and composition of process streams their mass, energy and material balances are established for SRF production process. By mass balance means the overall mass flow of input waste material stream in the various output streams and material balances mean the mass flow of components of input waste material stream (such as paper and cardboard, wood, plastic (soft), plastic (hard), textile and rubber) in the various output streams of SRF production process. The results from mass balance of SRF production process showed that of the total input C&D waste material to MT waste sorting plant, 44% was recovered in the form of SRF, 5% as ferrous metal, 1% as non-ferrous metal, and 28% was sorted out as fine fraction, 18% as reject material and 4% as heavy fraction. The energy balance of this SRF production process showed that of the total input energy content of C&D waste material to MT waste sorting plant, 74% was recovered in the form of SRF, 16% belonged to the reject material and rest 10% belonged to the streams of fine fraction and heavy fraction. From the material balances of this process, mass fractions of plastic (soft), paper and cardboard, wood and plastic (hard) recovered in the SRF stream were 84%, 82%, 72% and 68% respectively of their input masses to MT plant. A high mass fraction of plastic (PVC) and rubber material was found in the reject material stream. Streams of heavy fraction and fine fraction mainly contained non-combustible material (such as stone/rock, sand particles and gypsum material). Copyright © 2014 Elsevier Ltd. All rights reserved.

  10. On the organization of the perisylvian cortex: Insights from the electrophysiology of language. Comment on "Towards a Computational Comparative Neuroprimatology: Framing the language-ready brain" by M.A. Arbib

    NASA Astrophysics Data System (ADS)

    Brouwer, Harm; Crocker, Matthew W.

    2016-03-01

    The Mirror System Hypothesis (MSH) on the evolution of the language-ready brain draws upon the parallel dorsal-ventral stream architecture for vision [1]. The dorsal ;how; stream provides a mapping of parietally-mediated affordances onto the motor system (supporting preshape), whereas the ventral ;what; stream engages in object recognition and visual scene analysis (supporting pantomime and verbal description). Arbib attempts to integrate this MSH perspective with a recent conceptual dorsal-ventral stream model of auditory language comprehension [5] (henceforth, the B&S model). In the B&S model, the dorsal stream engages in time-dependent combinatorial processing, which subserves syntactic structuring and linkage to action, whereas the ventral stream performs time-independent unification of conceptual schemata. These streams are integrated in the left Inferior Frontal Gyrus (lIFG), which is assumed to subserve cognitive control, and no linguistic processing functions. Arbib criticizes the B&S model on two grounds: (i) the time-independence of the semantic processing in the ventral stream (by arguing that semantic processing is just as time-dependent as syntactic processing), and (ii) the absence of linguistic processing in the lIFG (reconciling syntactic and semantic representations is very much linguistic processing proper). Here, we provide further support for these two points of criticism on the basis of insights from the electrophysiology of language. In the course of our argument, we also sketch the contours of an alternative model that may prove better suited for integration with the MSH.

  11. Efficient gas-separation process to upgrade dilute methane stream for use as fuel

    DOEpatents

    Wijmans, Johannes G [Menlo Park, CA; Merkel, Timothy C [Menlo Park, CA; Lin, Haiqing [Mountain View, CA; Thompson, Scott [Brecksville, OH; Daniels, Ramin [San Jose, CA

    2012-03-06

    A membrane-based gas separation process for treating gas streams that contain methane in low concentrations. The invention involves flowing the stream to be treated across the feed side of a membrane and flowing a sweep gas stream, usually air, across the permeate side. Carbon dioxide permeates the membrane preferentially and is picked up in the sweep air stream on the permeate side; oxygen permeates in the other direction and is picked up in the methane-containing stream. The resulting residue stream is enriched in methane as well as oxygen and has an EMC value enabling it to be either flared or combusted by mixing with ordinary air.

  12. Platinum recovery from industrial process streams by halophilic bacteria: Influence of salt species and platinum speciation.

    PubMed

    Maes, Synthia; Claus, Mathias; Verbeken, Kim; Wallaert, Elien; De Smet, Rebecca; Vanhaecke, Frank; Boon, Nico; Hennebel, Tom

    2016-11-15

    The increased use and criticality of platinum asks for the development of effective low-cost strategies for metal recovery from process and waste streams. Although biotechnological processes can be applied for the valorization of diluted aqueous industrial streams, investigations considering real stream conditions (e.g., high salt levels, acidic pH, metal speciation) are lacking. This study investigated the recovery of platinum by a halophilic microbial community in the presence of increased salt concentrations (10-80 g L -1 ), different salt matrices (phosphate salts, sea salts and NH 4 Cl) and a refinery process stream. The halophiles were able to recover 79-99% of the Pt at 10-80 g L -1 salts and at pH 2.3. Transmission electron microscopy suggested a positive correlation between intracellular Pt cluster size and elevated salt concentrations. Furthermore, the halophiles recovered 46-95% of the Pt-amine complex Pt[NH 3 ] 4 2+ from a process stream after the addition of an alternative Pt source (K 2 PtCl 4 , 0.1-1.0 g L -1 Pt). Repeated Pt-tetraamine recovery (from an industrial process stream) was obtained after concomitant addition of fresh biomass and harvesting of Pt saturated biomass. This study demonstrates how aqueous Pt streams can be transformed into Pt rich biomass, which would be an interesting feed of a precious metals refinery. Copyright © 2016 Elsevier Ltd. All rights reserved.

  13. Screening of veterinary drug residues in food by LC-MS/MS. Background and challenges.

    PubMed

    Delatour, Thierry; Racault, Lucie; Bessaire, Thomas; Desmarchelier, Aurélien

    2018-04-01

    Regulatory agencies and government authorities have established maximum residue limits (MRL) in various food matrices of animal origin for supporting governments and food operators in the monitoring of veterinary drug residues in the food chain, and ultimately in the consumer's plate. Today, about 200 veterinary drug residues from several families, mainly with antibiotic, antiparasitic or antiinflammatory activities, are regulated in a variety of food matrices such as milk, meat or egg. This article provides a review of the regulatory framework in milk and muscle including data from Codex Alimentarius, Europe, the U.S.A., Canada and China for about 220 veterinary drugs. The article also provides a comprehensive overview of the challenge for food control, and emphasizes the pivotal role of liquid chromatography-mass spectrometry (LC-MS), either in tandem with quadrupoles (LC-MS/MS) or high resolution MS (LC-HRMS), for ensuring an adequate consumer protection combined with an affordable cost. The capability of a streamlined LC-MS/MS platform for screening 152 veterinary drug residues in a broad range of raw materials and finished products is highlighted in a production line perspective. The rationale for a suite of four methods intended to achieve appropriate performance in terms of scope and sensitivity is presented. Overall, the platform encompasses one stream for the determination of 105 compounds in a run (based on acidic QuEChERS-like), plus two streams for 23 β-lactams (alkaline QuEChERS-like) and 10 tetracyclines (low-temperature partitioning), respectively, and a dedicated stream for 14 aminoglycosides (molecularly-imprinted polymer).

  14. Two stroke engine exhaust emissions separator

    DOEpatents

    Turner, Terry D.; Wilding, Bruce M.; McKellar, Michael G.; Raterman, Kevin T.

    2003-04-22

    A separator for substantially resolving at least one component of a process stream, such as from the exhaust of an internal combustion engine. The separator includes a body defining a chamber therein. A nozzle housing is located proximate the chamber. An exhaust inlet is in communication with the nozzle housing and the chamber. A nozzle assembly is positioned in the nozzle housing and includes a nozzle moveable within and relative to the nozzle housing. The nozzle includes at least one passage formed therethrough such that a process stream entering the exhaust inlet connection passes through the passage formed in the nozzle and imparts a substantially rotational flow to the process stream as it enters the chamber. A positioning member is configured to position the nozzle relative to the nozzle housing in response to changes in process stream pressure thereby adjusting flowrate of said process stream entering into the chamber.

  15. Two stroke engine exhaust emissions separator

    DOEpatents

    Turner, Terry D.; Wilding, Bruce M.; McKellar, Michael G.; Raterman, Kevin T.

    2002-01-01

    A separator for substantially resolving at least one component of a process stream, such as from the exhaust of an internal combustion engine. The separator includes a body defining a chamber therein. A nozzle housing is located proximate the chamber. An exhaust inlet is in communication with the nozzle housing and the chamber. A nozzle assembly is positioned in the nozzle housing and includes a nozzle moveable within and relative to the nozzle housing. The nozzle includes at least one passage formed therethrough such that a process stream entering the exhaust inlet connection passes through the passage formed in the nozzle, which imparts a substantially rotational flow to the process stream as it enters the chamber. A positioning member is configured to position the nozzle relative to the nozzle housing in response to changes in process stream pressure to adjust flowrate of said process stream entering into the chamber.

  16. DWPF RECYCLE EVAPORATOR FLOWSHEET EVALUATION (U)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stone, M

    2005-04-30

    The Defense Waste Processing Facility (DWPF) converts the high level waste slurries stored at the Savannah River Site into borosilicate glass for long-term storage. The vitrification process results in the generation of approximately five gallons of dilute recycle streams for each gallon of waste slurry vitrified. This dilute recycle stream is currently transferred to the H-area Tank Farm and amounts to approximately 1,400,000 gallons of effluent per year. Process changes to incorporate salt waste could increase the amount of effluent to approximately 2,900,000 gallons per year. The recycle consists of two major streams and four smaller streams. The first majormore » recycle stream is condensate from the Chemical Process Cell (CPC), and is collected in the Slurry Mix Evaporator Condensate Tank (SMECT). The second major recycle stream is the melter offgas which is collected in the Off Gas Condensate Tank (OGCT). The four smaller streams are the sample flushes, sump flushes, decon solution, and High Efficiency Mist Eliminator (HEME) dissolution solution. These streams are collected in the Decontamination Waste Treatment Tank (DWTT) or the Recycle Collection Tank (RCT). All recycle streams are currently combined in the RCT and treated with sodium nitrite and sodium hydroxide prior to transfer to the tank farm. Tank Farm space limitations and previous outages in the 2H Evaporator system due to deposition of sodium alumino-silicates have led to evaluation of alternative methods of dealing with the DWPF recycle. One option identified for processing the recycle was a dedicated evaporator to concentrate the recycle stream to allow the solids to be recycled to the DWPF Sludge Receipt and Adjustment Tank (SRAT) and the condensate from this evaporation process to be sent and treated in the Effluent Treatment Plant (ETP). In order to meet process objectives, the recycle stream must be concentrated to 1/30th of the feed volume during the evaporation process. The concentrated stream must be pumpable to the DWPF SRAT vessel and should not precipitate solids to avoid fouling the evaporator vessel and heat transfer coils. The evaporation process must not generate excessive foam and must have a high Decontamination Factor (DF) for many species in the evaporator feed to allow the condensate to be transferred to the ETP. An initial scoping study was completed in 2001 to evaluate the feasibility of the evaporator which concluded that the concentration objectives could be met. This initial study was based on initial estimates of recycle concentration and was based solely on OLI modeling of the evaporation process. The Savannah River National Laboratory (SRNL) has completed additional studies using simulated recycle streams and OLI{reg_sign} simulations. Based on this work, the proposed flowsheet for the recycle evaporator was evaluated for feasibility, evaporator design considerations, and impact on the DWPF process. This work was in accordance with guidance from DWPF-E and was performed in accordance with the Technical Task and Quality Assurance Plan.« less

  17. High rates of organic carbon processing in the hyporheic zone of intermittent streams.

    PubMed

    Burrows, Ryan M; Rutlidge, Helen; Bond, Nick R; Eberhard, Stefan M; Auhl, Alexandra; Andersen, Martin S; Valdez, Dominic G; Kennard, Mark J

    2017-10-16

    Organic carbon cycling is a fundamental process that underpins energy transfer through the biosphere. However, little is known about the rates of particulate organic carbon processing in the hyporheic zone of intermittent streams, which is often the only wetted environment remaining when surface flows cease. We used leaf litter and cotton decomposition assays, as well as rates of microbial respiration, to quantify rates of organic carbon processing in surface and hyporheic environments of intermittent and perennial streams under a range of substrate saturation conditions. Leaf litter processing was 48% greater, and cotton processing 124% greater, in the hyporheic zone compared to surface environments when calculated over multiple substrate saturation conditions. Processing was also greater in more saturated surface environments (i.e. pools). Further, rates of microbial respiration on incubated substrates in the hyporheic zone were similar to, or greater than, rates in surface environments. Our results highlight that intermittent streams are important locations for particulate organic carbon processing and that the hyporheic zone sustains this fundamental process even without surface flow. Not accounting for carbon processing in the hyporheic zone of intermittent streams may lead to an underestimation of its local ecological significance and collective contribution to landscape carbon processes.

  18. Optical Flow in a Smart Sensor Based on Hybrid Analog-Digital Architecture

    PubMed Central

    Guzmán, Pablo; Díaz, Javier; Agís, Rodrigo; Ros, Eduardo

    2010-01-01

    The purpose of this study is to develop a motion sensor (delivering optical flow estimations) using a platform that includes the sensor itself, focal plane processing resources, and co-processing resources on a general purpose embedded processor. All this is implemented on a single device as a SoC (System-on-a-Chip). Optical flow is the 2-D projection into the camera plane of the 3-D motion information presented at the world scenario. This motion representation is widespread well-known and applied in the science community to solve a wide variety of problems. Most applications based on motion estimation require work in real-time; hence, this restriction must be taken into account. In this paper, we show an efficient approach to estimate the motion velocity vectors with an architecture based on a focal plane processor combined on-chip with a 32 bits NIOS II processor. Our approach relies on the simplification of the original optical flow model and its efficient implementation in a platform that combines an analog (focal-plane) and digital (NIOS II) processor. The system is fully functional and is organized in different stages where the early processing (focal plane) stage is mainly focus to pre-process the input image stream to reduce the computational cost in the post-processing (NIOS II) stage. We present the employed co-design techniques and analyze this novel architecture. We evaluate the system’s performance and accuracy with respect to the different proposed approaches described in the literature. We also discuss the advantages of the proposed approach as well as the degree of efficiency which can be obtained from the focal plane processing capabilities of the system. The final outcome is a low cost smart sensor for optical flow computation with real-time performance and reduced power consumption that can be used for very diverse application domains. PMID:22319283

  19. Where’s Waldo? How perceptual, cognitive, and emotional brain processes cooperate during learning to categorize and find desired objects in a cluttered scene

    PubMed Central

    Chang, Hung-Cheng; Grossberg, Stephen; Cao, Yongqiang

    2014-01-01

    The Where’s Waldo problem concerns how individuals can rapidly learn to search a scene to detect, attend, recognize, and look at a valued target object in it. This article develops the ARTSCAN Search neural model to clarify how brain mechanisms across the What and Where cortical streams are coordinated to solve the Where’s Waldo problem. The What stream learns positionally-invariant object representations, whereas the Where stream controls positionally-selective spatial and action representations. The model overcomes deficiencies of these computationally complementary properties through What and Where stream interactions. Where stream processes of spatial attention and predictive eye movement control modulate What stream processes whereby multiple view- and positionally-specific object categories are learned and associatively linked to view- and positionally-invariant object categories through bottom-up and attentive top-down interactions. Gain fields control the coordinate transformations that enable spatial attention and predictive eye movements to carry out this role. What stream cognitive-emotional learning processes enable the focusing of motivated attention upon the invariant object categories of desired objects. What stream cognitive names or motivational drives can prime a view- and positionally-invariant object category of a desired target object. A volitional signal can convert these primes into top-down activations that can, in turn, prime What stream view- and positionally-specific categories. When it also receives bottom-up activation from a target, such a positionally-specific category can cause an attentional shift in the Where stream to the positional representation of the target, and an eye movement can then be elicited to foveate it. These processes describe interactions among brain regions that include visual cortex, parietal cortex, inferotemporal cortex, prefrontal cortex (PFC), amygdala, basal ganglia (BG), and superior colliculus (SC). PMID:24987339

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yan, Rui; Praggastis, Brenda L.; Smith, William P.

    While streaming data have become increasingly more popular in business and research communities, semantic models and processing software for streaming data have not kept pace. Traditional semantic solutions have not addressed transient data streams. Semantic web languages (e.g., RDF, OWL) have typically addressed static data settings and linked data approaches have predominantly addressed static or growing data repositories. Streaming data settings have some fundamental differences; in particular, data are consumed on the fly and data may expire. Stream reasoning, a combination of stream processing and semantic reasoning, has emerged with the vision of providing "smart" processing of streaming data. C-SPARQLmore » is a prominent stream reasoning system that handles semantic (RDF) data streams. Many stream reasoning systems including C-SPARQL use a sliding window and use data arrival time to evict data. For data streams that include expiration times, a simple arrival time scheme is inadequate if the window size does not match the expiration period. In this paper, we propose a cache-enabled, order-aware, ontology-based stream reasoning framework. This framework consumes RDF streams with expiration timestamps assigned by the streaming source. Our framework utilizes both arrival and expiration timestamps in its cache eviction policies. In addition, we introduce the notion of "semantic importance" which aims to address the relevance of data to the expected reasoning, thus enabling the eviction algorithms to be more context- and reasoning-aware when choosing what data to maintain for question answering. We evaluate this framework by implementing three different prototypes and utilizing five metrics. The trade-offs of deploying the proposed framework are also discussed.« less

  1. Geospatial Data Stream Processing in Python Using FOSS4G Components

    NASA Astrophysics Data System (ADS)

    McFerren, G.; van Zyl, T.

    2016-06-01

    One viewpoint of current and future IT systems holds that there is an increase in the scale and velocity at which data are acquired and analysed from heterogeneous, dynamic sources. In the earth observation and geoinformatics domains, this process is driven by the increase in number and types of devices that report location and the proliferation of assorted sensors, from satellite constellations to oceanic buoy arrays. Much of these data will be encountered as self-contained messages on data streams - continuous, infinite flows of data. Spatial analytics over data streams concerns the search for spatial and spatio-temporal relationships within and amongst data "on the move". In spatial databases, queries can assess a store of data to unpack spatial relationships; this is not the case on streams, where spatial relationships need to be established with the incomplete data available. Methods for spatially-based indexing, filtering, joining and transforming of streaming data need to be established and implemented in software components. This article describes the usage patterns and performance metrics of a number of well known FOSS4G Python software libraries within the data stream processing paradigm. In particular, we consider the RTree library for spatial indexing, the Shapely library for geometric processing and transformation and the PyProj library for projection and geodesic calculations over streams of geospatial data. We introduce a message oriented Python-based geospatial data streaming framework called Swordfish, which provides data stream processing primitives, functions, transports and a common data model for describing messages, based on the Open Geospatial Consortium Observations and Measurements (O&M) and Unidata Common Data Model (CDM) standards. We illustrate how the geospatial software components are integrated with the Swordfish framework. Furthermore, we describe the tight temporal constraints under which geospatial functionality can be invoked when processing high velocity, potentially infinite geospatial data streams. The article discusses the performance of these libraries under simulated streaming loads (size, complexity and volume of messages) and how they can be deployed and utilised with Swordfish under real load scenarios, illustrated by a set of Vessel Automatic Identification System (AIS) use cases. We conclude that the described software libraries are able to perform adequately under geospatial data stream processing scenarios - many real application use cases will be handled sufficiently by the software.

  2. The Role of Riparian Vegetation in Protecting and Improving Chemical Water Quality in Streams

    Treesearch

    Michael G. Dosskey; Philippe Vidon; Noel P. Gurwick; Craig J. Allan; Tim P. Duval; Richard Lowrance

    2010-01-01

    We review the research literature and summarize the major processes by which riparian vegetation influences chemical water quality in streams, as well as how these processes vary among vegetation types, and discuss how these processes respond to removal and restoration of riparian vegetation and thereby determine the timing and level of response in stream water quality...

  3. Rich internet application system for patient-centric healthcare data management using handheld devices.

    PubMed

    Constantinescu, L; Pradana, R; Kim, J; Gong, P; Fulham, Michael; Feng, D

    2009-01-01

    Rich Internet Applications (RIAs) are an emerging software platform that blurs the line between web service and native application, and is a powerful tool for handheld device deployment. By democratizing health data management and widening its availability, this software platform has the potential to revolutionize telemedicine, clinical practice, medical education and information distribution, particularly in rural areas, and to make patient-centric medical computing a reality. In this paper, we propose a telemedicine application that leverages the ability of a mobile RIA platform to transcode, organise and present textual and multimedia data, which are sourced from medical database software. We adopted a web-based approach to communicate, in real-time, with an established hospital information system via a custom RIA. The proposed solution allows communication between handheld devices and a hospital information system for media streaming with support for real-time encryption, on any RIA enabled platform. We demonstrate our prototype's ability to securely and rapidly access, without installation requirements, medical data ranging from simple textual records to multi-slice PET-CT images and maximum intensity (MIP) projections.

  4. New metrics for evaluating channel networks extracted in grid digital elevation models

    NASA Astrophysics Data System (ADS)

    Orlandini, S.; Moretti, G.

    2017-12-01

    Channel networks are critical components of drainage basins and delta regions. Despite the important role played by these systems in hydrology and geomorphology, there are at present no well-defined methods to evaluate numerically how two complex channel networks are geometrically far apart. The present study introduces new metrics for evaluating numerically channel networks extracted in grid digital elevation models with respect to a reference channel network (see the figure below). Streams of the evaluated network (EN) are delineated as in the Horton ordering system and examined through a priority climbing algorithm based on the triple index (ID1,ID2,ID3), where ID1 is a stream identifier that increases as the elevation of lower end of the stream increases, ID2 indicates the ID1 of the draining stream, and ID3 is the ID1 of the corresponding stream in the reference network (RN). Streams of the RN are identified by the double index (ID1,ID2). Streams of the EN are processed in the order of increasing ID1 (plots a-l in the figure below). For each processed stream of the EN, the closest stream of the RN is sought by considering all the streams of the RN sharing the same ID2. This ID2 in the RN is equal in the EN to the ID3 of the stream draining the processed stream, the one having ID1 equal to the ID2 of the processed stream. The mean stream planar distance (MSPD) and the mean stream elevation drop (MSED) are computed as the mean distance and drop, respectively, between corresponding streams. The MSPD is shown to be useful for evaluating slope direction methods and thresholds for channel initiation, whereas the MSED is shown to indicate the ability of grid coarsening strategies to retain the profiles of observed channels. The developed metrics fill a gap in the existing literature by allowing hydrologists and geomorphologists to compare descriptions of a fixed physical system obtained by using different terrain analysis methods, or different physical systems described by using the same methods.

  5. Recent (circa 1998 to 2011) channel-migration rates of selected streams in Indiana

    USGS Publications Warehouse

    Robinson, Bret A.

    2013-01-01

    An investigation was completed to document recent (circa 1998 to 2011) channel-migration rates at 970 meander bends along 38 of the largest streams in Indiana. Data collection was completed by using the Google Earth™ platform and, for each selected site, identifying two images with capture dates separated by multiple years. Within each image, the position of the meander-bend cutbank was measured relative to a fixed local landscape feature visible in both images, and an average channel-migration rate was calculated at the point of maximum cutbank displacement. From these data it was determined that 65 percent of the measured sites have recently been migrating at a rate less than 1 ft/yr, 75 percent of the sites have been migrating at a rate less than 10 ft/yr, and while some sites are migrating in excess of 20 ft/yr, these occurrences are rare. In addition, it is shown that recent channel-migration activity is not evenly distributed across Indiana. For the stream reaches studied, far northern and much of far southern Indiana are drained by streams that recently have been relatively stationary. At the same time, this study shows that most of the largest streams in west-central Indiana and many of the largest streams in east-central Indiana have shown significant channel-migration activity during the recent past. It is anticipated that these results will support several fluvial-erosion-hazard mitigation activities currently being undertaken in Indiana.

  6. VPipe: Virtual Pipelining for Scheduling of DAG Stream Query Plans

    NASA Astrophysics Data System (ADS)

    Wang, Song; Gupta, Chetan; Mehta, Abhay

    There are data streams all around us that can be harnessed for tremendous business and personal advantage. For an enterprise-level stream processing system such as CHAOS [1] (Continuous, Heterogeneous Analytic Over Streams), handling of complex query plans with resource constraints is challenging. While several scheduling strategies exist for stream processing, efficient scheduling of complex DAG query plans is still largely unsolved. In this paper, we propose a novel execution scheme for scheduling complex directed acyclic graph (DAG) query plans with meta-data enriched stream tuples. Our solution, called Virtual Pipelined Chain (or VPipe Chain for short), effectively extends the "Chain" pipelining scheduling approach to complex DAG query plans.

  7. Investigating category- and shape-selective neural processing in ventral and dorsal visual stream under interocular suppression.

    PubMed

    Ludwig, Karin; Kathmann, Norbert; Sterzer, Philipp; Hesselmann, Guido

    2015-01-01

    Recent behavioral and neuroimaging studies using continuous flash suppression (CFS) have suggested that action-related processing in the dorsal visual stream might be independent of perceptual awareness, in line with the "vision-for-perception" versus "vision-for-action" distinction of the influential dual-stream theory. It remains controversial if evidence suggesting exclusive dorsal stream processing of tool stimuli under CFS can be explained by their elongated shape alone or by action-relevant category representations in dorsal visual cortex. To approach this question, we investigated category- and shape-selective functional magnetic resonance imaging-blood-oxygen level-dependent responses in both visual streams using images of faces and tools. Multivariate pattern analysis showed enhanced decoding of elongated relative to non-elongated tools, both in the ventral and dorsal visual stream. The second aim of our study was to investigate whether the depth of interocular suppression might differentially affect processing in dorsal and ventral areas. However, parametric modulation of suppression depth by varying the CFS mask contrast did not yield any evidence for differential modulation of category-selective activity. Together, our data provide evidence for shape-selective processing under CFS in both dorsal and ventral stream areas and, therefore, do not support the notion that dorsal "vision-for-action" processing is exclusively preserved under interocular suppression. © 2014 Wiley Periodicals, Inc.

  8. Foundations for Streaming Model Transformations by Complex Event Processing.

    PubMed

    Dávid, István; Ráth, István; Varró, Dániel

    2018-01-01

    Streaming model transformations represent a novel class of transformations to manipulate models whose elements are continuously produced or modified in high volume and with rapid rate of change. Executing streaming transformations requires efficient techniques to recognize activated transformation rules over a live model and a potentially infinite stream of events. In this paper, we propose foundations of streaming model transformations by innovatively integrating incremental model query, complex event processing (CEP) and reactive (event-driven) transformation techniques. Complex event processing allows to identify relevant patterns and sequences of events over an event stream. Our approach enables event streams to include model change events which are automatically and continuously populated by incremental model queries. Furthermore, a reactive rule engine carries out transformations on identified complex event patterns. We provide an integrated domain-specific language with precise semantics for capturing complex event patterns and streaming transformations together with an execution engine, all of which is now part of the Viatra reactive transformation framework. We demonstrate the feasibility of our approach with two case studies: one in an advanced model engineering workflow; and one in the context of on-the-fly gesture recognition.

  9. A Stream Morphology Classification for Eco-hydraulic Purposes Based on Geospatial Data: a Solute Transport Application Case

    NASA Astrophysics Data System (ADS)

    Jiménez Jaramillo, M. A.; Camacho Botero, L. A.; Vélez Upegui, J. I.

    2010-12-01

    Variation in stream morphology along a basin drainage network leads to different hydraulic patterns and sediment transport processes. Moreover, solute transport processes along streams, and stream habitats for fisheries and microorganisms, rely on stream corridor structure, including elements such as bed forms, channel patterns, riparian vegetation, and the floodplain. In this work solute transport processes simulation and stream habitat identification are carried out at the basin scale. A reach-scale morphological classification system based on channel slope and specific stream power was implemented by using digital elevation models and hydraulic geometry relationships. Although the morphological framework allows identification of cascade, step-pool, plane bed and pool-riffle morphologies along the drainage network, it still does not account for floodplain configuration and bed-forms identification of those channel types. Hence, as a first application case in order to obtain parsimonious three-dimensional characterizations of drainage channels, the morphological framework has been updated by including topographical floodplain delimitation through a Multi-resolution Valley Bottom Flatness Index assessing, and a stochastic bed form representation of the step-pool morphology. Model outcomes were tested in relation to in-stream water storage for different flow conditions and representative travel times according to the Aggregated Dead Zone -ADZ- model conceptualization of solute transport processes.

  10. Sewage treatment method

    DOEpatents

    Fassbender, Alex G.

    1995-01-01

    The invention greatly reduces the amount of ammonia in sewage plant effluent. The process of the invention has three main steps. The first step is dewatering without first digesting, thereby producing a first ammonia-containing stream having a low concentration of ammonia, and a second solids-containing stream. The second step is sending the second solids-containing stream through a means for separating the solids from the liquid and producing an aqueous stream containing a high concentration of ammonia. The third step is removal of ammonia from the aqueous stream using a hydrothermal process.

  11. UV meteor observation from a space platform

    NASA Astrophysics Data System (ADS)

    Scarsi, P.

    2004-07-01

    The paper reports on the evaluation of the meteor light curve in the 300-400 nm UV band produced by meteoroids and space debris interacting with the Earth atmosphere; the aim is to assess the visibility of the phenomenon by a near-Earth space platform and to estimate the capability for measuring the solid-body influx on the Earth from outer space. The simulations have been conceived on the basis of general processes only, without introducing a priori observational inputs: the calibration with real data can be made in orbit by validation with "characterized" meteor streams. Computations are made for different values of the entry velocity (12 to 72 km/s) and angle of impact of the meteoroid when entering the atmosphere, with initial-mass values ranging from 10-12 kg to the kg size encompassing the transition from micrometeorites ( m < 10-9-10-8kg) to the "ablation" regime typical of larger masses. The data are presented using units in UV Magnitudo to facilitate direct comparison with the common literature in the field. The results concern observations of the atmosphere up to M UV = 18 by a height of 400 km above the Earth surface (average for the International Space Station--ISS), with reference to the mission "Extreme Universe Space Observatory--EUSO" designed as an external payload for the module "Columbus" of the European Space Agency. Meteors represent for EUSO an observable as a slow UV phenomenon with seconds to minutes characteristic time duration, to be compared to the fast phenomenon typical of the Extensive Air Shower (EAS) induced by the energetic cosmic radiation, ranging from microseconds to milliseconds. Continuous wide-angle observation by EUSO with its high inclination orbit and sensitivity reaching M UV = 18 will allow the in-depth exploration of the meteor "sporadic" component and to isolate the contribution of minor "streams".

  12. Probabilistic and Evolutionary Early Warning System: concepts, performances, and case-studies

    NASA Astrophysics Data System (ADS)

    Zollo, A.; Emolo, A.; Colombelli, S.; Elia, L.; Festa, G.; Martino, C.; Picozzi, M.

    2013-12-01

    PRESTo (PRobabilistic and Evolutionary early warning SysTem) is a software platform for Earthquake Early Warning that integrates algorithms for real-time earthquake location, magnitude estimation and damage assessment into a highly configurable and easily portable package. In its regional configuration, the software processes, in real-time, the 3-component acceleration data streams coming from seismic stations, for P-waves arrival detection and, in the case a quite large event is occurring, can promptly performs event detection and location, magnitude estimation and peak ground-motion prediction at target sites. The regional approach has been integrated with a threshold-based early warning method that allows, in the very first seconds after a moderate-to-large earthquake, to identify the most Probable Damaged Zone starting from the real-time measurement at near-source stations located at increasing distances from the earthquake epicenter, of the peak displacement (Pd) and predominant period of P-waves (τc), over a few-second long window after the P-wave arrival. Thus, each recording site independently provides an evolutionary alert level, according to the Pd and τc it measured, through a decisional table. Since 2009, PRESTo has been under continuous real-time testing using data streaming from the Iripinia Seismic Network (Southern Italy) and has produced a bulletin of some hundreds low magnitude events, including all the M≥2.5 earthquakes occurred in that period in Irpinia. Recently, PRESTo has been also implemented at the accelerometric network and broad-band networks in South Korea and in Romania, and off-line tested in Iberian Peninsula, in Turkey, in Israel, and in Japan. The feasibility of an Early Warning System at national scale, is currently under testing by studying the performances of the PRESTo platform for the Italian Accelerometric Network. Moreover, PRESTo is under experimentation in order to provide alert in a high-school located in the neighborhood of Naples at about 100 km from the Irpinia region.

  13. Point-of-View Recording Devices for Intraoperative Neurosurgical Video Capture.

    PubMed

    Porras, Jose L; Khalid, Syed; Root, Brandon K; Khan, Imad S; Singer, Robert J

    2016-01-01

    The ability to record and stream neurosurgery is an unprecedented opportunity to further research, medical education, and quality improvement. Here, we appraise the ease of implementation of existing point-of-view devices when capturing and sharing procedures from the neurosurgical operating room and detail their potential utility in this context. Our neurosurgical team tested and critically evaluated features of the Google Glass and Panasonic HX-A500 cameras, including ergonomics, media quality, and media sharing in both the operating theater and the angiography suite. Existing devices boast several features that facilitate live recording and streaming of neurosurgical procedures. Given that their primary application is not intended for the surgical environment, we identified a number of concrete, yet improvable, limitations. The present study suggests that neurosurgical video capture and live streaming represents an opportunity to contribute to research, education, and quality improvement. Despite this promise, shortcomings render existing devices impractical for serious consideration. We describe the features that future recording platforms should possess to improve upon existing technology.

  14. Streaming weekly soap opera video episodes to smartphones in a randomized controlled trial to reduce HIV risk in young urban African American/black women.

    PubMed

    Jones, Rachel; Lacroix, Lorraine J

    2012-07-01

    Love, Sex, and Choices is a 12-episode soap opera video series created as an intervention to reduce HIV sex risk. The effect on women's HIV risk behavior was evaluated in a randomized controlled trial in 238 high risk, predominately African American young adult women in the urban Northeast. To facilitate on-demand access and privacy, the episodes were streamed to study-provided smartphones. Here, we discuss the development of a mobile platform to deliver the 12-weekly video episodes or weekly HIV risk reduction written messages to smartphones, including; the technical requirements, development, and evaluation. Popularity of the smartphone and use of the Internet for multimedia offer a new channel to address health disparities in traditionally underserved populations. This is the first study to report on streaming a serialized video-based intervention to a smartphone. The approach described here may provide useful insights in assessing advantages and disadvantages of smartphones to implement a video-based intervention.

  15. Metal-coated microfluidic channels: An approach to eliminate streaming potential effects in nano biosensors.

    PubMed

    Lee, Jieun; Wipf, Mathias; Mu, Luye; Adams, Chris; Hannant, Jennifer; Reed, Mark A

    2017-01-15

    We report a method to suppress streaming potential using an Ag-coated microfluidic channel on a p-type silicon nanowire (SiNW) array measured by a multiplexed electrical readout. The metal layer sets a constant electrical potential along the microfluidic channel for a given reference electrode voltage regardless of the flow velocity. Without the Ag layer, the magnitude and sign of the surface potential change on the SiNW depends on the flow velocity, width of the microfluidic channel and the device's location inside the microfluidic channel with respect to the reference electrode. Noise analysis of the SiNW array with and without the Ag coating in the fluidic channel shows that noise frequency peaks, resulting from the operation of a piezoelectric micropump, are eliminated using the Ag layer with two reference electrodes located at inlet and outlet. This strategy presents a simple platform to eliminate the streaming potential and can become a powerful tool for nanoscale potentiometric biosensors. Copyright © 2016 Elsevier B.V. All rights reserved.

  16. Direct Characterization of Comets and Asteroids via Cosmic Dust Analysis from the Deep Space Gateway

    NASA Technical Reports Server (NTRS)

    Fries, M.; Fisher, K.

    2018-01-01

    The Deep Space Gateway (DSG) may provide a platform for direct sampling of a large number of comets and asteroids, through employment of an instrument for characterizing dust from these bodies. Every year, the Earth traverses through debris streams of dust and small particles from comets and asteroids in Earth-crossing orbits, generating short-lived outbursts of meteor activity commonly known as "meteor showers" (Figure 1). The material in each debris stream originates from a distinct parent body, many of which have been identified. By sampling this material, it is possible to quantitatively analyze the composition of a dozen or more comets and asteroids (See Figure 2, following page) without leaving cislunar space.

  17. Macrophyte presence is an indicator of enhanced denitrification and nitrification in sediments of a temperate restored agricultural stream

    EPA Science Inventory

    Stream macrophytes are often removed with their sediments to deepen stream channels, stabilize channel banks, or provide habitat for target species. These sediments may support enhanced nitrogen processing. To evaluate sediment nitrogen processing, identify seasonal patterns, and...

  18. Serial and Parallel Processing in the Primate Auditory Cortex Revisited

    PubMed Central

    Recanzone, Gregg H.; Cohen, Yale E.

    2009-01-01

    Over a decade ago it was proposed that the primate auditory cortex is organized in a serial and parallel manner in which there is a dorsal stream processing spatial information and a ventral stream processing non-spatial information. This organization is similar to the “what”/“where” processing of the primate visual cortex. This review will examine several key studies, primarily electrophysiological, that have tested this hypothesis. We also review several human imaging studies that have attempted to define these processing streams in the human auditory cortex. While there is good evidence that spatial information is processed along a particular series of cortical areas, the support for a non-spatial processing stream is not as strong. Why this should be the case and how to better test this hypothesis is also discussed. PMID:19686779

  19. Towards Guided Underwater Survey Using Light Visual Odometry

    NASA Astrophysics Data System (ADS)

    Nawaf, M. M.; Drap, P.; Royer, J. P.; Merad, D.; Saccone, M.

    2017-02-01

    A light distributed visual odometry method adapted to embedded hardware platform is proposed. The aim is to guide underwater surveys in real time. We rely on image stream captured using portable stereo rig attached to the embedded system. Taken images are analyzed on the fly to assess image quality in terms of sharpness and lightness, so that immediate actions can be taken accordingly. Images are then transferred over the network to another processing unit to compute the odometry. Relying on a standard ego-motion estimation approach, we speed up points matching between image quadruplets using a low level points matching scheme relying on fast Harris operator and template matching that is invariant to illumination changes. We benefit from having the light source attached to the hardware platform to estimate a priori rough depth belief following light divergence over distance low. The rough depth is used to limit points correspondence search zone as it linearly depends on disparity. A stochastic relative bundle adjustment is applied to minimize re-projection errors. The evaluation of the proposed method demonstrates the gain in terms of computation time w.r.t. other approaches that use more sophisticated feature descriptors. The built system opens promising areas for further development and integration of embedded computer vision techniques.

  20. A Scalable Cyberinfrastructure for Interactive Visualization of Terascale Microscopy Data

    PubMed Central

    Venkat, A.; Christensen, C.; Gyulassy, A.; Summa, B.; Federer, F.; Angelucci, A.; Pascucci, V.

    2017-01-01

    The goal of the recently emerged field of connectomics is to generate a wiring diagram of the brain at different scales. To identify brain circuitry, neuroscientists use specialized microscopes to perform multichannel imaging of labeled neurons at a very high resolution. CLARITY tissue clearing allows imaging labeled circuits through entire tissue blocks, without the need for tissue sectioning and section-to-section alignment. Imaging the large and complex non-human primate brain with sufficient resolution to identify and disambiguate between axons, in particular, produces massive data, creating great computational challenges to the study of neural circuits. Researchers require novel software capabilities for compiling, stitching, and visualizing large imagery. In this work, we detail the image acquisition process and a hierarchical streaming platform, ViSUS, that enables interactive visualization of these massive multi-volume datasets using a standard desktop computer. The ViSUS visualization framework has previously been shown to be suitable for 3D combustion simulation, climate simulation and visualization of large scale panoramic images. The platform is organized around a hierarchical cache oblivious data layout, called the IDX file format, which enables interactive visualization and exploration in ViSUS, scaling to the largest 3D images. In this paper we showcase the VISUS framework used in an interactive setting with the microscopy data. PMID:28638896

  1. A Scalable Cyberinfrastructure for Interactive Visualization of Terascale Microscopy Data.

    PubMed

    Venkat, A; Christensen, C; Gyulassy, A; Summa, B; Federer, F; Angelucci, A; Pascucci, V

    2016-08-01

    The goal of the recently emerged field of connectomics is to generate a wiring diagram of the brain at different scales. To identify brain circuitry, neuroscientists use specialized microscopes to perform multichannel imaging of labeled neurons at a very high resolution. CLARITY tissue clearing allows imaging labeled circuits through entire tissue blocks, without the need for tissue sectioning and section-to-section alignment. Imaging the large and complex non-human primate brain with sufficient resolution to identify and disambiguate between axons, in particular, produces massive data, creating great computational challenges to the study of neural circuits. Researchers require novel software capabilities for compiling, stitching, and visualizing large imagery. In this work, we detail the image acquisition process and a hierarchical streaming platform, ViSUS, that enables interactive visualization of these massive multi-volume datasets using a standard desktop computer. The ViSUS visualization framework has previously been shown to be suitable for 3D combustion simulation, climate simulation and visualization of large scale panoramic images. The platform is organized around a hierarchical cache oblivious data layout, called the IDX file format, which enables interactive visualization and exploration in ViSUS, scaling to the largest 3D images. In this paper we showcase the VISUS framework used in an interactive setting with the microscopy data.

  2. EFFECTS OF HYDROLOGY ON NITROGEN PROCESSING IN A RESTORED URBAN STREAM

    EPA Science Inventory

    In 2001, EPA undertook an intensive research effort to evaluate the impact of stream restoration on water quality at a degraded stream in an urban watershed. An essential piece of this comprehensive study was to characterize, measure and quantify stream ground water/ stream wate...

  3. Experimental reductions in stream flow alter litter processing and consumer subsidies in headwater streams

    Treesearch

    Robert M. Northington; Jackson R. Webster

    2017-01-01

    SummaryForested headwater streams are connected to their surrounding catchments by a reliance on terrestrial subsidies. Changes in precipitation patterns and stream flow represent a potential disruption in stream ecosystem function, as the delivery of terrestrial detritus to aquatic consumers and...

  4. 40 CFR 63.146 - Process wastewater provisions-reporting.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... wastewater provisions—reporting. (a) For each waste management unit, treatment process, or control device... for Group 2 wastewater streams. This paragraph does not apply to Group 2 wastewater streams that are used to comply with § 63.138(g). For Group 2 wastewater streams, the owner or operator shall include...

  5. 40 CFR 63.146 - Process wastewater provisions-reporting.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... wastewater provisions—reporting. (a) For each waste management unit, treatment process, or control device... for Group 2 wastewater streams. This paragraph does not apply to Group 2 wastewater streams that are used to comply with § 63.138(g). For Group 2 wastewater streams, the owner or operator shall include...

  6. 40 CFR 63.146 - Process wastewater provisions-reporting.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... wastewater provisions—reporting. (a) For each waste management unit, treatment process, or control device... for Group 2 wastewater streams. This paragraph does not apply to Group 2 wastewater streams that are used to comply with § 63.138(g). For Group 2 wastewater streams, the owner or operator shall include...

  7. 40 CFR 63.146 - Process wastewater provisions-reporting.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... wastewater provisions—reporting. (a) For each waste management unit, treatment process, or control device... for Group 2 wastewater streams. This paragraph does not apply to Group 2 wastewater streams that are used to comply with § 63.138(g). For Group 2 wastewater streams, the owner or operator shall include...

  8. Application of Emerging Open-source Embedded Systems for Enabling Low-cost Wireless Mini-observatory Nodes in the Coastal Zone

    NASA Astrophysics Data System (ADS)

    Glazer, B. T.

    2016-02-01

    Here, we describe the development of novel, low-cost, open-source instrumentation to enable wireless data transfer of biogeochemical sensors in the coastal zone. The platform is centered upon the Beaglebone Black single board computer. Process-inquiry in environmental sciences suffers from undersampling; enabling sustained and unattended data collection typically involves expensive instrumentation and infrastructure deployed as cabled observatories with little flexibility in deployment location following initial installation. High cost of commercially-available or custom electronic packages have not only limited the number of sensor node sites that can be targeted by reasonably well-funded academic researchers, but have also entirely prohibited widespread engagement with K-12, public non-profit, and `citizen scientist' STEM audiences. The new platform under development represents a balanced blend of research-grade sensors and low-cost open-source electronics that are easily assembled. Custom, robust, open-source code that remains customizable for specific node configurations can match a specific deployment's measurement needs, depending on the scientific research priorities. We have demonstrated prototype capabilities and versatility through lab testing and field deployments of multiple sensor nodes with multiple sensor inputs, all of which are streaming near-real-time data over wireless RF links to a shore-based base station. On shore, first-pass data processing QA/QC takes place and near-real-time plots are made available on the World Wide Web. Specifically, we have worked closely with an environmental and cultural management and restoration non-profit organization, and middle and high school science classes, engaging their interest in STEM application to local watershed processes. Ultimately, continued successful development of this pilot project can lead to a coastal oceanographic analogue of the popular Weather Underground personal weather station model.

  9. Continuous-flow free acid monitoring method and system

    DOEpatents

    Strain, J.E.; Ross, H.H.

    1980-01-11

    A free acid monitoring method and apparatus is provided for continuously measuring the excess acid present in a process stream. The disclosed monitoring system and method is based on the relationship of the partial pressure ratio of water and acid in equilibrium with an acid solution at constant temperature. A portion of the process stream is pumped into and flows through the monitor under the influence of gravity and back to the process stream. A continuous flowing sample is vaporized at a constant temperature and the vapor is subsequently condensed. Conductivity measurements of the condensate produces a nonlinear response function from which the free acid molarity of the sample process stream is determined.

  10. Continuous-flow free acid monitoring method and system

    DOEpatents

    Strain, James E.; Ross, Harley H.

    1981-01-01

    A free acid monitoring method and apparatus is provided for continuously measuring the excess acid present in a process stream. The disclosed monitoring system and method is based on the relationship of the partial pressure ratio of water and acid in equilibrium with an acid solution at constant temperature. A portion of the process stream is pumped into and flows through the monitor under the influence of gravity and back to the process stream. A continuous flowing sample is vaporized at a constant temperature and the vapor is subsequently condensed. Conductivity measurements of the condensate produces a nonlinear response function from which the free acid molarity of the sample process stream is determined.

  11. 4DCAPTURE: a general purpose software package for capturing and analyzing two- and three-dimensional motion data acquired from video sequences

    NASA Astrophysics Data System (ADS)

    Walton, James S.; Hodgson, Peter; Hallamasek, Karen; Palmer, Jake

    2003-07-01

    4DVideo is creating a general purpose capability for capturing and analyzing kinematic data from video sequences in near real-time. The core element of this capability is a software package designed for the PC platform. The software ("4DCapture") is designed to capture and manipulate customized AVI files that can contain a variety of synchronized data streams -- including audio, video, centroid locations -- and signals acquired from more traditional sources (such as accelerometers and strain gauges.) The code includes simultaneous capture or playback of multiple video streams, and linear editing of the images (together with the ancilliary data embedded in the files). Corresponding landmarks seen from two or more views are matched automatically, and photogrammetric algorithms permit multiple landmarks to be tracked in two- and three-dimensions -- with or without lens calibrations. Trajectory data can be processed within the main application or they can be exported to a spreadsheet where they can be processed or passed along to a more sophisticated, stand-alone, data analysis application. Previous attempts to develop such applications for high-speed imaging have been limited in their scope, or by the complexity of the application itself. 4DVideo has devised a friendly ("FlowStack") user interface that assists the end-user to capture and treat image sequences in a natural progression. 4DCapture employs the AVI 2.0 standard and DirectX technology which effectively eliminates the file size limitations found in older applications. In early tests, 4DVideo has streamed three RS-170 video sources to disk for more than an hour without loss of data. At this time, the software can acquire video sequences in three ways: (1) directly, from up to three hard-wired cameras supplying RS-170 (monochrome) signals; (2) directly, from a single camera or video recorder supplying an NTSC (color) signal; and (3) by importing existing video streams in the AVI 1.0 or AVI 2.0 formats. The latter is particularly useful for high-speed applications where the raw images are often captured and stored by the camera before being downloaded. Provision has been made to synchronize data acquired from any combination of these video sources using audio and visual "tags." Additional "front-ends," designed for digital cameras, are anticipated.

  12. Controlled temperature expansion in oxygen production by molten alkali metal salts

    DOEpatents

    Erickson, Donald C.

    1985-06-04

    A continuous process is set forth for the production of oxygen from an oxygen containing gas stream, such as air, by contacting a feed gas stream with a molten solution of an oxygen acceptor to oxidize the acceptor and cyclically regenerating the oxidized acceptor by releasing oxygen from the acceptor wherein the oxygen-depleted gas stream from the contact zone is treated sequentially to temperature reduction by heat exchange against the feed stream so as to condense out entrained oxygen acceptor for recycle to the process, combustion of the gas stream with fuel to elevate its temperature and expansion of the combusted high temperature gas stream in a turbine to recover power.

  13. Controlled temperature expansion in oxygen production by molten alkali metal salts

    DOEpatents

    Erickson, D.C.

    1985-06-04

    A continuous process is set forth for the production of oxygen from an oxygen containing gas stream, such as air, by contacting a feed gas stream with a molten solution of an oxygen acceptor to oxidize the acceptor and cyclically regenerating the oxidized acceptor by releasing oxygen from the acceptor wherein the oxygen-depleted gas stream from the contact zone is treated sequentially to temperature reduction by heat exchange against the feed stream so as to condense out entrained oxygen acceptor for recycle to the process, combustion of the gas stream with fuel to elevate its temperature and expansion of the combusted high temperature gas stream in a turbine to recover power. 1 fig.

  14. Methanation of gas streams containing carbon monoxide and hydrogen

    DOEpatents

    Frost, Albert C.

    1983-01-01

    Carbon monoxide-containing gas streams having a relatively high concentration of hydrogen are pretreated so as to remove the hydrogen in a recoverable form for use in the second step of a cyclic, essentially two-step process for the production of methane. The thus-treated streams are then passed over a catalyst to deposit a surface layer of active surface carbon thereon essentially without the formation of inactive coke. This active carbon is reacted with said hydrogen removed from the feed gas stream to form methane. The utilization of the CO in the feed gas stream is appreciably increased, enhancing the overall process for the production of relatively pure, low-cost methane from CO-containing waste gas streams.

  15. Method for removing undesired particles from gas streams

    DOEpatents

    Durham, M.D.; Schlager, R.J.; Ebner, T.G.; Stewart, R.M.; Hyatt, D.E.; Bustard, C.J.; Sjostrom, S.

    1998-11-10

    The present invention discloses a process for removing undesired particles from a gas stream including the steps of contacting a composition containing an adhesive with the gas stream; collecting the undesired particles and adhesive on a collection surface to form an aggregate comprising the adhesive and undesired particles on the collection surface; and removing the agglomerate from the collection zone. The composition may then be atomized and injected into the gas stream. The composition may include a liquid that vaporizes in the gas stream. After the liquid vaporizes, adhesive particles are entrained in the gas stream. The process may be applied to electrostatic precipitators and filtration systems to improve undesired particle collection efficiency. 11 figs.

  16. Method and apparatus for decreased undesired particle emissions in gas streams

    DOEpatents

    Durham, M.D.; Schlager, R.J.; Ebner, T.G.; Stewart, R.M.; Bustard, C.J.

    1999-04-13

    The present invention discloses a process for removing undesired particles from a gas stream including the steps of contacting a composition containing an adhesive with the gas stream; collecting the undesired particles and adhesive on a collection surface to form an aggregate comprising the adhesive and undesired particles on the collection surface; and removing the agglomerate from the collection zone. The composition may then be atomized and injected into the gas stream. The composition may include a liquid that vaporizes in the gas stream. After the liquid vaporizes, adhesive particles are entrained in the gas stream. The process may be applied to electrostatic precipitators and filtration systems to improve undesired particle collection efficiency. 5 figs.

  17. Method and apparatus for decreased undesired particle emissions in gas streams

    DOEpatents

    Durham, Michael Dean; Schlager, Richard John; Ebner, Timothy George; Stewart, Robin Michele; Bustard, Cynthia Jean

    1999-01-01

    The present invention discloses a process for removing undesired particles from a gas stream including the steps of contacting a composition containing an adhesive with the gas stream; collecting the undesired particles and adhesive on a collection surface to form an aggregate comprising the adhesive and undesired particles on the collection surface; and removing the agglomerate from the collection zone. The composition may then be atomized and injected into the gas stream. The composition may include a liquid that vaporizes in the gas stream. After the liquid vaporizes, adhesive particles are entrained in the gas stream. The process may be applied to electrostatic precipitators and filtration systems to improve undesired particle collection efficiency.

  18. Method for removing undesired particles from gas streams

    DOEpatents

    Durham, Michael Dean; Schlager, Richard John; Ebner, Timothy George; Stewart, Robin Michele; Hyatt, David E.; Bustard, Cynthia Jean; Sjostrom, Sharon

    1998-01-01

    The present invention discloses a process for removing undesired particles from a gas stream including the steps of contacting a composition containing an adhesive with the gas stream; collecting the undesired particles and adhesive on a collection surface to form an aggregate comprising the adhesive and undesired particles on the collection surface; and removing the agglomerate from the collection zone. The composition may then be atomized and injected into the gas stream. The composition may include a liquid that vaporizes in the gas stream. After the liquid vaporizes, adhesive particles are entrained in the gas stream. The process may be applied to electrostatic precipitators and filtration systems to improve undesired particle collection efficiency.

  19. Process and system for removing impurities from a gas

    DOEpatents

    Henningsen, Gunnar; Knowlton, Teddy Merrill; Findlay, John George; Schlather, Jerry Neal; Turk, Brian S

    2014-04-15

    A fluidized reactor system for removing impurities from a gas and an associated process are provided. The system includes a fluidized absorber for contacting a feed gas with a sorbent stream to reduce the impurity content of the feed gas; a fluidized solids regenerator for contacting an impurity loaded sorbent stream with a regeneration gas to reduce the impurity content of the sorbent stream; a first non-mechanical gas seal forming solids transfer device adapted to receive an impurity loaded sorbent stream from the absorber and transport the impurity loaded sorbent stream to the regenerator at a controllable flow rate in response to an aeration gas; and a second non-mechanical gas seal forming solids transfer device adapted to receive a sorbent stream of reduced impurity content from the regenerator and transfer the sorbent stream of reduced impurity content to the absorber without changing the flow rate of the sorbent stream.

  20. Tracking Training-Related Plasticity by Combining fMRI and DTI: The Right Hemisphere Ventral Stream Mediates Musical Syntax Processing.

    PubMed

    Oechslin, Mathias S; Gschwind, Markus; James, Clara E

    2018-04-01

    As a functional homolog for left-hemispheric syntax processing in language, neuroimaging studies evidenced involvement of right prefrontal regions in musical syntax processing, of which underlying white matter connectivity remains unexplored so far. In the current experiment, we investigated the underlying pathway architecture in subjects with 3 levels of musical expertise. Employing diffusion tensor imaging tractography, departing from seeds from our previous functional magnetic resonance imaging study on music syntax processing in the same participants, we identified a pathway in the right ventral stream that connects the middle temporal lobe with the inferior frontal cortex via the extreme capsule, and corresponds to the left hemisphere ventral stream, classically attributed to syntax processing in language comprehension. Additional morphometric consistency analyses allowed dissociating tract core from more dispersed fiber portions. Musical expertise related to higher tract consistency of the right ventral stream pathway. Specifically, tract consistency in this pathway predicted the sensitivity for musical syntax violations. We conclude that enduring musical practice sculpts ventral stream architecture. Our results suggest that training-related pathway plasticity facilitates the right hemisphere ventral stream information transfer, supporting an improved sound-to-meaning mapping in music.

  1. Data Streams: An Overview and Scientific Applications

    NASA Astrophysics Data System (ADS)

    Aggarwal, Charu C.

    In recent years, advances in hardware technology have facilitated the ability to collect data continuously. Simple transactions of everyday life such as using a credit card, a phone, or browsing the web lead to automated data storage. Similarly, advances in information technology have lead to large flows of data across IP networks. In many cases, these large volumes of data can be mined for interesting and relevant information in a wide variety of applications. When the volume of the underlying data is very large, it leads to a number of computational and mining challenges: With increasing volume of the data, it is no longer possible to process the data efficiently by using multiple passes. Rather, one can process a data item at most once. This leads to constraints on the implementation of the underlying algorithms. Therefore, stream mining algorithms typically need to be designed so that the algorithms work with one pass of the data. In most cases, there is an inherent temporal component to the stream mining process. This is because the data may evolve over time. This behavior of data streams is referred to as temporal locality. Therefore, a straightforward adaptation of one-pass mining algorithms may not be an effective solution to the task. Stream mining algorithms need to be carefully designed with a clear focus on the evolution of the underlying data. Another important characteristic of data streams is that they are often mined in a distributed fashion. Furthermore, the individual processors may have limited processing and memory. Examples of such cases include sensor networks, in which it may be desirable to perform in-network processing of data stream with limited processing and memory [1, 2]. This chapter will provide an overview of the key challenges in stream mining algorithms which arise from the unique setup in which these problems are encountered. This chapter is organized as follows. In the next section, we will discuss the generic challenges that stream mining poses to a variety of data management and data mining problems. The next section also deals with several issues which arise in the context of data stream management. In Sect. 3, we discuss several mining algorithms on the data stream model. Section 4 discusses various scientific applications of data streams. Section 5 discusses the research directions and conclusions.

  2. Role of submerged vegetation in the retention processes of three plant protection products in flow-through stream mesocosms.

    PubMed

    Stang, Christoph; Wieczorek, Matthias Valentin; Noss, Christian; Lorke, Andreas; Scherr, Frank; Goerlitz, Gerhard; Schulz, Ralf

    2014-07-01

    Quantitative information on the processes leading to the retention of plant protection products (PPPs) in surface waters is not available, particularly for flow-through systems. The influence of aquatic vegetation on the hydraulic- and sorption-mediated mitigation processes of three PPPs (triflumuron, pencycuron, and penflufen; logKOW 3.3-4.9) in 45-m slow-flowing stream mesocosms was investigated. Peak reductions were 35-38% in an unvegetated stream mesocosm, 60-62% in a sparsely vegetated stream mesocosm (13% coverage with Elodea nuttallii), and in a similar range of 57-69% in a densely vegetated stream mesocosm (100% coverage). Between 89% and 93% of the measured total peak reductions in the sparsely vegetated stream can be explained by an increase of vegetation-induced dispersion (estimated with the one-dimensional solute transport model OTIS), while 7-11% of the peak reduction can be attributed to sorption processes. However, dispersion contributed only 59-71% of the peak reductions in the densely vegetated stream mesocosm, where 29% to 41% of the total peak reductions can be attributed to sorption processes. In the densely vegetated stream, 8-27% of the applied PPPs, depending on the logKOW values of the compounds, were temporarily retained by macrophytes. Increasing PPP recoveries in the aqueous phase were accompanied by a decrease of PPP concentrations in macrophytes indicating kinetic desorption over time. This is the first study to provide quantitative data on how the interaction of dispersion and sorption, driven by aquatic macrophytes, influences the mitigation of PPP concentrations in flowing vegetated stream systems. Copyright © 2014 Elsevier Ltd. All rights reserved.

  3. Process for the physical segregation of minerals

    DOEpatents

    Yingling, Jon C.; Ganguli, Rajive

    2004-01-06

    With highly heterogeneous groups or streams of minerals, physical segregation using online quality measurements is an economically important first stage of the mineral beneficiation process. Segregation enables high quality fractions of the stream to bypass processing, such as cleaning operations, thereby reducing the associated costs and avoiding the yield losses inherent in any downstream separation process. The present invention includes various methods for reliably segregating a mineral stream into at least one fraction meeting desired quality specifications while at the same time maximizing yield of that fraction.

  4. Analytic Strategies of Streaming Data for eHealth.

    PubMed

    Yoon, Sunmoo

    2016-01-01

    New analytic strategies for streaming big data from wearable devices and social media are emerging in ehealth. We face challenges to find meaningful patterns from big data because researchers face difficulties to process big volume of streaming data using traditional processing applications.1 This introductory 180 minutes tutorial offers hand-on instruction on analytics2 (e.g., topic modeling, social network analysis) of streaming data. This tutorial aims to provide practical strategies of information on reducing dimensionality using examples of big data. This tutorial will highlight strategies of incorporating domain experts and a comprehensive approach to streaming social media data.

  5. Nitrogen processing by grazers in a headwater stream: riparian connections

    DOE PAGES

    Hill, Walter R.; Griffiths, Natalie A.

    2016-10-19

    Primary consumers play important roles in the cycling of nutrients in headwater streams, storing assimilated nutrients in growing tissue and recycling them through excretion. Though environmental conditions in most headwater streams and their surrounding terrestrial ecosystems vary considerably over the course of a year, relatively little is known about the effects of seasonality on consumer nutrient recycling these streams. Here, we measured nitrogen accumulated through growth and excreted by the grazing snail Elimia clavaeformis (Pleuroceridae) over the course of 12 months in Walker Branch, identifying close connections between in-stream nitrogen processing and seasonal changes in the surrounding forest.

  6. Method and apparatus for separation of heavy and tritiated water

    DOEpatents

    Lee, Myung W.

    2001-01-01

    The present invention is a bi-thermal membrane process for separating and recovering hydrogen isotopes from a fluid containing hydrogen isotopes, such as water and hydrogen gas. The process in accordance with the present invention provides counter-current cold and hot streams of the fluid separated with a thermally insulating and chemically transparent proton exchange membrane (PEM). The two streams exchange hydrogen isotopes through the membrane: the heavier isotopes migrate into the cold stream, while the lighter isotopes migrate into the hot stream. The heavy and light isotopes are continuously withdrawn from the cold and hot streams respectively.

  7. A multistream model of visual word recognition.

    PubMed

    Allen, Philip A; Smith, Albert F; Lien, Mei-Ching; Kaut, Kevin P; Canfield, Angie

    2009-02-01

    Four experiments are reported that test a multistream model of visual word recognition, which associates letter-level and word-level processing channels with three known visual processing streams isolated in macaque monkeys: the magno-dominated (MD) stream, the interblob-dominated (ID) stream, and the blob-dominated (BD) stream (Van Essen & Anderson, 1995). We show that mixing the color of adjacent letters of words does not result in facilitation of response times or error rates when the spatial-frequency pattern of a whole word is familiar. However, facilitation does occur when the spatial-frequency pattern of a whole word is not familiar. This pattern of results is not due to different luminance levels across the different-colored stimuli and the background because isoluminant displays were used. Also, the mixed-case, mixed-hue facilitation occurred when different display distances were used (Experiments 2 and 3), so this suggests that image normalization can adjust independently of object size differences. Finally, we show that this effect persists in both spaced and unspaced conditions (Experiment 4)--suggesting that inappropriate letter grouping by hue cannot account for these results. These data support a model of visual word recognition in which lower spatial frequencies are processed first in the more rapid MD stream. The slower ID and BD streams may process some lower spatial frequency information in addition to processing higher spatial frequency information, but these channels tend to lose the processing race to recognition unless the letter string is unfamiliar to the MD stream--as with mixed-case presentation.

  8. Tampa Bay as a model estuary for examining the impact of human activities on biogeochemical processes: an introduction

    USGS Publications Warehouse

    Swarzenski, Peter W.; Baskaran, Mark; Henderson, Carl S.; Yates, Kim

    2007-01-01

    Tampa Bay is a shallow, Y-shaped coastal embayment that is located along the center of the Florida Platform – an expansive accumulation of Cretaceous–Tertiary shallow-water carbonates and evaporites that were periodically exposed during glacio–eustatic sea level fluctuations. As a consequence, extensive karstification likely had a controlling impact on the geologic evolution of Tampa Bay. Despite its large aerial size (∼ 1000 km2), Tampa Bay is relatively shallow (mean depth = 4 m) and its watershed (6700 km2) is among the smallest in the Gulf of Mexico. About 85% of all freshwater inflow (mean = 63 m3 s-1) to the bay is carried by four principal tributaries (Orlando et al., 1993). Groundwater makes up an important component of baseflow of these coastal streams and may also be important in delivering nutrients and other constituents to the bay proper by submarine groundwater discharge.

  9. KSC-2009-3556

    NASA Image and Video Library

    2009-06-05

    CAPE CANAVERAL, Fla. – At the Astrotech payload processing facility in Titusville, Fla., workers prepare to move the platform on which the encapsulated GOES-O satellite sits in preparation for moving GOES-O to Cape Canaveral Air Force Station's Launch Complex 37 pad where it will be mated with the United Launch Alliance Delta IV expendable launch vehicle. The GOES-O satellite is targeted to launch no earlier than June 26. The latest Geostationary Operational Environmental Satellite, GOES-O was developed by NASA for the National Oceanic and Atmospheric Administration, or NOAA. The GOES satellites continuously provide observations of 60 percent of the Earth including the continental United States, providing weather monitoring and forecast operations as well as a continuous and reliable stream of environmental information and severe weather warnings. Once in orbit, GOES-O will be designated GOES-14, and NASA will provide on-orbit checkout and then transfer operational responsibility to NOAA. Photo credit: NASA/Jack Pfaller

  10. KSC-2009-3557

    NASA Image and Video Library

    2009-06-05

    CAPE CANAVERAL, Fla. – At the Astrotech payload processing facility in Titusville, Fla., workers move the platform on which the encapsulated GOES-O satellite sits in preparation for moving GOES-O to Cape Canaveral Air Force Station's Launch Complex 37 pad where it will be mated with the United Launch Alliance Delta IV expendable launch vehicle. The GOES-O satellite is targeted to launch no earlier than June 26. The latest Geostationary Operational Environmental Satellite, GOES-O was developed by NASA for the National Oceanic and Atmospheric Administration, or NOAA. The GOES satellites continuously provide observations of 60 percent of the Earth including the continental United States, providing weather monitoring and forecast operations as well as a continuous and reliable stream of environmental information and severe weather warnings. Once in orbit, GOES-O will be designated GOES-14, and NASA will provide on-orbit checkout and then transfer operational responsibility to NOAA. Photo credit: NASA/Jack Pfaller

  11. KSC-2009-3554

    NASA Image and Video Library

    2009-06-05

    CAPE CANAVERAL, Fla. – At the Astrotech payload processing facility in Titusville, Fla., access platforms are being removed from around the encapsulated GOES-O satellite in preparation for moving GOES-O to Cape Canaveral Air Force Station's Launch Complex 37 pad where it will be mated with the United Launch Alliance Delta IV expendable launch vehicle. The GOES-O satellite is targeted to launch no earlier than June 26. The latest Geostationary Operational Environmental Satellite, GOES-O was developed by NASA for the National Oceanic and Atmospheric Administration, or NOAA. The GOES satellites continuously provide observations of 60 percent of the Earth including the continental United States, providing weather monitoring and forecast operations as well as a continuous and reliable stream of environmental information and severe weather warnings. Once in orbit, GOES-O will be designated GOES-14, and NASA will provide on-orbit checkout and then transfer operational responsibility to NOAA. Photo credit: NASA/Jack Pfaller

  12. KSC-2009-3555

    NASA Image and Video Library

    2009-06-05

    CAPE CANAVERAL, Fla. – At the Astrotech payload processing facility in Titusville, Fla., access platforms are being removed from around the encapsulated GOES-O satellite in preparation for moving GOES-O to Cape Canaveral Air Force Station's Launch Complex 37 pad where it will be mated with the United Launch Alliance Delta IV expendable launch vehicle. The GOES-O satellite is targeted to launch no earlier than June 26. The latest Geostationary Operational Environmental Satellite, GOES-O was developed by NASA for the National Oceanic and Atmospheric Administration, or NOAA. The GOES satellites continuously provide observations of 60 percent of the Earth including the continental United States, providing weather monitoring and forecast operations as well as a continuous and reliable stream of environmental information and severe weather warnings. Once in orbit, GOES-O will be designated GOES-14, and NASA will provide on-orbit checkout and then transfer operational responsibility to NOAA. Photo credit: NASA/Jack Pfaller

  13. KSC-2009-3558

    NASA Image and Video Library

    2009-06-05

    CAPE CANAVERAL, Fla. – At the Astrotech payload processing facility in Titusville, Fla., workers move the platform on which the encapsulated GOES-O satellite sits in preparation for moving GOES-O to Cape Canaveral Air Force Station's Launch Complex 37 pad where it will be mated with the United Launch Alliance Delta IV expendable launch vehicle. The GOES-O satellite is targeted to launch no earlier than June 26. The latest Geostationary Operational Environmental Satellite, GOES-O was developed by NASA for the National Oceanic and Atmospheric Administration, or NOAA. The GOES satellites continuously provide observations of 60 percent of the Earth including the continental United States, providing weather monitoring and forecast operations as well as a continuous and reliable stream of environmental information and severe weather warnings. Once in orbit, GOES-O will be designated GOES-14, and NASA will provide on-orbit checkout and then transfer operational responsibility to NOAA. Photo credit: NASA/Jack Pfaller

  14. The HEC RAS model of regulated stream for purposes of flood risk reduction

    NASA Astrophysics Data System (ADS)

    Fijko, Rastislav; Zeleňáková, Martina

    2016-06-01

    The work highlights the modeling of water flow in open channels using 1D mathematical model HEC-RAS in the area of interest Lopuchov village in eastern Slovakia. We created a digital model from a geodetic survey, which was used to show the area of inundation in ArcGIS software. We point out the modeling methodology with emphasis to collection of the data and their relevance for determination of boundary conditions in 3D model of the study area in GIS platform. The BIM objects can be exported to the defined model of the area. The obtained results were used for simulation of flooding. The results give to us clearly and distinctly defined areas of inundation, which we used in the processing of Cost benefit analysis. We used the developed model for stating the potential damages in flood vulnerable areas.

  15. Dust-penetrating (DUSPEN) see-through lidar for helicopter situational awareness in DVE

    NASA Astrophysics Data System (ADS)

    Murray, James T.; Seely, Jason; Plath, Jeff; Gotfredson, Eric; Engel, John; Ryder, Bill; Van Lieu, Neil; Goodwin, Ron; Wagner, Tyler; Fetzer, Greg; Kridler, Nick; Melancon, Chris; Panici, Ken; Mitchell, Anthony

    2013-10-01

    Areté Associates recently developed and flight tested a next-generation low-latency near real-time dust-penetrating (DUSPEN) imaging lidar system. These tests were accomplished for Naval Air Warfare Center (NAWC) Aircraft Division (AD) 4.5.6 (EO/IR Sensor Division) under the Office of Naval Research (ONR) Future Naval Capability (FNC) Helicopter Low-Level Operations (HELO) Product 2 program. Areté's DUSPEN system captures full lidar waveforms and uses sophisticated real-time detection and filtering algorithms to discriminate hard target returns from dust and other obscurants. Down-stream 3D image processing methods are used to enhance pilot visualization of threat objects and ground features during severe DVE conditions. This paper presents results from these recent flight tests in full brown-out conditions at Yuma Proving Grounds (YPG) from a CH-53E Super Stallion helicopter platform.

  16. Dust-Penetrating (DUSPEN) "see-through" lidar for helicopter situational awareness in DVE

    NASA Astrophysics Data System (ADS)

    Murray, James T.; Seely, Jason; Plath, Jeff; Gotfreson, Eric; Engel, John; Ryder, Bill; Van Lieu, Neil; Goodwin, Ron; Wagner, Tyler; Fetzer, Greg; Kridler, Nick; Melancon, Chris; Panici, Ken; Mitchell, Anthony

    2013-05-01

    Areté Associates recently developed and flight tested a next-generation low-latency near real-time dust-penetrating (DUSPEN) imaging lidar system. These tests were accomplished for Naval Air Warfare Center (NAWC) Aircraft Division (AD) 4.5.6 (EO/IR Sensor Division) under the Office of Naval Research (ONR) Future Naval Capability (FNC) Helicopter Low-Level Operations (HELO) Product 2 program. Areté's DUSPEN system captures full lidar waveforms and uses sophisticated real-time detection and filtering algorithms to discriminate hard target returns from dust and other obscurants. Down-stream 3D image processing methods are used to enhance pilot visualization of threat objects and ground features during severe DVE conditions. This paper presents results from these recent flight tests in full brown-out conditions at Yuma Proving Grounds (YPG) from a CH-53E Super Stallion helicopter platform.

  17. Interconnect Performance Evaluation of SGI Altix 3700 BX2, Cray X1, Cray Opteron Cluster, and Dell PowerEdge

    NASA Technical Reports Server (NTRS)

    Fatoohi, Rod; Saini, Subbash; Ciotti, Robert

    2006-01-01

    We study the performance of inter-process communication on four high-speed multiprocessor systems using a set of communication benchmarks. The goal is to identify certain limiting factors and bottlenecks with the interconnect of these systems as well as to compare these interconnects. We measured network bandwidth using different number of communicating processors and communication patterns, such as point-to-point communication, collective communication, and dense communication patterns. The four platforms are: a 512-processor SGI Altix 3700 BX2 shared-memory machine with 3.2 GB/s links; a 64-processor (single-streaming) Cray XI shared-memory machine with 32 1.6 GB/s links; a 128-processor Cray Opteron cluster using a Myrinet network; and a 1280-node Dell PowerEdge cluster with an InfiniBand network. Our, results show the impact of the network bandwidth and topology on the overall performance of each interconnect.

  18. COLA: Optimizing Stream Processing Applications via Graph Partitioning

    NASA Astrophysics Data System (ADS)

    Khandekar, Rohit; Hildrum, Kirsten; Parekh, Sujay; Rajan, Deepak; Wolf, Joel; Wu, Kun-Lung; Andrade, Henrique; Gedik, Buğra

    In this paper, we describe an optimization scheme for fusing compile-time operators into reasonably-sized run-time software units called processing elements (PEs). Such PEs are the basic deployable units in System S, a highly scalable distributed stream processing middleware system. Finding a high quality fusion significantly benefits the performance of streaming jobs. In order to maximize throughput, our solution approach attempts to minimize the processing cost associated with inter-PE stream traffic while simultaneously balancing load across the processing hosts. Our algorithm computes a hierarchical partitioning of the operator graph based on a minimum-ratio cut subroutine. We also incorporate several fusion constraints in order to support real-world System S jobs. We experimentally compare our algorithm with several other reasonable alternative schemes, highlighting the effectiveness of our approach.

  19. Data Reduction of Laser Ablation Split-Stream (LASS) Analyses Using Newly Developed Features Within Iolite: With Applications to Lu-Hf + U-Pb in Detrital Zircon and Sm-Nd +U-Pb in Igneous Monazite

    NASA Astrophysics Data System (ADS)

    Fisher, Christopher M.; Paton, Chad; Pearson, D. Graham; Sarkar, Chiranjeeb; Luo, Yan; Tersmette, Daniel B.; Chacko, Thomas

    2017-12-01

    A robust platform to view and integrate multiple data sets collected simultaneously is required to realize the utility and potential of the Laser Ablation Split-Stream (LASS) method. This capability, until now, has been unavailable and practitioners have had to laboriously process each data set separately, making it challenging to take full advantage of the benefits of LASS. We describe a new program for handling multiple mass spectrometric data sets collected simultaneously, designed specifically for the LASS technique, by which a laser aerosol is been split into two or more separate "streams" to be measured on separate mass spectrometers. New features within Iolite (https://iolite-software.com) enable the capability of loading, synchronizing, viewing, and reducing two or more data sets acquired simultaneously, as multiple DRSs (data reduction schemes) can be run concurrently. While this version of Iolite accommodates any combination of simultaneously collected mass spectrometer data, we demonstrate the utility using case studies where U-Pb and Lu-Hf isotope composition of zircon, and U-Pb and Sm-Nd isotope composition of monazite were analyzed simultaneously, in crystals showing complex isotopic zonation. These studies demonstrate the importance of being able to view and integrate simultaneously acquired data sets, especially for samples with complicated zoning and decoupled isotope systematics, in order to extract accurate and geologically meaningful isotopic and compositional data. This contribution provides instructions and examples for handling simultaneously collected laser ablation data. An instructional video is also provided. The updated Iolite software will help to fully develop the applications of both LASS and multi-instrument mass spectrometric measurement capabilities.

  20. Source Apportionment of Suspended Sediment Sources using 137Cs and 210Pbxs

    NASA Astrophysics Data System (ADS)

    Lamba, J.; Karthikeyan, K.; Thompson, A.

    2017-12-01

    A study was conducted in the Pleasant Valley Watershed (50 km 2) in South Central Wisconsin to better understand sediment transport processes using sediment fingerprinting technique. Previous studies conducted in this watershed showed that resuspension of fine sediment deposited on the stream bed is an important source of suspended sediment. To better understand the role of fine sediment deposited on the stream bed, fallout radionuclides,137Cs and 210Pbxs were used to determine relative contribution to suspended sediment from in-stream (stream bank and stream bed) and upland sediment sources. Suspended sediment samples were collected during the crop growing season. Potential sources of suspended sediment considered in this study included cropland, pasture and in-stream (stream bed and stream bank). Suspended sediment sources were determined at a subwatershed level. Results of this study showed that in-stream sediment sources are important sources of suspended sediment. Future research should be conducted to better understand the role of legacy sediment in watershed-level sediment transport processes.

  1. A spatial characterization of the Sagittarius dwarf galaxy tidal tails

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Newby, Matthew; Cole, Nathan; Newberg, Heidi Jo

    2013-06-01

    We measure the spatial density of F turnoff stars in the Sagittarius dwarf tidal stream, from Sloan Digital Sky Survey data, using statistical photometric parallax. We find a set of continuous, consistent parameters that describe the leading Sgr stream's position, direction, and width for 15 stripes in the north Galactic cap, and three stripes in the south Galactic cap. We produce a catalog of stars that has the density characteristics of the dominant leading Sgr tidal stream that can be compared with simulations. We find that the width of the leading (north) tidal tail is consistent with recent triaxial andmore » axisymmetric halo model simulations. The density along the stream is roughly consistent with common disruption models in the north, but possibly not in the south. We explore the possibility that one or more of the dominant Sgr streams has been misidentified, and that one or more of the ''bifurcated'' pieces is the real Sgr tidal tail, but we do not reach definite conclusions. If two dwarf progenitors are assumed, fits to the planes of the dominant and ''bifurcated'' tidal tails favor an association of the Sgr dwarf spheroidal galaxy with the dominant southern stream and the ''bifurcated'' stream in the north. In the north Galactic cap, the best fit Hernquist density profile for the smooth component of the stellar halo is oblate, with a flattening parameter q = 0.53, and a scale length of r {sub 0} = 6.73. The southern data for both the tidal debris and the smooth component of the stellar halo do not match the model fits to the north, although the stellar halo is still overwhelmingly oblate. Finally, we verify that we can reproduce the parameter fits on the asynchronous MilkyWay@home volunteer computing platform.« less

  2. Stream-related preferences of inputs to the superior colliculus from areas of dorsal and ventral streams of mouse visual cortex.

    PubMed

    Wang, Quanxin; Burkhalter, Andreas

    2013-01-23

    Previous studies of intracortical connections in mouse visual cortex have revealed two subnetworks that resemble the dorsal and ventral streams in primates. Although calcium imaging studies have shown that many areas of the ventral stream have high spatial acuity whereas areas of the dorsal stream are highly sensitive for transient visual stimuli, there are some functional inconsistencies that challenge a simple grouping into "what/perception" and "where/action" streams known in primates. The superior colliculus (SC) is a major center for processing of multimodal sensory information and the motor control of orienting the eyes, head, and body. Visual processing is performed in superficial layers, whereas premotor activity is generated in deep layers of the SC. Because the SC is known to receive input from visual cortex, we asked whether the projections from 10 visual areas of the dorsal and ventral streams terminate in differential depth profiles within the SC. We found that inputs from primary visual cortex are by far the strongest. Projections from the ventral stream were substantially weaker, whereas the sparsest input originated from areas of the dorsal stream. Importantly, we found that ventral stream inputs terminated in superficial layers, whereas dorsal stream inputs tended to be patchy and either projected equally to superficial and deep layers or strongly preferred deep layers. The results suggest that the anatomically defined ventral and dorsal streams contain areas that belong to distinct functional systems, specialized for the processing of visual information and visually guided action, respectively.

  3. 40 CFR 65.149 - Boilers and process heaters.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... stream is not introduced as or with the primary fuel, a temperature monitoring device in the fire box...-throughput transfer racks, as applicable, shall meet the requirements of this section. (2) The vent stream... thermal units per hour) or greater. (ii) A boiler or process heater into which the vent stream is...

  4. 40 CFR 63.1082 - What definitions do I need to know?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... includes direct-contact cooling water. Spent caustic waste stream means the continuously flowing process... compounds from process streams, typically cracked gas. The spent caustic waste stream does not include spent..., and the C4 butadiene storage equipment; and spent wash water from the C4 crude butadiene carbonyl wash...

  5. 40 CFR 63.1082 - What definitions do I need to know?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... includes direct-contact cooling water. Spent caustic waste stream means the continuously flowing process... compounds from process streams, typically cracked gas. The spent caustic waste stream does not include spent..., and the C4 butadiene storage equipment; and spent wash water from the C4 crude butadiene carbonyl wash...

  6. 40 CFR 63.1082 - What definitions do I need to know?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... includes direct-contact cooling water. Spent caustic waste stream means the continuously flowing process... compounds from process streams, typically cracked gas. The spent caustic waste stream does not include spent..., and the C4 butadiene storage equipment; and spent wash water from the C4 crude butadiene carbonyl wash...

  7. 40 CFR 63.1082 - What definitions do I need to know?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... includes direct-contact cooling water. Spent caustic waste stream means the continuously flowing process... compounds from process streams, typically cracked gas. The spent caustic waste stream does not include spent..., and the C4 butadiene storage equipment; and spent wash water from the C4 crude butadiene carbonyl wash...

  8. M-Stream Deficits and Reading-Related Visual Processes in Developmental Dyslexia

    ERIC Educational Resources Information Center

    Boden, Catherine; Giaschi, Deborah

    2007-01-01

    Some visual processing deficits in developmental dyslexia have been attributed to abnormalities in the subcortical M stream and/or the cortical dorsal stream of the visual pathways. The nature of the relationship between these visual deficits and reading is unknown. The purpose of the present article was to characterize reading-related perceptual…

  9. Streaming data analytics via message passing with application to graph algorithms

    DOE PAGES

    Plimpton, Steven J.; Shead, Tim

    2014-05-06

    The need to process streaming data, which arrives continuously at high-volume in real-time, arises in a variety of contexts including data produced by experiments, collections of environmental or network sensors, and running simulations. Streaming data can also be formulated as queries or transactions which operate on a large dynamic data store, e.g. a distributed database. We describe a lightweight, portable framework named PHISH which enables a set of independent processes to compute on a stream of data in a distributed-memory parallel manner. Datums are routed between processes in patterns defined by the application. PHISH can run on top of eithermore » message-passing via MPI or sockets via ZMQ. The former means streaming computations can be run on any parallel machine which supports MPI; the latter allows them to run on a heterogeneous, geographically dispersed network of machines. We illustrate how PHISH can support streaming MapReduce operations, and describe streaming versions of three algorithms for large, sparse graph analytics: triangle enumeration, subgraph isomorphism matching, and connected component finding. Lastly, we also provide benchmark timings for MPI versus socket performance of several kernel operations useful in streaming algorithms.« less

  10. Method and apparatus for producing oxygen and nitrogen and membrane therefor

    DOEpatents

    Roman, I.C.; Baker, R.W.

    1985-09-17

    Process and apparatus for the separation and purification of oxygen and nitrogen as well as a novel membrane useful therein are disclosed. The process utilizes novel facilitated transport membranes to selectively transport oxygen from one gaseous stream to another, leaving nitrogen as a byproduct. In the method, an oxygen carrier capable of reversibly binding molecular oxygen is dissolved in a polar organic membrane which separates a gaseous feed stream such as atmospheric air and a gaseous product stream. The feed stream is maintained at a sufficiently high oxygen pressure to keep the oxygen carrier in its oxygenated form at the interface of the feed stream with the membrane, while the product stream is maintained at a sufficiently low oxygen pressure to keep the carrier in its deoxygenated form at the interface of the product stream with the membrane. In an alternate mode of operation, the feed stream is maintained at a sufficiently low temperature and high oxygen pressure to keep the oxygen carrier in its oxygenated form at the interface of the feed stream with the membrane and the product stream is maintained at a sufficiently high temperature to keep the carrier in its deoxygenated form at the interface of the product stream with the membrane. Under such conditions, the carrier acts as a shuttle, picking up oxygen at the feed side of the membrane, diffusing across the membrane as the oxygenated complex, releasing oxygen to the product stream, and then diffusing back to the feed side to repeat the process. Exceptionally and unexpectedly high O[sub 2]/N[sub 2] selectivity, on the order of 10 to 30, is obtained, as well as exceptionally high oxygen permeability, on the order of 6 to 15 [times] 10[sup [minus]8] cm[sup 3]-cm/cm[sup 2]-sec-cmHg, as well as a long membrane life of in excess of 3 months, making the process commercially feasible. 2 figs.

  11. Method and apparatus for producing oxygen and nitrogen and membrane therefor

    DOEpatents

    Roman, Ian C.; Baker, Richard W.

    1985-01-01

    Process and apparatus for the separation and purification of oxygen and nitrogen as well as a novel membrane useful therein are disclosed. The process utilizes novel facilitated transport membranes to selectively transport oxygen from one gaseous stream to another, leaving nitrogen as a byproduct. In the method, an oxygen carrier capable of reversibly binding molecular oxygen is dissolved in a polar organic membrane which separates a gaseous feed stream such as atmospheric air and a gaseous product stream. The feed stream is maintained at a sufficiently high oxygen pressure to keep the oxygen carrier in its oxygenated form at the interface of the feed stream with the membrane, while the product stream is maintained at a sufficiently low oxygen pressure to keep the carrier in its deoxygenated form at the interface of the product stream with the membrane. In an alternate mode of operation, the feed stream is maintained at a sufficiently low temperature and high oxygen pressure to keep the oxygen carrier in its oxygenated form at the interface of the feed stream with the membrane and the product stream is maintained at a sufficiently high temperature to keep the carrier in its deoxygenated form at the interface of the product stream with the membrane. Under such conditions, the carrier acts as a shuttle, picking up oxygen at the feed side of the membrane, diffusing across the membrane as the oxygenated complex, releasing oxygen to the product stream, and then diffusing back to the feed side to repeat the process. Exceptionally and unexpectedly high O.sub.2 /N.sub.2 selectivity, on the order of 10 to 30, is obtained, as well as exceptionally high oxygen permeability, on the order of 6 to 15.times.10.sup.-8 cm.sup.3 -cm/cm.sup.2 -sec-cmHg, as well as a long membrane life of in excess of 3 months, making the process commercially feasible.

  12. 40 CFR 63.11970 - What are my initial compliance requirements for process wastewater?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... requirements for process wastewater? 63.11970 Section 63.11970 Protection of Environment ENVIRONMENTAL... What are my initial compliance requirements for process wastewater? (a) Demonstration of initial compliance for process wastewater streams that must be treated. For each process wastewater stream that must...

  13. 40 CFR 63.11970 - What are my initial compliance requirements for process wastewater?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... requirements for process wastewater? 63.11970 Section 63.11970 Protection of Environment ENVIRONMENTAL... What are my initial compliance requirements for process wastewater? (a) Demonstration of initial compliance for process wastewater streams that must be treated. For each process wastewater stream that must...

  14. 40 CFR 63.11970 - What are my initial compliance requirements for process wastewater?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... requirements for process wastewater? 63.11970 Section 63.11970 Protection of Environment ENVIRONMENTAL... What are my initial compliance requirements for process wastewater? (a) Demonstration of initial compliance for process wastewater streams that must be treated. For each process wastewater stream that must...

  15. Visualisation methods for large provenance collections in data-intensive collaborative platforms

    NASA Astrophysics Data System (ADS)

    Spinuso, Alessandro; Fligueira, Rosa; Atkinson, Malcolm; Gemuend, Andre

    2016-04-01

    This work investigates improving the methods of visually representing provenance information in the context of modern data-driven scientific research. It explores scenarios where data-intensive workflows systems are serving communities of researchers within collaborative environments, supporting the sharing of data and methods, and offering a variety of computation facilities, including HPC, HTC and Cloud. It focuses on the exploration of big-data visualization techniques aiming at producing comprehensive and interactive views on top of large and heterogeneous provenance data. The same approach is applicable to control-flow and data-flow workflows or to combinations of the two. This flexibility is achieved using the W3C-PROV recommendation as a reference model, especially its workflow oriented profiles such as D-PROV (Messier et al. 2013). Our implementation is based on the provenance records produced by the dispel4py data-intensive processing library (Filgueira et al. 2015). dispel4py is an open-source Python framework for describing abstract stream-based workflows for distributed data-intensive applications, developed during the VERCE project. dispel4py enables scientists to develop their scientific methods and applications on their laptop and then run them at scale on a wide range of e-Infrastructures (Cloud, Cluster, etc.) without making changes. Users can therefore focus on designing their workflows at an abstract level, describing actions, input and output streams, and how they are connected. The dispel4py system then maps these descriptions to the enactment platforms, such as MPI, Storm, multiprocessing. It provides a mechanism which allows users to determine the provenance information to be collected and to analyze it at runtime. For this work we consider alternative visualisation methods for provenance data, from infinite lists and localised interactive graphs, to radial-views. The latter technique has been positively explored in many fields, from text data visualisation to genomics and social networking analysis. Its adoption for provenance has been presented in literature (Borkin et al. 2013) in the context of parent-child relationships across processes, constructed from control-flow information. Computer graphics research has focused on the advantage of this radial distribution of interlinked information and on ways to improve the visual efficiency and tunability of such representations, like the Hierarchical Edge Bundles visualisation method, (Holten et al. 2006), which aims at reducing visual clutter of highly connected structures via the generation of bundles. Our approach explores the potential of the combination of these methods. It serves environments where the size of the provenance collection, coupled with the diversity of the infrastructures and the domain metadata, make the extrapolation of usage trends extremely challenging. Applications of such visualisation systems can engage groups of scientists, data providers and computational engineers, by serving visual snapshots that highlight relationships between an item and its connected processes. We will present examples of comprehensive views on the distribution of processing and data transfers during a workflow's execution in HPC, as well as cross workflows interactions and internal dynamics. The latter in the context of faceted searches on domain metadata values-range. These are obtained from the analysis of real provenance data generated by the processing of seismic traces performed through the VERCE platform.

  16. Integrating complex business processes for knowledge-driven clinical decision support systems.

    PubMed

    Kamaleswaran, Rishikesan; McGregor, Carolyn

    2012-01-01

    This paper presents in detail the component of the Complex Business Process for Stream Processing framework that is responsible for integrating complex business processes to enable knowledge-driven Clinical Decision Support System (CDSS) recommendations. CDSSs aid the clinician in supporting the care of patients by providing accurate data analysis and evidence-based recommendations. However, the incorporation of a dynamic knowledge-management system that supports the definition and enactment of complex business processes and real-time data streams has not been researched. In this paper we discuss the process web service as an innovative method of providing contextual information to a real-time data stream processing CDSS.

  17. Method for processing aqueous wastes

    DOEpatents

    Pickett, John B.; Martin, Hollis L.; Langton, Christine A.; Harley, Willie W.

    1993-01-01

    A method for treating waste water such as that from an industrial processing facility comprising the separation of the waste water into a dilute waste stream and a concentrated waste stream. The concentrated waste stream is treated chemically to enhance precipitation and then allowed to separate into a sludge and a supernate. The supernate is skimmed or filtered from the sludge and blended with the dilute waste stream to form a second dilute waste stream. The sludge remaining is mixed with cementitious material, rinsed to dissolve soluble components, then pressed to remove excess water and dissolved solids before being allowed to cure. The dilute waste stream is also chemically treated to decompose carbonate complexes and metal ions and then mixed with cationic polymer to cause the precipitated solids to flocculate. Filtration of the flocculant removes sufficient solids to allow the waste water to be discharged to the surface of a stream. The filtered material is added to the sludge of the concentrated waste stream. The method is also applicable to the treatment and removal of soluble uranium from aqueous streams, such that the treated stream may be used as a potable water supply.

  18. Integration and segregation in auditory scene analysis

    NASA Astrophysics Data System (ADS)

    Sussman, Elyse S.

    2005-03-01

    Assessment of the neural correlates of auditory scene analysis, using an index of sound change detection that does not require the listener to attend to the sounds [a component of event-related brain potentials called the mismatch negativity (MMN)], has previously demonstrated that segregation processes can occur without attention focused on the sounds and that within-stream contextual factors influence how sound elements are integrated and represented in auditory memory. The current study investigated the relationship between the segregation and integration processes when they were called upon to function together. The pattern of MMN results showed that the integration of sound elements within a sound stream occurred after the segregation of sounds into independent streams and, further, that the individual streams were subject to contextual effects. These results are consistent with a view of auditory processing that suggests that the auditory scene is rapidly organized into distinct streams and the integration of sequential elements to perceptual units takes place on the already formed streams. This would allow for the flexibility required to identify changing within-stream sound patterns, needed to appreciate music or comprehend speech..

  19. Processing reafferent and exafferent visual information for action and perception.

    PubMed

    Reichenbach, Alexandra; Diedrichsen, Jörn

    2015-01-01

    A recent study suggests that reafferent hand-related visual information utilizes a privileged, attention-independent processing channel for motor control. This process was termed visuomotor binding to reflect its proposed function: linking visual reafferences to the corresponding motor control centers. Here, we ask whether the advantage of processing reafferent over exafferent visual information is a specific feature of the motor processing stream or whether the improved processing also benefits the perceptual processing stream. Human participants performed a bimanual reaching task in a cluttered visual display, and one of the visual hand cursors could be displaced laterally during the movement. We measured the rapid feedback responses of the motor system as well as matched perceptual judgments of which cursor was displaced. Perceptual judgments were either made by watching the visual scene without moving or made simultaneously to the reaching tasks, such that the perceptual processing stream could also profit from the specialized processing of reafferent information in the latter case. Our results demonstrate that perceptual judgments in the heavily cluttered visual environment were improved when performed based on reafferent information. Even in this case, however, the filtering capability of the perceptual processing stream suffered more from the increasing complexity of the visual scene than the motor processing stream. These findings suggest partly shared and partly segregated processing of reafferent information for vision for motor control versus vision for perception.

  20. Geomorphic variation in riparian tree mortality and stream coarse woody debris recruitment from record flooding in a coastal plain stream

    Treesearch

    Brian J. Palik; Stephen W. Golladay; P. Charles Goebel; Brad W. Taylor

    1998-01-01

    Large floods are an important process controlling the structure and function of stream ecosystems. One of the ways floods affect streams is through the recruitment of coarse woody debris from stream-side forests. Stream valley geomorphology may mediate this interaction by altering flood velocity, depth, and duration. Little research has examined how floods and...

  1. A new neural framework for visuospatial processing.

    PubMed

    Kravitz, Dwight J; Saleem, Kadharbatcha S; Baker, Chris I; Mishkin, Mortimer

    2011-04-01

    The division of cortical visual processing into distinct dorsal and ventral streams is a key framework that has guided visual neuroscience. The characterization of the ventral stream as a 'What' pathway is relatively uncontroversial, but the nature of dorsal stream processing is less clear. Originally proposed as mediating spatial perception ('Where'), more recent accounts suggest it primarily serves non-conscious visually guided action ('How'). Here, we identify three pathways emerging from the dorsal stream that consist of projections to the prefrontal and premotor cortices, and a major projection to the medial temporal lobe that courses both directly and indirectly through the posterior cingulate and retrosplenial cortices. These three pathways support both conscious and non-conscious visuospatial processing, including spatial working memory, visually guided action and navigation, respectively.

  2. The ecology and biogeochemistry of stream biofilms.

    PubMed

    Battin, Tom J; Besemer, Katharina; Bengtsson, Mia M; Romani, Anna M; Packmann, Aaron I

    2016-04-01

    Streams and rivers form dense networks, shape the Earth's surface and, in their sediments, provide an immensely large surface area for microbial growth. Biofilms dominate microbial life in streams and rivers, drive crucial ecosystem processes and contribute substantially to global biogeochemical fluxes. In turn, water flow and related deliveries of nutrients and organic matter to biofilms constitute major constraints on microbial life. In this Review, we describe the ecology and biogeochemistry of stream biofilms and highlight the influence of physical and ecological processes on their structure and function. Recent advances in the study of biofilm ecology may pave the way towards a mechanistic understanding of the effects of climate and environmental change on stream biofilms and the biogeochemistry of stream ecosystems.

  3. Cooling and solidification of heavy hydrocarbon liquid streams

    DOEpatents

    Antieri, Salvatore J.; Comolli, Alfred G.

    1983-01-01

    A process and apparatus for cooling and solidifying a stream of heavy hydrocarbon material normally boiling above about 850.degree. F., such as vacuum bottoms material from a coal liquefaction process. The hydrocarbon stream is dropped into a liquid bath, preferably water, which contains a screw conveyor device and the stream is rapidly cooled, solidified and broken therein to form discrete elongated particles. The solid extrudates or prills are then dried separately to remove substantially all surface moisture, and passed to further usage.

  4. Electrophysiological Evidence for Ventral Stream Deficits in Schizophrenia Patients

    PubMed Central

    Plomp, Gijs; Roinishvili, Maya; Chkonia, Eka; Kapanadze, George; Kereselidze, Maia; Brand, Andreas; Herzog, Michael H.

    2013-01-01

    Schizophrenic patients suffer from many deficits including visual, attentional, and cognitive ones. Visual deficits are of particular interest because they are at the fore-end of information processing and can provide clear examples of interactions between sensory, perceptual, and higher cognitive functions. Visual deficits in schizophrenic patients are often attributed to impairments in the dorsal (where) rather than the ventral (what) stream of visual processing. We used a visual-masking paradigm in which patients and matched controls discriminated small vernier offsets. We analyzed the evoked electroencephalography (EEG) responses and applied distributed electrical source imaging techniques to estimate activity differences between conditions and groups throughout the brain. Compared with controls, patients showed strongly reduced discrimination accuracy, confirming previous work. The behavioral deficits corresponded to pronounced decreases in the evoked EEG response at around 200 ms after stimulus onset. At this latency, patients showed decreased activity for targets in left parietal cortex (dorsal stream), but the decrease was most pronounced in lateral occipital cortex (in the ventral stream). These deficiencies occurred at latencies that reflect object processing and fine shape discriminations. We relate the reduced ventral stream activity to deficient top-down processing of target stimuli and provide a framework for relating the commonly observed dorsal stream deficiencies with the currently observed ventral stream deficiencies. PMID:22258884

  5. Electrophysiological evidence for ventral stream deficits in schizophrenia patients.

    PubMed

    Plomp, Gijs; Roinishvili, Maya; Chkonia, Eka; Kapanadze, George; Kereselidze, Maia; Brand, Andreas; Herzog, Michael H

    2013-05-01

    Schizophrenic patients suffer from many deficits including visual, attentional, and cognitive ones. Visual deficits are of particular interest because they are at the fore-end of information processing and can provide clear examples of interactions between sensory, perceptual, and higher cognitive functions. Visual deficits in schizophrenic patients are often attributed to impairments in the dorsal (where) rather than the ventral (what) stream of visual processing. We used a visual-masking paradigm in which patients and matched controls discriminated small vernier offsets. We analyzed the evoked electroencephalography (EEG) responses and applied distributed electrical source imaging techniques to estimate activity differences between conditions and groups throughout the brain. Compared with controls, patients showed strongly reduced discrimination accuracy, confirming previous work. The behavioral deficits corresponded to pronounced decreases in the evoked EEG response at around 200 ms after stimulus onset. At this latency, patients showed decreased activity for targets in left parietal cortex (dorsal stream), but the decrease was most pronounced in lateral occipital cortex (in the ventral stream). These deficiencies occurred at latencies that reflect object processing and fine shape discriminations. We relate the reduced ventral stream activity to deficient top-down processing of target stimuli and provide a framework for relating the commonly observed dorsal stream deficiencies with the currently observed ventral stream deficiencies.

  6. 40 CFR 63.11940 - What continuous monitoring requirements must I meet for control devices required to install CPMS...

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... consistent with the manufacturer's recommendations within 15 days or by the next time any process vent stream... the manufacturer's recommendations within 15 days or by the next time any process vent stream is...) Determine gas stream flow using the design blower capacity, with appropriate adjustments for pressure drop...

  7. Carbon dioxide removal process

    DOEpatents

    Baker, Richard W.; Da Costa, Andre R.; Lokhandwala, Kaaeid A.

    2003-11-18

    A process and apparatus for separating carbon dioxide from gas, especially natural gas, that also contains C.sub.3+ hydrocarbons. The invention uses two or three membrane separation steps, optionally in conjunction with cooling/condensation under pressure, to yield a lighter, sweeter product natural gas stream, and/or a carbon dioxide stream of reinjection quality and/or a natural gas liquids (NGL) stream.

  8. Interaction of Substrate and Nutrient Availability on wood Biofilm Processes in Streams

    Treesearch

    Jennifer L. Tank; J.R. Webster

    1998-01-01

    We examined the effect of decomposing leaf litter and dissolved inorganic nutrients on the heterotrophic biofilm of submerged wood in streams with and without leaves. Leaf litter was excluded from one headwater stream in August 1993 at Coweeta Hydrologic Laboratory in the southern Appalachian Mountains. We compared microbial processes on wood in the litter-excluded...

  9. Rapid Production of Internally Structured Colloids by Flash Nanoprecipitation of Block Copolymer Blends.

    PubMed

    Grundy, Lorena S; Lee, Victoria E; Li, Nannan; Sosa, Chris; Mulhearn, William D; Liu, Rui; Register, Richard A; Nikoubashman, Arash; Prud'homme, Robert K; Panagiotopoulos, Athanassios Z; Priestley, Rodney D

    2018-05-08

    Colloids with internally structured geometries have shown great promise in applications ranging from biosensors to optics to drug delivery, where the internal particle structure is paramount to performance. The growing demand for such nanomaterials necessitates the development of a scalable processing platform for their production. Flash nanoprecipitation (FNP), a rapid and inherently scalable colloid precipitation technology, is used to prepare internally structured colloids from blends of block copolymers and homopolymers. As revealed by a combination of experiments and simulations, colloids prepared from different molecular weight diblock copolymers adopt either an ordered lamellar morphology consisting of concentric shells or a disordered lamellar morphology when chain dynamics are sufficiently slow to prevent defect annealing during solvent exchange. Blends of homopolymer and block copolymer in the feed stream generate more complex internally structured colloids, such as those with hierarchically structured Janus and patchy morphologies, due to additional phase separation and kinetic trapping effects. The ability of the FNP process to generate such a wide range of morphologies using a simple and scalable setup provides a pathway to manufacturing internally structured colloids on an industrial scale.

  10. Liquid additives for particulate emissions control

    DOEpatents

    Durham, Michael Dean; Schlager, Richard John; Ebner, Timothy George; Stewart, Robin Michele; Hyatt, David E.; Bustard, Cynthia Jean; Sjostrom, Sharon

    1999-01-01

    The present invention discloses a process for removing undesired particles from a gas stream including the steps of contacting a composition containing an adhesive with the gas stream; collecting the undesired particles and adhesive on a collection surface to form an aggregate comprising the adhesive and undesired particles on the collection surface; and removing the agglomerate from the collection zone. The composition may then be atomized and injected into the gas stream. The composition may include a liquid that vaporizes in the gas stream. After the liquid vaporizes, adhesive particles are entrained in the gas stream. The process may be applied to electrostatic precipitators and filtration systems to improve undesired particle collection efficiency.

  11. Continuous online Fourier transform infrared (FT-IR) spectrometry analysis of hydrogen chloride (HCl), carbon dioxide (CO2), and water (H2O) in nitrogen-rich and ethylene-rich streams.

    PubMed

    Stephenson, Serena; Pollard, Maria; Boit, Kipchirchir

    2013-09-01

    The prevalence of optical spectroscopy techniques being applied to the online analysis of continuous processes has increased in the past couple of decades. The ability to continuously "watch" changing stream compositions as operating conditions change has proven invaluable to pilot and world-scale manufacturing in the chemical and petrochemical industries. Presented here is an application requiring continuous monitoring of parts per million (ppm) by weight levels of hydrogen chloride (HCl), water (H2O), and carbon dioxide (CO2) in two gas-phase streams, one nitrogen-rich and one ethylene-rich. Because ethylene has strong mid-infrared (IR) absorption, building an IR method capable of quantifying HCl, H2O, and CO2 posed some challenges. A long-path (5.11m) Fourier transform infrared (FT-IR) spectrometer was used in the mid-infrared region between 1800 and 5000 cm(-1), with a 1 cm(-1) resolution and a 10 s spectral update time. Sample cell temperature and pressure were controlled and measured to minimize measurement variability. Models using a modified classical least squares method were developed and validated first in the laboratory and then using the process stream. Analytical models and process sampling conditions were adjusted to minimize interference of ethylene in the ethylene-rich stream. The predictive capabilities of the measurements were ±0.5 ppm for CO2 in either stream; ±1.1 and ±1.3 ppm for H2O in the nitrogen-rich and ethylene-rich streams, respectively; and ±1.0 and ±2.4 ppm for HCl in the nitrogen-rich and ethylene-rich streams, respectively. Continuous operation of the instrument in the process stream was demonstrated using an automated stream switching sample system set to 10 min intervals. Response time for all components of interest was sufficient to acquire representative stream composition data. This setup provides useful insight into the process for troubleshooting and optimizing plant operating conditions.

  12. Instrumental methods of analysis of sulfur compounds in synfuel process streams. Quarterly technical progress report, July-September 1984

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jordan, J.; Talbott, J.

    1984-01-01

    Task 1. Methods development for the speciation of the polysulfides. Work on this task has been completed in December 1983 and reported accordingly in DOE/PC/40783-T13. Task 2. Methods development for the speciation of dithionite and polythionates. Work on Task 2 has been completed in June 1984 and has been reported accordingly in DOE/PC/40783-T15. Task 3. Total accounting of the sulfur balance in representative samples of synfuel process streams. A systematic and critical comparison of results, obtained in the analysis of sulfur moieties in representative samples of coal conversion process streams, revealed the following general trends. (a) In specimens of highmore » pH (9-10) and low redox potential (-0.3 to -0.4 volt versus NHE) sulfidic and polysulfidic sulfur moieties predominate. (b) In process streams of lower pH and more positive redox potential, higher oxidation states of sulfur (notably sulfate) account for most of the total sulfur present. (c) Oxidative wastewater treatment procedures by the PETC stripping process convert lower oxidation states of sulfur into thiosulfate and sulfate. In this context, remarkable similarities were observed between liquefaction and gasification process streams. However, the thiocyanate present in samples from the Grand Forks gasifier were impervious to the PETC stripping process. (d) Total sulfur contaminant levels in coal conversion process stream wastewater samples are primarily determined by the abundance of sulfur in the coal used as starting material than by the nature of the conversion process (liquefaction or gasification). 13 references.« less

  13. Measurement of Near-Surface Salinity, Temperature and Directional Wave Spectra using a Novel Wave-Following, Lagrangian Surface Contact Buoy

    NASA Astrophysics Data System (ADS)

    Boyle, J. P.

    2016-02-01

    Results from a surface contact drifter buoy which measures near-surface conductivity ( 10 cm depth), sea state characteristics and near-surface water temperature ( 2 cm depth) are described. This light (< 750 gram), wave-following discus buoy platform has a hull diameter of 25 cm and a thickness of approximately 3 cm. The buoy is designed to allow for capsize events, but remains top up because it is ballasted for self-righting. It has a small above-surface profile and low windage, resulting in near-Lagrangian drift characteristics. It is autonomous, with low power requirements and solar panel battery recharging. Onboard sensors include an inductive toroidal conductivity probe for salinity measurement, a nine-degrees-of-freedom motion package for derivation of directional wave spectra and a thermocouple for water temperature measurement. Data retrieval for expendable, ocean-going operation uses an onboard Argos transmitter. Scientific results as well as data processing algorithms are presented from laboratory and field experiments which support qualification of buoy platform measurements. These include sensor calibration experiments, longer-term dock-side biofouling experiments during 2013-2014 and a series of short-duration ocean deployments in the Gulf Stream in 2014. In addition, a treatment method will be described which appears to minimize the effects of biofouling on the inductive conductivity probe when in coastal surface waters. Due to its low cost and ease of deployment, scores, perhaps hundreds of these novel instruments could be deployed from ships or aircraft during process studies or to provide surface validation for satellite-based measurements, particularly in high precipitation regions.

  14. Post-processing of a low-flow forecasting system in the Thur basin (Switzerland)

    NASA Astrophysics Data System (ADS)

    Bogner, Konrad; Joerg-Hess, Stefanie; Bernhard, Luzi; Zappa, Massimiliano

    2015-04-01

    Low-flows and droughts are natural hazards with potentially severe impacts and economic loss or damage in a number of environmental and socio-economic sectors. As droughts develop slowly there is time to prepare and pre-empt some of these impacts. Real-time information and forecasting of a drought situation can therefore be an effective component of drought management. Although Switzerland has traditionally been more concerned with problems related to floods, in recent years some unprecedented low-flow situations have been experienced. Driven by the climate change debate a drought information platform has been developed to guide water resources management during situations where water resources drop below critical low-flow levels characterised by the indices duration (time between onset and offset), severity (cumulative water deficit) and magnitude (severity/duration). However to gain maximum benefit from such an information system it is essential to remove the bias from the meteorological forecast, to derive optimal estimates of the initial conditions, and to post-process the stream-flow forecasts. Quantile mapping methods for pre-processing the meteorological forecasts and improved data assimilation methods of snow measurements, which accounts for much of the seasonal stream-flow predictability for the majority of the basins in Switzerland, have been tested previously. The objective of this study is the testing of post-processing methods in order to remove bias and dispersion errors and to derive the predictive uncertainty of a calibrated low-flow forecast system. Therefore various stream-flow error correction methods with different degrees of complexity have been applied and combined with the Hydrological Uncertainty Processor (HUP) in order to minimise the differences between the observations and model predictions and to derive posterior probabilities. The complexity of the analysed error correction methods ranges from simple AR(1) models to methods including wavelet transformations and support vector machines. These methods have been combined with forecasts driven by Numerical Weather Prediction (NWP) systems with different temporal and spatial resolutions, lead-times and different numbers of ensembles covering short to medium to extended range forecasts (COSMO-LEPS, 10-15 days, monthly and seasonal ENS) as well as climatological forecasts. Additionally the suitability of various skill scores and efficiency measures regarding low-flow predictions will be tested. Amongst others the novel 2afc (2 alternatives forced choices) score and the quantile skill score and its decompositions will be applied to evaluate the probabilistic forecasts and the effects of post-processing. First results of the performance of the low-flow predictions of the hydrological model PREVAH initialised with different NWP's will be shown.

  15. Alpha-band rhythm modulation under the condition of subliminal face presentation: MEG study.

    PubMed

    Sakuraba, Satoshi; Kobayashi, Hana; Sakai, Shinya; Yokosawa, Koichi

    2013-01-01

    The human brain has two streams to process visual information: a dorsal stream and a ventral stream. Negative potential N170 or its magnetic counterpart M170 is known as the face-specific signal originating from the ventral stream. It is possible to present a visual image unconsciously by using continuous flash suppression (CFS), which is a visual masking technique adopting binocular rivalry. In this work, magnetoencephalograms were recorded during presentation of the three invisible images: face images, which are processed by the ventral stream; tool images, which could be processed by the dorsal stream, and a blank image. Alpha-band activities detected by sensors that are sensitive to M170 were compared. The alpha-band rhythm was suppressed more during presentation of face images than during presentation of the blank image (p=.028). The suppression remained for about 1 s after ending presentations. However, no significant difference was observed between tool and other images. These results suggest that alpha-band rhythm can be modulated also by unconscious visual images.

  16. StreamQRE: Modular Specification and Efficient Evaluation of Quantitative Queries over Streaming Data.

    PubMed

    Mamouras, Konstantinos; Raghothaman, Mukund; Alur, Rajeev; Ives, Zachary G; Khanna, Sanjeev

    2017-06-01

    Real-time decision making in emerging IoT applications typically relies on computing quantitative summaries of large data streams in an efficient and incremental manner. To simplify the task of programming the desired logic, we propose StreamQRE, which provides natural and high-level constructs for processing streaming data. Our language has a novel integration of linguistic constructs from two distinct programming paradigms: streaming extensions of relational query languages and quantitative extensions of regular expressions. The former allows the programmer to employ relational constructs to partition the input data by keys and to integrate data streams from different sources, while the latter can be used to exploit the logical hierarchy in the input stream for modular specifications. We first present the core language with a small set of combinators, formal semantics, and a decidable type system. We then show how to express a number of common patterns with illustrative examples. Our compilation algorithm translates the high-level query into a streaming algorithm with precise complexity bounds on per-item processing time and total memory footprint. We also show how to integrate approximation algorithms into our framework. We report on an implementation in Java, and evaluate it with respect to existing high-performance engines for processing streaming data. Our experimental evaluation shows that (1) StreamQRE allows more natural and succinct specification of queries compared to existing frameworks, (2) the throughput of our implementation is higher than comparable systems (for example, two-to-four times greater than RxJava), and (3) the approximation algorithms supported by our implementation can lead to substantial memory savings.

  17. StreamQRE: Modular Specification and Efficient Evaluation of Quantitative Queries over Streaming Data*

    PubMed Central

    Mamouras, Konstantinos; Raghothaman, Mukund; Alur, Rajeev; Ives, Zachary G.; Khanna, Sanjeev

    2017-01-01

    Real-time decision making in emerging IoT applications typically relies on computing quantitative summaries of large data streams in an efficient and incremental manner. To simplify the task of programming the desired logic, we propose StreamQRE, which provides natural and high-level constructs for processing streaming data. Our language has a novel integration of linguistic constructs from two distinct programming paradigms: streaming extensions of relational query languages and quantitative extensions of regular expressions. The former allows the programmer to employ relational constructs to partition the input data by keys and to integrate data streams from different sources, while the latter can be used to exploit the logical hierarchy in the input stream for modular specifications. We first present the core language with a small set of combinators, formal semantics, and a decidable type system. We then show how to express a number of common patterns with illustrative examples. Our compilation algorithm translates the high-level query into a streaming algorithm with precise complexity bounds on per-item processing time and total memory footprint. We also show how to integrate approximation algorithms into our framework. We report on an implementation in Java, and evaluate it with respect to existing high-performance engines for processing streaming data. Our experimental evaluation shows that (1) StreamQRE allows more natural and succinct specification of queries compared to existing frameworks, (2) the throughput of our implementation is higher than comparable systems (for example, two-to-four times greater than RxJava), and (3) the approximation algorithms supported by our implementation can lead to substantial memory savings. PMID:29151821

  18. Remembering the Important Things: Semantic Importance in Stream Reasoning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yan, Rui; Greaves, Mark T.; Smith, William P.

    Reasoning and querying over data streams rely on the abil- ity to deliver a sequence of stream snapshots to the processing algo- rithms. These snapshots are typically provided using windows as views into streams and associated window management strategies. Generally, the goal of any window management strategy is to preserve the most im- portant data in the current window and preferentially evict the rest, so that the retained data can continue to be exploited. A simple timestamp- based strategy is rst-in-rst-out (FIFO), in which items are replaced in strict order of arrival. All timestamp-based strategies implicitly assume that a temporalmore » ordering reliably re ects importance to the processing task at hand, and thus that window management using timestamps will maximize the ability of the processing algorithms to deliver accurate interpretations of the stream. In this work, we explore a general no- tion of semantic importance that can be used for window management for streams of RDF data using semantically-aware processing algorithms like deduction or semantic query. Semantic importance exploits the infor- mation carried in RDF and surrounding ontologies for ranking window data in terms of its likely contribution to the processing algorithms. We explore the general semantic categories of query contribution, prove- nance, and trustworthiness, as well as the contribution of domain-specic ontologies. We describe how these categories behave using several con- crete examples. Finally, we consider how a stream window management strategy based on semantic importance could improve overall processing performance, especially as available window sizes decrease.« less

  19. Stream computing for biomedical signal processing: A QRS complex detection case-study.

    PubMed

    Murphy, B M; O'Driscoll, C; Boylan, G B; Lightbody, G; Marnane, W P

    2015-01-01

    Recent developments in "Big Data" have brought significant gains in the ability to process large amounts of data on commodity server hardware. Stream computing is a relatively new paradigm in this area, addressing the need to process data in real time with very low latency. While this approach has been developed for dealing with large scale data from the world of business, security and finance, there is a natural overlap with clinical needs for physiological signal processing. In this work we present a case study of streams processing applied to a typical physiological signal processing problem: QRS detection from ECG data.

  20. The paradox of cooling streams in a warming world: regional climate trends do not parallel variable local trends in stream temperature in the Pacific continental United States

    Treesearch

    Ivan Arismendi; Sherri L. Johnson; Jason B. Dunham; Roy Haggerty

    2012-01-01

    Temperature is a fundamentally important driver of ecosystem processes in streams. Recent warming of terrestrial climates around the globe has motivated concern about consequent increases in stream temperature. More specifically, observed trends of increasing air temperature and declining stream flow are widely believed to result in corresponding increases in stream...

  1. High temperature methods for forming oxidizer fuel

    DOEpatents

    Bravo, Jose Luis [Houston, TX

    2011-01-11

    A method of treating a formation fluid includes providing formation fluid from a subsurface in situ heat treatment process. The formation fluid is separated to produce a liquid stream and a first gas stream. The first gas stream includes carbon dioxide, hydrogen sulfide, hydrocarbons, hydrogen or mixtures thereof. Molecular oxygen is separated from air to form a molecular oxygen stream comprising molecular oxygen. The first gas stream is combined with the molecular oxygen stream to form a combined stream comprising molecular oxygen and the first gas stream. The combined stream is provided to one or more downhole burners.

  2. Risk-based process safety assessment and control measures design for offshore process facilities.

    PubMed

    Khan, Faisal I; Sadiq, Rehan; Husain, Tahir

    2002-09-02

    Process operation is the most hazardous activity next to the transportation and drilling operation on an offshore oil and gas (OOG) platform. Past experiences of onshore and offshore oil and gas activities have revealed that a small mis-happening in the process operation might escalate to a catastrophe. This is of especial concern in the OOG platform due to the limited space and compact geometry of the process area, less ventilation, and difficult escape routes. On an OOG platform, each extra control measure, which is implemented, not only occupies space on the platform and increases congestion but also adds extra load to the platform. Eventualities in the OOG platform process operation can be avoided through incorporating the appropriate control measures at the early design stage. In this paper, the authors describe a methodology for risk-based process safety decision making for OOG activities. The methodology is applied to various offshore process units, that is, the compressor, separators, flash drum and driers of an OOG platform. Based on the risk potential, appropriate safety measures are designed for each unit. This paper also illustrates that implementation of the designed safety measures reduces the high Fatal accident rate (FAR) values to an acceptable level.

  3. Integration and segregation in auditory streaming

    NASA Astrophysics Data System (ADS)

    Almonte, Felix; Jirsa, Viktor K.; Large, Edward W.; Tuller, Betty

    2005-12-01

    We aim to capture the perceptual dynamics of auditory streaming using a neurally inspired model of auditory processing. Traditional approaches view streaming as a competition of streams, realized within a tonotopically organized neural network. In contrast, we view streaming to be a dynamic integration process which resides at locations other than the sensory specific neural subsystems. This process finds its realization in the synchronization of neural ensembles or in the existence of informational convergence zones. Our approach uses two interacting dynamical systems, in which the first system responds to incoming acoustic stimuli and transforms them into a spatiotemporal neural field dynamics. The second system is a classification system coupled to the neural field and evolves to a stationary state. These states are identified with a single perceptual stream or multiple streams. Several results in human perception are modelled including temporal coherence and fission boundaries [L.P.A.S. van Noorden, Temporal coherence in the perception of tone sequences, Ph.D. Thesis, Eindhoven University of Technology, The Netherlands, 1975], and crossing of motions [A.S. Bregman, Auditory Scene Analysis: The Perceptual Organization of Sound, MIT Press, 1990]. Our model predicts phenomena such as the existence of two streams with the same pitch, which cannot be explained by the traditional stream competition models. An experimental study is performed to provide proof of existence of this phenomenon. The model elucidates possible mechanisms that may underlie perceptual phenomena.

  4. A novel rotating experimental platform in a superconducting magnet.

    PubMed

    Chen, Da; Cao, Hui-Ling; Ye, Ya-Jing; Dong, Chen; Liu, Yong-Ming; Shang, Peng; Yin, Da-Chuan

    2016-08-01

    This paper introduces a novel platform designed to be used in a strong static magnetic field (in a superconducting magnet). The platform is a sample holder that rotates in the strong magnetic field. Any samples placed in the platform will rotate due to the rotation of the sample holder. With this platform, a number of experiments such as material processing, culture of biological systems, chemical reactions, or other processes can be carried out. In this report, we present some preliminary experiments (protein crystallization, cell culture, and seed germination) conducted using this platform. The experimental results showed that the platform can affect the processes, indicating that it provides a novel environment that has not been investigated before and that the effects of such an environment on many different physical, chemical, or biological processes can be potentially useful for applications in many fields.

  5. Towards an Integrated Flood Preparedness and Response: Centralized Data Access, Analysis, and Visualization

    NASA Astrophysics Data System (ADS)

    Demir, I.; Krajewski, W. F.

    2014-12-01

    Recent advances in internet and cyberinfrastucture technologies have provided the capability to understand the hydrological and meteorological systems at space and time scales that are critical for making accurate understanding and prediction of flooding, and emergency preparedness. A novel example of a cyberinfrastructure platform for flood preparedness and response is the Iowa Flood Center's Iowa Flood Information System (IFIS). IFIS is a one-stop web-platform to access community-based flood conditions, forecasts, visualizations, inundation maps and flood-related data, information, and applications. An enormous volume of real-time observational data from a variety of sensors and remote sensing resources (radars, rain gauges, stream sensors, etc.) and complex flood inundation models are staged on a user-friendly maps environment that is accessible to the general public. IFIS has developed into a very successful tool used by agencies, decision-makers, and the general public throughout Iowa to better understand their local watershed and their personal and community flood risk, and to monitor local stream and river levels. IFIS helps communities make better-informed decisions on the occurrence of floods, and alerts communities in advance to help minimize flood damages. IFIS is widely used by general public in Iowa and the Midwest region with over 120,000 unique users, and became main source of information for many newspapers and TV stations in Iowa. IFIS has features for general public to improve emergency preparedness, and for decision makers to support emergency response and recovery efforts. IFIS is also a great platform for educators and local authorities to educate students and public on flooding with games, easy to use interactive environment, and data rich system.

  6. Process for the displacement of cyanide ions from metal-cyanide complexes

    DOEpatents

    Smith, Barbara F.; Robinson, Thomas W.

    1997-01-01

    The present invention relates to water-soluble polymers and the use of such water-soluble polymers in a process for the displacement of the cyanide ions from the metal ions within metal-cyanide complexes. The process waste streams can include metal-cyanide containing electroplating waste streams, mining leach waste streams, mineral processing waste streams, and related metal-cyanide containing waste streams. The metal ions of interest are metals that give very strong complexes with cyanide, mostly iron, nickel, and copper. The physical separation of the water-soluble polymer-metal complex from the cyanide ions can be accomplished through the use of ultrafiltration. Once the metal-cyanide complex is disrupted, the freed cyanide ions can be recovered for reuse or destroyed using available oxidative processes rendering the cyanide nonhazardous. The metal ions are released from the polymer, using dilute acid, metal ion oxidation state adjustment, or competing chelating agents, and collected and recovered or disposed of by appropriate waste management techniques. The water-soluble polymer can then be recycled. Preferred water-soluble polymers include polyethyleneimine and polyethyleneimine having a catechol or hydroxamate group.

  7. Modeling hyporheic zone processes

    USGS Publications Warehouse

    Runkel, Robert L.; McKnight, Diane M.; Rajaram, Harihar

    2003-01-01

    Stream biogeochemistry is influenced by the physical and chemical processes that occur in the surrounding watershed. These processes include the mass loading of solutes from terrestrial and atmospheric sources, the physical transport of solutes within the watershed, and the transformation of solutes due to biogeochemical reactions. Research over the last two decades has identified the hyporheic zone as an important part of the stream system in which these processes occur. The hyporheic zone may be loosely defined as the porous areas of the stream bed and stream bank in which stream water mixes with shallow groundwater. Exchange of water and solutes between the stream proper and the hyporheic zone has many biogeochemical implications, due to differences in the chemical composition of surface and groundwater. For example, surface waters are typically oxidized environments with relatively high dissolved oxygen concentrations. In contrast, reducing conditions are often present in groundwater systems leading to low dissolved oxygen concentrations. Further, microbial oxidation of organic materials in groundwater leads to supersaturated concentrations of dissolved carbon dioxide relative to the atmosphere. Differences in surface and groundwater pH and temperature are also common. The hyporheic zone is therefore a mixing zone in which there are gradients in the concentrations of dissolved gasses, the concentrations of oxidized and reduced species, pH, and temperature. These gradients lead to biogeochemical reactions that ultimately affect stream water quality. Due to the complexity of these natural systems, modeling techniques are frequently employed to quantify process dynamics.

  8. Distribution of model uncertainty across multiple data streams

    NASA Astrophysics Data System (ADS)

    Wutzler, Thomas

    2014-05-01

    When confronting biogeochemical models with a diversity of observational data streams, we are faced with the problem of weighing the data streams. Without weighing or multiple blocked cost functions, model uncertainty is allocated to the sparse data streams and possible bias in processes that are strongly constraint is exported to processes that are constrained by sparse data streams only. In this study we propose an approach that aims at making model uncertainty a factor of observations uncertainty, that is constant over all data streams. Further we propose an implementation based on Monte-Carlo Markov chain sampling combined with simulated annealing that is able to determine this variance factor. The method is exemplified both with very simple models, artificial data and with an inversion of the DALEC ecosystem carbon model against multiple observations of Howland forest. We argue that the presented approach is able to help and maybe resolve the problem of bias export to sparse data streams.

  9. Apparatus and method for two-stage oxidation of wastes

    DOEpatents

    Fleischman, Scott D.

    1995-01-01

    An apparatus and method for oxidizing wastes in a two-stage process. The apparatus includes an oxidation device, a gas-liquid contacting column and an electrocell. In the first stage of the process, wastes are heated in the presence of air to partially oxidize the wastes. The heated wastes produce an off-gas stream containing oxidizable materials. In the second stage, the off-gas stream is cooled and flowed through the contacting column, where the off-gas stream is contacted with an aqueous acid stream containing an oxidizing agent having at least two positive valence states. At least a portion of the oxidizable materials are transferred to the acid stream and destroyed by the oxidizing agent. During oxidation, the valence of the oxidizing agent is decreased from its higher state to its lower state. The acid stream is flowed to the electrocell, where an electric current is applied to the stream to restore the oxidizing agent to its higher valence state. The regenerated acid stream is recycled to the contacting column.

  10. Method for separating water soluble organics from a process stream by aqueous biphasic extraction

    DOEpatents

    Chaiko, David J.; Mego, William A.

    1999-01-01

    A method for separating water-miscible organic species from a process stream by aqueous biphasic extraction is provided. An aqueous biphase system is generated by contacting a process stream comprised of water, salt, and organic species with an aqueous polymer solution. The organic species transfer from the salt-rich phase to the polymer-rich phase, and the phases are separated. Next, the polymer is recovered from the loaded polymer phase by selectively extracting the polymer into an organic phase at an elevated temperature, while the organic species remain in a substantially salt-free aqueous solution. Alternatively, the polymer is recovered from the loaded polymer by a temperature induced phase separation (cloud point extraction), whereby the polymer and the organic species separate into two distinct solutions. The method for separating water-miscible organic species is applicable to the treatment of industrial wastewater streams, including the extraction and recovery of complexed metal ions from salt solutions, organic contaminants from mineral processing streams, and colorants from spent dye baths.

  11. Defining the cortical visual systems: "what", "where", and "how"

    NASA Technical Reports Server (NTRS)

    Creem, S. H.; Proffitt, D. R.; Kaiser, M. K. (Principal Investigator)

    2001-01-01

    The visual system historically has been defined as consisting of at least two broad subsystems subserving object and spatial vision. These visual processing streams have been organized both structurally as two distinct pathways in the brain, and functionally for the types of tasks that they mediate. The classic definition by Ungerleider and Mishkin labeled a ventral "what" stream to process object information and a dorsal "where" stream to process spatial information. More recently, Goodale and Milner redefined the two visual systems with a focus on the different ways in which visual information is transformed for different goals. They relabeled the dorsal stream as a "how" system for transforming visual information using an egocentric frame of reference in preparation for direct action. This paper reviews recent research from psychophysics, neurophysiology, neuropsychology and neuroimaging to define the roles of the ventral and dorsal visual processing streams. We discuss a possible solution that allows for both "where" and "how" systems that are functionally and structurally organized within the posterior parietal lobe.

  12. A new neural framework for visuospatial processing

    PubMed Central

    Kravitz, Dwight J.; Saleem, Kadharbatcha S.; Baker, Chris I.; Mishkin, Mortimer

    2012-01-01

    The division of cortical visual processing into distinct dorsal and ventral streams is a key framework that has guided visual neuroscience. The characterization of the ventral stream as a ‘What’ pathway is relatively uncontroversial, but the nature of dorsal stream processing is less clear. Originally proposed as mediating spatial perception (‘Where’), more recent accounts suggest it primarily serves non-conscious visually guided action (‘How’). Here, we identify three pathways emerging from the dorsal stream that consist of projections to the prefrontal and premotor cortices, and a major projection to the medial temporal lobe that courses both directly and indirectly through the posterior cingulate and retrosplenial cortices. These three pathways support both conscious and non-conscious visuospatial processing, including spatial working memory, visually guided action and navigation, respectively. PMID:21415848

  13. Developing an Environmental Decision Support System for Stream Management: the STREAMES Experience

    NASA Astrophysics Data System (ADS)

    Riera, J.; Argerich, A.; Comas, J.; Llorens, E.; Martí, E.; Godé, L.; Pargament, D.; Puig, M.; Sabater, F.

    2005-05-01

    Transferring research knowledge to stream managers is crucial for scientifically sound management. Environmental decision support systems are advocated as an effective means to accomplish this. STREAMES (STream REAach Management: an Expert System) is a decision tree based EDSS prototype developed within the context of an European project as a tool to assist water managers in the diagnosis of problems, detection of causes, and selection of management strategies for coping with stream degradation issues related mostly to excess nutrient availability. STREAMES was developed by a team of scientists, water managers, and experts in knowledge engineering. Although the tool focuses on management at the stream reach scale, it also incorporates a mass-balance catchment nutrient emission model and a simple GIS module. We will briefly present the prototype and share our experience in its development. Emphasis will be placed on the process of knowledge acquisition, the design process, the pitfalls and benefits of the communication between scientists and managers, and the potential for future development of STREAMES, particularly in the context of the EU Water Framework Directive.

  14. Stream Processors

    NASA Astrophysics Data System (ADS)

    Erez, Mattan; Dally, William J.

    Stream processors, like other multi core architectures partition their functional units and storage into multiple processing elements. In contrast to typical architectures, which contain symmetric general-purpose cores and a cache hierarchy, stream processors have a significantly leaner design. Stream processors are specifically designed for the stream execution model, in which applications have large amounts of explicit parallel computation, structured and predictable control, and memory accesses that can be performed at a coarse granularity. Applications in the streaming model are expressed in a gather-compute-scatter form, yielding programs with explicit control over transferring data to and from on-chip memory. Relying on these characteristics, which are common to many media processing and scientific computing applications, stream architectures redefine the boundary between software and hardware responsibilities with software bearing much of the complexity required to manage concurrency, locality, and latency tolerance. Thus, stream processors have minimal control consisting of fetching medium- and coarse-grained instructions and executing them directly on the many ALUs. Moreover, the on-chip storage hierarchy of stream processors is under explicit software control, as is all communication, eliminating the need for complex reactive hardware mechanisms.

  15. Method of synchronizing independent functional unit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, Changhoan

    A system for synchronizing parallel processing of a plurality of functional processing units (FPU), a first FPU and a first program counter to control timing of a first stream of program instructions issued to the first FPU by advancement of the first program counter; a second FPU and a second program counter to control timing of a second stream of program instructions issued to the second FPU by advancement of the second program counter, the first FPU is in communication with a second FPU to synchronize the issuance of a first stream of program instructions to the second stream ofmore » program instructions and the second FPU is in communication with the first FPU to synchronize the issuance of the second stream program instructions to the first stream of program instructions.« less

  16. Method of synchronizing independent functional unit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, Changhoan

    2017-05-16

    A system for synchronizing parallel processing of a plurality of functional processing units (FPU), a first FPU and a first program counter to control timing of a first stream of program instructions issued to the first FPU by advancement of the first program counter; a second FPU and a second program counter to control timing of a second stream of program instructions issued to the second FPU by advancement of the second program counter, the first FPU is in communication with a second FPU to synchronize the issuance of a first stream of program instructions to the second stream ofmore » program instructions and the second FPU is in communication with the first FPU to synchronize the issuance of the second stream program instructions to the first stream of program instructions.« less

  17. Method of synchronizing independent functional unit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, Changhoan

    2017-02-14

    A system for synchronizing parallel processing of a plurality of functional processing units (FPU), a first FPU and a first program counter to control timing of a first stream of program instructions issued to the first FPU by advancement of the first program counter; a second FPU and a second program counter to control timing of a second stream of program instructions issued to the second FPU by advancement of the second program counter, the first FPU is in communication with a second FPU to synchronize the issuance of a first stream of program instructions to the second stream ofmore » program instructions and the second FPU is in communication with the first FPU to synchronize the issuance of the second stream program instructions to the first stream of program instructions.« less

  18. Evaluation of Topographic wetness index and catchment characteristics on spatially and temporally variable streams across an elevation gradient

    NASA Astrophysics Data System (ADS)

    Martin, C.

    2017-12-01

    Topography can be used to delineate streams and quantify the topographic control on hydrological processes of a watershed because geomorphologic processes have shaped the topography and streams of a catchment over time. Topographic Wetness index (TWI) is a common index used for delineating stream networks by predicting location of saturation excess overland flow, but is also used for other physical attributes of a watershed such as soil moisture, groundwater level, and vegetation patterns. This study evaluates how well TWI works across an elevation gradient and the relationships between the active drainage network of four headwater watersheds at various elevations in the Colorado Front Range to topography, geology, climate, soils, elevation, and vegetation in attempt to determine the controls on streamflow location and duration. The results suggest that streams prefer to flow along a path of least resistance which including faults and permeable lithology. Permeable lithologies created more connectivity of stream networks during higher flows but during lower flows dried up. Streams flowing over impermeable lithologies had longer flow duration. Upslope soil hydraulic conductivity played a role on stream location, where soils with low hydraulic conductivity had longer flow duration than soils with higher hydraulic conductivity.Finally TWI thresholds ranged from 5.95 - 10.3 due to changes in stream length and to factors such as geology and soil. TWI had low accuracy for the lowest elevation site due to the greatest change of stream length. In conclusion, structural geology, upslope soil texture, and the permeability of the underlying lithology influenced where the stream was flowing and for how long. Elevation determines climate which influences the hydrologic processes occurring at the watersheds and therefore affects the duration and timing of streams at different elevations. TWI is an adequate tool for delineating streams because results suggest topography has a primary control on the stream locations, but because intermittent streams change throughout the year a algorithm needs to be created to correspond to snow melt and rain events. Also geology indices and soil indices need be considered in addition to topography to have the most accurate derived stream network.

  19. Leaf breakdown in streams differing in catchment land use

    USGS Publications Warehouse

    Paul, M.J.; Meyer, J.L.; Couch, C.A.

    2006-01-01

    1. The impact of changes in land use on stream ecosystem function is poorly understood. We studied leaf breakdown, a fundamental process of stream ecosystems, in streams that represent a range of catchment land use in the Piedmont physiographic province of the south-eastern United States. 2. We placed bags of chalk maple (Acer barbatum) leaves in similar-sized streams in 12 catchments of differing dominant land use: four forested, three agricultural, two suburban and three urban catchments. We measured leaf mass, invertebrate abundance and fungal biomass in leaf bags over time. 3. Leaves decayed significantly faster in agricultural (0.0465 day-1) and urban (0.0474 day-1) streams than in suburban (0.0173 day-1) and forested (0.0100 day-1) streams. Additionally, breakdown rates in the agricultural and urban streams were among the fastest reported for deciduous leaves in any stream. Nutrient concentrations in agricultural streams were significantly higher than in any other land-use type. Fungal biomass associated with leaves was significantly lower in urban streams; while shredder abundance in leaf bags was significantly higher in forested and agricultural streams than in suburban and urban streams. Storm runoff was significantly higher in urban and suburban catchments that had higher impervious surface cover than forested or agricultural catchments. 4. We propose that processes accelerating leaf breakdown in agricultural and urban streams were not the same: faster breakdown in agricultural streams was due to increased biological activity as a result of nutrient enrichment, whereas faster breakdown in urban streams was a result of physical fragmentation resulting from higher storm runoff. ?? 2006 The Authors.

  20. Method for processing aqueous wastes

    DOEpatents

    Pickett, J.B.; Martin, H.L.; Langton, C.A.; Harley, W.W.

    1993-12-28

    A method is presented for treating waste water such as that from an industrial processing facility comprising the separation of the waste water into a dilute waste stream and a concentrated waste stream. The concentrated waste stream is treated chemically to enhance precipitation and then allowed to separate into a sludge and a supernate. The supernate is skimmed or filtered from the sludge and blended with the dilute waste stream to form a second dilute waste stream. The sludge remaining is mixed with cementitious material, rinsed to dissolve soluble components, then pressed to remove excess water and dissolved solids before being allowed to cure. The dilute waste stream is also chemically treated to decompose carbonate complexes and metal ions and then mixed with cationic polymer to cause the precipitated solids to flocculate. Filtration of the flocculant removes sufficient solids to allow the waste water to be discharged to the surface of a stream. The filtered material is added to the sludge of the concentrated waste stream. The method is also applicable to the treatment and removal of soluble uranium from aqueous streams, such that the treated stream may be used as a potable water supply. 4 figures.

  1. The Stream Table in Physical Geography Instruction.

    ERIC Educational Resources Information Center

    Wikle, Thomas A.; Lightfoot, Dale R.

    1997-01-01

    Outlines a number of activities to be conducted with a stream table (large wooden box filled with sediment and designed for water to pass through) in class. Activities illustrate such fluvial processes as stream meandering, erosion, transportation, and deposition. Includes a diagram for constructing a stream table. (MJP)

  2. THE EMERGING USE OF LIDAR AS A TOOL FOR ASSESSING WATERSHED MORPHOLOGY

    EPA Science Inventory

    Stream channel morphology is an integral component of the stream fluvial process and is inherently related to the stability of stream aquatic ecology. Numerous studies have shown that changes in stream channel geometry are related to changes in biotic integrity. In urbanizing la...

  3. Ecoregions and stream morphology in eastern Oklahoma

    USGS Publications Warehouse

    Splinter, D.K.; Dauwalter, D.C.; Marston, R.A.; Fisher, W.L.

    2010-01-01

    Broad-scale variables (i.e., geology, topography, climate, land use, vegetation, and soils) influence channel morphology. How and to what extent the longitudinal pattern of channel morphology is influenced by broad-scale variables is important to fluvial geomorphologists and stream ecologists. In the last couple of decades, there has been an increase in the amount of interdisciplinary research between fluvial geomorphologists and stream ecologists. In a historical context, fluvial geomorphologists are more apt to use physiographic regions to distinguish broad-scale variables, while stream ecologists are more apt to use the concept of an ecosystem to address the broad-scale variables that influence stream habitat. For this reason, we designed a study using ecoregions, which uses physical and biological variables to understand how landscapes influence channel processes. Ecoregions are delineated by similarities in geology, climate, soils, land use, and potential natural vegetation. In the fluvial system, stream form and function are dictated by processes observed throughout the fluvial hierarchy. Recognizing that stream form and function should differ by ecoregion, a study was designed to evaluate how the characteristics of stream channels differed longitudinally among three ecoregions in eastern Oklahoma, USA: Boston Mountains, Ozark Highlands, and Ouachita Mountains. Channel morphology of 149 stream reaches was surveyed in 1st- through 4th-order streams, and effects of drainage area and ecoregion on channel morphology was evaluated using multiple regressions. Differences existed (?????0.05) among ecoregions for particle size, bankfull width, and width/depth ratio. No differences existed among ecoregions for gradient or sinuosity. Particle size was smallest in the Ozark Highlands and largest in the Ouachita Mountains. Bankfull width was larger in the Ozark Highlands than in the Boston Mountains and Ouachita Mountains in larger streams. Width/depth ratios of the Boston Mountains and Ozark Highlands were not statistically different. Significant differences existed, however, between the Boston Mountains and Ozark Highlands when compared individually to the Ouachita Mountains. We found that ecoregions afforded a good spatial structure that can help in understanding longitudinal trends in stream reach morphology surveyed at the reach scale. The hierarchy of the fluvial system begins within a broad, relatively homogenous setting that imparts control on processes that affect stream function. Ecoregions provide an adequate regional division to begin a large-scale geomorphic study of processes in stream channels. ?? 2010 Elsevier B.V.

  4. Recovery of olefin monomers

    DOEpatents

    Golden, Timothy Christoph; Weist, Jr., Edward Landis; Johnson, Charles Henry

    2004-03-16

    In a process for the production of a polyolefin, an olefin monomer is polymerised said polyolefin and residual monomer is recovered. A gas stream comprising the monomer and nitrogen is subjected to a PSA process in which said monomer is adsorbed on a periodically regenerated silica gel or alumina adsorbent to recover a purified gas stream containing said olefin and a nitrogen rich stream containing no less than 99% nitrogen and containing no less than 50% of the nitrogen content of the gas feed to the PSA process.

  5. Symmetric Stream Cipher using Triple Transposition Key Method and Base64 Algorithm for Security Improvement

    NASA Astrophysics Data System (ADS)

    Nurdiyanto, Heri; Rahim, Robbi; Wulan, Nur

    2017-12-01

    Symmetric type cryptography algorithm is known many weaknesses in encryption process compared with asymmetric type algorithm, symmetric stream cipher are algorithm that works on XOR process between plaintext and key, to improve the security of symmetric stream cipher algorithm done improvisation by using Triple Transposition Key which developed from Transposition Cipher and also use Base64 algorithm for encryption ending process, and from experiment the ciphertext that produced good enough and very random.

  6. Material and system for catalytic reduction of nitrogen oxide in an exhaust stream of a combustion process

    DOEpatents

    Gardner, Timothy J.; Lott, Stephen E.; Lockwood, Steven J.; McLaughlin, Linda I.

    1998-01-01

    A catalytic material of activated hydrous metal oxide doped with platinum, palladium, or a combination of these, and optionally containing an alkali or alkaline earth metal, that is effective for NO.sub.X reduction in an oxidizing exhaust stream from a combustion process is disclosed. A device for reduction of nitrogen oxides in an exhaust stream, particularly an automotive exhaust stream, the device having a substrate coated with the activated noble-metal doped hydrous metal oxide of the invention is also provided.

  7. Green Cloud on the Horizon

    NASA Astrophysics Data System (ADS)

    Ali, Mufajjul

    This paper proposes a Green Cloud model for mobile Cloud computing. The proposed model leverage on the current trend of IaaS (Infrastructure as a Service), PaaS (Platform as a Service) and SaaS (Software as a Service), and look at new paradigm called "Network as a Service" (NaaS). The Green Cloud model proposes various Telco's revenue generating streams and services with the CaaS (Cloud as a Service) for the near future.

  8. Dispersal constraints for stream invertebrates: setting realistic timescales for biodiversity restoration.

    PubMed

    Parkyn, Stephanie M; Smith, Brian J

    2011-09-01

    Biodiversity goals are becoming increasingly important in stream restoration. Typical models of stream restoration are based on the assumption that if habitat is restored then species will return and ecological processes will re-establish. However, a range of constraints at different scales can affect restoration success. Much of the research in stream restoration ecology has focused on habitat constraints, namely the in-stream and riparian conditions required to restore biota. Dispersal constraints are also integral to determining the timescales, trajectory and potential endpoints of a restored ecosystem. Dispersal is both a means of organism recolonization of restored sites and a vital ecological process that maintains viable populations. We review knowledge of dispersal pathways and explore the factors influencing stream invertebrate dispersal. From empirical and modeling studies of restoration in warm-temperate zones of New Zealand, we make predictions about the timescales of stream ecological restoration under differing levels of dispersal constraints. This process of constraints identification and timescale prediction is proposed as a practical step for resource managers to prioritize and appropriately monitor restoration sites and highlights that in some instances, natural recolonization and achievement of biodiversity goals may not occur.

  9. Dispersal Constraints for Stream Invertebrates: Setting Realistic Timescales for Biodiversity Restoration

    NASA Astrophysics Data System (ADS)

    Parkyn, Stephanie M.; Smith, Brian J.

    2011-09-01

    Biodiversity goals are becoming increasingly important in stream restoration. Typical models of stream restoration are based on the assumption that if habitat is restored then species will return and ecological processes will re-establish. However, a range of constraints at different scales can affect restoration success. Much of the research in stream restoration ecology has focused on habitat constraints, namely the in-stream and riparian conditions required to restore biota. Dispersal constraints are also integral to determining the timescales, trajectory and potential endpoints of a restored ecosystem. Dispersal is both a means of organism recolonization of restored sites and a vital ecological process that maintains viable populations. We review knowledge of dispersal pathways and explore the factors influencing stream invertebrate dispersal. From empirical and modeling studies of restoration in warm-temperate zones of New Zealand, we make predictions about the timescales of stream ecological restoration under differing levels of dispersal constraints. This process of constraints identification and timescale prediction is proposed as a practical step for resource managers to prioritize and appropriately monitor restoration sites and highlights that in some instances, natural recolonization and achievement of biodiversity goals may not occur.

  10. Does Value Stream Mapping affect the structure, process, and outcome quality in care facilities? A systematic review.

    PubMed

    Nowak, Marina; Pfaff, Holger; Karbach, Ute

    2017-08-24

    Quality improvement within health and social care facilities is needed and has to be evidence-based and patient-centered. Value Stream Mapping, a method of Lean management, aims to increase the patients' value and quality of care by a visualization and quantification of the care process. The aim of this research is to examine the effectiveness of Value Stream Mapping on structure, process, and outcome quality in care facilities. A systematic review is conducted. PubMed, EBSCOhost, including Business Source Complete, Academic Search Complete, PSYCInfo, PSYNDX, SocINDEX with Full Text, Web of Knowledge, and EMBASE ScienceDirect are searched in February 2016. All peer-reviewed papers evaluating Value Stream Mapping and published in English or German from January 2000 are included. For data synthesis, all study results are categorized into Donabedian's model of structure, process, and outcome quality. To assess and interpret the effectiveness of Value Stream Mapping, the frequencies of the results statistically examined are considered. Of the 903 articles retrieved, 22 studies fulfill the inclusion criteria. Of these, 11 studies are used to answer the research question. Value Stream Mapping has positive effects on the time dimension of process and outcome quality. It seems to reduce non-value-added time (e.g., waiting time) and length of stay. All study designs are before and after studies without control, and methodologically sophisticated studies are missing. For a final conclusion about Value Stream Mapping's effectiveness, more research with improved methodology is needed. Despite this lack of evidence, Value Stream Mapping has the potential to improve quality of care on the time dimension. The contextual influence has to be investigated to make conclusions about the relationship between different quality domains when applying Value Stream Mapping. However, for using this review's conclusion, the limitation of including heterogeneous and potentially biased results has to be considered.

  11. NO.sub.x sensor and process for detecting NO.sub.x

    DOEpatents

    Dalla Betta, Ralph A.; Sheridan, David R.; Reed, Daniel L.

    1994-01-01

    This invention is a process for detecting low levels of nitrogen oxides (NO.sub.x) in a flowing gas stream (typically an exhaust gas stream) and a catalytic NO.sub.x sensor which may be used in that process.

  12. F3D Image Processing and Analysis for Many - and Multi-core Platforms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    F3D is written in OpenCL, so it achieve[sic] platform-portable parallelism on modern mutli-core CPUs and many-core GPUs. The interface and mechanims to access F3D core are written in Java as a plugin for Fiji/ImageJ to deliver several key image-processing algorithms necessary to remove artifacts from micro-tomography data. The algorithms consist of data parallel aware filters that can efficiently utilizes[sic] resources and can work on out of core datasets and scale efficiently across multiple accelerators. Optimizing for data parallel filters, streaming out of core datasets, and efficient resource and memory and data managements over complex execution sequence of filters greatly expeditesmore » any scientific workflow with image processing requirements. F3D performs several different types of 3D image processing operations, such as non-linear filtering using bilateral filtering and/or median filtering and/or morphological operators (MM). F3D gray-level MM operators are one-pass constant time methods that can perform morphological transformations with a line-structuring element oriented in discrete directions. Additionally, MM operators can be applied to gray-scale images, and consist of two parts: (a) a reference shape or structuring element, which is translated over the image, and (b) a mechanism, or operation, that defines the comparisons to be performed between the image and the structuring element. This tool provides a critical component within many complex pipelines such as those for performing automated segmentation of image stacks. F3D is also called a "descendent" of Quant-CT, another software we developed in the past. These two modules are to be integrated in a next version. Further details were reported in: D.M. Ushizima, T. Perciano, H. Krishnan, B. Loring, H. Bale, D. Parkinson, and J. Sethian. Structure recognition from high-resolution images of ceramic composites. IEEE International Conference on Big Data, October 2014.« less

  13. Liquid additives for particulate emissions control

    DOEpatents

    Durham, M.D.; Schlager, R.J.; Ebner, T.G.; Stewart, R.M.; Hyatt, D.E.; Bustard, C.J.; Sjostrom, S.

    1999-01-05

    The present invention discloses a process for removing undesired particles from a gas stream including the steps of contacting a composition containing an adhesive with the gas stream; collecting the undesired particles and adhesive on a collection surface to form an aggregate comprising the adhesive and undesired particles on the collection surface; and removing the agglomerate from the collection zone. The composition may then be atomized and injected into the gas stream. The composition may include a liquid that vaporizes in the gas stream. After the liquid vaporizes, adhesive particles are entrained in the gas stream. The process may be applied to electrostatic precipitators and filtration systems to improve undesired particle collection efficiency. 11 figs.

  14. Processes for washing a spent ion exchange bed and for treating biomass-derived pyrolysis oil, and apparatuses for treating biomass-derived pyrolysis oil

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baird, Lance Awender; Brandvold, Timothy A.

    Processes and apparatuses for washing a spent ion exchange bed and for treating biomass-derived pyrolysis oil are provided herein. An exemplary process for washing a spent ion exchange bed employed in purification of biomass-derived pyrolysis oil includes the step of providing a ion-depleted pyrolysis oil stream having an original oxygen content. The ion-depleted pyrolysis oil stream is partially hydrotreated to reduce the oxygen content thereof, thereby producing a partially hydrotreated pyrolysis oil stream having a residual oxygen content that is less than the original oxygen content. At least a portion of the partially hydrotreated pyrolysis oil stream is passed throughmore » the spent ion exchange bed. Water is passed through the spent ion exchange bed after passing at least the portion of the partially hydrotreated pyrolysis oil stream therethrough.« less

  15. Segment scheduling method for reducing 360° video streaming latency

    NASA Astrophysics Data System (ADS)

    Gudumasu, Srinivas; Asbun, Eduardo; He, Yong; Ye, Yan

    2017-09-01

    360° video is an emerging new format in the media industry enabled by the growing availability of virtual reality devices. It provides the viewer a new sense of presence and immersion. Compared to conventional rectilinear video (2D or 3D), 360° video poses a new and difficult set of engineering challenges on video processing and delivery. Enabling comfortable and immersive user experience requires very high video quality and very low latency, while the large video file size poses a challenge to delivering 360° video in a quality manner at scale. Conventionally, 360° video represented in equirectangular or other projection formats can be encoded as a single standards-compliant bitstream using existing video codecs such as H.264/AVC or H.265/HEVC. Such method usually needs very high bandwidth to provide an immersive user experience. While at the client side, much of such high bandwidth and the computational power used to decode the video are wasted because the user only watches a small portion (i.e., viewport) of the entire picture. Viewport dependent 360°video processing and delivery approaches spend more bandwidth on the viewport than on non-viewports and are therefore able to reduce the overall transmission bandwidth. This paper proposes a dual buffer segment scheduling algorithm for viewport adaptive streaming methods to reduce latency when switching between high quality viewports in 360° video streaming. The approach decouples the scheduling of viewport segments and non-viewport segments to ensure the viewport segment requested matches the latest user head orientation. A base layer buffer stores all lower quality segments, and a viewport buffer stores high quality viewport segments corresponding to the most recent viewer's head orientation. The scheduling scheme determines viewport requesting time based on the buffer status and the head orientation. This paper also discusses how to deploy the proposed scheduling design for various viewport adaptive video streaming methods. The proposed dual buffer segment scheduling method is implemented in an end-to-end tile based 360° viewports adaptive video streaming platform, where the entire 360° video is divided into a number of tiles, and each tile is independently encoded into multiple quality level representations. The client requests different quality level representations of each tile based on the viewer's head orientation and the available bandwidth, and then composes all tiles together for rendering. The simulation results verify that the proposed dual buffer segment scheduling algorithm reduces the viewport switch latency, and utilizes available bandwidth more efficiently. As a result, a more consistent immersive 360° video viewing experience can be presented to the user.

  16. Prospects, recent advancements and challenges of different wastewater streams for microalgal cultivation.

    PubMed

    Guldhe, Abhishek; Kumari, Sheena; Ramanna, Luveshan; Ramsundar, Prathana; Singh, Poonam; Rawat, Ismail; Bux, Faizal

    2017-12-01

    Microalgae are recognized as one of the most powerful biotechnology platforms for many value added products including biofuels, bioactive compounds, animal and aquaculture feed etc. However, large scale production of microalgal biomass poses challenges due to the requirements of large amounts of water and nutrients for cultivation. Using wastewater for microalgal cultivation has emerged as a potential cost effective strategy for large scale microalgal biomass production. This approach also offers an efficient means to remove nutrients and metals from wastewater making wastewater treatment sustainable and energy efficient. Therefore, much research has been conducted in the recent years on utilizing various wastewater streams for microalgae cultivation. This review identifies and discusses the opportunities and challenges of different wastewater streams for microalgal cultivation. Many alternative routes for microalgal cultivation have been proposed to tackle some of the challenges that occur during microalgal cultivation in wastewater such as nutrient deficiency, substrate inhibition, toxicity etc. Scope and challenges of microalgal biomass grown on wastewater for various applications are also discussed along with the biorefinery approach. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. SIMULATION COASTAL PLAIN STREAM FISH COMMUNITY RESPONSE TO NONPOINT SOURCE POLLUTION USING LINKED HYDROLOGIC-ECOLOGICAL MODELS

    EPA Science Inventory

    Nonpoint source pollution is the primary stress in many streams. Characteristic declines in stream fish communities are recognized in streams influenced by nonpoint source pollution, but the processes by which these declines occur are not well understood. Here, predicted time s...

  18. Orbiter processing facility: Access platforms Kennedy Space Center, Florida, from challenge to achievement

    NASA Technical Reports Server (NTRS)

    Haratunian, M.

    1985-01-01

    A system of access platforms and equipment within the space shuttle orbiter processing facility at Kennedy Space Center is described. The design challenges of the platforms, including clearance envelopes, load criteria, and movement, are discussed. Various applications of moveable platforms are considered.

  19. Continuous pCO2 time series from Ocean Networks Canada cabled observatories at the northeast Pacific shelf edge and in the sub-tidal Arctic

    NASA Astrophysics Data System (ADS)

    Juniper, S. Kim; Sastri, Akash; Mihaly, Steven; Duke, Patrick; Else, Brent; Thomas, Helmuth; Miller, Lisa

    2017-04-01

    Marine pCO2 sensor technology has progressed to the point where months-long time series from remotely-deployed pCO2 sensors can be used to document seasonal and higher frequency variability in pCO2 and its relationship to oceanographic processes. Ocean Networks Canada recently deployed pCO2 sensors on two cabled platforms: a bottom-moored (400 m depth), vertical profiler at the edge of the northeast Pacific continental shelf off Vancouver Island, Canada, and a subtidal seafloor platform in the Canadian High Arctic (69˚ N) at Cambridge Bay, Nunavut. Both platforms streamed continuous data to a shore-based archive from Pro-Oceanus pCO2 sensors and other oceanographic instruments. The vertical profiler time series revealed substantial intrusions of corrosive (high CO2/low O2), saltier, colder water masses during the summertime upwelling season and during winter-time reversals of along-slope currents. Step-wise profiles during the downcast provided the most reliable pCO2 data, permitting the sensor to equilibrate to the broad range of pCO2 concentrations encountered over the 400 metre depth interval. The Arctic pCO2 sensor was deployed in August 2015. Reversing seasonal trends in pCO2 and dissolved oxygen values can be related to the changing balance of photosynthesis and respiration under sea ice, as influenced by irradiance. Correlation of pCO2 and dissolved oxygen sensor data and the collection of calibration samples have permitted evaluation of sensor performance in relation to operational conditions encountered in vertical profiling and lengthy exposure to subzero seawater.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pallesen, T.R.; Braestrup, M.W.; Jorgensen, O.

    Development of Danish North Sea hydrocarbon resources includes the 17-km Rolf pipeline installed in 1985. This one consists of an insulated 8-in. two-phase flow product line with a 3-in. piggyback gas lift line. A practical solution to design of this insulated pipeline, including the small diameter, piggyback injection line was corrosion coating of fusion bonded epoxy (FBE) and polyethylene (PE) sleeve pipe. The insulation design prevents hydrate formation under the most conservative flow regime during gas lift production. Also, the required minimum flow rate during the initial natural lift period is well below the value anticipiated at the initiation ofmore » gas lift. The weight coating design ensures stability on the seabed during the summer months only; thus trenching was required during the same installation season. Installation of insulated flowlines serving marginal fields is a significant feature of North Sea hydrocarbon development projects. The Skjold field is connected to Gorm by a 6-in., two-phase-flow line. The 11-km line was installed in 1982 as the first insulated pipeline in the North Sea. The Rolf field, located 17 km west of Gorm, went on stream Jan. 2. The development includes an unmanned wellhead platform and an insulated, two-phase-flow pipeline to the Gorm E riser platform. After separation on the Gorm C process platform, the oil and condensate are transported to shore through the 20-in. oil pipeline, and the natural gas is piped to Tyra for transmission through the 30-in. gas pipeline. Oil production at Rolf is assisted by the injection of lift gas, transported from Gorm through a 3-in. pipeline, installed piggyback on the insulated 8-in. product line. The seabed is smooth and sandy, the water depth varying between 33.7 m (110.5 ft) at Rolf and 39.1 m (128 ft) at Gorm.« less

  1. Face and location processing in children with early unilateral brain injury.

    PubMed

    Paul, Brianna; Appelbaum, Mark; Carapetian, Stephanie; Hesselink, John; Nass, Ruth; Trauner, Doris; Stiles, Joan

    2014-07-01

    Human visuospatial functions are commonly divided into those dependent on the ventral visual stream (ventral occipitotemporal regions), which allows for processing the 'what' of an object, and the dorsal visual stream (dorsal occipitoparietal regions), which allows for processing 'where' an object is in space. Information about the development of each of the two streams has been accumulating, but very little is known about the effects of injury, particularly very early injury, on this developmental process. Using a set of computerized dorsal and ventral stream tasks matched for stimuli, required response, and difficulty (for typically-developing individuals), we sought to compare the differential effects of injury to the two systems by examining performance in individuals with perinatal brain injury (PBI), who present with selective deficits in visuospatial processing from a young age. Thirty participants (mean=15.1 years) with early unilateral brain injury (15 right hemisphere PBI, 15 left hemisphere PBI) and 16 matched controls participated. On our tasks children with PBI performed more poorly than controls (lower accuracy and longer response times), and this was particularly prominent for the ventral stream task. Lateralization of PBI was also a factor, as the dorsal stream task did not seem to be associated with lateralized deficits, with both PBI groups showing only subtle decrements in performance, while the ventral stream task elicited deficits from RPBI children that do not appear to improve with age. Our findings suggest that early injury results in lesion-specific visuospatial deficits that persist into adolescence. Further, as the stimuli used in our ventral stream task were faces, our findings are consistent with what is known about the neural systems for face processing, namely, that they are established relatively early, follow a comparatively rapid developmental trajectory (conferring a vulnerability to early insult), and are biased toward the right hemisphere. Copyright © 2014 Elsevier Inc. All rights reserved.

  2. Partial oxidation process for producing a stream of hot purified gas

    DOEpatents

    Leininger, Thomas F.; Robin, Allen M.; Wolfenbarger, James K.; Suggitt, Robert M.

    1995-01-01

    A partial oxidation process for the production of a stream of hot clean gas substantially free from particulate matter, ammonia, alkali metal compounds, halides and sulfur-containing gas for use as synthesis gas, reducing gas, or fuel gas. A hydrocarbonaceous fuel comprising a solid carbonaceous fuel with or without liquid hydrocarbonaceous fuel or gaseous hydrocarbon fuel, wherein said hydrocarbonaceous fuel contains halides, alkali metal compounds, sulfur, nitrogen and inorganic ash containing components, is reacted in a gasifier by partial oxidation to produce a hot raw gas stream comprising H.sub.2, CO, CO.sub.2, H.sub.2 O, CH.sub.4, NH.sub.3, HCl, HF, H.sub.2 S, COS, N.sub.2, Ar, particulate matter, vapor phase alkali metal compounds, and molten slag. The hot raw gas stream from the gasifier is split into two streams which are separately deslagged, cleaned and recombined. Ammonia in the gas mixture is catalytically disproportionated into N.sub.2 and H.sub.2. The ammonia-free gas stream is then cooled and halides in the gas stream are reacted with a supplementary alkali metal compound to remove HCl and HF. Alkali metal halides, vaporized alkali metal compounds and residual fine particulate matter are removed from the gas stream by further cooling and filtering. The sulfur-containing gases in the process gas stream are then reacted at high temperature with a regenerable sulfur-reactive mixed metal oxide sulfur sorbent material to produce a sulfided sorbent material which is then separated from the hot clean purified gas stream having a temperature of at least 1000.degree. F.

  3. Partial oxidation process for producing a stream of hot purified gas

    DOEpatents

    Leininger, T.F.; Robin, A.M.; Wolfenbarger, J.K.; Suggitt, R.M.

    1995-03-28

    A partial oxidation process is described for the production of a stream of hot clean gas substantially free from particulate matter, ammonia, alkali metal compounds, halides and sulfur-containing gas for use as synthesis gas, reducing gas, or fuel gas. A hydrocarbonaceous fuel comprising a solid carbonaceous fuel with or without liquid hydrocarbonaceous fuel or gaseous hydrocarbon fuel, wherein said hydrocarbonaceous fuel contains halides, alkali metal compounds, sulfur, nitrogen and inorganic ash containing components, is reacted in a gasifier by partial oxidation to produce a hot raw gas stream comprising H{sub 2}, CO, CO{sub 2}, H{sub 2}O, CH{sub 4}, NH{sub 3}, HCl, HF, H{sub 2}S, COS, N{sub 2}, Ar, particulate matter, vapor phase alkali metal compounds, and molten slag. The hot raw gas stream from the gasifier is split into two streams which are separately deslagged, cleaned and recombined. Ammonia in the gas mixture is catalytically disproportionated into N{sub 2} and H{sub 2}. The ammonia-free gas stream is then cooled and halides in the gas stream are reacted with a supplementary alkali metal compound to remove HCl and HF. Alkali metal halides, vaporized alkali metal compounds and residual fine particulate matter are removed from the gas stream by further cooling and filtering. The sulfur-containing gases in the process gas stream are then reacted at high temperature with a regenerable sulfur-reactive mixed metal oxide sulfur sorbent material to produce a sulfided sorbent material which is then separated from the hot clean purified gas stream having a temperature of at least 1000 F. 1 figure.

  4. SITHON: An Airborne Fire Detection System Compliant with Operational Tactical Requirements

    PubMed Central

    Kontoes, Charalabos; Keramitsoglou, Iphigenia; Sifakis, Nicolaos; Konstantinidis, Pavlos

    2009-01-01

    In response to the urging need of fire managers for timely information on fire location and extent, the SITHON system was developed. SITHON is a fully digital thermal imaging system, integrating INS/GPS and a digital camera, designed to provide timely positioned and projected thermal images and video data streams rapidly integrated in the GIS operated by Crisis Control Centres. This article presents in detail the hardware and software components of SITHON, and demonstrates the first encouraging results of test flights over the Sithonia Peninsula in Northern Greece. It is envisaged that the SITHON system will be soon operated onboard various airborne platforms including fire brigade airplanes and helicopters as well as on UAV platforms owned and operated by the Greek Air Forces. PMID:22399963

  5. Sources, transformations, and hydrological processes that control stream nitrate and dissolved organic matter concentrations during snowmelt in an upland forest

    Treesearch

    Stephen D. Sebestyen; Elizabeth W. Boyer; James B. Shanley; Carol Kendall; Daniel H. Doctor; George R. Aiken; Nobuhito Ohte

    2008-01-01

    We explored catchment processes that control stream nutrient concentrations at an upland forest in northeastern Vermont, USA, where inputs of nitrogen via atmospheric deposition are among the highest in the nation and affect ecosystem functioning. We traced sources of water, nitrate, and dissolved organic matter (DOM) using stream water samples collected at high...

  6. Electrolytic trapping of iodine from process gas streams

    DOEpatents

    Horner, Donald E.; Mailen, James C.; Posey, Franz A.

    1977-01-25

    A method for removing molecular, inorganic, and organic forms of iodine from process gas streams comprises the electrolytic oxidation of iodine in the presence of cobalt-III ions. The gas stream is passed through the anode compartment of a partitioned electrolytic cell having a nitric acid anolyte containing a catalytic amount of cobalt to cause the oxidation of effluent iodine species to aqueous soluble species.

  7. Two Visual Pathways in Primates Based on Sampling of Space: Exploitation and Exploration of Visual Information.

    PubMed

    Sheth, Bhavin R; Young, Ryan

    2016-01-01

    Evidence is strong that the visual pathway is segregated into two distinct streams-ventral and dorsal. Two proposals theorize that the pathways are segregated in function: The ventral stream processes information about object identity, whereas the dorsal stream, according to one model, processes information about either object location, and according to another, is responsible in executing movements under visual control. The models are influential; however recent experimental evidence challenges them, e.g., the ventral stream is not solely responsible for object recognition; conversely, its function is not strictly limited to object vision; the dorsal stream is not responsible by itself for spatial vision or visuomotor control; conversely, its function extends beyond vision or visuomotor control. In their place, we suggest a robust dichotomy consisting of a ventral stream selectively sampling high-resolution/ focal spaces, and a dorsal stream sampling nearly all of space with reduced foveal bias. The proposal hews closely to the theme of embodied cognition: Function arises as a consequence of an extant sensory underpinning. A continuous, not sharp, segregation based on function emerges, and carries with it an undercurrent of an exploitation-exploration dichotomy. Under this interpretation, cells of the ventral stream, which individually have more punctate receptive fields that generally include the fovea or parafovea, provide detailed information about object shapes and features and lead to the systematic exploitation of said information; cells of the dorsal stream, which individually have large receptive fields, contribute to visuospatial perception, provide information about the presence/absence of salient objects and their locations for novel exploration and subsequent exploitation by the ventral stream or, under certain conditions, the dorsal stream. We leverage the dichotomy to unify neuropsychological cases under a common umbrella, account for the increased prevalence of multisensory integration in the dorsal stream under a Bayesian framework, predict conditions under which object recognition utilizes the ventral or dorsal stream, and explain why cells of the dorsal stream drive sensorimotor control and motion processing and have poorer feature selectivity. Finally, the model speculates on a dynamic interaction between the two streams that underscores a unified, seamless perception. Existing theories are subsumed under our proposal.

  8. Methods of hydrotreating a liquid stream to remove clogging compounds

    DOEpatents

    Minderhoud, Johannes Kornelis [Amsterdam, NL; Nelson, Richard Gene [Katy, TX; Roes, Augustinus Wilhelmus Maria [Houston, TX; Ryan, Robert Charles [Houston, TX; Nair, Vijay [Katy, TX

    2009-09-22

    A method includes producing formation fluid from a subsurface in situ heat treatment process. The formation fluid is separated to produce a liquid stream and a gas stream. At least a portion of the liquid stream is provided to a hydrotreating unit. At least a portion of selected in situ heat treatment clogging compositions in the liquid stream are removed to produce a hydrotreated liquid stream by hydrotreating at least a portion of the liquid stream at conditions sufficient to remove the selected in situ heat treatment clogging compositions.

  9. Perception of shapes targeting local and global processes in autism spectrum disorders.

    PubMed

    Grinter, Emma J; Maybery, Murray T; Pellicano, Elizabeth; Badcock, Johanna C; Badcock, David R

    2010-06-01

    Several researchers have found evidence for impaired global processing in the dorsal visual stream in individuals with autism spectrum disorders (ASDs). However, support for a similar pattern of visual processing in the ventral visual stream is less consistent. Critical to resolving the inconsistency is the assessment of local and global form processing ability. Within the visual domain, radial frequency (RF) patterns - shapes formed by sinusoidally varying the radius of a circle to add 'bumps' of a certain number to a circle - can be used to examine local and global form perception. Typically developing children and children with an ASD discriminated between circles and RF patterns that are processed either locally (RF24) or globally (RF3). Children with an ASD required greater shape deformation to identify RF3 shapes compared to typically developing children, consistent with difficulty in global processing in the ventral stream. No group difference was observed for RF24 shapes, suggesting intact local ventral-stream processing. These outcomes support the position that a deficit in global visual processing is present in ASDs, consistent with the notion of Weak Central Coherence.

  10. Ambient groundwater flow diminishes nitrate processing in the hyporheic zone of streams

    NASA Astrophysics Data System (ADS)

    Azizian, Morvarid; Boano, Fulvio; Cook, Perran L. M.; Detwiler, Russell L.; Rippy, Megan A.; Grant, Stanley B.

    2017-05-01

    Modeling and experimental studies demonstrate that ambient groundwater reduces hyporheic exchange, but the implications of this observation for stream N-cycling is not yet clear. Here we utilize a simple process-based model (the Pumping and Streamline Segregation or PASS model) to evaluate N-cycling over two scales of hyporheic exchange (fluvial ripples and riffle-pool sequences), ten ambient groundwater and stream flow scenarios (five gaining and losing conditions and two stream discharges), and three biogeochemical settings (identified based on a principal component analysis of previously published measurements in streams throughout the United States). Model-data comparisons indicate that our model provides realistic estimates for direct denitrification of stream nitrate, but overpredicts nitrification and coupled nitrification-denitrification. Riffle-pool sequences are responsible for most of the N-processing, despite the fact that fluvial ripples generate 3-11 times more hyporheic exchange flux. Across all scenarios, hyporheic exchange flux and the Damköhler Number emerge as primary controls on stream N-cycling; the former regulates trafficking of nutrients and oxygen across the sediment-water interface, while the latter quantifies the relative rates of organic carbon mineralization and advective transport in streambed sediments. Vertical groundwater flux modulates both of these master variables in ways that tend to diminish stream N-cycling. Thus, anthropogenic perturbations of ambient groundwater flows (e.g., by urbanization, agricultural activities, groundwater mining, and/or climate change) may compromise some of the key ecosystem services provided by streams.

  11. Challenges and Opportunities of Long-Term Continuous Stream Metabolism Measurements at the National Ecological Observatory Network

    NASA Astrophysics Data System (ADS)

    Goodman, K. J.; Lunch, C. K.; Baxter, C.; Hall, R.; Holtgrieve, G. W.; Roberts, B. J.; Marcarelli, A. M.; Tank, J. L.

    2013-12-01

    Recent advances in dissolved oxygen sensing and modeling have made continuous measurements of whole-stream metabolism relatively easy to make, allowing ecologists to quantify and evaluate stream ecosystem health at expanded temporal and spatial scales. Long-term monitoring of continuous stream metabolism will enable a better understanding of the integrated and complex effects of anthropogenic change (e.g., land-use, climate, atmospheric deposition, invasive species, etc.) on stream ecosystem function. In addition to their value in the particular streams measured, information derived from long-term data will improve the ability to extrapolate from shorter-term data. With the need to better understand drivers and responses of whole-stream metabolism come difficulties in interpreting the results. Long-term trends will encompass physical changes in stream morphology and flow regime (e.g., variable flow conditions and changes in channel structure) combined with changes in biota. Additionally long-term data sets will require an organized database structure, careful quantification of errors and uncertainties, as well as propagation of error as a result of the calculation of metabolism metrics. Parsing of continuous data and the choice of modeling approaches can also have a large influence on results and on error estimation. The two main modeling challenges include 1) obtaining unbiased, low-error daily estimates of gross primary production (GPP) and ecosystem respiration (ER), and 2) interpreting GPP and ER measurements over extended time periods. The National Ecological Observatory Network (NEON), in partnership with academic and government scientists, has begun to tackle several of these challenges as it prepares for the collection and calculation of 30 years of continuous whole-stream metabolism data. NEON is a national-scale research platform that will use consistent procedures and protocols to standardize measurements across the United States, providing long-term, high-quality, open-access data from a connected network to address large-scale change. NEON infrastructure will support 36 aquatic sites across 19 ecoclimatic domains. Sites include core sites, which remain for 30 years, and relocatable sites, which move to capture regional gradients. NEON will measure continuous whole-stream metabolism in conjunction with aquatic, terrestrial and airborne observations, allowing researchers to link stream ecosystem function with landscape and climatic drivers encompassing short to long time periods (i.e., decades).

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williams, M. S.; Miller, D. H.; Fowley, M. D.

    The Savannah River National Laboratory (SRNL) was tasked to support validation of the Defense Waste Processing Facility (DWPF) melter offgas flammability model for the nitric-glycolic (NG) flowsheet. The work supports Deliverable 4 of the DWPF & Saltstone Facility Engineering Technical Task Request (TTR)1 and is supplemental to the Cold Cap Evaluation Furnace (CEF) testing conducted in 2014.2 The Slurry-fed Melt Rate Furnace (SMRF) was selected for the supplemental testing as it requires significantly less resources than the CEF and could provide a tool for more rapid analysis of melter feeds in the future. The SMRF platform has been used previouslymore » to evaluate melt rate behavior of DWPF glasses, but was modified to accommodate analysis of the offgas stream. Additionally, the Melt Rate Furnace (MRF) and Quartz Melt Rate Furnace (QMRF) were utilized for evaluations. MRF data was used exclusively for melt behavior observations and REDuction/OXidation (REDOX) prediction comparisons and will be briefly discussed in conjunction with its support of the SMRF testing. The QMRF was operated similarly to the SMRF for the same TTR task, but will be discussed in a separate future report. The overall objectives of the SMRF testing were to; 1) Evaluate the efficacy of the SMRF as a platform for steady state melter testing with continuous feeding and offgas analysis; and 2) Generate supplemental melter offgas flammability data to support the melter offgas flammability modelling effort for DWPF implementation of the NG flowsheet.« less

  13. Scaling up high throughput field phenotyping of corn and soy research plots using ground rovers

    NASA Astrophysics Data System (ADS)

    Peshlov, Boyan; Nakarmi, Akash; Baldwin, Steven; Essner, Scott; French, Jasenka

    2017-05-01

    Crop improvement programs require large and meticulous selection processes that effectively and accurately collect and analyze data to generate quality plant products as efficiently as possible, develop superior cropping and/or crop improvement methods. Typically, data collection for such testing is performed by field teams using hand-held instruments or manually-controlled devices. Although steps are taken to reduce error, the data collected in such manner can be unreliable due to human error and fatigue, which reduces the ability to make accurate selection decisions. Monsanto engineering teams have developed a high-clearance mobile platform (Rover) as a step towards high throughput and high accuracy phenotyping at an industrial scale. The rovers are equipped with GPS navigation, multiple cameras and sensors and on-board computers to acquire data and compute plant vigor metrics per plot. The supporting IT systems enable automatic path planning, plot identification, image and point cloud data QA/QC and near real-time analysis where results are streamed to enterprise databases for additional statistical analysis and product advancement decisions. Since the rover program was launched in North America in 2013, the number of research plots we can analyze in a growing season has expanded dramatically. This work describes some of the successes and challenges in scaling up of the rover platform for automated phenotyping to enable science at scale.

  14. Designing Extensible Data Management for Ocean Observatories, Platforms, and Devices

    NASA Astrophysics Data System (ADS)

    Graybeal, J.; Gomes, K.; McCann, M.; Schlining, B.; Schramm, R.; Wilkin, D.

    2002-12-01

    The Monterey Bay Aquarium Research Institute (MBARI) has been collecting science data for 15 years from all kinds of oceanographic instruments and systems, and is building a next-generation observing system, the MBARI Ocean Observing System (MOOS). To meet the data management requirements of the MOOS, the Institute began developing a flexible, extensible data management solution, the Shore Side Data System (SSDS). This data management system must address a wide variety of oceanographic instruments and data sources, including instruments and platforms of the future. Our data management solution will address all elements of the data management challenge, from ingest (including suitable pre-definition of metadata) through to access and visualization. Key to its success will be ease of use, and automatic incorporation of new data streams and data sets. The data will be of many different forms, and come from many different types of instruments. Instruments will be designed for fixed locations (as with moorings), changing locations (drifters and AUVs), and cruise-based sampling. Data from airplanes, satellites, models, and external archives must also be considered. Providing an architecture which allows data from these varied sources to be automatically archived and processed, yet readily accessed, is only possible with the best practices in metadata definition, software design, and re-use of third-party components. The current status of SSDS development will be presented, including lessons learned from our science users and from previous data management designs.

  15. Accelerating DNA analysis applications on GPU clusters

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tumeo, Antonino; Villa, Oreste

    DNA analysis is an emerging application of high performance bioinformatic. Modern sequencing machinery are able to provide, in few hours, large input streams of data which needs to be matched against exponentially growing databases known fragments. The ability to recognize these patterns effectively and fastly may allow extending the scale and the reach of the investigations performed by biology scientists. Aho-Corasick is an exact, multiple pattern matching algorithm often at the base of this application. High performance systems are a promising platform to accelerate this algorithm, which is computationally intensive but also inherently parallel. Nowadays, high performance systems also includemore » heterogeneous processing elements, such as Graphic Processing Units (GPUs), to further accelerate parallel algorithms. Unfortunately, the Aho-Corasick algorithm exhibits large performance variabilities, depending on the size of the input streams, on the number of patterns to search and on the number of matches, and poses significant challenges on current high performance software and hardware implementations. An adequate mapping of the algorithm on the target architecture, coping with the limit of the underlining hardware, is required to reach the desired high throughputs. Load balancing also plays a crucial role when considering the limited bandwidth among the nodes of these systems. In this paper we present an efficient implementation of the Aho-Corasick algorithm for high performance clusters accelerated with GPUs. We discuss how we partitioned and adapted the algorithm to fit the Tesla C1060 GPU and then present a MPI based implementation for a heterogeneous high performance cluster. We compare this implementation to MPI and MPI with pthreads based implementations for a homogeneous cluster of x86 processors, discussing the stability vs. the performance and the scaling of the solutions, taking into consideration aspects such as the bandwidth among the different nodes.« less

  16. Hardware Architectures for Data-Intensive Computing Problems: A Case Study for String Matching

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tumeo, Antonino; Villa, Oreste; Chavarría-Miranda, Daniel

    DNA analysis is an emerging application of high performance bioinformatic. Modern sequencing machinery are able to provide, in few hours, large input streams of data, which needs to be matched against exponentially growing databases of known fragments. The ability to recognize these patterns effectively and fastly may allow extending the scale and the reach of the investigations performed by biology scientists. Aho-Corasick is an exact, multiple pattern matching algorithm often at the base of this application. High performance systems are a promising platform to accelerate this algorithm, which is computationally intensive but also inherently parallel. Nowadays, high performance systems alsomore » include heterogeneous processing elements, such as Graphic Processing Units (GPUs), to further accelerate parallel algorithms. Unfortunately, the Aho-Corasick algorithm exhibits large performance variability, depending on the size of the input streams, on the number of patterns to search and on the number of matches, and poses significant challenges on current high performance software and hardware implementations. An adequate mapping of the algorithm on the target architecture, coping with the limit of the underlining hardware, is required to reach the desired high throughputs. In this paper, we discuss the implementation of the Aho-Corasick algorithm for GPU-accelerated high performance systems. We present an optimized implementation of Aho-Corasick for GPUs and discuss its tradeoffs on the Tesla T10 and he new Tesla T20 (codename Fermi) GPUs. We then integrate the optimized GPU code, respectively, in a MPI-based and in a pthreads-based load balancer to enable execution of the algorithm on clusters and large sharedmemory multiprocessors (SMPs) accelerated with multiple GPUs.« less

  17. Data Flow for the TERRA-REF project

    NASA Astrophysics Data System (ADS)

    Kooper, R.; Burnette, M.; Maloney, J.; LeBauer, D.

    2017-12-01

    The Transportation Energy Resources from Renewable Agriculture Phenotyping Reference Platform (TERRA-REF) program aims to identify crop traits that are best suited to producing high-energy sustainable biofuels and match those plant characteristics to their genes to speed the plant breeding process. One tool used to achieve this goal is a high-throughput phenotyping robot outfitted with sensors and cameras to monitor the growth of 1.25 acres of sorghum. Data types range from hyperspectral imaging to 3D reconstructions and thermal profiles, all at 1mm resolution. This system produces thousands of daily measurements with high spatiotemporal resolution. The team at NCSA processes, annotates, organizes and stores the massive amounts of data produced by this system - up to 5 TB per day. Data from the sensors is streamed to a local gantry-cache server. The standardized sensor raw data stream is automatically and securely delivered to NCSA using Globus Connect service. Once files have been successfully received by the Globus endpoint, the files are removed from the gantry-cache server. As each dataset arrives or is created the Clowder system automatically triggers different software tools to analyze each file, extract information, and convert files to a common format. Other tools can be triggered to run after all required data is uploaded. For example, a stitched image of the entire field is created after all images of the field become available. Some of these tools were developed by external collaborators based on predictive models and algorithms, others were developed as part of other projects and could be leveraged by the TERRA project. Data will be stored for the lifetime of the project and is estimated to reach 10 PB over 3 years. The Clowder system, BETY and other systems will allow users to easily find data by browsing or searching the extracted information.

  18. Acceleration of atmospheric Cherenkov telescope signal processing to real-time speed with the Auto-Pipe design system

    NASA Astrophysics Data System (ADS)

    Tyson, Eric J.; Buckley, James; Franklin, Mark A.; Chamberlain, Roger D.

    2008-10-01

    The imaging atmospheric Cherenkov technique for high-energy gamma-ray astronomy is emerging as an important new technique for studying the high energy universe. Current experiments have data rates of ≈20TB/year and duty cycles of about 10%. In the future, more sensitive experiments may produce up to 1000 TB/year. The data analysis task for these experiments requires keeping up with this data rate in close to real-time. Such data analysis is a classic example of a streaming application with very high performance requirements. This class of application often benefits greatly from the use of non-traditional approaches for computation including using special purpose hardware (FPGAs and ASICs), or sophisticated parallel processing techniques. However, designing, debugging, and deploying to these architectures is difficult and thus they are not widely used by the astrophysics community. This paper presents the Auto-Pipe design toolset that has been developed to address many of the difficulties in taking advantage of complex streaming computer architectures for such applications. Auto-Pipe incorporates a high-level coordination language, functional and performance simulation tools, and the ability to deploy applications to sophisticated architectures. Using the Auto-Pipe toolset, we have implemented the front-end portion of an imaging Cherenkov data analysis application, suitable for real-time or offline analysis. The application operates on data from the VERITAS experiment, and shows how Auto-Pipe can greatly ease performance optimization and application deployment of a wide variety of platforms. We demonstrate a performance improvement over a traditional software approach of 32x using an FPGA solution and 3.6x using a multiprocessor based solution.

  19. The OGC Innovation Program Testbeds - Advancing Architectures for Earth and Systems

    NASA Astrophysics Data System (ADS)

    Bermudez, L. E.; Percivall, G.; Simonis, I.; Serich, S.

    2017-12-01

    The OGC Innovation Program provides a collaborative agile process for solving challenging science problems and advancing new technologies. Since 1999, 100 initiatives have taken place, from multi-million dollar testbeds to small interoperability experiments. During these initiatives, sponsors and technology implementers (including academia and private sector) come together to solve problems, produce prototypes, develop demonstrations, provide best practices, and advance the future of standards. This presentation will provide the latest system architectures that can be used for Earth and space systems as a result of the OGC Testbed 13, including the following components: Elastic cloud autoscaler for Earth Observations (EO) using a WPS in an ESGF hybrid climate data research platform. Accessibility of climate data for the scientist and non-scientist users via on demand models wrapped in WPS. Standards descriptions for containerize applications to discover processes on the cloud, including using linked data, a WPS extension for hybrid clouds and linking to hybrid big data stores. OpenID and OAuth to secure OGC Services with built-in Attribute Based Access Control (ABAC) infrastructures leveraging GeoDRM patterns. Publishing and access of vector tiles, including use of compression and attribute options reusing patterns from WMS, WMTS and WFS. Servers providing 3D Tiles and streaming of data, including Indexed 3d Scene Layer (I3S), CityGML and Common DataBase (CDB). Asynchronous Services with advanced pushed notifications strategies, with a filter language instead of simple topic subscriptions, that can be use across OGC services. Testbed 14 will continue advancing topics like Big Data, security, and streaming, as well as making easier to use OGC services (e.g. RESTful APIs). The Call for Participation will be issued in December and responses are due on mid January 2018.

  20. The OGC Innovation Program Testbeds - Advancing Architectures for Earth and Systems

    NASA Astrophysics Data System (ADS)

    Bermudez, L. E.; Percivall, G.; Simonis, I.; Serich, S.

    2016-12-01

    The OGC Innovation Program provides a collaborative agile process for solving challenging science problems and advancing new technologies. Since 1999, 100 initiatives have taken place, from multi-million dollar testbeds to small interoperability experiments. During these initiatives, sponsors and technology implementers (including academia and private sector) come together to solve problems, produce prototypes, develop demonstrations, provide best practices, and advance the future of standards. This presentation will provide the latest system architectures that can be used for Earth and space systems as a result of the OGC Testbed 13, including the following components: Elastic cloud autoscaler for Earth Observations (EO) using a WPS in an ESGF hybrid climate data research platform. Accessibility of climate data for the scientist and non-scientist users via on demand models wrapped in WPS. Standards descriptions for containerize applications to discover processes on the cloud, including using linked data, a WPS extension for hybrid clouds and linking to hybrid big data stores. OpenID and OAuth to secure OGC Services with built-in Attribute Based Access Control (ABAC) infrastructures leveraging GeoDRM patterns. Publishing and access of vector tiles, including use of compression and attribute options reusing patterns from WMS, WMTS and WFS. Servers providing 3D Tiles and streaming of data, including Indexed 3d Scene Layer (I3S), CityGML and Common DataBase (CDB). Asynchronous Services with advanced pushed notifications strategies, with a filter language instead of simple topic subscriptions, that can be use across OGC services. Testbed 14 will continue advancing topics like Big Data, security, and streaming, as well as making easier to use OGC services (e.g. RESTful APIs). The Call for Participation will be issued in December and responses are due on mid January 2018.

  1. The "Vsoil Platform" : a tool to integrate the various physical, chemical and biological processes contributing to the soil functioning at the local scale.

    NASA Astrophysics Data System (ADS)

    Lafolie, François; Cousin, Isabelle; Mollier, Alain; Pot, Valérie; Moitrier, Nicolas; Balesdent, Jérome; bruckler, Laurent; Moitrier, Nathalie; Nouguier, Cédric; Richard, Guy

    2014-05-01

    Models describing the soil functioning are valuable tools for addressing challenging issues related to agricultural production, soil protection or biogeochemical cycles. Coupling models that address different scientific fields is actually required in order to develop numerical tools able to simulate the complex interactions and feed-backs occurring within a soil profile in interaction with climate and human activities. We present here a component-based modelling platform named "VSoil", that aims at designing, developing, implementing and coupling numerical representation of biogeochemical and physical processes in soil, from the aggregate to the profile scales. The platform consists of four softwares, i) Vsoil_Processes dedicated to the conceptual description of processes and of their inputs and outputs, ii) Vsoil_Modules devoted to the development of numerical representation of elementary processes as modules, iii) Vsoil_Models which permits the coupling of modules to create models, iv) Vsoil_Player for the run of the model and the primary analysis of results. The platform is designed to be a collaborative tool, helping scientists to share not only their models, but also the scientific knowledge on which the models are built. The platform is based on the idea that processes of any kind can be described and characterized by their inputs (state variables required) and their outputs. The links between the processes are automatically detected by the platform softwares. For any process, several numerical representations (modules) can be developed and made available to platform users. When developing modules, the platform takes care of many aspects of the development task so that the user can focus on numerical calculations. Fortran2008 and C++ are the supported languages and existing codes can be easily incorporated into platform modules. Building a model from available modules simply requires selecting the processes being accounted for and for each process a module. During this task, the platform displays available modules and checks the compatibility between the modules. The model (main program) is automatically created when compatible modules have been selected for all the processes. A GUI is automatically generated to help the user providing parameters and initial situations. Numerical results can be immediately visualized, archived and exported. The platform also provides facilities to carry out sensitivity analysis. Parameters estimation and links with databases are being developed. The platform can be freely downloaded from the web site (http://www.inra.fr/sol_virtuel/) with a set of processes, variables, modules and models. However, it is designed so that any user can add its own components. Theses adds-on can be shared with co-workers by means of an export/import mechanism using the e-mail. The adds-on can also be made available to the whole community of platform users when developers asked for. A filtering tool is available to explore the content of the platform (processes, variables, modules, models).

  2. Tracer-based characterization of hyporheic exchange and benthic biolayers in streams

    NASA Astrophysics Data System (ADS)

    Knapp, Julia L. A.; González-Pinzón, Ricardo; Drummond, Jennifer D.; Larsen, Laurel G.; Cirpka, Olaf A.; Harvey, Judson W.

    2017-02-01

    Shallow benthic biolayers at the top of the streambed are believed to be places of enhanced biogeochemical turnover within the hyporheic zone. They can be investigated by reactive stream tracer tests with tracer recordings in the streambed and in the stream channel. Common in-stream measurements of such reactive tracers cannot localize where the processing primarily takes place, whereas isolated vertical depth profiles of solutes within the hyporheic zone are usually not representative of the entire stream. We present results of a tracer test where we injected the conservative tracer bromide together with the reactive tracer resazurin into a third-order stream and combined the recording of in-stream breakthrough curves with multidepth sampling of the hyporheic zone at several locations. The transformation of resazurin was used as an indicator of metabolism, and high-reactivity zones were identified from depth profiles. The results from our subsurface analysis indicate that the potential for tracer transformation (i.e., the reaction rate constant) varied with depth in the hyporheic zone. This highlights the importance of the benthic biolayer, which we found to be on average 2 cm thick in this study, ranging from one third to one half of the full depth of the hyporheic zone. The reach-scale approach integrated the effects of processes along the reach length, isolating hyporheic processes relevant for whole-stream chemistry and estimating effective reaction rates.

  3. Tracer-based characterization of hyporheic exchange and benthic biolayers in streams

    USGS Publications Warehouse

    Knapp, Julia L.A.; González-Pinzón, Ricardo; Drummond, Jennifer D.; Larsen, Laurel G.; Cirpka, Olaf A.; Harvey, Judson W.

    2017-01-01

    Shallow benthic biolayers at the top of the streambed are believed to be places of enhanced biogeochemical turnover within the hyporheic zone. They can be investigated by reactive stream tracer tests with tracer recordings in the streambed and in the stream channel. Common in-stream measurements of such reactive tracers cannot localize where the processing primarily takes place, whereas isolated vertical depth profiles of solutes within the hyporheic zone are usually not representative of the entire stream. We present results of a tracer test where we injected the conservative tracer bromide together with the reactive tracer resazurin into a third-order stream and combined the recording of in-stream breakthrough curves with multidepth sampling of the hyporheic zone at several locations. The transformation of resazurin was used as an indicator of metabolism, and high-reactivity zones were identified from depth profiles. The results from our subsurface analysis indicate that the potential for tracer transformation (i.e., the reaction rate constant) varied with depth in the hyporheic zone. This highlights the importance of the benthic biolayer, which we found to be on average 2 cm thick in this study, ranging from one third to one half of the full depth of the hyporheic zone. The reach-scale approach integrated the effects of processes along the reach length, isolating hyporheic processes relevant for whole-stream chemistry and estimating effective reaction rates.

  4. Adaptations to vision-for-action in primate brain evolution: Comment on "Towards a Computational Comparative Neuroprimatology: Framing the language-ready brain" by Michael A. Arbib

    NASA Astrophysics Data System (ADS)

    Hecht, Erin

    2016-03-01

    As Arbib [1] notes, the two-streams hypothesis [5] has provided a powerful explanatory framework for understanding visual processing. The inferotemporal ventral stream recognizes objects and agents - ;what; one is seeing. The dorsal ;how; or ;where; stream through parietal cortex processes motion, spatial location, and visuo-proprioceptive relationships - ;vision for action.; Hickock and Poeppel's [3] extension of this model to the auditory system raises the question of deeper, multi- or supra-sensory themes in dorsal vs. ventral processing. Petrides and Pandya [10] postulate that the evolution of language may have been influenced by the fact that the dorsal stream terminates in posterior Broca's area (BA44) while the ventral stream terminates in anterior Broca's area (BA45). In an intriguing potential parallel, a recent ALE metanalysis of 54 fMRI studies found that semantic processing is located more anteriorly and superiorly than syntactic processing in Broca's area [13]. But clearly, macaques do not have language, nor other likely pre- or co-adaptations to language, such as complex imitation and tool use. What changed in the brain that enabled these functions to evolve?

  5. New methods for modeling stream temperature using high resolution LiDAR, solar radiation analysis and flow accumulated values to predict stream temperature

    EPA Science Inventory

    In-stream temperature directly effects a variety of biotic organisms, communities and processes. Changes in stream temperature can render formally suitable habitat unsuitable for aquatic organisms, particularly native cold water species that are not able to adjust. In order to...

  6. Radar meteor orbital structure of Southern Hemisphere cometary dust streams

    NASA Technical Reports Server (NTRS)

    Baggaley, W. Jack; Taylor, Andrew D.

    1992-01-01

    The Christchurch, New Zealand meteor orbit radar (AMOR) with its high precision and sensitivity, permits studies of the orbital fine structure of cometary streams. PC generated graphics are presented of data on some Southern Hemisphere Streams. Such data can be related to the formation phase and subsequent dynamical processes of dust streams.

  7. An evaluation of underwater epoxies to permanently install temperature sensors in mountain streams

    Treesearch

    Daniel J. Isaak; Dona L. Horan

    2011-01-01

    Stream temperature regimes are of fundamental importance in understanding the patterns and processes in aquatic ecosystems, and inexpensive digital sensors provide accurate and repeated measurements of temperature. Most temperature measurements in mountain streams are made only during summer months because of logistical constraints associated with stream access and...

  8. Interoperable Access to Near Real Time Ocean Observations with the Observing System Monitoring Center

    NASA Astrophysics Data System (ADS)

    O'Brien, K.; Hankin, S.; Mendelssohn, R.; Simons, R.; Smith, B.; Kern, K. J.

    2013-12-01

    The Observing System Monitoring Center (OSMC), a project funded by the National Oceanic and Atmospheric Administration's Climate Observations Division (COD), exists to join the discrete 'networks' of In Situ ocean observing platforms -- ships, surface floats, profiling floats, tide gauges, etc. - into a single, integrated system. The OSMC is addressing this goal through capabilities in three areas focusing on the needs of specific user groups: 1) it provides real time monitoring of the integrated observing system assets to assist management in optimizing the cost-effectiveness of the system for the assessment of climate variables; 2) it makes the stream of real time data coming from the observing system available to scientific end users into an easy-to-use form; and 3) in the future, it will unify the delayed-mode data from platform-focused data assembly centers into a standards- based distributed system that is readily accessible to interested users from the science and education communities. In this presentation, we will be focusing on the efforts of the OSMC to provide interoperable access to the near real time data stream that is available via the Global Telecommunications System (GTS). This is a very rich data source, and includes data from nearly all of the oceanographic platforms that are actively observing. We will discuss how the data is being served out using a number of widely used 'web services' (including OPeNDAP and SOS) and downloadable file formats (KML, csv, xls, netCDF), so that it can be accessed in web browsers and popular desktop analysis tools. We will also be discussing our use of the Environmental Research Division's Data Access Program (ERDDAP), available from NOAA/NMFS, which has allowed us to achieve our goals of serving the near real time data. From an interoperability perspective, it's important to note that access to the this stream of data is not just for humans, but also for machine-to-machine requests. We'll also delve into how we configured access to the near real time ocean observations in accordance with the Climate and Forecast (CF) metadata conventions describing the various 'feature types' associated with particular in situ observation types, or discrete sampling geometries (DSG). Wrapping up, we'll discuss some of the ways this data source is already being used.

  9. Multiplatform sampling (ship, aircraft, and satellite) of a Gulf Stream warm core ring

    NASA Technical Reports Server (NTRS)

    Smith, Raymond C.; Brown, Otis B.; Hoge, Frank E.; Baker, Karen S.; Evans, Robert H.

    1987-01-01

    The purpose of this paper is to demonstrate the ability to meet the need to measure distributions of physical and biological properties of the ocean over large areas synoptically and over long time periods by means of remote sensing utilizing contemporaneous buoy, ship, aircraft, and satellite (i.e., multiplatform) sampling strategies. A mapping of sea surface temperature and chlorophyll fields in a Gulf Stream warm core ring using the multiplatform approach is described. Sampling capabilities of each sensing system are discussed as background for the data collected by means of these three dissimilar methods. Commensurate space/time sample sets from each sensing system are compared, and their relative accuracies in space and time are determined. The three-dimensional composite maps derived from the data set provide a synoptic perspective unobtainable from single platforms alone.

  10. Use of once-through treat gas to remove the heat of reaction in solvent hydrogenation processes

    DOEpatents

    Nizamoff, Alan J.

    1980-01-01

    In a coal liquefaction process wherein feed coal is contacted with molecular hydrogen and a hydrogen-donor solvent in a liquefaction zone to form coal liquids and vapors and coal liquids in the solvent boiling range are thereafter hydrogenated to produce recycle solvent and liquid products, the improvement which comprises separating the effluent from the liquefaction zone into a hot vapor stream and a liquid stream; cooling the entire hot vapor stream sufficiently to condense vaporized liquid hydrocarbons; separating condensed liquid hydrocarbons from the cooled vapor; fractionating the liquid stream to produce coal liquids in the solvent boiling range; dividing the cooled vapor into at least two streams; passing the cooling vapors from one of the streams, the coal liquids in the solvent boiling range, and makeup hydrogen to a solvent hydrogenation zone, catalytically hydrogenating the coal liquids in the solvent boiling range and quenching the hydrogenation zone with cooled vapors from the other cooled vapor stream.

  11. Modeling the economics of landfilling organic processing waste streams

    NASA Astrophysics Data System (ADS)

    Rosentrater, Kurt A.

    2005-11-01

    As manufacturing industries become more cognizant of the ecological effects that their firms have on the surrounding environment, their waste streams are increasingly becoming viewed not only as materials in need of disposal, but also as resources that can be reused, recycled, or reprocessed into valuable products. Within the food processing sector are many examples of various liquid, sludge, and solid biological and organic waste streams that require remediation. Alternative disposal methods for food and other bio-organic manufacturing waste streams are increasingly being investigated. Direct shipping, blending, extrusion, pelleting, and drying are commonly used to produce finished human food, animal feed, industrial products, and components ready for further manufacture. Landfilling, the traditional approach to waste remediation, however, should not be dismissed entirely. It does provide a baseline to which all other recycling and reprocessing options should be compared. This paper discusses the implementation of a computer model designed to examine the economics of landfilling bio-organic processing waste streams. Not only are these results applicable to food processing operations, but any industrial or manufacturing firm would benefit from examining the trends discussed here.

  12. Electrodialysis-based separation process for salt recovery and recycling from waste water

    DOEpatents

    Tsai, S.P.

    1997-07-08

    A method for recovering salt from a process stream containing organic contaminants is provided, comprising directing the waste stream to a desalting electrodialysis unit so as to create a concentrated and purified salt permeate and an organic contaminants-containing stream, and contacting said concentrated salt permeate to a water-splitting electrodialysis unit so as to convert the salt to its corresponding base and acid. 6 figs.

  13. Electrodialysis-based separation process for salt recovery and recycling from waste water

    DOEpatents

    Tsai, Shih-Perng

    1997-01-01

    A method for recovering salt from a process stream containing organic contaminants is provided, comprising directing the waste stream to a desalting electrodialysis unit so as to create a concentrated and purified salt permeate and an organic contaminants containing stream, and contacting said concentrated salt permeate to a water-splitting electrodialysis unit so as to convert the salt to its corresponding base and acid.

  14. Computation Methods for NASA Data-streams for Agricultural Efficiency Applications

    NASA Astrophysics Data System (ADS)

    Shrestha, B.; O'Hara, C. G.; Mali, P.

    2007-12-01

    Temporal Map Algebra (TMA) is a novel technique for analyzing time-series of satellite imageries using simple algebraic operators that treats time-series imageries as a three-dimensional dataset, where two dimensions encode planimetric position on earth surface and the third dimension encodes time. Spatio-temporal analytical processing methods such as TMA that utilize moderate spatial resolution satellite imagery having high temporal resolution to create multi-temporal composites are data intensive as well as computationally intensive. TMA analysis for multi-temporal composites provides dramatically enhanced usefulness that will yield previously unavailable capabilities to user communities, if deployment is coupled with significant High Performance Computing (HPC) capabilities; and interfaces are designed to deliver the full potential for these new technological developments. In this research, cross-platform data fusion and adaptive filtering using TMA was employed to create highly useful daily datasets and cloud-free high-temporal resolution vegetation index (VI) composites with enhanced information content for vegetation and bio-productivity monitoring, surveillance, and modeling. Fusion of Normalized Difference Vegetation Index (NDVI) data created from Aqua and Terra Moderate Resolution Imaging Spectroradiometer (MODIS) surface-reflectance data (MOD09) enables the creation of daily composites which are of immense value to a broad spectrum of global and national applications. Additionally these products are highly desired by many natural resources agencies like USDA/FAS/PECAD. Utilizing data streams collected by similar sensors on different platforms that transit the same areas at slightly different times of the day offers the opportunity to develop fused data products that have enhanced cloud-free and reduced noise characteristics. Establishing a Fusion Quality Confidence Code (FQCC) provides a metadata product that quantifies the method of fusion for a given pixel and enables a relative quality and confidence factor to be established for a given daily pixel value. When coupled with metadata that quantify the source sensor, day and time of acquisition, and the fusion method of each pixel to create the daily product; a wealth of information is available to assist in deriving new data and information products. These newly developed abilities to create highly useful daily data sets imply that temporal composites for a geographic area of interest may be created for user-defined temporal intervals that emphasize a user designated day of interest. At GeoResources Institute, Mississippi State University, solutions have been developed to create custom composites and cross-platform satellite data fusion using TMA which are useful for National Aeronautics and Space Administration (NASA) Rapid Prototyping Capability (RPC) and Integrated System Solutions (ISS) experiments for agricultural applications.

  15. Development of SNS Stream Analysis Based on Forest Disaster Warning Information Service System

    NASA Astrophysics Data System (ADS)

    Oh, J.; KIM, D.; Kang, M.; Woo, C.; Kim, D.; Seo, J.; Lee, C.; Yoon, H.; Heon, S.

    2017-12-01

    Forest disasters, such as landslides and wildfires, cause huge economic losses and casualties, and the cost of recovery is increasing every year. While forest disaster mitigation technologies have been focused on the development of prevention and response technologies, they are now required to evolve into evacuation and border evacuation, and to develop technologies fused with ICT. In this study, we analyze the SNS (Social Network Service) stream and implement a system to detect the message that the forest disaster occurred or the forest disaster, and search the keyword related to the forest disaster in advance in real time. It is possible to detect more accurate forest disaster messages by repeatedly learning the retrieved results using machine learning techniques. To do this, we designed and implemented a system based on Hadoop and Spark, a distributed parallel processing platform, to handle Twitter stream messages that open SNS. In order to develop the technology to notify the information of forest disaster risk, a linkage of technology such as CBS (Cell Broadcasting System) based on mobile communication, internet-based civil defense siren, SNS and the legal and institutional issues for applying these technologies are examined. And the protocol of the forest disaster warning information service system that can deliver the SNS analysis result was developed. As a result, it was possible to grasp real-time forest disaster situation by real-time big data analysis of SNS that occurred during forest disasters. In addition, we confirmed that it is possible to rapidly propagate alarm or warning according to the disaster situation by using the function of the forest disaster warning information notification service. However, the limitation of system application due to the restriction of opening and sharing of SNS data currently in service and the disclosure of personal information remains a problem to be solved in the future. Keyword : SNS stream, Big data, Machine learning techniques, CBS, Forest disaster warning information service system Acknowledgement : This research was supported by the Forestry Technology 2015 Forestry Technology Research and Development Project (Planning project).

  16. Spatial and temporal patterns of stream burial and its effect on habitat connectivity across headwater stream communities of the Potomac River Basin, USA

    NASA Astrophysics Data System (ADS)

    Weitzell, R.; Guinn, S. M.; Elmore, A. J.

    2012-12-01

    The process of directing streams into culverts, pipes, or concrete-lined ditches during urbanization, known as stream burial, alters the primary physical, chemical, and biological processes of streams. Knowledge of the cumulative impacts of reduced structure and ecological function within buried stream networks is crucial for informing management of stream ecosystems, in light of continued growth in urban areas, and the uncertain response of freshwater ecosystems to the stresses of global climate change. To address this need, we utilized recently improved stream maps for the Potomac River Basin (PRB) to describe the extent and severity of stream burial across the basin. Observations of stream burial made from high resolution aerial photographs (>1% of total basin area) and a decision tree using spatial statistics from impervious cover data were used to predict stream burial at 4 time-steps (1975, 1990, 2001, 2006). Of the roughly 95,500 kilometers (km) of stream in the PRB, approximately 4551 km (4.76%) were buried by urban development as of 2001. Analysis of county-level burial trends shows differential patterns in the timing and rates of headwater stream burial, which may be due to local development policies, topographical constraints, and/or time since development. Consistently higher rates of stream burial were observed for small streams, decreasing with stream order. Headwater streams (1st-2nd order) are disproportionately affected, with burial rates continuing to increase over time in relation to larger stream orders. Beyond simple habitat loss, headwater burial decreases connectivity among headwater populations and habitats, with potential to affect a wide range of important ecological processes. To quantify changes to regional headwater connectivity we applied a connectivity model based on electrical circuit theory. Circuit-theoretical models function by treating the landscape as a resistance surface, representing hypothesized relationships between landscape features and their differential "resistance" to movement by organisms. A landscape resistance layer was developed and fine-tuned in terms of the habitat use/needs of aquatic invertebrates with terrestrial adult stages, organisms of critical importance to riparian and aquatic ecosystem health. Initial results show significant increases in landscape resistance (isolation) among headwater systems, and corresponding decreases in current flow (movement of organisms) across the increasingly urbanized PRB landscape. Of particular interest, the circuit model highlighted the importance of stream confluences and zero-order (non-channel) headwater areas for movement of organisms between headwater systems that are otherwise highly disconnected, and for which the latter currently receives no legal protection from development.

  17. Method and apparatus for combination catalyst for reduction of NO.sub.x in combustion products

    DOEpatents

    Socha, Richard F.; Vartuli, James C.; El-Malki, El-Mekki; Kalyanaraman, Mohan; Park, Paul W.

    2010-09-28

    A method and apparatus for catalytically processing a gas stream passing therethrough to reduce the presence of NO.sub.x therein, wherein the apparatus includes a first catalyst composed of a silver containing alumina that is adapted for catalytically processing the gas stream at a first temperature range, and a second catalyst composed of a copper containing zeolite located downstream from the first catalyst, wherein the second catalyst is adapted for catalytically processing the gas stream at a lower second temperature range relative to the first temperature range.

  18. Gas separation process using membranes with permeate sweep to remove CO.sub.2 from gaseous fuel combustion exhaust

    DOEpatents

    Wijmans, Johannes G [Menlo Park, CA; Merkel, Timothy C [Menlo Park, CA; Baker, Richard W [Palo Alto, CA

    2012-05-15

    A gas separation process for treating exhaust gases from the combustion of gaseous fuels, and gaseous fuel combustion processes including such gas separation. The invention involves routing a first portion of the exhaust stream to a carbon dioxide capture step, while simultaneously flowing a second portion of the exhaust gas stream across the feed side of a membrane, flowing a sweep gas stream, usually air, across the permeate side, then passing the permeate/sweep gas back to the combustor.

  19. Combining multiple approaches and optimized data resolution for an improved understanding of stream temperature dynamics of a forested headwater basin in the Southern Appalachians

    NASA Astrophysics Data System (ADS)

    Belica, L.; Mitasova, H.; Caldwell, P.; McCarter, J. B.; Nelson, S. A. C.

    2017-12-01

    Thermal regimes of forested headwater streams continue to be an area of active research as climatic, hydrologic, and land cover changes can influence water temperature, a key aspect of aquatic ecosystems. Widespread monitoring of stream temperatures have provided an important data source, yielding insights on the temporal and spatial patterns and the underlying processes that influence stream temperature. However, small forested streams remain challenging to model due to the high spatial and temporal variability of stream temperatures and the climatic and hydrologic conditions that drive them. Technological advances and increased computational power continue to provide new tools and measurement methods and have allowed spatially explicit analyses of dynamic natural systems at greater temporal resolutions than previously possible. With the goal of understanding how current stream temperature patterns and processes may respond to changing landcover and hydroclimatoligical conditions, we combined high-resolution, spatially explicit geospatial modeling with deterministic heat flux modeling approaches using data sources that ranged from traditional hydrological and climatological measurements to emerging remote sensing techniques. Initial analyses of stream temperature monitoring data revealed that high temporal resolution (5 minutes) and measurement resolutions (<0.1°C) were needed to adequately describe diel stream temperature patterns and capture the differences between paired 1st order and 4th order forest streams draining north and south facing slopes. This finding along with geospatial models of subcanopy solar radiation and channel morphology were used to develop hypotheses and guide field data collection for further heat flux modeling. By integrating multiple approaches and optimizing data resolution for the processes being investigated, small, but ecologically significant differences in stream thermal regimes were revealed. In this case, multi-approach research contributed to the identification of the dominant mechanisms driving stream temperature in the study area and advanced our understanding of the current thermal fluxes and how they may change as environmental conditions change in the future.

  20. Methods of making transportation fuel

    DOEpatents

    Roes, Augustinus Wilhelmus Maria [Houston, TX; Mo, Weijian [Sugar Land, TX; Muylle, Michel Serge Marie [Houston, TX; Mandema, Remco Hugo [Houston, TX; Nair, Vijay [Katy, TX

    2012-04-10

    A method for producing alkylated hydrocarbons is disclosed. Formation fluid is produced from a subsurface in situ heat treatment process. The formation fluid is separated to produce a liquid stream and a first gas stream. The first gas stream includes olefins. The liquid stream is fractionated to produce at least a second gas stream including hydrocarbons having a carbon number of at least 3. The first gas stream and the second gas stream are introduced into an alkylation unit to produce alkylated hydrocarbons. At least a portion of the olefins in the first gas stream enhance alkylation. The alkylated hydrocarbons may be blended with one or more components to produce transportation fuel.

  1. Dual-stream accounts bridge the gap between monkey audition and human language processing. Comment on "Towards a Computational Comparative Neuroprimatology: Framing the language-ready brain" by Michael Arbib

    NASA Astrophysics Data System (ADS)

    Garrod, Simon; Pickering, Martin J.

    2016-03-01

    Over the last few years there has been a resurgence of interest in dual-stream dorsal-ventral accounts of language processing [4]. This has led to recent attempts to bridge the gap between the neurobiology of primate audition and human language processing with the dorsal auditory stream assumed to underlie time-dependent (and syntactic) processing and the ventral to underlie some form of time-independent (and semantic) analysis of the auditory input [3,10]. Michael Arbib [1] considers these developments in relation to his earlier Mirror System Hypothesis about the origins of human language processing [11].

  2. On the patterns and processes of wood in northern California streams

    NASA Astrophysics Data System (ADS)

    Benda, Lee; Bigelow, Paul

    2014-03-01

    Forest management and stream habitat can be improved by clarifying the primary riparian and geomorphic controls on streams. To this end, we evaluated the recruitment, storage, transport, and the function of wood in 95 km of streams (most drainage areas < 30 km2) in northern California, crossing four coastal to inland regions with different histories of forest management (managed, less-managed, unmanaged). The dominant source of variability in stream wood storage and recruitment is driven by local variation in rates of bank erosion, forest mortality, and mass wasting. These processes are controlled by changes in watershed structure, including the location of canyons, floodplains and tributary confluences; types of geology and topography; and forest types and management history. Average wood storage volumes in coastal streams are 5 to 20 times greater than inland sites primarily from higher riparian forest biomass and growth rates (productivity), with some influence by longer residence time of wood in streams and more wood from landsliding and logging sources. Wood recruitment by mortality (windthrow, disease, senescence) was substantial across all sites (mean 50%) followed by bank erosion (43%) and more locally by mass wasting (7%). The distances to sources of stream wood are controlled by recruitment process and tree height. Ninety percent of wood recruitment occurs within 10 to 35 m of channels in managed and less-managed forests and upward of 50 m in unmanaged Sequoia and coast redwood forests. Local landsliding extends the source distance. The recruitment of large wood pieces that create jams (mean diameter 0.7 m) is primarily by bank erosion in managed forests and by mortality in unmanaged forests. Formation of pools by wood is more frequent in streams with low stream power, indicating the further relevance of environmental context and watershed structure. Forest management influences stream wood dynamics, where smaller trees in managed forests often generate shorter distances to sources of stream wood, lower stream wood storage, and smaller diameter stream wood. These findings can be used to improve riparian protection and inform spatially explicit riparian management.

  3. Two Visual Pathways in Primates Based on Sampling of Space: Exploitation and Exploration of Visual Information

    PubMed Central

    Sheth, Bhavin R.; Young, Ryan

    2016-01-01

    Evidence is strong that the visual pathway is segregated into two distinct streams—ventral and dorsal. Two proposals theorize that the pathways are segregated in function: The ventral stream processes information about object identity, whereas the dorsal stream, according to one model, processes information about either object location, and according to another, is responsible in executing movements under visual control. The models are influential; however recent experimental evidence challenges them, e.g., the ventral stream is not solely responsible for object recognition; conversely, its function is not strictly limited to object vision; the dorsal stream is not responsible by itself for spatial vision or visuomotor control; conversely, its function extends beyond vision or visuomotor control. In their place, we suggest a robust dichotomy consisting of a ventral stream selectively sampling high-resolution/focal spaces, and a dorsal stream sampling nearly all of space with reduced foveal bias. The proposal hews closely to the theme of embodied cognition: Function arises as a consequence of an extant sensory underpinning. A continuous, not sharp, segregation based on function emerges, and carries with it an undercurrent of an exploitation-exploration dichotomy. Under this interpretation, cells of the ventral stream, which individually have more punctate receptive fields that generally include the fovea or parafovea, provide detailed information about object shapes and features and lead to the systematic exploitation of said information; cells of the dorsal stream, which individually have large receptive fields, contribute to visuospatial perception, provide information about the presence/absence of salient objects and their locations for novel exploration and subsequent exploitation by the ventral stream or, under certain conditions, the dorsal stream. We leverage the dichotomy to unify neuropsychological cases under a common umbrella, account for the increased prevalence of multisensory integration in the dorsal stream under a Bayesian framework, predict conditions under which object recognition utilizes the ventral or dorsal stream, and explain why cells of the dorsal stream drive sensorimotor control and motion processing and have poorer feature selectivity. Finally, the model speculates on a dynamic interaction between the two streams that underscores a unified, seamless perception. Existing theories are subsumed under our proposal. PMID:27920670

  4. Quantifying spatial differences in metabolism in headwater streams

    Treesearch

    Ricardo González-Pinzón; Roy Haggerty; Alba Argerich

    2014-01-01

    Stream functioning includes simultaneous interaction among solute transport, nutrient processing, and metabolism. Metabolism is measured with methods that have limited spatial representativeness and are highly uncertain. These problems restrict development of methods for up-scaling biological processes that mediate nutrient processing. We used the resazurin–resorufin (...

  5. Mass, energy and material balances of SRF production process. Part 3: solid recovered fuel produced from municipal solid waste.

    PubMed

    Nasrullah, Muhammad; Vainikka, Pasi; Hannula, Janne; Hurme, Markku; Kärki, Janne

    2015-02-01

    This is the third and final part of the three-part article written to describe the mass, energy and material balances of the solid recovered fuel production process produced from various types of waste streams through mechanical treatment. This article focused the production of solid recovered fuel from municipal solid waste. The stream of municipal solid waste used here as an input waste material to produce solid recovered fuel is energy waste collected from households of municipality. This article presents the mass, energy and material balances of the solid recovered fuel production process. These balances are based on the proximate as well as the ultimate analysis and the composition determination of various streams of material produced in a solid recovered fuel production plant. All the process streams are sampled and treated according to CEN standard methods for solid recovered fuel. The results of the mass balance of the solid recovered fuel production process showed that 72% of the input waste material was recovered in the form of solid recovered fuel; 2.6% as ferrous metal, 0.4% as non-ferrous metal, 11% was sorted as rejects material, 12% as fine faction and 2% as heavy fraction. The energy balance of the solid recovered fuel production process showed that 86% of the total input energy content of input waste material was recovered in the form of solid recovered fuel. The remaining percentage (14%) of the input energy was split into the streams of reject material, fine fraction and heavy fraction. The material balances of this process showed that mass fraction of paper and cardboard, plastic (soft) and wood recovered in the solid recovered fuel stream was 88%, 85% and 90%, respectively, of their input mass. A high mass fraction of rubber material, plastic (PVC-plastic) and inert (stone/rock and glass particles) was found in the reject material stream. © The Author(s) 2014.

  6. Differential modulation of visual object processing in dorsal and ventral stream by stimulus visibility.

    PubMed

    Ludwig, Karin; Sterzer, Philipp; Kathmann, Norbert; Hesselmann, Guido

    2016-10-01

    As a functional organization principle in cortical visual information processing, the influential 'two visual systems' hypothesis proposes a division of labor between a dorsal "vision-for-action" and a ventral "vision-for-perception" stream. A core assumption of this model is that the two visual streams are differentially involved in visual awareness: ventral stream processing is closely linked to awareness while dorsal stream processing is not. In this functional magnetic resonance imaging (fMRI) study with human observers, we directly probed the stimulus-related information encoded in fMRI response patterns in both visual streams as a function of stimulus visibility. We parametrically modulated the visibility of face and tool stimuli by varying the contrasts of the masks in a continuous flash suppression (CFS) paradigm. We found that visibility - operationalized by objective and subjective measures - decreased proportionally with increasing log CFS mask contrast. Neuronally, this relationship was closely matched by ventral visual areas, showing a linear decrease of stimulus-related information with increasing mask contrast. Stimulus-related information in dorsal areas also showed a dependency on mask contrast, but the decrease rather followed a step function instead of a linear function. Together, our results suggest that both the ventral and the dorsal visual stream are linked to visual awareness, but neural activity in ventral areas more closely reflects graded differences in awareness compared to dorsal areas. Copyright © 2016 Elsevier Ltd. All rights reserved.

  7. Endogenous Delta/Theta Sound-Brain Phase Entrainment Accelerates the Buildup of Auditory Streaming.

    PubMed

    Riecke, Lars; Sack, Alexander T; Schroeder, Charles E

    2015-12-21

    In many natural listening situations, meaningful sounds (e.g., speech) fluctuate in slow rhythms among other sounds. When a slow rhythmic auditory stream is selectively attended, endogenous delta (1‒4 Hz) oscillations in auditory cortex may shift their timing so that higher-excitability neuronal phases become aligned with salient events in that stream [1, 2]. As a consequence of this stream-brain phase entrainment [3], these events are processed and perceived more readily than temporally non-overlapping events [4-11], essentially enhancing the neural segregation between the attended stream and temporally noncoherent streams [12]. Stream-brain phase entrainment is robust to acoustic interference [13-20] provided that target stream-evoked rhythmic activity can be segregated from noncoherent activity evoked by other sounds [21], a process that usually builds up over time [22-27]. However, it has remained unclear whether stream-brain phase entrainment functionally contributes to this buildup of rhythmic streams or whether it is merely an epiphenomenon of it. Here, we addressed this issue directly by experimentally manipulating endogenous stream-brain phase entrainment in human auditory cortex with non-invasive transcranial alternating current stimulation (TACS) [28-30]. We assessed the consequences of these manipulations on the perceptual buildup of the target stream (the time required to recognize its presence in a noisy background), using behavioral measures in 20 healthy listeners performing a naturalistic listening task. Experimentally induced cyclic 4-Hz variations in stream-brain phase entrainment reliably caused a cyclic 4-Hz pattern in perceptual buildup time. Our findings demonstrate that strong endogenous delta/theta stream-brain phase entrainment accelerates the perceptual emergence of task-relevant rhythmic streams in noisy environments. Copyright © 2015 Elsevier Ltd. All rights reserved.

  8. Digital Multicasting of Multiple Audio Streams

    NASA Technical Reports Server (NTRS)

    Macha, Mitchell; Bullock, John

    2007-01-01

    The Mission Control Center Voice Over Internet Protocol (MCC VOIP) system (see figure) comprises hardware and software that effect simultaneous, nearly real-time transmission of as many as 14 different audio streams to authorized listeners via the MCC intranet and/or the Internet. The original version of the MCC VOIP system was conceived to enable flight-support personnel located in offices outside a spacecraft mission control center to monitor audio loops within the mission control center. Different versions of the MCC VOIP system could be used for a variety of public and commercial purposes - for example, to enable members of the general public to monitor one or more NASA audio streams through their home computers, to enable air-traffic supervisors to monitor communication between airline pilots and air-traffic controllers in training, and to monitor conferences among brokers in a stock exchange. At the transmitting end, the audio-distribution process begins with feeding the audio signals to analog-to-digital converters. The resulting digital streams are sent through the MCC intranet, using a user datagram protocol (UDP), to a server that converts them to encrypted data packets. The encrypted data packets are then routed to the personal computers of authorized users by use of multicasting techniques. The total data-processing load on the portion of the system upstream of and including the encryption server is the total load imposed by all of the audio streams being encoded, regardless of the number of the listeners or the number of streams being monitored concurrently by the listeners. The personal computer of a user authorized to listen is equipped with special- purpose MCC audio-player software. When the user launches the program, the user is prompted to provide identification and a password. In one of two access- control provisions, the program is hard-coded to validate the user s identity and password against a list maintained on a domain-controller computer at the MCC. In the other access-control provision, the program verifies that the user is authorized to have access to the audio streams. Once both access-control checks are completed, the audio software presents a graphical display that includes audiostream-selection buttons and volume-control sliders. The user can select all or any subset of the available audio streams and can adjust the volume of each stream independently of that of the other streams. The audio-player program spawns a "read" process for the selected stream(s). The spawned process sends, to the router(s), a "multicast-join" request for the selected streams. The router(s) responds to the request by sending the encrypted multicast packets to the spawned process. The spawned process receives the encrypted multicast packets and sends a decryption packet to audio-driver software. As the volume or muting features are changed by the user, interrupts are sent to the spawned process to change the corresponding attributes sent to the audio-driver software. The total latency of this system - that is, the total time from the origination of the audio signals to generation of sound at a listener s computer - lies between four and six seconds.

  9. A Video Game Platform for Exploring Satellite and In-Situ Data Streams

    NASA Astrophysics Data System (ADS)

    Cai, Y.

    2014-12-01

    Exploring spatiotemporal patterns of moving objects are essential to Earth Observation missions, such as tracking, modeling and predicting movement of clouds, dust, plumes and harmful algal blooms. Those missions involve high-volume, multi-source, and multi-modal imagery data analysis. Analytical models intend to reveal inner structure, dynamics, and relationship of things. However, they are not necessarily intuitive to humans. Conventional scientific visualization methods are intuitive but limited by manual operations, such as area marking, measurement and alignment of multi-source data, which are expensive and time-consuming. A new development of video analytics platform has been in progress, which integrates the video game engine with satellite and in-situ data streams. The system converts Earth Observation data into articulated objects that are mapped from a high-dimensional space to a 3D space. The object tracking and augmented reality algorithms highlight the objects' features in colors, shapes and trajectories, creating visual cues for observing dynamic patterns. The head and gesture tracker enable users to navigate the data space interactively. To validate our design, we have used NASA SeaWiFS satellite images of oceanographic remote sensing data and NOAA's in-situ cell count data. Our study demonstrates that the video game system can reduce the size and cost of traditional CAVE systems in two to three orders of magnitude. This system can also be used for satellite mission planning and public outreaching.

  10. FireHose Streaming Benchmarks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Karl Anderson, Steve Plimpton

    2015-01-27

    The FireHose Streaming Benchmarks are a suite of stream-processing benchmarks defined to enable comparison of streaming software and hardware, both quantitatively vis-a-vis the rate at which they can process data, and qualitatively by judging the effort involved to implement and run the benchmarks. Each benchmark has two parts. The first is a generator which produces and outputs datums at a high rate in a specific format. The second is an analytic which reads the stream of datums and is required to perform a well-defined calculation on the collection of datums, typically to find anomalous datums that have been created inmore » the stream by the generator. The FireHose suite provides code for the generators, sample code for the analytics (which users are free to re-implement in their own custom frameworks), and a precise definition of each benchmark calculation.« less

  11. W-007H B Plant Process Condensate Treatment Facility. Revision 3

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rippy, G.L.

    1995-01-20

    B Plant Process Condensate (BCP) liquid effluent stream is the condensed vapors originating from the operation of the B Plant low-level liquid waste concentration system. In the past, the BCP stream was discharged into the soil column under a compliance plan which expired January 1, 1987. Currently, the BCP stream is inactive, awaiting restart of the E-23-3 Concentrator. B Plant Steam Condensate (BCS) liquid effluent stream is the spent steam condensate used to supply heat to the E-23-3 Concentrator. The tube bundles in the E-23-3 Concentrator discharge to the BCS. In the past, the BCS stream was discharged into themore » soil column. Currently, the BCS stream is inactive. This project shall provide liquid effluent systems (BCP/BCS/BCE) capable of operating for a minimum of 20 years, which does not include the anticipated decontamination and decommissioning (D and D) period.« less

  12. How visual illusions illuminate complementary brain processes: illusory depth from brightness and apparent motion of illusory contours

    PubMed Central

    Grossberg, Stephen

    2014-01-01

    Neural models of perception clarify how visual illusions arise from adaptive neural processes. Illusions also provide important insights into how adaptive neural processes work. This article focuses on two illusions that illustrate a fundamental property of global brain organization; namely, that advanced brains are organized into parallel cortical processing streams with computationally complementary properties. That is, in order to process certain combinations of properties, each cortical stream cannot process complementary properties. Interactions between these streams, across multiple processing stages, overcome their complementary deficiencies to compute effective representations of the world, and to thereby achieve the property of complementary consistency. The two illusions concern how illusory depth can vary with brightness, and how apparent motion of illusory contours can occur. Illusory depth from brightness arises from the complementary properties of boundary and surface processes, notably boundary completion and surface-filling in, within the parvocellular form processing cortical stream. This illusion depends upon how surface contour signals from the V2 thin stripes to the V2 interstripes ensure complementary consistency of a unified boundary/surface percept. Apparent motion of illusory contours arises from the complementary properties of form and motion processes across the parvocellular and magnocellular cortical processing streams. This illusion depends upon how illusory contours help to complete boundary representations for object recognition, how apparent motion signals can help to form continuous trajectories for target tracking and prediction, and how formotion interactions from V2-to-MT enable completed object representations to be continuously tracked even when they move behind intermittently occluding objects through time. PMID:25389399

  13. Combined effects of hydrologic alteration and cyprinid fish in mediating biogeochemical processes in a Mediterranean stream.

    PubMed

    Rubio-Gracia, Francesc; Almeida, David; Bonet, Berta; Casals, Frederic; Espinosa, Carmen; Flecker, Alexander S; García-Berthou, Emili; Martí, Eugènia; Tuulaikhuu, Baigal-Amar; Vila-Gispert, Anna; Zamora, Lluis; Guasch, Helena

    2017-12-01

    Flow regimes are important drivers of both stream community and biogeochemical processes. However, the interplay between community and biogeochemical responses under different flow regimes in streams is less understood. In this study, we investigated the structural and functional responses of periphyton and macroinvertebrates to different densities of the Mediterranean barbel (Barbus meridionalis, Cyprinidae) in two stream reaches differing in flow regime. The study was conducted in Llémena Stream, a small calcareous Mediterranean stream with high nutrient levels. We selected a reach with permanent flow (permanent reach) and another subjected to flow regulation (regulated reach) with periods of flow intermittency. At each reach, we used in situ cages to generate 3 levels of fish density. Cages with 10 barbels were used to simulate high fish density (>7indm -2 ); cages with open sides were used as controls (i.e. exposed to actual fish densities of each stream reach) thus having low fish density; and those with no fish were used to simulate the disappearance of fish that occurs with stream drying. Differences in fish density did not cause significant changes in periphyton biomass and macroinvertebrate density. However, phosphate uptake by periphyton was enhanced in treatments lacking fish in the regulated reach with intermittent flow but not in the permanent reach, suggesting that hydrologic alteration hampers the ability of biotic communities to compensate for the absence of fish. This study indicates that fish density can mediate the effects of anthropogenic alterations such as flow intermittence derived from hydrologic regulation on stream benthic communities and associated biogeochemical processes, at least in eutrophic streams. Copyright © 2017. Published by Elsevier B.V.

  14. Process for simultaneous removal of SO.sub.2 and NO.sub.x from gas streams

    DOEpatents

    Rosenberg, Harvey S.

    1987-01-01

    A process for simultaneous removal of SO.sub.2 and NO.sub.x from a gas stream that includes flowing the gas stream to a spray dryer and absorbing a portion of the SO.sub.2 content of the gas stream and a portion of the NO.sub.x content of the gas stream with ZnO by contacting the gas stream with a spray of an aqueous ZnO slurry; controlling the gas outlet temperature of the spray dryer to within the range of about a 0.degree. to 125.degree. F. approach to the adiabatic saturation temperature; flowing the gas, unreacted ZnO and absorbed SO.sub.2 and NO.sub.x from the spray dryer to a fabric filter and collecting any solids therein and absorbing a portion of the SO.sub.2 remaining in the gas stream and a portion of the NO.sub.x remaining in the gas stream with ZnO; and controlling the ZnO content of the aqueous slurry so that sufficient unreacted ZnO is present in the solids collected in the fabric filter to react with SO.sub.2 and NO.sub.x as the gas passes through the fabric filter whereby the overall feed ratio of ZnO to SO.sub.2 plus NO.sub.x is about 1.0 to 4.0 moles of ZnO per of SO.sub.2 and about 0.5 to 2.0 moles of ZnO per mole of NO.sub.x. Particulates may be removed from the gas stream prior to treatment in the spray dryer. The process further allows regeneration of ZnO that has reacted to absorb SO.sub.2 and NO.sub.x from the gas stream and acid recovery.

  15. Process for simultaneous removal of SO[sub 2] and NO[sub x] from gas streams

    DOEpatents

    Rosenberg, H.S.

    1987-02-03

    A process is described for simultaneous removal of SO[sub 2] and NO[sub x] from a gas stream that includes flowing the gas stream to a spray dryer and absorbing a portion of the SO[sub 2] content of the gas stream and a portion of the NO[sub x] content of the gas stream with ZnO by contacting the gas stream with a spray of an aqueous ZnO slurry; controlling the gas outlet temperature of the spray dryer to within the range of about a 0 to 125 F approach to the adiabatic saturation temperature; flowing the gas, unreacted ZnO and absorbed SO[sub 2] and NO[sub x] from the spray dryer to a fabric filter and collecting any solids therein and absorbing a portion of the SO[sub 2] remaining in the gas stream and a portion of the NO[sub x] remaining in the gas stream with ZnO; and controlling the ZnO content of the aqueous slurry so that sufficient unreacted ZnO is present in the solids collected in the fabric filter to react with SO[sub 2] and NO[sub x] as the gas passes through the fabric filter whereby the overall feed ratio of ZnO to SO[sub 2] plus NO[sub x] is about 1.0 to 4.0 moles of ZnO per of SO[sub 2] and about 0.5 to 2.0 moles of ZnO per mole of NO[sub x]. Particulates may be removed from the gas stream prior to treatment in the spray dryer. The process further allows regeneration of ZnO that has reacted to absorb SO[sub 2] and NO[sub x] from the gas stream and acid recovery. 4 figs.

  16. Carbon limitation patterns in buried and open urban streams

    EPA Science Inventory

    Urban streams alternate between darkened buried segments dominated by heterotrophic processes and lighted open segments dominated by autotrophic processes. We hypothesized that labile carbon leaking from autotrophic cells would reduce heterotrophic carbon limitation in open chan...

  17. Hot and Cool Spots of Primary Production, Respiration and 15N Nitrate and Ammonium Uptake: Spatial Heterogeneity in Tropical Streams and Rivers

    NASA Astrophysics Data System (ADS)

    Dodds, W. K.; Tromboni, F.; Neres-Lima, V.; Zandoná, E.; Moulton, T. P.

    2016-12-01

    While whole-stream measures of metabolism and uptake have become common methods to characterize biogeochemical transport and processing, less is known about how nitrogen (N) uptake, gross primary production (GPP) and ecosystem respiration (ER) covary among different stream substrata as smaller scales. We measured 15N ammonium and nitrate uptake seperately, and GPP and ER of ecosystem compartments (leaves, epilithon, sand-associated biota and macrophytes) in closed circulating chambers in three streams/ rivers of varied size. The streams drain pristine Brazilian Atlantic Rainforest watersheds and are all within a few km of eachother. The smallest stream had dense forest canopy cover; the largest river was almost completely open. GPP could not be detected in the closed canopy stream. Epilithon (biofilms on rocks) was a dominant compartment for GPP and N uptake in the two open streams, and macrophytes rivaled epilithon GPP and N uptake rates in the most open stream. Even though leaves covered only 1-3% of the stream bottom, they could account for around half of all the ER in the streams but almost no N uptake. Sand had minimal rates of N uptake, GPP and R associated with it in all streams due to relatively low organic material content. The data suggest that N uptake, GPP and ER of different substrata are not closely linked over relatively small spatial (dm) scales, and that different biogeochemical processes may map to different hot and cool spots for ecosystem rates.

  18. Analyzing Cyber-Physical Threats on Robotic Platforms.

    PubMed

    Ahmad Yousef, Khalil M; AlMajali, Anas; Ghalyon, Salah Abu; Dweik, Waleed; Mohd, Bassam J

    2018-05-21

    Robots are increasingly involved in our daily lives. Fundamental to robots are the communication link (or stream) and the applications that connect the robots to their clients or users. Such communication link and applications are usually supported through client/server network connection. This networking system is amenable of being attacked and vulnerable to the security threats. Ensuring security and privacy for robotic platforms is thus critical, as failures and attacks could have devastating consequences. In this paper, we examine several cyber-physical security threats that are unique to the robotic platforms; specifically the communication link and the applications. Threats target integrity, availability and confidential security requirements of the robotic platforms, which use MobileEyes/arnlServer client/server applications. A robot attack tool (RAT) was developed to perform specific security attacks. An impact-oriented approach was adopted to analyze the assessment results of the attacks. Tests and experiments of attacks were conducted in simulation environment and physically on the robot. The simulation environment was based on MobileSim; a software tool for simulating, debugging and experimenting on MobileRobots/ActivMedia platforms and their environments. The robot platform PeopleBot TM was used for physical experiments. The analysis and testing results show that certain attacks were successful at breaching the robot security. Integrity attacks modified commands and manipulated the robot behavior. Availability attacks were able to cause Denial-of-Service (DoS) and the robot was not responsive to MobileEyes commands. Integrity and availability attacks caused sensitive information on the robot to be hijacked. To mitigate security threats, we provide possible mitigation techniques and suggestions to raise awareness of threats on the robotic platforms, especially when the robots are involved in critical missions or applications.

  19. Analyzing Cyber-Physical Threats on Robotic Platforms †

    PubMed Central

    2018-01-01

    Robots are increasingly involved in our daily lives. Fundamental to robots are the communication link (or stream) and the applications that connect the robots to their clients or users. Such communication link and applications are usually supported through client/server network connection. This networking system is amenable of being attacked and vulnerable to the security threats. Ensuring security and privacy for robotic platforms is thus critical, as failures and attacks could have devastating consequences. In this paper, we examine several cyber-physical security threats that are unique to the robotic platforms; specifically the communication link and the applications. Threats target integrity, availability and confidential security requirements of the robotic platforms, which use MobileEyes/arnlServer client/server applications. A robot attack tool (RAT) was developed to perform specific security attacks. An impact-oriented approach was adopted to analyze the assessment results of the attacks. Tests and experiments of attacks were conducted in simulation environment and physically on the robot. The simulation environment was based on MobileSim; a software tool for simulating, debugging and experimenting on MobileRobots/ActivMedia platforms and their environments. The robot platform PeopleBotTM was used for physical experiments. The analysis and testing results show that certain attacks were successful at breaching the robot security. Integrity attacks modified commands and manipulated the robot behavior. Availability attacks were able to cause Denial-of-Service (DoS) and the robot was not responsive to MobileEyes commands. Integrity and availability attacks caused sensitive information on the robot to be hijacked. To mitigate security threats, we provide possible mitigation techniques and suggestions to raise awareness of threats on the robotic platforms, especially when the robots are involved in critical missions or applications. PMID:29883403

  20. Geochemical results from stream-water and stream-sediment samples collected in Colorado and New Mexico

    USGS Publications Warehouse

    Hageman, Philip L.; Todd, Andrew S.; Smith, Kathleen S.; DeWitt, Ed; Zeigler, Mathew P.

    2013-01-01

    Scientists from the U.S. Geological Survey are studying the relationship between watershed lithology and stream-water chemistry. As part of this effort, 60 stream-water samples and 43 corresponding stream-sediment samples were collected in 2010 and 2011 from locations in Colorado and New Mexico. Sample sites were selected from small to midsize watersheds composed of a high percentage of one rock type or geologic unit. Stream-water and stream-sediment samples were collected, processed, preserved, and analyzed in a consistent manner. This report releases geochemical data for this phase of the study.

  1. Method for sequestering CO.sub.2 and SO.sub.2 utilizing a plurality of waste streams

    DOEpatents

    Soong, Yee [Monroeville, PA; Allen, Douglas E [Salem, MA; Zhu, Chen [Monroe County, IN

    2011-04-12

    A neutralization/sequestration process is provided for concomitantly addressing capture and sequestration of both CO.sub.2 and SO.sub.2 from industrial gas byproduct streams. The invented process concomitantly treats and minimizes bauxite residues from aluminum production processes and brine wastewater from oil/gas production processes. The benefits of this integrated approach to coincidental treatment of multiple industrial waste byproduct streams include neutralization of caustic byproduct such as bauxite residue, thereby decreasing the risk associated with the long-term storage and potential environmental of storing caustic materials, decreasing or obviating the need for costly treatment of byproduct brines, thereby eliminating the need to purchase CaO or similar scrubber reagents typically required for SO.sub.2 treatment of such gasses, and directly using CO.sub.2 from flue gas to neutralize bauxite residue/brine mixtures, without the need for costly separation of CO.sub.2 from the industrial byproduct gas stream by processes such as liquid amine-based scrubbers.

  2. Fish relationships with large wood in small streams

    Treesearch

    C. Andrew Dolloff; Melvin L. Warren

    2003-01-01

    Many ecological processes are associated with large wood in streams, such as forming habitat critical for fish and a host of other organisms. Wood loading in streams varies with age and species of riparian vegetation, stream size, time since last disturbance, and history of land use. Changes in the landscape resulting from homesteading, agriculture, and logging have...

  3. Particulate organic contributions from forests and streams: debris isn't so bad

    Treesearch

    C. Andrew Dolloff; Jackson R. Webster

    2000-01-01

    It is clear that the input of "debris" from terrestrial plants falling into streams is one of the most significant processes occurring at the interface of terrestrial and stream ecosystems. Organic matter?leaves, twigs, branches, and whole trees?provides energy, nutrients, and structure to streams flowing through forests. A host of vertebrate and invertebrate...

  4. Riparian management in forests of the continental eastern United States

    Treesearch

    Elon S. Verry; James W. Hornbeck; C. Andrew Dolloff

    2000-01-01

    As we meditate on the management of stream riparian areas, it is clear that the input of "debris" from terrestrial plants falling into streams is one of the most significant processes occurring at the interface of terrestrial and stream ecosystems. Organic matter - leaves. twigs, branches, and whole trees - provides energy, nutrients, and structure to streams...

  5. Increasing synchrony of high temperature and low flow in western North American streams: Double trouble for coldwater biota?

    Treesearch

    Ivan Arismendi; Mohammad Safeeq; Sherri L. Johnson; Jason B Dunham; Roy Haggerty

    2013-01-01

    Flow and temperature are strongly linked environmental factors driving ecosystem processes in streams. Stream temperature maxima (Tmax_w) and stream flow minima (Qmin) can create periods of stress for aquatic organisms. In mountainous areas, such as western North America, recent shifts toward an earlier spring peak flow and...

  6. Groundwater exchanges near a channelized versus unmodified stream mouth discharging to a subalpine lake

    USGS Publications Warehouse

    Constantz, James; Naranjo, Ramon C.; Niswonger, Richard G.; Allander, Kip K.; Neilson, B.; Rosenberry, Donald O.; Smith, David W.; Rosecrans, C.; Stonestrom, David A.

    2016-01-01

    The terminus of a stream flowing into a larger river, pond, lake, or reservoir is referred to as the stream-mouth reach or simply the stream mouth. The terminus is often characterized by rapidly changing thermal and hydraulic conditions that result in abrupt shifts in surface water/groundwater (sw/gw) exchange patterns, creating the potential for unique biogeochemical processes and ecosystems. Worldwide shoreline development is changing stream-lake interfaces through channelization of stream mouths, i.e., channel straightening and bank stabilization to prevent natural meandering at the shoreline. In the central Sierra Nevada (USA), Lake Tahoe's shoreline has an abundance of both “unmodified” (i.e., not engineered though potentially impacted by broader watershed engineering) and channelized stream mouths. Two representative stream mouths along the lake's north shore, one channelized and one unmodified, were selected to compare and contrast water and heat exchanges. Hydraulic and thermal properties were monitored during separate campaigns in September 2012 and 2013 and sw/gw exchanges were estimated within the stream mouth-shoreline continuum. Heat-flow and water-flow patterns indicated clear differences in the channelized versus the unmodified stream mouth. For the channelized stream mouth, relatively modulated, cool-temperature, low-velocity longitudinal streambed flows discharged offshore beneath warmer buoyant lakeshore water. In contrast, a seasonal barrier bar formed across the unmodified stream mouth, creating higher-velocity subsurface flow paths and higher diurnal temperature variations relative to shoreline water. As a consequence, channelization altered sw/gw exchanges potentially altering biogeochemical processing and ecological systems in and near the stream mouth.

  7. SOMKE: kernel density estimation over data streams by sequences of self-organizing maps.

    PubMed

    Cao, Yuan; He, Haibo; Man, Hong

    2012-08-01

    In this paper, we propose a novel method SOMKE, for kernel density estimation (KDE) over data streams based on sequences of self-organizing map (SOM). In many stream data mining applications, the traditional KDE methods are infeasible because of the high computational cost, processing time, and memory requirement. To reduce the time and space complexity, we propose a SOM structure in this paper to obtain well-defined data clusters to estimate the underlying probability distributions of incoming data streams. The main idea of this paper is to build a series of SOMs over the data streams via two operations, that is, creating and merging the SOM sequences. The creation phase produces the SOM sequence entries for windows of the data, which obtains clustering information of the incoming data streams. The size of the SOM sequences can be further reduced by combining the consecutive entries in the sequence based on the measure of Kullback-Leibler divergence. Finally, the probability density functions over arbitrary time periods along the data streams can be estimated using such SOM sequences. We compare SOMKE with two other KDE methods for data streams, the M-kernel approach and the cluster kernel approach, in terms of accuracy and processing time for various stationary data streams. Furthermore, we also investigate the use of SOMKE over nonstationary (evolving) data streams, including a synthetic nonstationary data stream, a real-world financial data stream and a group of network traffic data streams. The simulation results illustrate the effectiveness and efficiency of the proposed approach.

  8. Apparatus and method for materials processing utilizing a rotating magnetic field

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Muralidharan, Govindarajan; Angelini, Joseph A.; Murphy, Bart L.

    An apparatus for materials processing utilizing a rotating magnetic field comprises a platform for supporting a specimen, and a plurality of magnets underlying the platform. The plurality of magnets are configured for rotation about an axis of rotation intersecting the platform. A heat source is disposed above the platform for heating the specimen during the rotation of the plurality of magnets. A method for materials processing utilizing a rotating magnetic field comprises providing a specimen on a platform overlying a plurality of magnets; rotating the plurality of magnets about an axis of rotation intersecting the platform, thereby applying a rotatingmore » magnetic field to the specimen; and, while rotating the plurality of magnets, heating the specimen to a desired temperature.« less

  9. Ecosystem Services Provided by Stream Fishes

    EPA Science Inventory

    Stream fish provide important services to people, including recreation and food, regulation of ecosystem processes, and aesthetic benefits. If the services provided by fish in different streams can be measured, then they can be valued and considered in restoration decisions. We...

  10. One project of Educational Innovation applying news Information and Communications Technologies (ICT): CyberAula 2.0

    NASA Astrophysics Data System (ADS)

    Mendiola, M. A.; Aguado, P. L.; Espejo, R.

    2012-04-01

    The main objective of the CyberAula 2.0 project is to support, record and validate videoconferencing and lecture recording services by means the integration of the Polytechnic University of Madrid (UPM) Moodle Platform with both the Global Plaza platform and Isabel videoconference tool. Each class session is broadcast on the Internet and then recorded, enabling geographically distant students to participate in on-campus classes. All the software used in the project is open source. The videoconferencing tool that we have used is called Isabel and the web platform to schedule, perform, stream, record and publish the videoconferences automatically is called GlobalPlaza. Both of them have been developed at UPM (Universidad Politécnica de Madrid) and are specifically designed for educational purposes. Each class session is broadcasted on the Internet and recorded, enabling geographically distant students to participate live in on-campus classes with questions through a chat tool or through the same videoconference. In order to provide educational support to GlobalPlaza, the CyberAula 2.0 project has been proposed. GlobalPlaza (Barra et al., 2011) is the web platform to schedule, perform, stream, record and publish videoconferences automatically. It is integrated with the videoconferencing tool called Isabel (Quemada et al, 2005), which is a real-time collaboration tool for the Internet, which supports advanced collaborative web/videoconferencing with application sharing and TV like media integration. Both of them are open source solutions which have been developed at our university. GlobalPlaza is a web application developed in the context of the GLOBAL project, a research project supported by the European Commission's seventh framework program. Students can review the recording lectures when needed through Moodle. In this paper we present the project developed at the Escuela Técnica Superior de Ingenieros Agrónomos (ETSIA), Secondary Cycle free-elective-subject, which is currently in process of expiration with the introduction of new curricula within the framework of the European Higher Education Space. Students participate in this subject with outstanding interest, thus achieving transversal competences as they must prepare and present a report in the last week of the Semester. The Project development background was the inclusion of the subject Plants of agro-alimentary interest. It has a quite remarkable, attractive practical deal within the Subjects Offer by this Center and it is one of the most demanded subjects of the free elective ones by students, whose active participation can be highlighted, either in practical workshops or in their individual presentation of reports. In the workshops they must identify, describe, classify and even taste several species of agro-alimentary interest (fruits of tempered or tropical regions, aromatic plants and spices, edible mushrooms and cereals and pseudocereals), many of them formerly unknown for the majority. They are asked to fill some questionnaires in order to consolidate concepts and to evaluate their personal participation in the subject development.

  11. A sustainability model based on cloud infrastructures for core and downstream Copernicus services

    NASA Astrophysics Data System (ADS)

    Manunta, Michele; Calò, Fabiana; De Luca, Claudio; Elefante, Stefano; Farres, Jordi; Guzzetti, Fausto; Imperatore, Pasquale; Lanari, Riccardo; Lengert, Wolfgang; Zinno, Ivana; Casu, Francesco

    2014-05-01

    The incoming Sentinel missions have been designed to be the first remote sensing satellite system devoted to operational services. In particular, the Synthetic Aperture Radar (SAR) Sentinel-1 sensor, dedicated to globally acquire over land in the interferometric mode, guarantees an unprecedented capability to investigate and monitor the Earth surface deformations related to natural and man-made hazards. Thanks to the global coverage strategy and 12-day revisit time, jointly with the free and open access data policy, such a system will allow an extensive application of Differential Interferometric SAR (DInSAR) techniques. In such a framework, European Commission has been funding several projects through the GMES and Copernicus programs, aimed at preparing the user community to the operational and extensive use of Sentinel-1 products for risk mitigation and management purposes. Among them, the FP7-DORIS, an advanced GMES downstream service coordinated by Italian National Council of Research (CNR), is based on the fully exploitation of advanced DInSAR products in landslides and subsidence contexts. In particular, the DORIS project (www.doris-project.eu) has developed innovative scientific techniques and methodologies to support Civil Protection Authorities (CPA) during the pre-event, event, and post-event phases of the risk management cycle. Nonetheless, the huge data stream expected from the Sentinel-1 satellite may jeopardize the effective use of such data in emergency response and security scenarios. This potential bottleneck can be properly overcome through the development of modern infrastructures, able to efficiently provide computing resources as well as advanced services for big data management, processing and dissemination. In this framework, CNR and ESA have tightened up a cooperation to foster the use of GRID and cloud computing platforms for remote sensing data processing, and to make available to a large audience advanced and innovative tools for DInSAR products generation and exploitation. In particular, CNR is porting the multi-temporal DInSAR technique referred to as Small Baseline Subset (SBAS) into the ESA G-POD (Grid Processing On Demand) and CIOP (Cloud Computing Operational Pilot) platforms (Elefante et al., 2013) within the SuperSites Exploitation Platform (SSEP) project, which aim is contributing to the development of an ecosystem for big geo-data processing and dissemination. This work focuses on presenting the main results that have been achieved by the DORIS project concerning the use of advanced DInSAR products for supporting CPA during the risk management cycle. Furthermore, based on the DORIS experience, a sustainability model for Core and Downstream Copernicus services based on the effective exploitation of cloud platforms is proposed. In this framework, remote sensing community, both service providers and users, can significantly benefit from the Helix Nebula-The Science Cloud initiative, created by European scientific institutions, agencies, SMEs and enterprises to pave the way for the development and exploitation of a cloud computing infrastructure for science. REFERENCES Elefante, S., Imperatore, P. , Zinno, I., M. Manunta, E. Mathot, F. Brito, J. Farres, W. Lengert, R. Lanari, F. Casu, 2013, "SBAS-DINSAR Time series generation on cloud computing platforms". IEEE IGARSS Conference, Melbourne (AU), July 2013.

  12. Reformer assisted lean NO.sub.x catalyst aftertreatment system and method

    DOEpatents

    Kalyanaraman, Mohan [Media, PA; Park, Paul W [Peoria, IL; Ragle, Christie S [Havana, IL

    2010-06-29

    A method and apparatus for catalytically processing a gas stream passing therethrough to reduce the presence of NO.sub.x therein, wherein the apparatus includes a first catalyst composed of a silver-containing alumina that is adapted for catalytically processing the gas stream at a first temperature range, a second catalyst composed of a copper-containing zeolite located downstream from the first catalyst, wherein the second catalyst is adapted for catalytically processing the gas stream at a lower second temperature range relative to the first temperature range, a hydrocarbon compound for injection into the gas stream upstream of the first catalyst to provide a reductant, and a reformer for reforming a portion of the hydrocarbon compound into H.sub.2 and/or oxygenated hydrocarbon for injection into the gas stream upstream of the first catalyst. The second catalyst is adapted to facilitate the reaction of reducing NOx into N.sub.2, whereby the intermediates are produced via the first catalyst reacting with NOx and hydrocarbons.

  13. Decoupling of dissolved organic matter patterns between stream and riparian groundwater in a headwater forested catchment

    NASA Astrophysics Data System (ADS)

    Bernal, Susana; Lupon, Anna; Catalán, Núria; Castelar, Sara; Martí, Eugènia

    2018-03-01

    Streams are important sources of carbon to the atmosphere, though knowing whether they merely outgas terrestrially derived carbon dioxide or mineralize terrestrial inputs of dissolved organic matter (DOM) is still a big challenge in ecology. The objective of this study was to investigate the influence of riparian groundwater (GW) and in-stream processes on the temporal pattern of stream DOM concentrations and quality in a forested headwater stream, and whether this influence differed between the leaf litter fall (LLF) period and the remaining part of the year (non-LLF). The spectroscopic indexes (fluorescence index, biological index, humification index, and parallel factor analysis components) indicated that DOM had an eminently protein-like character and was most likely originated from microbial sources and recent biological activity in both stream water and riparian GW. However, paired samples of stream water and riparian GW showed that dissolved organic carbon (DOC) and nitrogen (DON) concentrations as well as the spectroscopic character of DOM differed between the two compartments throughout the year. A simple mass balance approach indicated that in-stream processes along the reach contributed to reducing DOC and DON fluxes by 50 and 30 %, respectively. Further, in-stream DOC and DON uptakes were unrelated to each other, suggesting that these two compounds underwent different biogeochemical pathways. During the LLF period, stream DOC and DOC : DON ratios were higher than during the non-LLF period, and spectroscopic indexes suggested a major influence of terrestrial vegetation on stream DOM. Our study highlights that stream DOM is not merely a reflection of riparian GW entering the stream and that headwater streams have the capacity to internally produce, transform, and consume DOM.

  14. 30 CFR 250.909 - What is the Platform Verification Program?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 30 Mineral Resources 2 2011-07-01 2011-07-01 false What is the Platform Verification Program? 250... Platforms and Structures Platform Verification Program § 250.909 What is the Platform Verification Program? The Platform Verification Program is the MMS approval process for ensuring that floating platforms...

  15. Developing Web-based Tools for Collaborative Science and Public Outreach

    NASA Astrophysics Data System (ADS)

    Friedman, A.; Pizarro, O.; Williams, S. B.

    2016-02-01

    With the advances in high bandwidth communications and the proliferation of social media tools, education & outreach activities have become commonplace on ocean-bound research cruises. In parallel, advances in underwater robotics & other data collecting platforms, have made it possible to collect copious amounts of oceanographic data. This data then typically undergoes laborious, manual processing to transform it into quantitative information, which normally occurs post cruise resulting in significant lags between collecting data and using it for scientific discovery. This presentation discusses how appropriately designed software systems, can be used to fulfill multiple objectives and attempt to leverage public engagement in order to compliment science goals. We will present two software platforms: the first is a web browser based tool that was developed for real-time tracking of multiple underwater robots and ships. It was designed to allow anyone on board to view or control it on any device with a web browser. It opens up the possibility of remote teleoperation & engagement and was easily adapted to enable live streaming over the internet for public outreach. While the tracking system provided context and engaged people in real-time, it also directed interested participants to Squidle, another online system. Developed for scientists, Squidle supports data management, exploration & analysis and enables direct access to survey data reducing the lag in data processing. It provides a user-friendly streamlined interface that integrates advanced data management & online annotation tools. This system was adapted to provide a simplified user interface, tutorial instructions and a gamified ranking system to encourage "citizen science" participation. These examples show that through a flexible design approach, it is possible to leverage the development effort of creating science tools to facilitate outreach goals, opening up the possibility for acquiring large volumes of crowd-sourced data without compromising science objectives.

  16. Stochastic modeling of Cryptosporidium parvum to predict transport, retention, and downstream exposure

    NASA Astrophysics Data System (ADS)

    Drummond, J. D.; Boano, F.; Atwill, E. R.; Li, X.; Harter, T.; Packman, A. I.

    2016-12-01

    Rivers are a means of rapid and long-distance transmission of pathogenic microorganisms from upstream terrestrial sources. Thus, significant fluxes of pathogen loads from agricultural lands can occur due to transport in surface waters. Pathogens enter streams and rivers in a variety of processes, notably overland flow, shallow groundwater discharge, and direct inputs from host populations such as humans and other vertebrate species. Viruses, bacteria, and parasites can enter a stream and persist in the environment for varying amounts of time. Of particular concern is the protozoal parasite, Cryptosporidium parvum, which can remain infective for weeks to months under cool and moist conditions, with the infectious state (oocysts) largely resistant to chlorination. In order to manage water-borne diseases more effectively we need to better predict how microbes behave in freshwater systems, particularly how they are transported downstream in rivers and in the process interact with the streambed and other solid surfaces. Microbes continuously immobilize and resuspend during downstream transport due to a variety of processes, such as gravitational settling, attachment to in-stream structures such as submerged macrophytes, and hyporheic exchange and filtration within underlying sediments. These various interactions result in a wide range of microbial residence times in the streambed and therefore influence the persistence of pathogenic microbes in the stream environment. We developed a stochastic mobile-immobile model to describe these microbial transport and retention processes in streams and rivers that also accounts for microbial inactivation. We used the model to assess the transport, retention, and inactivation of C. parvum within stream environments, specifically under representative flow conditions of California streams where C. parvum exposure can be at higher risk due to agricultural nonpoint sources. The results demonstrate that the combination of stream reach-scale analysis and multi-scale stochastic modeling improves assessment of C. parvum transport and retention in streams in order to predict downstream exposure to human communities, wildlife, and livestock.

  17. 40 CFR 63.149 - Control requirements for certain liquid streams in open systems within a chemical manufacturing...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... streams in open systems within a chemical manufacturing process unit. 63.149 Section 63.149 Protection of... open systems within a chemical manufacturing process unit. (a) The owner or operator shall comply with... Air Pollutants From the Synthetic Organic Chemical Manufacturing Industry for Process Vents, Storage...

  18. Perception of Shapes Targeting Local and Global Processes in Autism Spectrum Disorders

    ERIC Educational Resources Information Center

    Grinter, Emma J.; Maybery, Murray T.; Pellicano, Elizabeth; Badcock, Johanna C.; Badcock, David R.

    2010-01-01

    Background: Several researchers have found evidence for impaired global processing in the dorsal visual stream in individuals with autism spectrum disorders (ASDs). However, support for a similar pattern of visual processing in the ventral visual stream is less consistent. Critical to resolving the inconsistency is the assessment of local and…

  19. Diel biogeochemical processes and their effect on the aqueous chemistry of streams: A review

    USGS Publications Warehouse

    Nimick, David A.; Gammons, Christopher H.; Parker, Stephen R.

    2011-01-01

    This review summarizes biogeochemical processes that operate on diel, or 24-h, time scales in streams and the changes in aqueous chemistry that are associated with these processes. Some biogeochemical processes, such as those producing diel cycles of dissolved O2 and pH, were the first to be studied, whereas processes producing diel concentration cycles of a broader spectrum of chemical species including dissolved gases, dissolved inorganic and organic carbon, trace elements, nutrients, stable isotopes, and suspended particles have received attention only more recently. Diel biogeochemical cycles are interrelated because the cyclical variations produced by one biogeochemical process commonly affect another. Thus, understanding biogeochemical cycling is essential not only for guiding collection and interpretation of water-quality data but also for geochemical and ecological studies of streams. Expanded knowledge of diel biogeochemical cycling will improve understanding of how natural aquatic environments function and thus lead to better predictions of how stream ecosystems might react to changing conditions of contaminant loading, eutrophication, climate change, drought, industrialization, development, and other factors.

  20. Applying the Principles of Lean Production to Gastrointestinal Biopsy Handling: From the Factory Floor to the Anatomic Pathology Laboratory.

    PubMed

    Sugianto, Jessica Z; Stewart, Brian; Ambruzs, Josephine M; Arista, Amanda; Park, Jason Y; Cope-Yokoyama, Sandy; Luu, Hung S

    2015-01-01

    To implement Lean principles to accommodate expanding volumes of gastrointestinal biopsies and to improve laboratory processes overall. Our continuous improvement (kaizen) project analyzed the current state for gastrointestinal biopsy handling using value-stream mapping for specimens obtained at a 487-bed tertiary care pediatric hospital in Dallas, Texas. We identified non-value-added time within the workflow process, from receipt of the specimen in the histology laboratory to the delivery of slides and paperwork to the pathologist. To eliminate non-value-added steps, we implemented the changes depicted in a revised-state value-stream map. Current-state value-stream mapping identified a total specimen processing time of 507 minutes, of which 358 minutes were non-value-added. This translated to a process cycle efficiency of 29%. Implementation of a revised-state value stream resulted in a total process time reduction to 238 minutes, of which 89 minutes were non-value-added, and an improved process cycle efficiency of 63%. Lean production principles of continuous improvement and waste elimination can be successfully implemented within the clinical laboratory.

Top