Sample records for source-to-source compiler infrastructure

  1. Integrating Emerging Data Sources into Operational Practice : Opportunities for Integration of Emerging Data for Traffic Management and TMCs.

    DOT National Transportation Integrated Search

    2017-11-01

    With the emergence of data generated from connected vehicles, connected travelers, and connected infrastructure, the capabilities of traffic management systems or centers (TMCs) will need to be improved to allow agencies to compile and benefit from u...

  2. LLVM Infrastructure and Tools Project Summary

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCormick, Patrick Sean

    2017-11-06

    This project works with the open source LLVM Compiler Infrastructure (http://llvm.org) to provide tools and capabilities that address needs and challenges faced by ECP community (applications, libraries, and other components of the software stack). Our focus is on providing a more productive development environment that enables (i) improved compilation times and code generation for parallelism, (ii) additional features/capabilities within the design and implementations of LLVM components for improved platform/performance portability and (iii) improved aspects related to composition of the underlying implementation details of the programming environment, capturing resource utilization, overheads, etc. -- including runtime systems that are often not easilymore » addressed by application and library developers.« less

  3. ROSE Version 1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Quinlan, D.; Yi, Q.; Buduc, R.

    2005-02-17

    ROSE is an object-oriented software infrastructure for source-to-source translation that provides an interface for programmers to write their own specialized translators for optimizing scientific applications. ROSE is a part of current research on telescoping languages, which provides optimizations of the use of libraries in scientific applications. ROSE defines approaches to extend the optimization techniques, common in well defined languages, to the optimization of scientific applications using well defined libraries. ROSE includes a rich set of tools for generating customized transformations to support optimization of applications codes. We currently support full C and C++ (including template instantiation etc.), with Fortran 90more » support under development as part of a collaboration and contract with Rice to use their version of the open source Open64 F90 front-end. ROSE represents an attempt to define an open compiler infrastructure to handle the full complexity of full scale DOE applications codes using the languages common to scientific computing within DOE. We expect that such an infrastructure will also be useful for the development of numerous tools that may then realistically expect to work on DOE full scale applications.« less

  4. Runtime Detection of C-Style Errors in UPC Code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pirkelbauer, P; Liao, C; Panas, T

    2011-09-29

    Unified Parallel C (UPC) extends the C programming language (ISO C 99) with explicit parallel programming support for the partitioned global address space (PGAS), which provides a global memory space with localized partitions to each thread. Like its ancestor C, UPC is a low-level language that emphasizes code efficiency over safety. The absence of dynamic (and static) safety checks allows programmer oversights and software flaws that can be hard to spot. In this paper, we present an extension of a dynamic analysis tool, ROSE-Code Instrumentation and Runtime Monitor (ROSECIRM), for UPC to help programmers find C-style errors involving the globalmore » address space. Built on top of the ROSE source-to-source compiler infrastructure, the tool instruments source files with code that monitors operations and keeps track of changes to the system state. The resulting code is linked to a runtime monitor that observes the program execution and finds software defects. We describe the extensions to ROSE-CIRM that were necessary to support UPC. We discuss complications that arise from parallel code and our solutions. We test ROSE-CIRM against a runtime error detection test suite, and present performance results obtained from running error-free codes. ROSE-CIRM is released as part of the ROSE compiler under a BSD-style open source license.« less

  5. Application of crowd-sourced data to multi-scale evolutionary exposure and vulnerability models

    NASA Astrophysics Data System (ADS)

    Pittore, Massimiliano

    2016-04-01

    Seismic exposure, defined as the assets (population, buildings, infrastructure) exposed to earthquake hazard and susceptible to damage, is a critical -but often neglected- component of seismic risk assessment. This partly stems from the burden associated with the compilation of a useful and reliable model over wide spatial areas. While detailed engineering data have still to be collected in order to constrain exposure and vulnerability models, the availability of increasingly large crowd-sourced datasets (e. g. OpenStreetMap) opens up the exciting possibility to generate incrementally evolving models. Integrating crowd-sourced and authoritative data using statistical learning methodologies can reduce models uncertainties and also provide additional drive and motivation to volunteered geoinformation collection. A case study in Central Asia will be presented and discussed.

  6. Strengthening Software Authentication with the ROSE Software Suite

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    White, G

    2006-06-15

    Many recent nonproliferation and arms control software projects include a software authentication regime. These include U.S. Government-sponsored projects both in the United States and in the Russian Federation (RF). This trend toward requiring software authentication is only accelerating. Demonstrating assurance that software performs as expected without hidden ''backdoors'' is crucial to a project's success. In this context, ''authentication'' is defined as determining that a software package performs only its intended purpose and performs said purpose correctly and reliably over the planned duration of an agreement. In addition to visual inspections by knowledgeable computer scientists, automated tools are needed to highlightmore » suspicious code constructs, both to aid visual inspection and to guide program development. While many commercial tools are available for portions of the authentication task, they are proprietary and not extensible. An open-source, extensible tool can be customized to the unique needs of each project (projects can have both common and custom rules to detect flaws and security holes). Any such extensible tool has to be based on a complete language compiler. ROSE is precisely such a compiler infrastructure developed within the Department of Energy (DOE) and targeted at the optimization of scientific applications and user-defined libraries within large-scale applications (typically applications of a million lines of code). ROSE is a robust, source-to-source analysis and optimization infrastructure currently addressing large, million-line DOE applications in C and C++ (handling the full C, C99, C++ languages and with current collaborations to support Fortran90). We propose to extend ROSE to address a number of security-specific requirements, and apply it to software authentication for nonproliferation and arms control projects.« less

  7. A Guide for Finding Biographical Sources.

    ERIC Educational Resources Information Center

    Huang, Samuel T., Comp.

    Intended to assist library users in finding biographical sources in various disciplines, this compilation lists selective biographical sources which are available in the Northern Illinois University Libraries. The compilation is divided into four major areas: indexes to biographies, sources of information on living persons, sources of information…

  8. The HACMS program: using formal methods to eliminate exploitable bugs

    PubMed Central

    Launchbury, John; Richards, Raymond

    2017-01-01

    For decades, formal methods have offered the promise of verified software that does not have exploitable bugs. Until recently, however, it has not been possible to verify software of sufficient complexity to be useful. Recently, that situation has changed. SeL4 is an open-source operating system microkernel efficient enough to be used in a wide range of practical applications. Its designers proved it to be fully functionally correct, ensuring the absence of buffer overflows, null pointer exceptions, use-after-free errors, etc., and guaranteeing integrity and confidentiality. The CompCert Verifying C Compiler maps source C programs to provably equivalent assembly language, ensuring the absence of exploitable bugs in the compiler. A number of factors have enabled this revolution, including faster processors, increased automation, more extensive infrastructure, specialized logics and the decision to co-develop code and correctness proofs rather than verify existing artefacts. In this paper, we explore the promise and limitations of current formal-methods techniques. We discuss these issues in the context of DARPA’s HACMS program, which had as its goal the creation of high-assurance software for vehicles, including quadcopters, helicopters and automobiles. This article is part of the themed issue ‘Verified trustworthy software systems’. PMID:28871050

  9. Continuous integration and quality control for scientific software

    NASA Astrophysics Data System (ADS)

    Neidhardt, A.; Ettl, M.; Brisken, W.; Dassing, R.

    2013-08-01

    Modern software has to be stable, portable, fast and reliable. This is going to be also more and more important for scientific software. But this requires a sophisticated way to inspect, check and evaluate the quality of source code with a suitable, automated infrastructure. A centralized server with a software repository and a version control system is one essential part, to manage the code basis and to control the different development versions. While each project can be compiled separately, the whole code basis can also be compiled with one central “Makefile”. This is used to create automated, nightly builds. Additionally all sources are inspected automatically with static code analysis and inspection tools, which check well-none error situations, memory and resource leaks, performance issues, or style issues. In combination with an automatic documentation generator it is possible to create the developer documentation directly from the code and the inline comments. All reports and generated information are presented as HTML page on a Web server. Because this environment increased the stability and quality of the software of the Geodetic Observatory Wettzell tremendously, it is now also available for scientific communities. One regular customer is already the developer group of the DiFX software correlator project.

  10. The HACMS program: using formal methods to eliminate exploitable bugs.

    PubMed

    Fisher, Kathleen; Launchbury, John; Richards, Raymond

    2017-10-13

    For decades, formal methods have offered the promise of verified software that does not have exploitable bugs. Until recently, however, it has not been possible to verify software of sufficient complexity to be useful. Recently, that situation has changed. SeL4 is an open-source operating system microkernel efficient enough to be used in a wide range of practical applications. Its designers proved it to be fully functionally correct, ensuring the absence of buffer overflows, null pointer exceptions, use-after-free errors, etc., and guaranteeing integrity and confidentiality. The CompCert Verifying C Compiler maps source C programs to provably equivalent assembly language, ensuring the absence of exploitable bugs in the compiler. A number of factors have enabled this revolution, including faster processors, increased automation, more extensive infrastructure, specialized logics and the decision to co-develop code and correctness proofs rather than verify existing artefacts. In this paper, we explore the promise and limitations of current formal-methods techniques. We discuss these issues in the context of DARPA's HACMS program, which had as its goal the creation of high-assurance software for vehicles, including quadcopters, helicopters and automobiles.This article is part of the themed issue 'Verified trustworthy software systems'. © 2017 The Authors.

  11. Digital map of aquifer boundary for the High Plains aquifer in parts of Colorado, Kansas, Nebraska, New Mexico, Oklahoma, South Dakota, Texas, and Wyoming

    USGS Publications Warehouse

    Qi, Sharon

    2010-01-01

    This digital data set represents the extent of the High Plains aquifer in the central United States. The extent of the High Plains aquifer covers 174,000 square miles in eight states: Colorado, Kansas, Nebraska, New Mexico, Oklahoma, South Dakota, Texas, and Wyoming. This data set represents a compilation of information from digital and paper sources and personal communication. This boundary is an update to the boundary published in U.S. Geological Survey Professional Paper 1400-B, and this report supersedes Open-File Report 99-267. The purpose of this data set is to refine and update the extent of the High Plains aquifer based on currently available information. This data set represents a compilation of arcs from a variety of sources and scales that represent the 174,000 square-mile extent of the High Plains aquifer within the eight states. Where updated information was not available, the original boundary extent defined by OFR 99-267 was retained. The citations for the sources in each State are listed in the 00README.txt file. The boundary also contains internal polygons, or 'islands', that represent the areas within the aquifer boundary where the aquifer is not present due to erosion or non-deposition. The datasets that pertain to this report can be found on the U.S. Geological Survey's NSDI (National Spatial Data Infrastructure) Node, the links are provided on the sidebar.

  12. Performing aggressive code optimization with an ability to rollback changes made by the aggressive optimizations

    DOEpatents

    Gschwind, Michael K

    2013-07-23

    Mechanisms for aggressively optimizing computer code are provided. With these mechanisms, a compiler determines an optimization to apply to a portion of source code and determines if the optimization as applied to the portion of source code will result in unsafe optimized code that introduces a new source of exceptions being generated by the optimized code. In response to a determination that the optimization is an unsafe optimization, the compiler generates an aggressively compiled code version, in which the unsafe optimization is applied, and a conservatively compiled code version in which the unsafe optimization is not applied. The compiler stores both versions and provides them for execution. Mechanisms are provided for switching between these versions during execution in the event of a failure of the aggressively compiled code version. Moreover, predictive mechanisms are provided for predicting whether such a failure is likely.

  13. Energy map of southwestern Wyoming, Part A - Coal and wind

    USGS Publications Warehouse

    Biewick, Laura; Jones, Nicholas R.

    2012-01-01

    To further advance the objectives of the Wyoming Landscape Conservation Initiative (WLCI) the U.S. Geological Survey (USGS) and the Wyoming State Geological Survey (WSGS) have compiled Part A of the Energy Map of Southwestern Wyoming. Focusing primarily on electrical power sources, Part A of the energy map is a compilation of both published and previously unpublished coal (including coalbed gas) and wind energy resources data, presented in a Geographic Information System (GIS) data package. Energy maps, data, documentation and spatial data processing capabilities are available in a geodatabase, published map file (pmf), ArcMap document (mxd), Adobe Acrobat PDF map (plate 1) and other digital formats that can be downloaded at the USGS website. Accompanying the map (plate 1) and the geospatial data are four additional plates that describe the geology, energy resources, and related infrastructure. These tabular plates include coal mine (plate 2), coal field (plate 3), coalbed gas assessment unit (plate 4), and wind farm (plate 5) information with hyperlinks to source publications and data on the internet. The plates can be printed and examined in hardcopy, or accessed digitally. The data represent decades of research by the USGS, WSGS, BLM and others, and can facilitate landscape-level science assessments, and resource management decisionmaking.

  14. Distributed Data Integration Infrastructure

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Critchlow, T; Ludaescher, B; Vouk, M

    The Internet is becoming the preferred method for disseminating scientific data from a variety of disciplines. This can result in information overload on the part of the scientists, who are unable to query all of the relevant sources, even if they knew where to find them, what they contained, how to interact with them, and how to interpret the results. A related issue is keeping up with current trends in information technology often taxes the end-user's expertise and time. Thus instead of benefiting from this information rich environment, scientists become experts on a small number of sources and technologies, usemore » them almost exclusively, and develop a resistance to innovations that can enhance their productivity. Enabling information based scientific advances, in domains such as functional genomics, requires fully utilizing all available information and the latest technologies. In order to address this problem we are developing a end-user centric, domain-sensitive workflow-based infrastructure, shown in Figure 1, that will allow scientists to design complex scientific workflows that reflect the data manipulation required to perform their research without an undue burden. We are taking a three-tiered approach to designing this infrastructure utilizing (1) abstract workflow definition, construction, and automatic deployment, (2) complex agent-based workflow execution and (3) automatic wrapper generation. In order to construct a workflow, the scientist defines an abstract workflow (AWF) in terminology (semantics and context) that is familiar to him/her. This AWF includes all of the data transformations, selections, and analyses required by the scientist, but does not necessarily specify particular data sources. This abstract workflow is then compiled into an executable workflow (EWF, in our case XPDL) that is then evaluated and executed by the workflow engine. This EWF contains references to specific data source and interfaces capable of performing the desired actions. In order to provide access to the largest number of resources possible, our lowest level utilizes automatic wrapper generation techniques to create information and data wrappers capable of interacting with the complex interfaces typical in scientific analysis. The remainder of this document outlines our work in these three areas, the impact our work has made, and our plans for the future.« less

  15. Generating code adapted for interlinking legacy scalar code and extended vector code

    DOEpatents

    Gschwind, Michael K

    2013-06-04

    Mechanisms for intermixing code are provided. Source code is received for compilation using an extended Application Binary Interface (ABI) that extends a legacy ABI and uses a different register configuration than the legacy ABI. First compiled code is generated based on the source code, the first compiled code comprising code for accommodating the difference in register configurations used by the extended ABI and the legacy ABI. The first compiled code and second compiled code are intermixed to generate intermixed code, the second compiled code being compiled code that uses the legacy ABI. The intermixed code comprises at least one call instruction that is one of a call from the first compiled code to the second compiled code or a call from the second compiled code to the first compiled code. The code for accommodating the difference in register configurations is associated with the at least one call instruction.

  16. Compiled MPI: Cost-Effective Exascale Applications Development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bronevetsky, G; Quinlan, D; Lumsdaine, A

    2012-04-10

    The complexity of petascale and exascale machines makes it increasingly difficult to develop applications that can take advantage of them. Future systems are expected to feature billion-way parallelism, complex heterogeneous compute nodes and poor availability of memory (Peter Kogge, 2008). This new challenge for application development is motivating a significant amount of research and development on new programming models and runtime systems designed to simplify large-scale application development. Unfortunately, DoE has significant multi-decadal investment in a large family of mission-critical scientific applications. Scaling these applications to exascale machines will require a significant investment that will dwarf the costs of hardwaremore » procurement. A key reason for the difficulty in transitioning today's applications to exascale hardware is their reliance on explicit programming techniques, such as the Message Passing Interface (MPI) programming model to enable parallelism. MPI provides a portable and high performance message-passing system that enables scalable performance on a wide variety of platforms. However, it also forces developers to lock the details of parallelization together with application logic, making it very difficult to adapt the application to significant changes in the underlying system. Further, MPI's explicit interface makes it difficult to separate the application's synchronization and communication structure, reducing the amount of support that can be provided by compiler and run-time tools. This is in contrast to the recent research on more implicit parallel programming models such as Chapel, OpenMP and OpenCL, which promise to provide significantly more flexibility at the cost of reimplementing significant portions of the application. We are developing CoMPI, a novel compiler-driven approach to enable existing MPI applications to scale to exascale systems with minimal modifications that can be made incrementally over the application's lifetime. It includes: (1) New set of source code annotations, inserted either manually or automatically, that will clarify the application's use of MPI to the compiler infrastructure, enabling greater accuracy where needed; (2) A compiler transformation framework that leverages these annotations to transform the original MPI source code to improve its performance and scalability; (3) Novel MPI runtime implementation techniques that will provide a rich set of functionality extensions to be used by applications that have been transformed by our compiler; and (4) A novel compiler analysis that leverages simple user annotations to automatically extract the application's communication structure and synthesize most complex code annotations.« less

  17. Compilation of geospatial data for the mineral industries and related infrastructure of Latin America and the Caribbean

    USGS Publications Warehouse

    Baker, Michael S.; Buteyn, Spencer D.; Freeman, Philip A.; Trippi, Michael H.; Trimmer III, Loyd M.

    2017-07-31

    This report describes the U.S. Geological Survey’s (USGS) ongoing commitment to its mission of understanding the nature and distribution of global mineral commodity supply chains by updating and publishing the georeferenced locations of mineral commodity production and processing facilities, mineral exploration and development sites, and mineral commodity exporting ports in Latin America and the Caribbean. The report includes an overview of data sources and an explanation of the geospatial PDF map format.The geodatabase and geospatial data layers described in this report create a new geographic information product in the form of a geospatial portable document format (PDF) map. The geodatabase contains additional data layers from USGS, foreign governmental, and open-source sources as follows: (1) coal occurrence areas, (2) electric power generating facilities, (3) electric power transmission lines, (4) hydrocarbon resource cumulative production data, (5) liquefied natural gas terminals, (6) oil and gas concession leasing areas, (7) oil and gas field center points, (8) oil and gas pipelines, (9) USGS petroleum provinces, (10) railroads, (11) recoverable proven plus probable hydrocarbon resources, (12) major cities, (13) major rivers, and (14) undiscovered porphyry copper tracts.

  18. Great Issues in American History: A Compilation of Primary Sources Related to Issues That Have Occupied the Attention of the American People from Colonial Days to the Present. Oregon ASCD Curriculum Bulletin, Vol. 30, No. 333.

    ERIC Educational Resources Information Center

    Nance, Elizabeth

    This publication is a compilation of primary source materials related to issues that have occupied the attention of the American people from colonial days to the present. It is intended for use at the secondary level. A prologue contains creation stories and poems on the origins of the world and man. Documentation of the primary sources is…

  19. Program structure-based blocking

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bertolli, Carlo; Eichenberger, Alexandre E.; O'Brien, John K.

    2017-09-26

    Embodiments relate to program structure-based blocking. An aspect includes receiving source code corresponding to a computer program by a compiler of a computer system. Another aspect includes determining a prefetching section in the source code by a marking module of the compiler. Yet another aspect includes performing, by a blocking module of the compiler, blocking of instructions located in the prefetching section into instruction blocks, such that the instruction blocks of the prefetching section only contain instructions that are located in the prefetching section.

  20. HAL/S-FC and HAL/S-360 compiler system program description

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The compiler is a large multi-phase design and can be broken into four phases: Phase 1 inputs the source language and does a syntactic and semantic analysis generating the source listing, a file of instructions in an internal format (HALMAT) and a collection of tables to be used in subsequent phases. Phase 1.5 massages the code produced by Phase 1, performing machine independent optimization. Phase 2 inputs the HALMAT produced by Phase 1 and outputs machine language object modules in a form suitable for the OS-360 or FCOS linkage editor. Phase 3 produces the SDF tables. The four phases described are written in XPL, a language specifically designed for compiler implementation. In addition to the compiler, there is a large library containing all the routines that can be explicitly called by the source language programmer plus a large collection of routines for implementing various facilities of the language.

  1. Aquarius Project: Research in the System Architecture of Accelerators for the High Performance Execution of Logic Programs.

    DTIC Science & Technology

    1991-05-31

    benchmarks ............ .... . .. .. . . .. 220 Appendix G : Source code of the Aquarius Prolog compiler ........ . 224 Chapter I Introduction "You’re given...notation, a tool that is used throughout the compiler’s implementation. Appendix F lists the source code of the C and Prolog benchmarks. Appendix G lists the...source code of the compilcr. 5 "- standard form Prolog / a-sfomadon / head umrvln Convert to tmeikernel Prol g vrans~fonaon 1symbolic execution

  2. ATLAS OF SOURCE EMISSION PARTICLES

    EPA Science Inventory

    An atlas of various source emission particles characterized by electron optical techniques has been compiled for use by air pollution investigators. The particles studied were emitted by mobile, stationary, and natural sources. Sources included automobiles, manufacturing operatio...

  3. Comprehensive trends assessment of nitrogen sources and loads to estuaries of the coterminous United States

    EPA Science Inventory

    Sources of nitrogen and phosphorus to estuaries and estuarine watersheds of the coterminous United States have been compiled from a variety of publically available data sources (1985 – 2015). Atmospheric loading was obtained from two sources. Modelled and interpolated meas...

  4. Traleika Glacier X-Stack Extension Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fryman, Joshua

    The XStack Extension Project continued along the direction of the XStack program in exploring the software tools and frameworks to support a task-based community runtime towards the goal of Exascale programming. The momentum built as part of the XStack project, with the development of the task-based Open Community Runtime (OCR) and related tools, was carried through during the XStack Extension with the focus areas of easing application development, improving performance and supporting more features. The infrastructure set up for a community-driven open-source development continued to be used towards these areas, with continued co-development of runtime and applications. A variety ofmore » OCR programming environments were studied, as described in Sections Revolutionary Programming Environments & Applications – to assist with application development on OCR, and we develop OCR Translator, a ROSE-based source-to-source compiler that parses high-level annotations in an MPI program to generate equivalent OCR code. Figure 2 compares the number of OCR objects needed to generate the 2D stencil workload using the translator, against manual approaches based on SPMD library or native coding. The rate of increase with the translator, with an increase in number of ranks, is consistent with other approaches. This is explored further in Section OCR Translator.« less

  5. Defense and Development in Sub-Saharan Africa: Codebook.

    DTIC Science & Technology

    1988-03-01

    countries by presenting the different data sources and explaining how they were compiled. The statistics in the 0 database cover 41 African countries for...February 1984, pp. 157-164 -vi Finally, in addition to the economic and military data , some statistics have been compiled that monitor social and...32 IX. SOCIAL/POLITICAL STATISTICS ....................................34 SOURCES AND NOTES ON COLLECTION OF DATA

  6. Compilation of Physicochemical and Toxicological Information ...

    EPA Pesticide Factsheets

    The purpose of this product is to make accessible the information about the 1,173 hydraulic fracturing-related chemicals that were listed in the external review draft of the Hydraulic Fracturing Drinking Water Assessment that was released recently. The product consists of a series of spreadsheets with physicochemical and toxicological information pulled from several sources of information, including: EPI Suite, LeadScope, QikiProp, Reaxys, IRIS, PPRTV, ATSDR, among other sources. The spreadsheets also contain background information about how the list of chemicals were compiled, what the different sources of chemical information are, and definitions and descriptions of the values presented. The purpose of this product is to compile and make accessible information about the 1,173 hydraulic fracturing-related chemicals listed in the external review draft of the Hydraulic Fracturing Drinking Water Assessment.

  7. SOURCEBOOK: NOX CONTROL TECHNOLOGY DATA

    EPA Science Inventory

    The report, a compilation of available information on the control of nitrogen oxide (NOx) emissions from stationary sources, is provided to assist new source permitting activities by regulatory agencies. he sources covered are combustion turbines, internal combustion engines, non...

  8. SEGY to ASCII: Conversion and Plotting Program

    USGS Publications Warehouse

    Goldman, Mark R.

    1999-01-01

    This report documents a computer program to convert standard 4 byte, IBM floating point SEGY files to ASCII xyz format. The program then optionally plots the seismic data using the GMT plotting package. The material for this publication is contained in a standard tar file (of99-126.tar) that is uncompressed and 726 K in size. It can be downloaded by any Unix machine. Move the tar file to the directory you wish to use it in, then type 'tar xvf of99-126.tar' The archive files (and diskette) contain a NOTE file, a README file, a version-history file, source code, a makefile for easy compilation, and an ASCII version of the documentation. The archive files (and diskette) also contain example test files, including a typical SEGY file along with the resulting ASCII xyz and postscript files. Requirements for compiling the source code into an executable are a C++ compiler. The program has been successfully compiled using Gnu's g++ version 2.8.1, and use of other compilers may require modifications to the existing source code. The g++ compiler is a free, high quality C++ compiler and may be downloaded from the ftp site: ftp://ftp.gnu.org/gnu Requirements for plotting the seismic data is the existence of the GMT plotting package. The GMT plotting package may be downloaded from the web site: http://www.soest.hawaii.edu/gmt/

  9. Space Telecommunications Radio System (STRS) Compliance Testing

    NASA Technical Reports Server (NTRS)

    Handler, Louis M.

    2011-01-01

    The Space Telecommunications Radio System (STRS) defines an open architecture for software defined radios. This document describes the testing methodology to aid in determining the degree of compliance to the STRS architecture. Non-compliances are reported to the software and hardware developers as well as the NASA project manager so that any non-compliances may be fixed or waivers issued. Since the software developers may be divided into those that provide the operating environment including the operating system and STRS infrastructure (OE) and those that supply the waveform applications, the tests are divided accordingly. The static tests are also divided by the availability of an automated tool that determines whether the source code and configuration files contain the appropriate items. Thus, there are six separate step-by-step test procedures described as well as the corresponding requirements that they test. The six types of STRS compliance tests are: STRS application automated testing, STRS infrastructure automated testing, STRS infrastructure testing by compiling WFCCN with the infrastructure, STRS configuration file testing, STRS application manual code testing, and STRS infrastructure manual code testing. Examples of the input and output of the scripts are shown in the appendices as well as more specific information about what to configure and test in WFCCN for non-compliance. In addition, each STRS requirement is listed and the type of testing briefly described. Attached is also a set of guidelines on what to look for in addition to the requirements to aid in the document review process.

  10. Developing Conceptual Models for Assessing Climate Change Impacts to Contaminant Availability in Terrestrial Ecosystems

    DTIC Science & Technology

    2015-03-01

    Stressors Secondary Source/ Stressors Measures of Effect Score Summary Individual Scores Compile Results Land Management (e.g., controlled fire ...Secondary Source/ Stressors Measures of Effect Score Summary Individual Scores Compile Results Land Management (e.g., controlled fire , timber...Greenberg 2005), effects of dredged material (PIANC 2006), and ecosystem restoration (Fischenich 2008) among others. The process of developing a conceptual

  11. A Roadmap to Continuous Integration for ATLAS Software Development

    NASA Astrophysics Data System (ADS)

    Elmsheuser, J.; Krasznahorkay, A.; Obreshkov, E.; Undrus, A.; ATLAS Collaboration

    2017-10-01

    The ATLAS software infrastructure facilitates efforts of more than 1000 developers working on the code base of 2200 packages with 4 million lines of C++ and 1.4 million lines of python code. The ATLAS offline code management system is the powerful, flexible framework for processing new package versions requests, probing code changes in the Nightly Build System, migration to new platforms and compilers, deployment of production releases for worldwide access and supporting physicists with tools and interfaces for efficient software use. It maintains multi-stream, parallel development environment with about 70 multi-platform branches of nightly releases and provides vast opportunities for testing new packages, for verifying patches to existing software and for migrating to new platforms and compilers. The system evolution is currently aimed on the adoption of modern continuous integration (CI) practices focused on building nightly releases early and often, with rigorous unit and integration testing. This paper describes the CI incorporation program for the ATLAS software infrastructure. It brings modern open source tools such as Jenkins and GitLab into the ATLAS Nightly System, rationalizes hardware resource allocation and administrative operations, provides improved feedback and means to fix broken builds promptly for developers. Once adopted, ATLAS CI practices will improve and accelerate innovation cycles and result in increased confidence in new software deployments. The paper reports the status of Jenkins integration with the ATLAS Nightly System as well as short and long term plans for the incorporation of CI practices.

  12. The Chandra Source Catalog: Processing and Infrastructure

    NASA Astrophysics Data System (ADS)

    Evans, Janet; Evans, Ian N.; Glotfelty, Kenny J.; Hain, Roger; Hall, Diane M.; Miller, Joseph B.; Plummer, David A.; Zografou, Panagoula; Primini, Francis A.; Anderson, Craig S.; Bonaventura, Nina R.; Chen, Judy C.; Davis, John E.; Doe, Stephen M.; Fabbiano, Giuseppina; Galle, Elizabeth C.; Gibbs, Danny G., II; Grier, John D.; Harbo, Peter N.; He, Xiang Qun (Helen); Houck, John C.; Karovska, Margarita; Kashyap, Vinay L.; Lauer, Jennifer; McCollough, Michael L.; McDowell, Jonathan C.; Mitschang, Arik W.; Morgan, Douglas L.; Mossman, Amy E.; Nichols, Joy S.; Nowak, Michael A.; Refsdal, Brian L.; Rots, Arnold H.; Siemiginowska, Aneta L.; Sundheim, Beth A.; Tibbetts, Michael S.; van Stone, David W.; Winkelman, Sherry L.

    2009-09-01

    Chandra Source Catalog processing recalibrates each observation using the latest available calibration data, and employs a wavelet-based source detection algorithm to identify all the X-ray sources in the field of view. Source properties are then extracted from each detected source that is a candidate for inclusion in the catalog. Catalog processing is completed by matching sources across multiple observations, merging common detections, and applying quality assurance checks. The Chandra Source Catalog processing system shares a common processing infrastructure and utilizes much of the functionality that is built into the Standard Data Processing (SDP) pipeline system that provides calibrated Chandra data to end-users. Other key components of the catalog processing system have been assembled from the portable CIAO data analysis package. Minimal new software tool development has been required to support the science algorithms needed for catalog production. Since processing pipelines must be instantiated for each detected source, the number of pipelines that are run during catalog construction is a factor of order 100 times larger than for SDP. The increased computational load, and inherent parallel nature of the processing, is handled by distributing the workload across a multi-node Beowulf cluster. Modifications to the SDP automated processing application to support catalog processing, and extensions to Chandra Data Archive software to ingest and retrieve catalog products, complete the upgrades to the infrastructure to support catalog processing.

  13. Selection of important ecological source patches base on Green Infrastructure theory: A case study of Wuhan city

    NASA Astrophysics Data System (ADS)

    Ke, Yuanyuan; Yu, Yan; Tong, Yan

    2018-01-01

    Selecting urban ecological patches is of great significance for constructing urban green infrastructure network, protecting urban biodiversity and ecological environment. With the support of GIS technology, a criterion for selecting sources of patches was developed according to existing planning. Then ecological source patches of terrestrial organism, aquatic and amphibious organism were selected in Wuhan city. To increase the connectivity of the ecological patches and achieve greater ecological protection benefits, the green infrastructure networks in Wuhan city were constructed with the minimum path analysis method. Finally, the characteristics of ecological source patches were analyzed with landscape metrics, and ecological protection importance degree of ecological source patches were evaluated comprehensively. The results showed that there were 23 important ecological source patches in Wuhan city, among which Sushan Temple Forest Patch, Lu Lake and Shangshe Lake Wetland Patch were the most important in all kinds of patches for ecological protection. This study can provide a scientific basis for the preservation of urban ecological space, the delineation of natural conservation areas and the protection of biological diversity.

  14. Groundwater-quality data associated with abandoned underground coal mine aquifers in West Virginia, 1973-2016: Compilation of existing data from multiple sources

    USGS Publications Warehouse

    McAdoo, Mitchell A.; Kozar, Mark D.

    2017-11-14

    This report describes a compilation of existing water-quality data associated with groundwater resources originating from abandoned underground coal mines in West Virginia. Data were compiled from multiple sources for the purpose of understanding the suitability of groundwater from abandoned underground coal mines for public supply, industrial, agricultural, and other uses. This compilation includes data collected for multiple individual studies conducted from July 13, 1973 through September 7, 2016. Analytical methods varied by the time period of data collection and requirements of the independent studies.This project identified 770 water-quality samples from 294 sites that could be attributed to abandoned underground coal mine aquifers originating from multiple coal seams in West Virginia.

  15. Driving toward guiding principles: a goal for privacy, confidentiality, and security of health information.

    PubMed

    Buckovich, S A; Rippen, H E; Rozen, M J

    1999-01-01

    As health care moves from paper to electronic data collection, providing easier access and dissemination of health information, the development of guiding privacy, confidentiality, and security principles is necessary to help balance the protection of patients' privacy interests against appropriate information access. A comparative review and analysis was done, based on a compilation of privacy, confidentiality, and security principles from many sources. Principles derived from ten identified sources were compared with each of the compiled principles to assess support level, uniformity, and inconsistencies. Of 28 compiled principles, 23 were supported by at least 50 percent of the sources. Technology could address at least 12 of the principles. Notable consistencies among the principles could provide a basis for consensus for further legislative and organizational work. It is imperative that all participants in our health care system work actively toward a viable resolution of this information privacy debate.

  16. Driving Toward Guiding Principles

    PubMed Central

    Buckovich, Suzy A.; Rippen, Helga E.; Rozen, Michael J.

    1999-01-01

    As health care moves from paper to electronic data collection, providing easier access and dissemination of health information, the development of guiding privacy, confidentiality, and security principles is necessary to help balance the protection of patients' privacy interests against appropriate information access. A comparative review and analysis was done, based on a compilation of privacy, confidentiality, and security principles from many sources. Principles derived from ten identified sources were compared with each of the compiled principles to assess support level, uniformity, and inconsistencies. Of 28 compiled principles, 23 were supported by at least 50 percent of the sources. Technology could address at least 12 of the principles. Notable consistencies among the principles could provide a basis for consensus for further legislative and organizational work. It is imperative that all participants in our health care system work actively toward a viable resolution of this information privacy debate. PMID:10094065

  17. A GIS-based time-dependent seismic source modeling of Northern Iran

    NASA Astrophysics Data System (ADS)

    Hashemi, Mahdi; Alesheikh, Ali Asghar; Zolfaghari, Mohammad Reza

    2017-01-01

    The first step in any seismic hazard study is the definition of seismogenic sources and the estimation of magnitude-frequency relationships for each source. There is as yet no standard methodology for source modeling and many researchers have worked on this topic. This study is an effort to define linear and area seismic sources for Northern Iran. The linear or fault sources are developed based on tectonic features and characteristic earthquakes while the area sources are developed based on spatial distribution of small to moderate earthquakes. Time-dependent recurrence relationships are developed for fault sources using renewal approach while time-independent frequency-magnitude relationships are proposed for area sources based on Poisson process. GIS functionalities are used in this study to introduce and incorporate spatial-temporal and geostatistical indices in delineating area seismic sources. The proposed methodology is used to model seismic sources for an area of about 500 by 400 square kilometers around Tehran. Previous researches and reports are studied to compile an earthquake/fault catalog that is as complete as possible. All events are transformed to uniform magnitude scale; duplicate events and dependent shocks are removed. Completeness and time distribution of the compiled catalog is taken into account. The proposed area and linear seismic sources in conjunction with defined recurrence relationships can be used to develop time-dependent probabilistic seismic hazard analysis of Northern Iran.

  18. Nitrogen Source and Loading Data for EPA Estuary Data Mapper

    EPA Science Inventory

    Nitrogen source and loading data have been compiled and aggregated at the scale of estuaries and associated watersheds of the conterminous United States, using the spatial framework in EPA's Estuary Data Mapper (EDM) to provide system boundaries. Original sources of data include...

  19. REVIEW OF INDOOR EMISSION SOURCE MODELS--PART 1. OVERVIEW

    EPA Science Inventory

    Indoor emission source models are mainly used as a component in indoor air quality (IAQ) and exposure modeling. They are also widely used to interpret the experimental data obtained from environmental chambers and buildings. This paper compiles 46 indoor emission source models fo...

  20. The paradigm compiler: Mapping a functional language for the connection machine

    NASA Technical Reports Server (NTRS)

    Dennis, Jack B.

    1989-01-01

    The Paradigm Compiler implements a new approach to compiling programs written in high level languages for execution on highly parallel computers. The general approach is to identify the principal data structures constructed by the program and to map these structures onto the processing elements of the target machine. The mapping is chosen to maximize performance as determined through compile time global analysis of the source program. The source language is Sisal, a functional language designed for scientific computations, and the target language is Paris, the published low level interface to the Connection Machine. The data structures considered are multidimensional arrays whose dimensions are known at compile time. Computations that build such arrays usually offer opportunities for highly parallel execution; they are data parallel. The Connection Machine is an attractive target for these computations, and the parallel for construct of the Sisal language is a convenient high level notation for data parallel algorithms. The principles and organization of the Paradigm Compiler are discussed.

  1. Documentation of methods and inventory of irrigation data collected for the 2000 and 2005 U.S. Geological Survey Estimated use of water in the United States, comparison of USGS-compiled irrigation data to other sources, and recommendations for future compilations

    USGS Publications Warehouse

    Dickens, Jade M.; Forbes, Brandon T.; Cobean, Dylan S.; Tadayon, Saeid

    2011-01-01

    An indirect method for estimating irrigation withdrawals is presented and results are compared to the 2005 USGS-reported irrigation withdrawals for selected States. This method is meant to demonstrate a way to check data reported or received from a third party, if metered data are unavailable. Of the 11 States where this method was applied, 8 States had estimated irrigation withdrawals that were within 15 percent of what was reported in the 2005 water-use compilation, and 3 States had estimated irrigation withdrawals that were more than 20 percent of what was reported in 2005. Recommendations for improving estimates of irrigated acreage and irrigation withdrawals also are presented in this report. Conveyance losses and irrigation-system efficiencies should be considered in order to achieve a more accurate representation of irrigation withdrawals. Better documentation of data sources and methods used can help lead to more consistent information in future irrigation water-use compilations. Finally, a summary of data sources and methods used to estimate irrigated acreage and irrigation withdrawals for the 2000 and 2005 compilations for each WSC is presented in appendix 1.

  2. 36 CFR 902.57 - Investigatory files compiled for law enforcement purposes.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 36 Parks, Forests, and Public Property 3 2011-07-01 2011-07-01 false Investigatory files compiled... Records § 902.57 Investigatory files compiled for law enforcement purposes. (a) Files compiled by the...) Constitute an unwarranted invasion of personal privacy; (4) Disclose the identity of a confidential source...

  3. Establishing Malware Attribution and Binary Provenance Using Multicompilation Techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ramshaw, M. J.

    2017-07-28

    Malware is a serious problem for computer systems and costs businesses and customers billions of dollars a year in addition to compromising their private information. Detecting malware is particularly difficult because malware source code can be compiled in many different ways and generate many different digital signatures, which causes problems for most anti-malware programs that rely on static signature detection. Our project uses a convolutional neural network to identify malware programs but these require large amounts of data to be effective. Towards that end, we gather thousands of source code files from publicly available programming contest sites and compile themmore » with several different compilers and flags. Building upon current research, we then transform these binary files into image representations and use them to train a long-term recurrent convolutional neural network that will eventually be used to identify how a malware binary was compiled. This information will include the compiler, version of the compiler and the options used in compilation, information which can be critical in determining where a malware program came from and even who authored it.« less

  4. A translator writing system for microcomputer high-level languages and assemblers

    NASA Technical Reports Server (NTRS)

    Collins, W. R.; Knight, J. C.; Noonan, R. E.

    1980-01-01

    In order to implement high level languages whenever possible, a translator writing system of advanced design was developed. It is intended for routine production use by many programmers working on different projects. As well as a fairly conventional parser generator, it includes a system for the rapid generation of table driven code generators. The parser generator was developed from a prototype version. The translator writing system includes various tools for the management of the source text of a compiler under construction. In addition, it supplies various default source code sections so that its output is always compilable and executable. The system thereby encourages iterative enhancement as a development methodology by ensuring an executable program from the earliest stages of a compiler development project. The translator writing system includes PASCAL/48 compiler, three assemblers, and two compilers for a subset of HAL/S.

  5. Source Reduction Assistance Grant Program Guidance for Applicants

    EPA Pesticide Factsheets

    The following FAQs were compiled to benefit prospective applicants seeking to apply for grant s or cooperative agreement funding under the Environmental Protection Agency’s (EPA) Source Reduction Assistance (SRA) Grant Program.

  6. The Role of Free/Libre and Open Source Software in Learning Health Systems.

    PubMed

    Paton, C; Karopka, T

    2017-08-01

    Objective: To give an overview of the role of Free/Libre and Open Source Software (FLOSS) in the context of secondary use of patient data to enable Learning Health Systems (LHSs). Methods: We conducted an environmental scan of the academic and grey literature utilising the MedFLOSS database of open source systems in healthcare to inform a discussion of the role of open source in developing LHSs that reuse patient data for research and quality improvement. Results: A wide range of FLOSS is identified that contributes to the information technology (IT) infrastructure of LHSs including operating systems, databases, frameworks, interoperability software, and mobile and web apps. The recent literature around the development and use of key clinical data management tools is also reviewed. Conclusions: FLOSS already plays a critical role in modern health IT infrastructure for the collection, storage, and analysis of patient data. The nature of FLOSS systems to be collaborative, modular, and modifiable may make open source approaches appropriate for building the digital infrastructure for a LHS. Georg Thieme Verlag KG Stuttgart.

  7. Applying Standard Interfaces to a Process-Control Language

    NASA Technical Reports Server (NTRS)

    Berthold, Richard T.

    2005-01-01

    A method of applying open-operating-system standard interfaces to the NASA User Interface Language (UIL) has been devised. UIL is a computing language that can be used in monitoring and controlling automated processes: for example, the Timeliner computer program, written in UIL, is a general-purpose software system for monitoring and controlling sequences of automated tasks in a target system. In providing the major elements of connectivity between UIL and the target system, the present method offers advantages over the prior method. Most notably, unlike in the prior method, the software description of the target system can be made independent of the applicable compiler software and need not be linked to the applicable executable compiler image. Also unlike in the prior method, it is not necessary to recompile the source code and relink the source code to a new executable compiler image. Abstraction of the description of the target system to a data file can be defined easily, with intuitive syntax, and knowledge of the source-code language is not needed for the definition.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fox, G.C.; Stevens, P.R.; Rittenberg, A.

    A compilation is presented of reaction data taken from experimental high energy physics journal articles, reports, preprints, theses, and other sources. Listings of all the data are given, and the data points are indexed by reaction and momentum, as well as by their source document. Much of the original compilation was done by others working in the field. The data presented also exist in the form of a computer-readable and searchable database; primitive access facilities for this database are available.

  9. Risk assessment for tephra dispersal and sedimentation: the example of four Icelandic volcanoes

    NASA Astrophysics Data System (ADS)

    Biass, Sebastien; Scaini, Chiara; Bonadonna, Costanza; Smith, Kate; Folch, Arnau; Höskuldsson, Armann; Galderisi, Adriana

    2014-05-01

    In order to assist the elaboration of proactive measures for the management of future Icelandic volcanic eruptions, we developed a new approach to assess the impact associated with tephra dispersal and sedimentation at various scales and for multiple sources. Target volcanoes are Hekla, Katla, Eyjafjallajökull and Askja, selected for their high probabilities of eruption and/or their high potential impact. We combined stratigraphic studies, probabilistic strategies and numerical modelling to develop comprehensive eruption scenarios and compile hazard maps for local ground deposition and regional atmospheric concentration using both TEPHRA2 and FALL3D models. New algorithms for the identification of comprehensive probability density functions of eruptive source parameters were developed for both short and long-lasting activity scenarios. A vulnerability assessment of socioeconomic and territorial aspects was also performed at both national and continental scales. The identification of relevant vulnerability indicators allowed for the identification of the most critical areas and territorial nodes. At a national scale, the vulnerability of economic activities and the accessibility to critical infrastructures was assessed. At a continental scale, we assessed the vulnerability of the main airline routes and airports. Resulting impact and risk were finally assessed by combining hazard and vulnerability analysis.

  10. Implementing the EuroFIR Document and Data Repositories as accessible resources of food composition information.

    PubMed

    Unwin, Ian; Jansen-van der Vliet, Martine; Westenbrink, Susanne; Presser, Karl; Infanger, Esther; Porubska, Janka; Roe, Mark; Finglas, Paul

    2016-02-15

    The EuroFIR Document and Data Repositories are being developed as accessible collections of source documents, including grey literature, and the food composition data reported in them. These Repositories will contain source information available to food composition database compilers when selecting their nutritional data. The Document Repository was implemented as searchable bibliographic records in the Europe PubMed Central database, which links to the documents online. The Data Repository will contain original data from source documents in the Document Repository. Testing confirmed the FoodCASE food database management system as a suitable tool for the input, documentation and quality assessment of Data Repository information. Data management requirements for the input and documentation of reported analytical results were established, including record identification and method documentation specifications. Document access and data preparation using the Repositories will provide information resources for compilers, eliminating duplicated work and supporting unambiguous referencing of data contributing to their compiled data. Copyright © 2014 Elsevier Ltd. All rights reserved.

  11. Sources: A Compilation of Useful Information for Teachers & Teacher-Librarians. Canadian Edition.

    ERIC Educational Resources Information Center

    School Libraries in Canada, 2002

    2002-01-01

    Includes a variety of sources for quality information for Canadian school libraries. Highlights include professional associations; award-winning books; Canadian children's and young adult authors and illustrators; educational films; Web sites; Canadian information sources on the Web; Canadian poetry; and professional resources. (LRW)

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Seyong; Kim, Jungwon; Vetter, Jeffrey S

    This paper presents a directive-based, high-level programming framework for high-performance reconfigurable computing. It takes a standard, portable OpenACC C program as input and generates a hardware configuration file for execution on FPGAs. We implemented this prototype system using our open-source OpenARC compiler; it performs source-to-source translation and optimization of the input OpenACC program into an OpenCL code, which is further compiled into a FPGA program by the backend Altera Offline OpenCL compiler. Internally, the design of OpenARC uses a high- level intermediate representation that separates concerns of program representation from underlying architectures, which facilitates portability of OpenARC. In fact, thismore » design allowed us to create the OpenACC-to-FPGA translation framework with minimal extensions to our existing system. In addition, we show that our proposed FPGA-specific compiler optimizations and novel OpenACC pragma extensions assist the compiler in generating more efficient FPGA hardware configuration files. Our empirical evaluation on an Altera Stratix V FPGA with eight OpenACC benchmarks demonstrate the benefits of our strategy. To demonstrate the portability of OpenARC, we show results for the same benchmarks executing on other heterogeneous platforms, including NVIDIA GPUs, AMD GPUs, and Intel Xeon Phis. This initial evidence helps support the goal of using a directive-based, high-level programming strategy for performance portability across heterogeneous HPC architectures.« less

  13. TerraFERMA: The Transparent Finite Element Rapid Model Assembler for multiphysics problems in Earth sciences

    NASA Astrophysics Data System (ADS)

    Wilson, Cian R.; Spiegelman, Marc; van Keken, Peter E.

    2017-02-01

    We introduce and describe a new software infrastructure TerraFERMA, the Transparent Finite Element Rapid Model Assembler, for the rapid and reproducible description and solution of coupled multiphysics problems. The design of TerraFERMA is driven by two computational needs in Earth sciences. The first is the need for increased flexibility in both problem description and solution strategies for coupled problems where small changes in model assumptions can lead to dramatic changes in physical behavior. The second is the need for software and models that are more transparent so that results can be verified, reproduced, and modified in a manner such that the best ideas in computation and Earth science can be more easily shared and reused. TerraFERMA leverages three advanced open-source libraries for scientific computation that provide high-level problem description (FEniCS), composable solvers for coupled multiphysics problems (PETSc), and an options handling system (SPuD) that allows the hierarchical management of all model options. TerraFERMA integrates these libraries into an interface that organizes the scientific and computational choices required in a model into a single options file from which a custom compiled application is generated and run. Because all models share the same infrastructure, models become more reusable and reproducible, while still permitting the individual researcher considerable latitude in model construction. TerraFERMA solves partial differential equations using the finite element method. It is particularly well suited for nonlinear problems with complex coupling between components. TerraFERMA is open-source and available at http://terraferma.github.io, which includes links to documentation and example input files.

  14. A compilation of chemical quality data for ground and surface waters in Utah

    USGS Publications Warehouse

    Connor, John G.; Mitchell, C.G.

    1958-01-01

    An accelerated use of water resulting from a growing population, industrial expansion, and irrigation has brought into focus the importance of the quality as well as the quantity of this natural resource in Utah. As new demands are made on the existing supply, a search goes on for new sources of ground and surface water. These new sources must not only meet quantity requirements, but also must fall within certain limits of chemical composition - in relation to its proposed use.The prime purpose of this report is to compile into one volume all of the available information that exists on the quality of ground and surface water in Utah. The various sources of information, named in the preface, have supplied data obtained through their own organizations. Analyses from these sources may be identified by reference to the indicated 2-letter code on the data sheets.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hale, Rebecca L.; Turnbull, Laura; Earl, Stevan

    Urban watersheds are often sources of nitrogen (N) to downstream systems, contributing to poor water quality. However, it is unknown which components (e.g., land cover and stormwater infrastructure type) of urban watersheds contribute to N export and which may be sites of retention. In this study we investigated which watershed characteristics control N sourcing, biogeochemical processing of nitrate (NO3–) during storms, and the amount of rainfall N that is retained within urban watersheds. We used triple isotopes of NO3– (δ15N, δ18O, and Δ17O) to identify sources and transformations of NO3– during storms from 10 nested arid urban watersheds that variedmore » in stormwater infrastructure type and drainage area. Stormwater infrastructure and land cover—retention basins, pipes, and grass cover—dictated the sourcing of NO3– in runoff. Urban watersheds can be strong sinks or sources of N to stormwater depending on the proportion of rainfall that leaves the watershed as runoff, but we found no evidence that denitrification occurred during storms. Our results suggest that watershed characteristics control the sources and transport of inorganic N in urban stormwater but that retention of inorganic N at the timescale of individual runoff events is controlled by hydrologic, rather than biogeochemical, mechanisms.« less

  16. Science and Technology Review October/November 2009

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bearinger, J P

    2009-08-21

    This month's issue has the following articles: (1) Award-Winning Collaborations Provide Solutions--Commentary by Steven D. Liedle; (2) Light-Speed Spectral Analysis of a Laser Pulse--An optical device inspects and stops potentially damaging laser pulses; (3) Capturing Waveforms in a Quadrillionth of a Second--The femtoscope, a time microscope, improves the temporal resolution and dynamic range of conventional recording instruments; (4) Gamma-Ray Spectroscopy in the Palm of Your Hand--A miniature gamma-ray spectrometer provides increased resolution at a reduced cost; (5) Building Fusion Targets with Precision Robotics--A robotic system assembles tiny fusion targets with nanometer precision; (6) ROSE: Making Compiler Technology More Accessible--An open-sourcemore » software infrastructure makes powerful compiler techniques available to all programmers; (7) Restoring Sight to the Blind with an Artificial Retina--A retinal prosthesis could restore vision to people suffering from eye diseases; (8) Eradicating the Aftermath of War--A remotely operated system precisely locates buried land mines; (9) Compact Alignment for Diagnostic Laser Beams--A smaller, less expensive device aligns diagnostic laser beams onto targets; and (10) Securing Radiological Sources in Africa--Livermore and other national laboratories are helping African countries secure their nuclear materials.« less

  17. Energy: A Guide to Organizations and Information Resources in the United States.

    ERIC Educational Resources Information Center

    Center for California Public Affairs, Claremont.

    A central source of information on the key organizations concerned with energy in the United States has been compiled. Chapter 2 covers organizations involved with broad questions of energy policy; Chapters 2-6 describe organizations having to do with sources of energy: oil, natural gas, coal, water power, nuclear fission, and alternate sources;…

  18. A spatial model to aggregate point-source and nonpoint-source water-quality data for large areas

    USGS Publications Warehouse

    White, D.A.; Smith, R.A.; Price, C.V.; Alexander, R.B.; Robinson, K.W.

    1992-01-01

    More objective and consistent methods are needed to assess water quality for large areas. A spatial model, one that capitalizes on the topologic relationships among spatial entities, to aggregate pollution sources from upstream drainage areas is described that can be implemented on land surfaces having heterogeneous water-pollution effects. An infrastructure of stream networks and drainage basins, derived from 1:250,000-scale digital-elevation models, define the hydrologic system in this spatial model. The spatial relationships between point- and nonpoint pollution sources and measurement locations are referenced to the hydrologic infrastructure with the aid of a geographic information system. A maximum-branching algorithm has been developed to simulate the effects of distance from a pollutant source to an arbitrary downstream location, a function traditionally employed in deterministic water quality models. ?? 1992.

  19. Open Source Vision

    ERIC Educational Resources Information Center

    Villano, Matt

    2006-01-01

    Increasingly, colleges and universities are turning to open source as a way to meet their technology infrastructure and application needs. Open source has changed life for visionary CIOs and their campus communities nationwide. The author discusses what these technologists see as the benefits--and the considerations.

  20. Statistical yearbook

    DOT National Transportation Integrated Search

    2010-01-01

    The Statistical Yearbook is an annual compilation of a wide range of international economic, social and environmental statistics on over 200 countries and areas, compiled from sources including UN agencies and other international, national and specia...

  1. Geologic map of Detrital, Hualapai, and Sacramento Valleys and surrounding areas, northwest Arizona

    USGS Publications Warehouse

    Beard, L. Sue; Kennedy, Jeffrey; Truini, Margot; Felger, Tracey

    2011-01-01

    A 1:250,000-scale geologic map and report covering the Detrital, Hualapai, and Sacramento valleys in northwest Arizona is presented for the purpose of improving understanding of the geology and geohydrology of the basins beneath those valleys. The map was compiled from existing geologic mapping, augmented by digital photogeologic reconnaissance mapping. The most recent geologic map for the area, and the only digital one, is the 1:1,000,000-scale Geologic Map of Arizona. The larger scale map presented here includes significantly more detailed geology than the Geologic Map of Arizona in terms of accuracy of geologic unit contacts, number of faults, fault type, fault location, and details of Neogene and Quaternary deposits. Many sources were used to compile the geology; the accompanying geodatabase includes a source field in the polygon feature class that lists source references for polygon features. The citations for the source field are included in the reference section.

  2. Debris flow susceptibility mapping using a qualitative heuristic method and Flow-R along the Yukon Alaska Highway Corridor, Canada

    NASA Astrophysics Data System (ADS)

    Blais-Stevens, A.; Behnia, P.

    2016-02-01

    This research activity aimed at reducing risk to infrastructure, such as a proposed pipeline route roughly parallel to the Yukon Alaska Highway Corridor (YAHC), by filling geoscience knowledge gaps in geohazards. Hence, the Geological Survey of Canada compiled an inventory of landslides including debris flow deposits, which were subsequently used to validate two different debris flow susceptibility models. A qualitative heuristic debris flow susceptibility model was produced for the northern region of the YAHC, from Kluane Lake to the Alaska border, by integrating data layers with assigned weights and class ratings. These were slope angle, slope aspect, surficial geology, plan curvature, and proximity to drainage system. Validation of the model was carried out by calculating a success rate curve which revealed a good correlation with the susceptibility model and the debris flow deposit inventory compiled from air photos, high-resolution satellite imagery, and field verification. In addition, the quantitative Flow-R method was tested in order to define the potential source and debris flow susceptibility for the southern region of Kluane Lake, an area where documented debris flow events have blocked the highway in the past (e.g. 1988). Trial and error calculations were required for this method because there was not detailed information on the debris flows for the YAHC to allow us to define threshold values for some parameters when calculating source areas, spreading, and runout distance. Nevertheless, correlation with known documented events helped define these parameters and produce a map that captures most of the known events and displays debris flow susceptibility in other, usually smaller, steep channels that had not been previously documented.

  3. Debris flow susceptibility mapping using a qualitative heuristic method and Flow-R along the Yukon Alaska Highway Corridor, Canada

    NASA Astrophysics Data System (ADS)

    Blais-Stevens, A.; Behnia, P.

    2015-05-01

    This research activity aimed at reducing risk to infrastructure, such as a proposed pipeline route roughly parallel to the Yukon Alaska Highway Corridor (YAHC) by filling geoscience knowledge gaps in geohazards. Hence, the Geological Survey of Canada compiled an inventory of landslides including debris flow deposits, which were subsequently used to validate two different debris flow susceptibility models. A qualitative heuristic debris flow susceptibility model was produced for the northern region of the YAHC, from Kluane Lake to the Alaska border, by integrating data layers with assigned weights and class ratings. These were slope angle, slope aspect (derived from a 5 m × 5 m DEM), surficial geology, permafrost distribution, and proximity to drainage system. Validation of the model was carried out by calculating a success rate curve which revealed a good correlation with the susceptibility model and the debris flow deposit inventory compiled from air photos, high resolution satellite imagery, and field verification. In addition, the quantitative Flow-R method was tested in order to define the potential source and debris flow susceptibility for the southern region of Kluane Lake, an area where documented debris flow events have blocked the highway in the past (e.g., 1988). Trial and error calculations were required for this method because there was not detailed information on the debris flows for the YAHC to allow us to define threshold values for some parameters when calculating source areas, spreading, and runout distance. Nevertheless, correlation with known documented events helped define these parameters and produce a map that captures most of the known events and displays debris flow susceptibility in other, usually smaller, steep channels that had not been previously documented.

  4. OMPC: an Open-Source MATLAB®-to-Python Compiler

    PubMed Central

    Jurica, Peter; van Leeuwen, Cees

    2008-01-01

    Free access to scientific information facilitates scientific progress. Open-access scientific journals are a first step in this direction; a further step is to make auxiliary and supplementary materials that accompany scientific publications, such as methodological procedures and data-analysis tools, open and accessible to the scientific community. To this purpose it is instrumental to establish a software base, which will grow toward a comprehensive free and open-source language of technical and scientific computing. Endeavors in this direction are met with an important obstacle. MATLAB®, the predominant computation tool in many fields of research, is a closed-source commercial product. To facilitate the transition to an open computation platform, we propose Open-source MATLAB®-to-Python Compiler (OMPC), a platform that uses syntax adaptation and emulation to allow transparent import of existing MATLAB® functions into Python programs. The imported MATLAB® modules will run independently of MATLAB®, relying on Python's numerical and scientific libraries. Python offers a stable and mature open source platform that, in many respects, surpasses commonly used, expensive commercial closed source packages. The proposed software will therefore facilitate the transparent transition towards a free and general open-source lingua franca for scientific computation, while enabling access to the existing methods and algorithms of technical computing already available in MATLAB®. OMPC is available at http://ompc.juricap.com. PMID:19225577

  5. Freshwater Biological Traits Database (Traits)

    EPA Pesticide Factsheets

    The traits database was compiled for a project on climate change effects on river and stream ecosystems. The traits data, gathered from multiple sources, focused on information published or otherwise well-documented by trustworthy sources.

  6. Application of remote sensor data to geologic analysis of the Bonanza Test Site Colorado

    NASA Technical Reports Server (NTRS)

    Lee, K. (Compiler)

    1973-01-01

    A geologic map of the Bonanza Test Site is nearing completion. Using published large scale geologic maps from various sources, the geology of the area is being compiled on a base scaled at 1:250,000. Sources of previously published geologic mapping include: (1) USGS Bulletins; (2) professional papers and geologic quadrangle maps; (3) Bureau of Mines reports; (4) Colorado School of Mines quarterlies; and (5) Rocky Mountain Association of Geologist Guidebooks. This compilation will be used to evaluate ERTS, Skylab, and remote sensing underflight data.

  7. [Life cycle assessment of the infrastructure for hydrogen sources of fuel cell vehicles].

    PubMed

    Feng, Wen; Wang, Shujuan; Ni, Weidou; Chen, Changhe

    2003-05-01

    In order to promote the application of life cycle assessment and provide references for China to make the project of infrastructure for hydrogen sources of fuel cell vehicles in the near future, 10 feasible plans of infrastructure for hydrogen sources of fuel cell vehicles were designed according to the current technologies of producing, storing and transporting hydrogen. Then life cycle assessment was used as a tool to evaluate the environmental performances of the 10 plans. The standard indexes of classified environmental impacts of every plan were gotten and sensitivity analysis for several parameters were carried out. The results showed that the best plan was that hydrogen will be produced by natural gas steam reforming in central factory, then transported to refuelling stations through pipelines, and filled to fuel cell vehicles using hydrogen gas at last.

  8. Digital Bedrock Compilation: A Geodatabase Covering Forest Service Lands in California

    NASA Astrophysics Data System (ADS)

    Elder, D.; de La Fuente, J. A.; Reichert, M.

    2010-12-01

    This digital database contains bedrock geologic mapping for Forest Service lands within California. This compilation began in 2004 and the first version was completed in 2005. Second publication of this geodatabase was completed in 2010 and filled major gaps in the southern Sierra Nevada and Modoc/Medicine Lake/Warner Mountains areas. This digital map database was compiled from previously published and unpublished geologic mapping, with source mapping and review from California Geological Survey, the U.S. Geological Survey and others. Much of the source data was itself compilation mapping. This geodatabase is huge, containing ~107,000 polygons and ~ 280,000 arcs. Mapping was compiled from more than one thousand individual sources and covers over 41,000,000 acres (~166,000 km2). It was compiled from source maps at various scales - from ~ 1:4,000 to 1:250,000 and represents the best available geologic mapping at largest scale possible. An estimated 70-80% of the source information was digitized from geologic mapping at 1:62,500 scale or better. Forest Service ACT2 Enterprise Team compiled the bedrock mapping and developed a geodatabase to store this information. This geodatabase supports feature classes for polygons (e.g, map units), lines (e.g., contacts, boundaries, faults and structural lines) and points (e.g., orientation data, structural symbology). Lookup tables provide detailed information for feature class items. Lookup/type tables contain legal values and hierarchical groupings for geologic ages and lithologies. Type tables link coded values with descriptions for line and point attributes, such as line type, line location and point type. This digital mapping is at the core of many quantitative analyses and derivative map products. Queries of the database are used to produce maps and to quantify rock types of interest. These include the following: (1) ultramafic rocks - where hazards from naturally occurring asbestos are high, (2) granitic rocks - increased erosion hazards, (3) limestone, chert, sedimentary rocks - paleontological resources (Potential Fossil Yield Classification maps), (4) calcareous rocks (cave resources, water chemistry), and (5) lava flows - lava tubes (more caves). Map unit groupings (e.g., belts, terranes, tectonic & geomorphic provinces) can also be derived from the geodatabase. Digital geologic mapping was used in ground water modeling to predict effects of tunneling through the San Bernardino Mountains. Bedrock mapping is used in models that characterize watershed sediment regimes and quantify anthropogenic influences. When combined with digital geomorphology mapping, this geodatabase helps to assess landslide hazards.

  9. A compiled catalog of rotation measures of radio point sources

    NASA Astrophysics Data System (ADS)

    Xu, Jun; Han, Jin-Lin

    2014-08-01

    We compiled a catalog of Faraday rotation measures (RMs) for 4553 extragalactic radio point sources published in literature. These RMs were derived from multi-frequency polarization observations. The RM data are compared to those in the NRAO VLA Sky Survey (NVSS) RM catalog. We reveal a systematic uncertainty of about 10.0 ± 1.5 rad m-2 in the NVSS RM catalog. The Galactic foreground RM is calculated through a weighted averaging method by using the compiled RM catalog together with the NVSS RM catalog, with careful consideration of uncertainties in the RM data. The data from the catalog and the interface for the Galactic foreground RM calculations are publicly available on the webpage: http://zmtt.bao.ac.cn/RM/.

  10. Final Report A Multi-Language Environment For Programmable Code Optimization and Empirical Tuning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yi, Qing; Whaley, Richard Clint; Qasem, Apan

    This report summarizes our effort and results of building an integrated optimization environment to effectively combine the programmable control and the empirical tuning of source-to-source compiler optimizations within the framework of multiple existing languages, specifically C, C++, and Fortran. The environment contains two main components: the ROSE analysis engine, which is based on the ROSE C/C++/Fortran2003 source-to-source compiler developed by Co-PI Dr.Quinlan et. al at DOE/LLNL, and the POET transformation engine, which is based on an interpreted program transformation language developed by Dr. Yi at University of Texas at San Antonio (UTSA). The ROSE analysis engine performs advanced compiler analysis,more » identifies profitable code transformations, and then produces output in POET, a language designed to provide programmable control of compiler optimizations to application developers and to support the parameterization of architecture-sensitive optimizations so that their configurations can be empirically tuned later. This POET output can then be ported to different machines together with the user application, where a POET-based search engine empirically reconfigures the parameterized optimizations until satisfactory performance is found. Computational specialists can write POET scripts to directly control the optimization of their code. Application developers can interact with ROSE to obtain optimization feedback as well as provide domain-specific knowledge and high-level optimization strategies. The optimization environment is expected to support different levels of automation and programmer intervention, from fully-automated tuning to semi-automated development and to manual programmable control.« less

  11. mmpdb: An Open-Source Matched Molecular Pair Platform for Large Multiproperty Data Sets.

    PubMed

    Dalke, Andrew; Hert, Jérôme; Kramer, Christian

    2018-05-29

    Matched molecular pair analysis (MMPA) enables the automated and systematic compilation of medicinal chemistry rules from compound/property data sets. Here we present mmpdb, an open-source matched molecular pair (MMP) platform to create, compile, store, retrieve, and use MMP rules. mmpdb is suitable for the large data sets typically found in pharmaceutical and agrochemical companies and provides new algorithms for fragment canonicalization and stereochemistry handling. The platform is written in Python and based on the RDKit toolkit. It is freely available from https://github.com/rdkit/mmpdb .

  12. Ground Operations Aerospace Language (GOAL). Volume 2: Compiler

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The principal elements and functions of the Ground Operations Aerospace Language (GOAL) compiler are presented. The technique used to transcribe the syntax diagrams into machine processable format for use by the parsing routines is described. An explanation of the parsing technique used to process GOAL source statements is included. The compiler diagnostics and the output reports generated during a GOAL compilation are explained. A description of the GOAL program package is provided.

  13. On Fusing Recursive Traversals of K-d Trees

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rajbhandari, Samyam; Kim, Jinsung; Krishnamoorthy, Sriram

    Loop fusion is a key program transformation for data locality optimization that is implemented in production compilers. But optimizing compilers currently cannot exploit fusion opportunities across a set of recursive tree traversal computations with producer-consumer relationships. In this paper, we develop a compile-time approach to dependence characterization and program transformation to enable fusion across recursively specified traversals over k-ary trees. We present the FuseT source-to-source code transformation framework to automatically generate fused composite recursive operators from an input program containing a sequence of primitive recursive operators. We use our framework to implement fused operators for MADNESS, Multiresolution Adaptive Numerical Environmentmore » for Scientific Simulation. We show that locality optimization through fusion can offer more than an order of magnitude performance improvement.« less

  14. Estimated water use, by county, in North Carolina, 1990

    USGS Publications Warehouse

    Terziotti, Silvia; Schrader, Tony P.; Treece, M.W.

    1994-01-01

    Data on water use in North Carolina were compiled for 1990 as part of a cooperative agreement between the U.S. Geological Survey and the Division of Water Resources of the North Carolina Department of Environment, Health, and Natural Resources. Data were compiled from a number of Federal, State, and private sources for the offstream water-use categories of public supply, domestic, commercial, industrial, mining, livestock, irrigation, and thermoelectric-power generation. Data also were collected for instream use from hydroelectric facilities. Total estimated offstream water use in the State for 1990 was about 8,940 million gallons per day. About 95 percent of the water withdrawn was from surface-water sources. Thermoelectric-power generation accounted for about 81 percent of all withdrawals. Data for instream water use for hydroelectric-power generation also were compiled. This instream water use totaled about 66,900 million gallons per day. eAch water-use category is summarized in this report by county and source of water supply.

  15. caGrid 1.0 : an enterprise Grid infrastructure for biomedical research.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oster, S.; Langella, S.; Hastings, S.

    To develop software infrastructure that will provide support for discovery, characterization, integrated access, and management of diverse and disparate collections of information sources, analysis methods, and applications in biomedical research. Design: An enterprise Grid software infrastructure, called caGrid version 1.0 (caGrid 1.0), has been developed as the core Grid architecture of the NCI-sponsored cancer Biomedical Informatics Grid (caBIG{trademark}) program. It is designed to support a wide range of use cases in basic, translational, and clinical research, including (1) discovery, (2) integrated and large-scale data analysis, and (3) coordinated study. Measurements: The caGrid is built as a Grid software infrastructure andmore » leverages Grid computing technologies and the Web Services Resource Framework standards. It provides a set of core services, toolkits for the development and deployment of new community provided services, and application programming interfaces for building client applications. Results: The caGrid 1.0 was released to the caBIG community in December 2006. It is built on open source components and caGrid source code is publicly and freely available under a liberal open source license. The core software, associated tools, and documentation can be downloaded from the following URL: .« less

  16. A ROSE-based OpenMP 3.0 Research Compiler Supporting Multiple Runtime Libraries

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liao, C; Quinlan, D; Panas, T

    2010-01-25

    OpenMP is a popular and evolving programming model for shared-memory platforms. It relies on compilers for optimal performance and to target modern hardware architectures. A variety of extensible and robust research compilers are key to OpenMP's sustainable success in the future. In this paper, we present our efforts to build an OpenMP 3.0 research compiler for C, C++, and Fortran; using the ROSE source-to-source compiler framework. Our goal is to support OpenMP research for ourselves and others. We have extended ROSE's internal representation to handle all of the OpenMP 3.0 constructs and facilitate their manipulation. Since OpenMP research is oftenmore » complicated by the tight coupling of the compiler translations and the runtime system, we present a set of rules to define a common OpenMP runtime library (XOMP) on top of multiple runtime libraries. These rules additionally define how to build a set of translations targeting XOMP. Our work demonstrates how to reuse OpenMP translations across different runtime libraries. This work simplifies OpenMP research by decoupling the problematic dependence between the compiler translations and the runtime libraries. We present an evaluation of our work by demonstrating an analysis tool for OpenMP correctness. We also show how XOMP can be defined using both GOMP and Omni and present comparative performance results against other OpenMP compilers.« less

  17. The HEAO A-1 X Ray Source Catalog (Wood Et Al. 1984): Documentation for the Machine-Readable Version

    NASA Technical Reports Server (NTRS)

    Warren, Wayne H., Jr.

    1990-01-01

    The machine-readable version of the catalog, as it is currently being distributed from the Astronomical Data Center, is described. The catalog is a compilation of data for 842 sources detected with the U.S. Naval Research Laboratory Large Area Sky Survey Experiment flown aboard the HEAO 1 satellite. The data include source identifications, positions, error boxes, mean X-ray intensities, and cross identifications to other source designations.

  18. Perspex machine: V. Compilation of C programs

    NASA Astrophysics Data System (ADS)

    Spanner, Matthew P.; Anderson, James A. D. W.

    2006-01-01

    The perspex machine arose from the unification of the Turing machine with projective geometry. The original, constructive proof used four special, perspective transformations to implement the Turing machine in projective geometry. These four transformations are now generalised and applied in a compiler, implemented in Pop11, that converts a subset of the C programming language into perspexes. This is interesting both from a geometrical and a computational point of view. Geometrically, it is interesting that program source can be converted automatically to a sequence of perspective transformations and conditional jumps, though we find that the product of homogeneous transformations with normalisation can be non-associative. Computationally, it is interesting that program source can be compiled for a Reduced Instruction Set Computer (RISC), the perspex machine, that is a Single Instruction, Zero Exception (SIZE) computer.

  19. Compilation of fatigue, fatigue-crack propagation, and fracture data for 2024 and 7075 aluminum, Ti-6Al-4V titanium, and 300M steel. Volume 1: Description of data and data storage on magnetic tape. Volume 2: Data tape (7-track magnetic tape)

    NASA Technical Reports Server (NTRS)

    Rice, R. C.; Reynolds, J. L.

    1976-01-01

    Fatigue, fatigue-crack-propagation, and fracture data compiled and stored on magnetic tape are documented. Data for 202 and 7075 aluminum alloys, Ti-6Al-4V titanium alloy, and 300M steel are included in the compilation. Approximately 4,500 fatigue, 6,500 fatigue-crack-propagation, and 1,500 fracture data points are stored on magnetic tape. Descriptions of the data, an index to the data on the magnetic tape, information on data storage format on the tape, a listing of all data source references, and abstracts of other pertinent test information from each data source reference are included.

  20. OMPC: an Open-Source MATLAB-to-Python Compiler.

    PubMed

    Jurica, Peter; van Leeuwen, Cees

    2009-01-01

    Free access to scientific information facilitates scientific progress. Open-access scientific journals are a first step in this direction; a further step is to make auxiliary and supplementary materials that accompany scientific publications, such as methodological procedures and data-analysis tools, open and accessible to the scientific community. To this purpose it is instrumental to establish a software base, which will grow toward a comprehensive free and open-source language of technical and scientific computing. Endeavors in this direction are met with an important obstacle. MATLAB((R)), the predominant computation tool in many fields of research, is a closed-source commercial product. To facilitate the transition to an open computation platform, we propose Open-source MATLAB((R))-to-Python Compiler (OMPC), a platform that uses syntax adaptation and emulation to allow transparent import of existing MATLAB((R)) functions into Python programs. The imported MATLAB((R)) modules will run independently of MATLAB((R)), relying on Python's numerical and scientific libraries. Python offers a stable and mature open source platform that, in many respects, surpasses commonly used, expensive commercial closed source packages. The proposed software will therefore facilitate the transparent transition towards a free and general open-source lingua franca for scientific computation, while enabling access to the existing methods and algorithms of technical computing already available in MATLAB((R)). OMPC is available at http://ompc.juricap.com.

  1. America's freight transportation gateways : connecting our nation to places and markets abroad : [2009

    DOT National Transportation Integrated Search

    2009-11-01

    This report ranks freight gateways by the value of merchandise trade they handle. Value : data were compiled from multiple sources, allowing comparison of all the freight modes. : See box 2 for a detailed description of the freight data sources. : Th...

  2. Incorporating Green Infrastructure into TMDLs

    EPA Pesticide Factsheets

    The fact sheet provides examples of how some states describe green infrastructure and low impact development activities in their TMDL implementation sections to address stormwater-source impaired waters.

  3. Continuation of research into language concepts for the mission support environment: Source code

    NASA Technical Reports Server (NTRS)

    Barton, Timothy J.; Ratner, Jeremiah M.

    1991-01-01

    Research into language concepts for the Mission Control Center is presented. A computer code for source codes is presented. The file contains the routines which allow source code files to be created and compiled. The build process assumes that all elements and the COMP exist in the current directory. The build process places as much code generation as possible on the preprocessor as possible. A summary is given of the source files as used and/or manipulated by the build routine.

  4. Almanac of American Education, 2011

    ERIC Educational Resources Information Center

    Bernan Press, 2011

    2011-01-01

    "The Almanac of American Education" is an easy-to-use, single-volume source designed to help users understand and compare the quality of education at the national, state, and county levels. Compiled from official U.S. government and reliable private sources, "The Almanac" contains historical and current data, insightful…

  5. Principal Facts for Gravity Data Collected in Wisconsin: A Web Site and CD-ROM for Distribution of Data

    USGS Publications Warehouse

    Snyder, Stephen L.; Geister, Daniel W.; Daniels, David L.; Ervin, C. Patrick

    2004-01-01

    Principal facts for 40,488 gravity stations covering the entire state of Wisconsin are presented here in digital form. This is a compilation of previously published data collected between 1948 and 1992 from numerous sources, along with over 10,000 new gravity stations collected by the USGS since 1999. Also included are 550 gravity stations from previously unpublished sources. Observed gravity and complete-Bouguer gravity anomaly data for this statewide compilation are included here. Altogether, 14 individual surveys are presented here.

  6. Selected Instrumentation Films, 1969-1970.

    ERIC Educational Resources Information Center

    Simmons, Raymond L., Ed.

    This list of currently available films and filmstrips pertinent to instrumentation has been compiled from information solicited from many government and private sources. The 1969 compilation has been organized into the following eight categories: (1) principles of measurement and basic measurements; (2) analysis instrumentation; (3) automation and…

  7. ACToR Chemical Structure processing using Open Source ChemInformatics Libraries (FutureToxII)

    EPA Science Inventory

    ACToR (Aggregated Computational Toxicology Resource) is a centralized database repository developed by the National Center for Computational Toxicology (NCCT) at the U.S. Environmental Protection Agency (EPA). Free and open source tools were used to compile toxicity data from ove...

  8. S3DB core: a framework for RDF generation and management in bioinformatics infrastructures

    PubMed Central

    2010-01-01

    Background Biomedical research is set to greatly benefit from the use of semantic web technologies in the design of computational infrastructure. However, beyond well defined research initiatives, substantial issues of data heterogeneity, source distribution, and privacy currently stand in the way towards the personalization of Medicine. Results A computational framework for bioinformatic infrastructure was designed to deal with the heterogeneous data sources and the sensitive mixture of public and private data that characterizes the biomedical domain. This framework consists of a logical model build with semantic web tools, coupled with a Markov process that propagates user operator states. An accompanying open source prototype was developed to meet a series of applications that range from collaborative multi-institution data acquisition efforts to data analysis applications that need to quickly traverse complex data structures. This report describes the two abstractions underlying the S3DB-based infrastructure, logical and numerical, and discusses its generality beyond the immediate confines of existing implementations. Conclusions The emergence of the "web as a computer" requires a formal model for the different functionalities involved in reading and writing to it. The S3DB core model proposed was found to address the design criteria of biomedical computational infrastructure, such as those supporting large scale multi-investigator research, clinical trials, and molecular epidemiology. PMID:20646315

  9. Marketing the Arts: A Selected and Annotated Bibliography.

    ERIC Educational Resources Information Center

    Nakamoto, Kent, Comp.; Levin, Kathi, Comp.

    Compiled from the marketing, arts, and arts management literature, this selected list includes materials ranging from theoretical articles and reports of studies to practical guides for marketing techniques. It consists of three sections: Primary Sources, Secondary Sources, and Surveys. Of particular interest to the arts administrator, the first…

  10. HAL/S-360 compiler test activity report

    NASA Technical Reports Server (NTRS)

    Helmers, C. T.

    1974-01-01

    The levels of testing employed in verifying the HAL/S-360 compiler were as follows: (1) typical applications program case testing; (2) functional testing of the compiler system and its generated code; and (3) machine oriented testing of compiler implementation on operational computers. Details of the initial test plan and subsequent adaptation are reported, along with complete test results for each phase which examined the production of object codes for every possible source statement.

  11. PCAL: Language Support for Proof-Carrying Authorization Systems

    DTIC Science & Technology

    2009-10-16

    behavior of a compiled program is the same as that of the source program (Theorem 4.1) and that successfully compiled programs cannot fail due to access...semantics, formalize our compilation procedure and show that it preserves the behavior of programs. For simplicity of presentation, we abstract various...H;L ` s (6) if γ :: H;L ` s then H;L ` s↘ γ′ for some γ′. We can now show that compilation preserves the behavior of programs. More precisely, if

  12. Sectoral network on occupational health and safety in agriculture to support enterprises and family farms in Estonia.

    PubMed

    Kurppa, Kari; Tammaru, Eva; Kempinen, Marina; Rünkla, Ester; Sõrra, Jaan; Lehtinen, Suvi

    2006-01-01

    A sectoral network on occupational health and safety in agriculture has been established in Estonia as part of a project that provided support for Estonian accession into European Union. Participating organizations represent farmers' unions at county level, agricultural enterprises, workers' representatives, universities and agricultural expert institutions, and government agencies. The purpose is to provide a shared infrastructure that combines information and other capacities of several organizations and provides a platform for dialogue and co-operation in order to make a greater impact with available resources. The network has a decentralized architecture and is technically managed by an institutionalized secretariat. The network's task forces have compiled a network directory, summarised the capacities and interests of member organizations, made an inventory of existing information and training materials, developed an overall strategy for information management, established an information repository on the Internet, prepared promotional materials, and devised a protocol for agricultural walk-though assessment. A profile on occupational health and safety in Estonian agriculture has been compiled with a rapid assessment approach that collected both quantitative and qualitative information from secondary sources (statistics, documents) and from focus group discussions. The profile is used as an instrument for taking occupational health and safety needs in agriculture into discussion on political arena.

  13. DISTRIBUTION, TYPE, ACCUMULATION AND SOURCE OF MARINE DEBRIS IN THE UNITED STATES, 1989-93

    EPA Science Inventory

    Distribution, type, accumulation, & source of marine debris on coastal beaches and in harbors of the United States were examined from 1989 to 1993. nformation was compiled from annual beach cleanups coordinated by the Center for marine Conservation, quarterly beach surveys at eig...

  14. Preliminary Evaluation of Removing Used Nuclear Fuel from Shutdown Sites

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maheras, Steven J.; Best, Ralph E.; Ross, Steven B.

    A preliminary evaluation of removing spent nuclear fuel (SNF) from 13 shutdown nuclear power plant sites was performed. At these shutdown sites the nuclear power reactors have been permanently shut down and the sites have been decommissioned or are undergoing decommissioning. The shutdown sites were Maine Yankee, Yankee Rowe, Connecticut Yankee, Humboldt Bay, Big Rock Point, Rancho Seco, Trojan, La Crosse, Zion, Crystal River, Kewaunee, San Onofre, and Vermont Yankee. The evaluation was divided into four components: Characterization of the SNF and greater-than-Class C low-level radioactive waste (GTCC waste) inventory A description of the on-site infrastructure at the shutdown sitesmore » An evaluation of the near-site transportation infrastructure and transportation experience at the shutdown sites An evaluation of the actions necessary to prepare for and remove SNF and GTCC waste. The primary sources for the inventory of SNF and GTCC waste were the U.S. Department of Energy (DOE) spent nuclear fuel inventory database, industry publications such as StoreFUEL, and government sources such as the U.S. Nuclear Regulatory Commission. The primary sources for information on the conditions of on-site infrastructure and near-site transportation infrastructure and experience included information collected during site visits, information provided by managers at the shutdown sites, Facility Interface Data Sheets compiled for DOE in 2005, Services Planning Documents prepared for DOE in 1993 and 1994, industry publications such as Radwaste Solutions, and Google Earth. State staff, State Regional Group representatives, a Tribal representative, and a Federal Railroad Administration representative have participated in nine of the shutdown site visits. Every shutdown site was found to have at least one off-site transportation mode option for removing its SNF and GTCC waste; some have multiple options. Experience removing large components during reactor decommissioning provided an important source of information used to identify the transportation mode options for the sites. Especially important in conducting the evaluation were site visits, through which information was obtained that would not have been available otherwise. Extensive photographs taken during the site visits proved to be particularly useful in documenting the current conditions at or near the sites. Additional conclusions from this evaluation include: The 13 shutdown sites use designs from 4 different suppliers involving 11 different (horizontal and vertical) dry storage systems that would require the use of 9 different transportation cask designs to remove the SNF and GTCC waste from the shutdown sites. Although some changes to transportation certificates of compliance will be required, the SNF at the initial 9 shutdown sites (Maine Yankee, Yankee Rowe, Connecticut Yankee, Humboldt Bay, Big Rock Point, Rancho Seco, Trojan, La Crosse, and Zion) is in dual purpose dry storage canisters that can be transported, including a small amount of high-burnup fuel. Most sites indicated that 2-3 years of advance time would be required for its preparations before shipments could begin. Some sites could be ready in less time. As additional sites such as Fort Calhoun, Clinton, Quad Cities, Pilgrim, Oyster Creek, and Diablo Canyon shut down, these sites will be included in updates to the evaluation.« less

  15. Chemical investigation of three plutonium–beryllium neutron sources

    DOE PAGES

    Byerly, Benjamin; Kuhn, Kevin; Colletti, Lisa; ...

    2017-02-03

    Thorough physical and chemical characterization of plutonium–beryllium (PuBe) neutron sources is an important capability with applications ranging from material accountancy to nuclear forensics. Furthermore, characterization of PuBe sources is not trivial owing to range of existing source designs and the need for adequate infrastructure to deal with radiation and protect the analyst. Our study demonstrates a method for characterization of three PuBe sources that includes physical inspection and imaging followed by controlled disassembly and destructive analysis.

  16. Chemical investigation of three plutonium–beryllium neutron sources

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Byerly, Benjamin; Kuhn, Kevin; Colletti, Lisa

    Thorough physical and chemical characterization of plutonium–beryllium (PuBe) neutron sources is an important capability with applications ranging from material accountancy to nuclear forensics. Furthermore, characterization of PuBe sources is not trivial owing to range of existing source designs and the need for adequate infrastructure to deal with radiation and protect the analyst. Our study demonstrates a method for characterization of three PuBe sources that includes physical inspection and imaging followed by controlled disassembly and destructive analysis.

  17. The case for open-source software in drug discovery.

    PubMed

    DeLano, Warren L

    2005-02-01

    Widespread adoption of open-source software for network infrastructure, web servers, code development, and operating systems leads one to ask how far it can go. Will "open source" spread broadly, or will it be restricted to niches frequented by hopeful hobbyists and midnight hackers? Here we identify reasons for the success of open-source software and predict how consumers in drug discovery will benefit from new open-source products that address their needs with increased flexibility and in ways complementary to proprietary options.

  18. 78 FR 17168 - Approval and Promulgation of Implementation Plans; Tennessee; 110(a)(1) and (2) Infrastructure...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-20

    ... and new source review (NSR).\\4\\ \\4\\ This rulemaking only addresses requirements for this element as...)(2)(E): Adequate personnel, funding, and authority. 110(a)(2)(F): Stationary source monitoring and... related to excess emissions during periods of start-up, shutdown, or malfunction at sources (SSM), that...

  19. Simulation and analysis of support hardware for multiple instruction rollback

    NASA Technical Reports Server (NTRS)

    Alewine, Neil J.

    1992-01-01

    Recently, a compiler-assisted approach to multiple instruction retry was developed. In this scheme, a read buffer of size 2N, where N represents the maximum instruction rollback distance, is used to resolve one type of data hazard. This hardware support helps to reduce code growth, compilation time, and some of the performance impacts associated with hazard resolution. The 2N read buffer size requirement of the compiler-assisted approach is worst case, assuring data redundancy for all data required but also providing some unnecessary redundancy. By adding extra bits in the operand field for source 1 and source 2 it becomes possible to design the read buffer to save only those values required, thus reducing the read buffer size requirement. This study measures the effect on performance of a DECstation 3100 running 10 application programs using 6 read buffer configurations at varying read buffer sizes.

  20. Vista-LA: Mapping methane-emitting infrastructure in the Los Angeles megacity

    NASA Astrophysics Data System (ADS)

    Carranza, Valerie; Rafiq, Talha; Frausto-Vicencio, Isis; Hopkins, Francesca M.; Verhulst, Kristal R.; Rao, Preeti; Duren, Riley M.; Miller, Charles E.

    2018-03-01

    Methane (CH4) is a potent greenhouse gas (GHG) and a critical target of climate mitigation efforts. However, actionable emission reduction efforts are complicated by large uncertainties in the methane budget on relevant scales. Here, we present Vista, a Geographic Information System (GIS)-based approach to map potential methane emissions sources in the South Coast Air Basin (SoCAB) that encompasses Los Angeles, an area with a dense, complex mixture of methane sources. The goal of this work is to provide a database that, together with atmospheric observations, improves methane emissions estimates in urban areas with complex infrastructure. We aggregated methane source location information into three sectors (energy, agriculture, and waste) following the frameworks used by the State of California GHG Inventory and the Intergovernmental Panel on Climate Change (IPCC) Guidelines for GHG Reporting. Geospatial modeling was applied to publicly available datasets to precisely geolocate facilities and infrastructure comprising major anthropogenic methane source sectors. The final database, Vista-Los Angeles (Vista-LA), is presented as maps of infrastructure known or expected to emit CH4. Vista-LA contains over 33 000 features concentrated on < 1 % of land area in the region. Currently, Vista-LA is used as a planning and analysis tool for atmospheric measurement surveys of methane sources, particularly for airborne remote sensing, and methane hotspot detection using regional observations. This study represents a first step towards developing an accurate, spatially resolved methane flux estimate for point sources in SoCAB, with the potential to address discrepancies between bottom-up and top-down methane emissions accounting in this region. The Vista-LA datasets and associated metadata are available from the Oak Ridge National Laboratory Distributed Active Archive Center for Biogeochemical Dynamics (ORNL DAAC; https://doi.org/10.3334/ORNLDAAC/1525).

  1. Chronology of KSC and KSC Related Events for 2011

    NASA Technical Reports Server (NTRS)

    Liston, Elaine E. (Compiler)

    2012-01-01

    This document is intended to serve as a record of KSC events and is a reference source for historian and other researchers. Arrangement is by day and month and individual articles are attributed to published sources. Materials were researched and compiled by the KSC Library Archivist for KSC Library Services Contractor, Abacus Technology Corporation.

  2. Chronology of KSC and KSC Related Events for 2008

    NASA Technical Reports Server (NTRS)

    Liston, Elaine E.

    2009-01-01

    This document is intended to serve as a record of KSC events and is a reference source for historians and other researchers. Arrangement is by day and month and individual articles are attributed to published sources. Materials were researched and compiled by the KSC Library Archivist for KSC Library Services Contractor, Abacus Technology Corporation.

  3. Chronology of KSC and KSC Related Events for 2007

    NASA Technical Reports Server (NTRS)

    Liston, Elaine E.

    2008-01-01

    This document is intended to serve as a record of KSC events and is a reference source for historians and other researchers. Arrangement is by day and month and individual articles are attributed to published sources. Materials were researched and compiled by the KSC Library Archivist for KSC Library Services Contractor, InDyne, Inc.

  4. Chronology of KSC and KSC Related Events for 2006

    NASA Technical Reports Server (NTRS)

    Liston, Elaine E.

    2007-01-01

    This document is intended to serve as a record of KSC events and is a reference source for historians and other researchers. Arrangement is by day and month and individual articles are attributed to published sources. Materials were researched and compiled by the KSC Library Archivist for KSC Library Services Contractor, InDyne, Inc.

  5. Ada Compiler Validation Summary Report: Certificate Number: 901112W1. 11116 Cray Research, Inc., Cray Ada Compiler, Release 2.0, Cray X-MP/EA (Host & Target)

    DTIC Science & Technology

    1990-11-12

    This feature prevents any significant unexpected and undesired size overhead introduced by the automatic inlining of a called subprogram. Any...PRESERVELAYOUT forces the 5.5.1 compiler to maintain the Ada source order of a given record type, thereby, preventing the compiler from performing this...Environment, Volme 2: Prgram nng Guide assignments to the copied array in Ada do not affect the Fortran version of the array. The dimensions and order of

  6. NASA Remote Sensing Observations for Water Resource and Infrastructure Management

    NASA Astrophysics Data System (ADS)

    Granger, S. L.; Armstrong, L.; Farr, T.; Geller, G.; Heath, E.; Hyon, J.; Lavoie, S.; McDonald, K.; Realmuto, V.; Stough, T.; Szana, K.

    2008-12-01

    Decision support tools employed by water resource and infrastructure managers often utilize data products obtained from local sources or national/regional databases of historic surveys and observations. Incorporation of data from these sources can be laborious and time consuming as new products must be identified, cleaned and archived for each new study site. Adding remote sensing observations to the list of sources holds promise for a timely, consistent, global product to aid decision support at regional and global scales by providing global observations of geophysical parameters including soil moisture, precipitation, atmospheric temperature, derived evapotranspiration, and snow extent needed for hydrologic models and decision support tools. However, issues such as spatial and temporal resolution arise when attempting to integrate remote sensing observations into existing decision support tools. We are working to overcome these and other challenges through partnerships with water resource managers, tool developers and other stakeholders. We are developing a new data processing framework, enabled by a core GIS server, to seamlessly pull together observations from disparate sources for synthesis into information products and visualizations useful to the water resources community. A case study approach is being taken to develop the system by working closely with water infrastructure and resource managers to integrate remote observations into infrastructure, hydrologic and water resource decision tools. We present the results of a case study utilizing observations from the PALS aircraft instrument as a proxy for NASA's upcoming Soil Moisture Active Passive (SMAP) mission and an existing commercial decision support tool.

  7. Increasing Usability in Ocean Observing Systems

    NASA Astrophysics Data System (ADS)

    Chase, A. C.; Gomes, K.; O'Reilly, T.

    2005-12-01

    As observatory systems move to more advanced techniques for instrument configuration and data management, standardized frameworks are being developed to benefit from commodities of scale. ACE (A Configuror and Editor) is a tool that was developed for SIAM (Software Infrastructure and Application for MOOS), a framework for the seamless integration of self-describing plug-and-work instruments into the Monterey Ocean Observing System. As a comprehensive solution, the SIAM infrastructure requires a number of processes to be run to configure an instrument for use within its framework. As solutions move from the lab to the field, the steps needed to implement the solution must be made bulletproof so that they may be used in the field with confidence. Loosely defined command line interfaces don't always provide enough user feedback and business logic can be difficult to maintain over a series of scripts. ACE is a tool developed for guiding the user through a number of complicated steps, removing the reliance on command-line utilities and reducing the difficulty of completing the necessary steps, while also preventing operator error and enforcing system constraints. Utilizing the cross-platform nature of the Java programming language, ACE provides a complete solution for deploying an instrument within the SIAM infrastructure without depending on special software being installed on the users computer. Requirements such as the installation of a Unix emulator for users running Windows machines, and the installation of, and ability to use, a CVS client, have all been removed by providing the equivalent functionality from within ACE. In order to achieve a "one stop shop" for configuring instruments, ACE had to be written to handle a wide variety of functionality including: compiling java code, interacting with a CVS server and maintaining client-side CVS information, editing XML, interacting with a server side database, and negotiating serial port communications through Java. This paper will address the relative tradeoffs of including all the afore-mentioned functionality in a single tool, its affects on user adoption of the framework (SIAM) it provides access to, as well as further discussion of some of the functionality generally pertinent to data management (XML editing, source code management and compilation, etc).

  8. A Source Book for Taxation: Myths and Realities.

    ERIC Educational Resources Information Center

    Hellman, Mary A.

    This sourcebook is one of two supplementary materials for a newspaper course about taxes and tax reform. Program ideas and sources of related resources compiled in the sourcebook are designed to help civic and group leaders and educators plan educational community programs based on the course topics. Section one describes ways in which the program…

  9. An Enrollment Profile of Nebraska Postsecondary Education...A Staff Report to the Commission.

    ERIC Educational Resources Information Center

    Nebraska Coordinating Commission for Postsecondary Education, Lincoln.

    This enrollment profile is a compilation of enrollment information pertinent to postsecondary education planning. The document contains a compilation of enrollment information collected from a variety of state and national data sources including IPEDS and NEEDS surveys. Nebraska's public and private postsecondary institutions have seen a stable…

  10. Intercultural Communication Bibliography (Part 3).

    ERIC Educational Resources Information Center

    Kitao, Kenji; Kitao, S. Kathleen

    Compiled for Japanese teachers of English who are interested in intercultural communication, this bibliography is broken down into 16 subcategories. Most items in the bibliography come from primary sources, and for each source available at the Michigan State University Library, the bibliography provides call numbers. The first section lists books…

  11. Open source system OpenVPN in a function of Virtual Private Network

    NASA Astrophysics Data System (ADS)

    Skendzic, A.; Kovacic, B.

    2017-05-01

    Using of Virtual Private Networks (VPN) can establish high security level in network communication. VPN technology enables high security networking using distributed or public network infrastructure. VPN uses different security and managing rules inside networks. It can be set up using different communication channels like Internet or separate ISP communication infrastructure. VPN private network makes security communication channel over public network between two endpoints (computers). OpenVPN is an open source software product under GNU General Public License (GPL) that can be used to establish VPN communication between two computers inside business local network over public communication infrastructure. It uses special security protocols and 256-bit Encryption and it is capable of traversing network address translators (NATs) and firewalls. It allows computers to authenticate each other using a pre-shared secret key, certificates or username and password. This work gives review of VPN technology with a special accent on OpenVPN. This paper will also give comparison and financial benefits of using open source VPN software in business environment.

  12. System Data Model (SDM) Source Code

    DTIC Science & Technology

    2012-08-23

    CROSS_COMPILE=/opt/gumstix/build_arm_nofpu/staging_dir/bin/arm-linux-uclibcgnueabi- 8 : CC=$(CROSS_COMPILE)gcc 9: CXX=$(CROSS_COMPILE)g++ 10 : AR...and flags to pass to it 6: LEX=flex 7: LEXFLAGS=-B 8 : 9: ## The parser generator to invoke and flags to pass to it 10 : YACC=bison 11: YACCFLAGS...5: # Point to default PetaLinux root directory 6: ifndef ROOTDIR 7: ROOTDIR=$(PETALINUX)/software/petalinux-dist 8 : endif 9: 10 : PATH:=$(PATH

  13. Command in Air War. Centralized Versus Decentralized Control of Combat Airpower

    DTIC Science & Technology

    2007-06-01

    per response, including the time for reviewing instructions, searching existing data sources , gathering and maintaining the data needed, and completing...facts in chronological order; instead, histori- ans are guided by an agenda formed by their particular exper- tise, affecting source selection and...convey their ideas, notes and logs they compiled at the time, and their official reports on lessons learned. Because these sources come from

  14. Analysis of CERN computing infrastructure and monitoring data

    NASA Astrophysics Data System (ADS)

    Nieke, C.; Lassnig, M.; Menichetti, L.; Motesnitsalis, E.; Duellmann, D.

    2015-12-01

    Optimizing a computing infrastructure on the scale of LHC requires a quantitative understanding of a complex network of many different resources and services. For this purpose the CERN IT department and the LHC experiments are collecting a large multitude of logs and performance probes, which are already successfully used for short-term analysis (e.g. operational dashboards) within each group. The IT analytics working group has been created with the goal to bring data sources from different services and on different abstraction levels together and to implement a suitable infrastructure for mid- to long-term statistical analysis. It further provides a forum for joint optimization across single service boundaries and the exchange of analysis methods and tools. To simplify access to the collected data, we implemented an automated repository for cleaned and aggregated data sources based on the Hadoop ecosystem. This contribution describes some of the challenges encountered, such as dealing with heterogeneous data formats, selecting an efficient storage format for map reduce and external access, and will describe the repository user interface. Using this infrastructure we were able to quantitatively analyze the relationship between CPU/wall fraction, latency/throughput constraints of network and disk and the effective job throughput. In this contribution we will first describe the design of the shared analysis infrastructure and then present a summary of first analysis results from the combined data sources.

  15. Estimation of diffuse and point source microbial pollution in the ribble catchment discharging to bathing waters in the north west of England.

    PubMed

    Wither, A; Greaves, J; Dunhill, I; Wyer, M; Stapleton, C; Kay, D; Humphrey, N; Watkins, J; Francis, C; McDonald, A; Crowther, J

    2005-01-01

    Achieving compliance with the mandatory standards of the 1976 Bathing Water Directive (76/160/EEC) is required at all U.K. identified bathing waters. In recent years, the Fylde coast has been an area of significant investments in 'point source' control, which have not proven, in isolation, to satisfactorily achieve compliance with the mandatory, let alone the guide, levels of water quality in the Directive. The potential impact of riverine sources of pollution was first confirmed after a study in 1997. The completion of sewerage system enhancements offered the potential for the study of faecal indicator delivery from upstream sources comprising both point sources and diffuse agricultural sources. A research project to define these elements commenced in 2001. Initially, a desk study reported here, estimated the principal infrastructure contributions within the Ribble catchment. A second phase of this investigation has involved acquisition of empirical water quality and hydrological data from the catchment during the 2002 bathing season. These data have been used further to calibrate the 'budgets' and 'delivery' modelling and these data are still being analysed. This paper reports the initial desk study approach to faecal indicator budget estimation using available data from the sewerage infrastructure and catchment sources of faecal indicators.

  16. Arid Green Infrastructure for Water Control and Conservation State of the Science and Research Needs for Arid/Semi-Arid Regions

    EPA Science Inventory

    Green infrastructure is an approach to managing wet weather flows using systems and practices that mimic natural processes. It is designed to manage stormwater as close to its source as possible and protect the quality of receiving waters. Although most green infrastructure pract...

  17. Peregrine Software Toolchains | High-Performance Computing | NREL

    Science.gov Websites

    toolchain is an open-source alternative against which many technical applications are natively developed and tested. The Portland Group compilers are not fully supported, but are available to the HPC community. Use Group (PGI) C/C++ and Fortran (partially supported) The PGI Accelerator compilers include NVIDIA GPU

  18. Landslide databases for applied landslide impact research: the example of the landslide database for the Federal Republic of Germany

    NASA Astrophysics Data System (ADS)

    Damm, Bodo; Klose, Martin

    2014-05-01

    This contribution presents an initiative to develop a national landslide database for the Federal Republic of Germany. It highlights structure and contents of the landslide database and outlines its major data sources and the strategy of information retrieval. Furthermore, the contribution exemplifies the database potentials in applied landslide impact research, including statistics of landslide damage, repair, and mitigation. The landslide database offers due to systematic regional data compilation a differentiated data pool of more than 5,000 data sets and over 13,000 single data files. It dates back to 1137 AD and covers landslide sites throughout Germany. In seven main data blocks, the landslide database stores besides information on landslide types, dimensions, and processes, additional data on soil and bedrock properties, geomorphometry, and climatic or other major triggering events. A peculiarity of this landslide database is its storage of data sets on land use effects, damage impacts, hazard mitigation, and landslide costs. Compilation of landslide data is based on a two-tier strategy of data collection. The first step of information retrieval includes systematic web content mining and exploration of online archives of emergency agencies, fire and police departments, and news organizations. Using web and RSS feeds and soon also a focused web crawler, this enables effective nationwide data collection for recent landslides. On the basis of this information, in-depth data mining is performed to deepen and diversify the data pool in key landslide areas. This enables to gather detailed landslide information from, amongst others, agency records, geotechnical reports, climate statistics, maps, and satellite imagery. Landslide data is extracted from these information sources using a mix of methods, including statistical techniques, imagery analysis, and qualitative text interpretation. The landslide database is currently migrated to a spatial database system running on PostgreSQL/PostGIS. This provides advanced functionality for spatial data analysis and forms the basis for future data provision and visualization using a WebGIS application. Analysis of landslide database contents shows that in most parts of Germany landslides primarily affect transportation infrastructures. Although with distinct lower frequency, recent landslides are also recorded to cause serious damage to hydraulic facilities and waterways, supply and disposal infrastructures, sites of cultural heritage, as well as forest, agricultural, and mining areas. The main types of landslide damage are failure of cut and fill slopes, destruction of retaining walls, street lights, and forest stocks, burial of roads, backyards, and garden areas, as well as crack formation in foundations, sewer lines, and building walls. Landslide repair and mitigation at transportation infrastructures is dominated by simple solutions such as catch barriers or rock fall drapery. These solutions are often undersized and fail under stress. The use of costly slope stabilization or protection systems is proven to reduce these risks effectively over longer maintenance cycles. The right balancing of landslide mitigation is thus a crucial problem in managing landslide risks. Development and analysis of such landslide databases helps to support decision-makers in finding efficient solutions to minimize landslide risks for human beings, infrastructures, and financial assets.

  19. A Literature Review and Compilation of Nuclear Waste Management System Attributes for Use in Multi-Objective System Evaluations.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kalinina, Elena Arkadievna; Samsa, Michael

    The purpose of this work was to compile a comprehensive initial set of potential nuclear waste management system attributes. This initial set of attributes is intended to serve as a starting point for additional consideration by system analysts and planners to facilitate the development of a waste management system multi-objective evaluation framework based on the principles and methodology of multi-attribute utility analysis. The compilation is primarily based on a review of reports issued by the Canadian Nuclear Waste Management Organization (NWMO) and the Blue Ribbon Commission on America's Nuclear Future (BRC), but also an extensive review of the available literaturemore » for similar and past efforts as well. Numerous system attributes found in different sources were combined into a single objectives-oriented hierarchical structure. This study provides a discussion of the data sources and the descriptions of the hierarchical structure. A particular focus of this study was on collecting and compiling inputs from past studies that involved the participation of various external stakeholders. However, while the important role of stakeholder input in a country's waste management decision process is recognized in the referenced sources, there are only a limited number of in-depth studies of the stakeholders' differing perspectives. Compiling a comprehensive hierarchical listing of attributes is a complex task since stakeholders have multiple and often conflicting interests. The BRC worked for two years (January 2010 to January 2012) to "ensure it has heard from as many points of view as possible." The Canadian NWMO study took four years and ample resources, involving national and regional stakeholders' dialogs, internet-based dialogs, information and discussion sessions, open houses, workshops, round tables, public attitude research, website, and topic reports. The current compilation effort benefited from the distillation of these many varied inputs conducted by the previous studies.« less

  20. Managing for enhancement of riparian and wetland areas of the Western United States: an annotated bibliography

    Treesearch

    David A. Koehler; Allan E. Thomas

    2000-01-01

    This annotated bibliography contains 1,905 citations from professional journals, symposia, workshops, proceedings, technical reports, and other sources. The intent of this compilation was to: (1) assemble, to the extent possible, all available and accessible publications relating to riparian management within a single source or document; (2) provide managers, field...

  1. Curriculum-Guided Crowd Sourcing of Assessments in a Developing Country

    ERIC Educational Resources Information Center

    Zualkernan, Imran A.; Raza, Anjana; Karim, Asad

    2012-01-01

    Success of Wikipedia has opened a number of possibilities for crowd sourcing learning resources. However, not all crowd sourcing initiatives are successful. For developing countries, adoption factors like lack of infrastructure and poor teacher training can have an impact on success of such systems. This paper presents an exploratory study to…

  2. Monitoring the Status of Educational Research and Development in the United States.

    ERIC Educational Resources Information Center

    Shapiro, Karen; And Others

    This technical report is the companion volume to the 1975 Databook, the purpose of which was the compilation of available data series focused on the organization and functioning of the knowledge production and utilization (KPU) system. This technical report is intended to (1) provide more accurate data sources, (2) analyze existing data sources,…

  3. MetaJC++: A flexible and automatic program transformation technique using meta framework

    NASA Astrophysics Data System (ADS)

    Beevi, Nadera S.; Reghu, M.; Chitraprasad, D.; Vinodchandra, S. S.

    2014-09-01

    Compiler is a tool to translate abstract code containing natural language terms to machine code. Meta compilers are available to compile more than one languages. We have developed a meta framework intends to combine two dissimilar programming languages, namely C++ and Java to provide a flexible object oriented programming platform for the user. Suitable constructs from both the languages have been combined, thereby forming a new and stronger Meta-Language. The framework is developed using the compiler writing tools, Flex and Yacc to design the front end of the compiler. The lexer and parser have been developed to accommodate the complete keyword set and syntax set of both the languages. Two intermediate representations have been used in between the translation of the source program to machine code. Abstract Syntax Tree has been used as a high level intermediate representation that preserves the hierarchical properties of the source program. A new machine-independent stack-based byte-code has also been devised to act as a low level intermediate representation. The byte-code is essentially organised into an output class file that can be used to produce an interpreted output. The results especially in the spheres of providing C++ concepts in Java have given an insight regarding the potential strong features of the resultant meta-language.

  4. Proposal on a sustainable strategy to avoid point source pollution of water with plant protection products.

    PubMed

    Mestdagh, Inge; Bonicelli, Bernard; Laplana, Ramon; Roettele, Manfred

    2009-01-01

    Based on the results and lessons learned from the TOPPS project (Training the Operators to prevent Pollution from Point Sources), a proposal on a sustainable strategy to avoid point source pollution from Plant Protection Products (PPPs) was made. Within this TOPPS project (2005-2008), stakeholders were interviewed and research and analysis were done in 6 pilot catchment areas (BE, FR, DE, DK, IT, PL). Next, there was a repeated survey on operators' perception and opinion to measure changes resulting from TOPPS activities and good and bad practices were defined based on the Best Management Practices (risk analysis). Aim of the proposal is to suggest a strategy considering the differences between countries which can be implemented on Member State level in order to avoid PPP pollution of water through point sources. The methodology used for the up-scaLing proposal consists of the analysis of the current situation, a gap analysis, a consistency analysis and organisational structures for implementation. The up-scaling proposal focuses on the behaviour of the operators, on the equipment and infrastructure available with the operators. The proposal defines implementation structures to support correct behaviour through the development and updating of Best Management Practices (BMPs) and through the transfer and the implementation of these BMPs. Next, the proposal also defines requirements for the improvement of equipment and infrastructure based on the defined key factors related to point source pollution. It also contains cost estimates for technical and infrastructure upgrades to comply with BMPs.

  5. Evaluation of arctic multibeam sonar data quality using nadir crossover error analysis and compilation of a full-resolution data product

    NASA Astrophysics Data System (ADS)

    Flinders, Ashton F.; Mayer, Larry A.; Calder, Brian A.; Armstrong, Andrew A.

    2014-05-01

    We document a new high-resolution multibeam bathymetry compilation for the Canada Basin and Chukchi Borderland in the Arctic Ocean - United States Arctic Multibeam Compilation (USAMBC Version 1.0). The compilation preserves the highest native resolution of the bathymetric data, allowing for more detailed interpretation of seafloor morphology than has been previously possible. The compilation was created from multibeam bathymetry data available through openly accessible government and academic repositories. Much of the new data was collected during dedicated mapping cruises in support of the United States effort to map extended continental shelf regions beyond the 200 nm Exclusive Economic Zone. Data quality was evaluated using nadir-beam crossover-error statistics, making it possible to assess the precision of multibeam depth soundings collected from a wide range of vessels and sonar systems. Data were compiled into a single high-resolution grid through a vertical stacking method, preserving the highest quality data source in any specific grid cell. The crossover-error analysis and method of data compilation can be applied to other multi-source multibeam data sets, and is particularly useful for government agencies targeting extended continental shelf regions but with limited hydrographic capabilities. Both the gridded compilation and an easily distributed geospatial PDF map are freely available through the University of New Hampshire's Center for Coastal and Ocean Mapping (ccom.unh.edu/theme/law-sea). The geospatial pdf is a full resolution, small file-size product that supports interpretation of Arctic seafloor morphology without the need for specialized gridding/visualization software.

  6. Water Quality Instructional Resources Information System (IRIS): A Compilation of Abstracts to Water Quality and Water Resources Materials. Supplement VIII.

    ERIC Educational Resources Information Center

    Ohio State Univ., Columbus, OH. Information Reference Center for Science, Mathematics, and Environmental Education.

    Compiled are abstracts and indexes to selected print and non-print materials; related to wastewater treatment and water quality education and instruction, as well as materials related to pesticides, hazardous wastes, and public participation. Sources of abstracted/indexed materials include all levels of government, private concerns, and…

  7. A survey of compiler optimization techniques

    NASA Technical Reports Server (NTRS)

    Schneck, P. B.

    1972-01-01

    Major optimization techniques of compilers are described and grouped into three categories: machine dependent, architecture dependent, and architecture independent. Machine-dependent optimizations tend to be local and are performed upon short spans of generated code by using particular properties of an instruction set to reduce the time or space required by a program. Architecture-dependent optimizations are global and are performed while generating code. These optimizations consider the structure of a computer, but not its detailed instruction set. Architecture independent optimizations are also global but are based on analysis of the program flow graph and the dependencies among statements of source program. A conceptual review of a universal optimizer that performs architecture-independent optimizations at source-code level is also presented.

  8. Safety impacts of bicycle infrastructure: A critical review.

    PubMed

    DiGioia, Jonathan; Watkins, Kari Edison; Xu, Yanzhi; Rodgers, Michael; Guensler, Randall

    2017-06-01

    This paper takes a critical look at the present state of bicycle infrastructure treatment safety research, highlighting data needs. Safety literature relating to 22 bicycle treatments is examined, including findings, study methodologies, and data sources used in the studies. Some preliminary conclusions related to research efficacy are drawn from the available data and findings in the research. While the current body of bicycle safety literature points toward some defensible conclusions regarding the safety and effectiveness of certain bicycle treatments, such as bike lanes and removal of on-street parking, the vast majority treatments are still in need of rigorous research. Fundamental questions arise regarding appropriate exposure measures, crash measures, and crash data sources. This research will aid transportation departments with regard to decisions about bicycle infrastructure and guide future research efforts toward understanding safety impacts of bicycle infrastructure. Copyright © 2017 Elsevier Ltd and National Safety Council. All rights reserved.

  9. ROSE::FTTransform - A Source-to-Source Translation Framework for Exascale Fault-Tolerance Research

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lidman, J; Quinlan, D; Liao, C

    2012-03-26

    Exascale computing systems will require sufficient resilience to tolerate numerous types of hardware faults while still assuring correct program execution. Such extreme-scale machines are expected to be dominated by processors driven at lower voltages (near the minimum 0.5 volts for current transistors). At these voltage levels, the rate of transient errors increases dramatically due to the sensitivity to transient and geographically localized voltage drops on parts of the processor chip. To achieve power efficiency, these processors are likely to be streamlined and minimal, and thus they cannot be expected to handle transient errors entirely in hardware. Here we present anmore » open, compiler-based framework to automate the armoring of High Performance Computing (HPC) software to protect it from these types of transient processor errors. We develop an open infrastructure to support research work in this area, and we define tools that, in the future, may provide more complete automated and/or semi-automated solutions to support software resiliency on future exascale architectures. Results demonstrate that our approach is feasible, pragmatic in how it can be separated from the software development process, and reasonably efficient (0% to 30% overhead for the Jacobi iteration on common hardware; and 20%, 40%, 26%, and 2% overhead for a randomly selected subset of benchmarks from the Livermore Loops [1]).« less

  10. Scientific names of organisms: attribution, rights, and licensing

    PubMed Central

    2014-01-01

    Background As biological disciplines extend into the ‘big data’ world, they will need a names-based infrastructure to index and interconnect distributed data. The infrastructure must have access to all names of all organisms if it is to manage all information. Those who compile lists of species hold different views as to the intellectual property rights that apply to the lists. This creates uncertainty that impedes the development of a much-needed infrastructure for sharing biological data in the digital world. Findings The laws in the United States of America and European Union are consistent with the position that scientific names of organisms and their compilation in checklists, classifications or taxonomic revisions are not subject to copyright. Compilations of names, such as classifications or checklists, are not creative in the sense of copyright law. Many content providers desire credit for their efforts. Conclusions A ‘blue list’ identifies elements of checklists, classifications and monographs to which intellectual property rights do not apply. To promote sharing, authors of taxonomic content, compilers, intermediaries, and aggregators should receive citable recognition for their contributions, with the greatest recognition being given to the originating authors. Mechanisms for achieving this are discussed. PMID:24495358

  11. Bibliography on Criterion Referenced Measurement.

    ERIC Educational Resources Information Center

    Ellsworth, Randolph A.; Franz, Carleen

    This bibliography contains 262 references on Criterion Referenced Measurement (CRM) that were obtained from the following sources: (1) the author's personal files; (2) a bibliography compiled by Hsu and Boston (ERIC Document #ED 068 531) containing 52 references; (3) a bibliography compiled by Keller (ERIC Document #ED 060 041) containing 116…

  12. Phytochemica: a platform to explore phytochemicals of medicinal plants

    PubMed Central

    Pathania, Shivalika; Ramakrishnan, Sai Mukund; Bagler, Ganesh

    2015-01-01

    Plant-derived molecules (PDMs) are known to be a rich source of diverse scaffolds that could serve as the basis for rational drug design. Structured compilation of phytochemicals from traditional medicinal plants can facilitate prospection for novel PDMs and their analogs as therapeutic agents. Atropa belladonna, Catharanthus roseus, Heliotropium indicum, Picrorhiza kurroa and Podophyllum hexandrum are important Himalayan medicinal plants, reported to have immense therapeutic properties against various diseases. We present Phytochemica, a structured compilation of 963 PDMs from these plants, inclusive of their plant part source, chemical classification, IUPAC names, SMILES notations, physicochemical properties and 3-dimensional structures with associated references. Phytochemica is an exhaustive resource of natural molecules facilitating prospection for therapeutic molecules from medicinally important plants. It also offers refined search option to explore the neighbourhood of chemical space against ZINC database to identify analogs of natural molecules at user-defined cut-off. Availability of phytochemical structured dataset may enable their direct use in in silico drug discovery which will hasten the process of lead identification from natural products under proposed hypothesis, and may overcome urgent need for phytomedicines. Compilation and accessibility of indigenous phytochemicals and their derivatives can be a source of considerable advantage to research institutes as well as industries. Database URL: home.iitj.ac.in/∼bagler/webservers/Phytochemica PMID:26255307

  13. Challenges for Data Archival Centers in Evolving Environmental Sciences

    NASA Astrophysics Data System (ADS)

    Wei, Y.; Cook, R. B.; Gu, L.; Santhana Vannan, S. K.; Beaty, T.

    2015-12-01

    Environmental science has entered into a big data era as enormous data about the Earth environment are continuously collected through field and airborne missions, remote sensing observations, model simulations, sensor networks, etc. An open-access and open-management data infrastructure for data-intensive science is a major grand challenge in global environmental research (BERAC, 2010). Such an infrastructure, as exemplified in EOSDIS, GEOSS, and NSF EarthCube, will provide a complete lifecycle of environmental data and ensures that data will smoothly flow among different phases of collection, preservation, integration, and analysis. Data archival centers, as the data integration units closest to data providers, serve as the source power to compile and integrate heterogeneous environmental data into this global infrastructure. This presentation discusses the interoperability challenges and practices of geosciences from the aspect of data archival centers, based on the operational experiences of the NASA-sponsored Oak Ridge National Laboratory Distributed Active Archive Center (ORNL DAAC) and related environmental data management activities. Specifically, we will discuss the challenges to 1) encourage and help scientists to more actively share data with the broader scientific community, so that valuable environmental data, especially those dark data collected by individual scientists in small independent projects, can be shared and integrated into the infrastructure to tackle big science questions; 2) curate heterogeneous multi-disciplinary data, focusing on the key aspects of identification, format, metadata, data quality, and semantics to make them ready to be plugged into a global data infrastructure. We will highlight data curation practices at the ORNL DAAC for global campaigns such as BOREAS, LBA, SAFARI 2000; and 3) enhance the capabilities to more effectively and efficiently expose and deliver "big" environmental data to broad range of users and systems. Experiences and challenges with integrating large data sets via the ORNL DAAC's data discovery and delivery Web services will be discussed.

  14. Compilation of Earthquakes from 1850-2007 within 200 miles of the Idaho National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    N. Seth Carpenter

    2010-07-01

    An updated earthquake compilation was created for the years 1850 through 2007 within 200 miles of the Idaho National Laboratory. To generate this compilation, earthquake catalogs were collected from several contributing sources and searched for redundant events using the search criteria established for this effort. For all sets of duplicate events, a preferred event was selected, largely based on epicenter-network proximity. All unique magnitude information for each event was added to the preferred event records and these records were used to create the compilation referred to as “INL1850-2007”.

  15. Water Quality Instructional Resources Information System (IRIS): A Compilation of Abstracts to Water Quality and Water Resources Materials.

    ERIC Educational Resources Information Center

    Office of Water Program Operations (EPA), Cincinnati, OH. National Training and Operational Technology Center.

    Presented is a compilation of over 3,000 abstracts on print and non-print materials related to water quality and water resources education. Entries are included from all levels of governmental sources, private concerns, and educational institutions. Each entry includes: title, author, cross references, descriptors, and availability. (CLS)

  16. Sharing the Code.

    ERIC Educational Resources Information Center

    Olsen, Florence

    2003-01-01

    Colleges and universities are beginning to consider collaborating on open-source-code projects as a way to meet critical software and computing needs. Points out the attractive features of noncommercial open-source software and describes some examples in use now, especially for the creation of Web infrastructure. (SLD)

  17. VLSI (Very Large Scale Integrated Circuits) Design with the MacPitts Silicon Compiler.

    DTIC Science & Technology

    1985-09-01

    the background. If the algorithm is not fully debugged, then issue instead macpitts basename herald so MacPitts diagnostics and Liszt diagnostics both...command interpreter. Upon compilation, however, the following LI!F compiler ( Liszt ) diagnostic results, Error: Non-number to minus nil where the first...language used in the MacPitts source code. The more instructive solution is to write the Franz LISP code to decide if a jumper wire is needed, and if so, to

  18. A Selected Bibliography on Employee Attitude Surveys. Special Report.

    ERIC Educational Resources Information Center

    Blasingame, Margaret C.

    This bibliography provides an up-to-date compilation of literature covering all aspects of the employee attitude survey process. It is intended for both researcher and practitioner. A wide variety of sources are represented with a focus on the past 10-12 years of attitude survey research. The 473 citations compiled are categorized under 7 separate…

  19. NASA One-Dimensional Combustor Simulation--User Manual for S1D_ML

    NASA Technical Reports Server (NTRS)

    Stueber, Thomas J.; Paxson, Daniel E.

    2014-01-01

    The work presented in this paper is to promote research leading to a closed-loop control system to actively suppress thermo-acoustic instabilities. To serve as a model for such a closed-loop control system, a one-dimensional combustor simulation composed using MATLAB software tools has been written. This MATLAB based process is similar to a precursor one-dimensional combustor simulation that was formatted as FORTRAN 77 source code. The previous simulation process requires modification to the FORTRAN 77 source code, compiling, and linking when creating a new combustor simulation executable file. The MATLAB based simulation does not require making changes to the source code, recompiling, or linking. Furthermore, the MATLAB based simulation can be run from script files within the MATLAB environment or with a compiled copy of the executable file running in the Command Prompt window without requiring a licensed copy of MATLAB. This report presents a general simulation overview. Details regarding how to setup and initiate a simulation are also presented. Finally, the post-processing section describes the two types of files created while running the simulation and it also includes simulation results for a default simulation included with the source code.

  20. Arid Green Infrastructure for Water Control and Conservation ...

    EPA Pesticide Factsheets

    Green infrastructure is an approach to managing wet weather flows using systems and practices that mimic natural processes. It is designed to manage stormwater as close to its source as possible and protect the quality of receiving waters. Although most green infrastructure practices were first developed in temperate climates, green infrastructure also can be a cost-effective approach to stormwater management and water conservation in arid and semi-arid regions, such as those found in the western and southwestern United States. Green infrastructure practices can be applied at the site, neighborhood and watershed scales. In addition to water management and conservation, implementing green infrastructure confers many social and economic benefits and can address issues of environmental justice. The U.S. Environmental Protection Agency (EPA) commissioned a literature review to identify the state-of-the science practices dealing with water control and conservation in arid and semi-arid regions, with emphasis on these regions in the United States. The search focused on stormwater control measures or practices that slow, capture, treat, infiltrate and/or store runoff at its source (i.e., green infrastructure). The material in Chapters 1 through 3 provides background to EPA’s current activities related to the application of green infrastructure practices in arid and semi-arid regions. An introduction to the topic of green infrastructure in arid and semi-arid regions i

  1. caGrid 1.0: An Enterprise Grid Infrastructure for Biomedical Research

    PubMed Central

    Oster, Scott; Langella, Stephen; Hastings, Shannon; Ervin, David; Madduri, Ravi; Phillips, Joshua; Kurc, Tahsin; Siebenlist, Frank; Covitz, Peter; Shanbhag, Krishnakant; Foster, Ian; Saltz, Joel

    2008-01-01

    Objective To develop software infrastructure that will provide support for discovery, characterization, integrated access, and management of diverse and disparate collections of information sources, analysis methods, and applications in biomedical research. Design An enterprise Grid software infrastructure, called caGrid version 1.0 (caGrid 1.0), has been developed as the core Grid architecture of the NCI-sponsored cancer Biomedical Informatics Grid (caBIG™) program. It is designed to support a wide range of use cases in basic, translational, and clinical research, including 1) discovery, 2) integrated and large-scale data analysis, and 3) coordinated study. Measurements The caGrid is built as a Grid software infrastructure and leverages Grid computing technologies and the Web Services Resource Framework standards. It provides a set of core services, toolkits for the development and deployment of new community provided services, and application programming interfaces for building client applications. Results The caGrid 1.0 was released to the caBIG community in December 2006. It is built on open source components and caGrid source code is publicly and freely available under a liberal open source license. The core software, associated tools, and documentation can be downloaded from the following URL: https://cabig.nci.nih.gov/workspaces/Architecture/caGrid. Conclusions While caGrid 1.0 is designed to address use cases in cancer research, the requirements associated with discovery, analysis and integration of large scale data, and coordinated studies are common in other biomedical fields. In this respect, caGrid 1.0 is the realization of a framework that can benefit the entire biomedical community. PMID:18096909

  2. caGrid 1.0: an enterprise Grid infrastructure for biomedical research.

    PubMed

    Oster, Scott; Langella, Stephen; Hastings, Shannon; Ervin, David; Madduri, Ravi; Phillips, Joshua; Kurc, Tahsin; Siebenlist, Frank; Covitz, Peter; Shanbhag, Krishnakant; Foster, Ian; Saltz, Joel

    2008-01-01

    To develop software infrastructure that will provide support for discovery, characterization, integrated access, and management of diverse and disparate collections of information sources, analysis methods, and applications in biomedical research. An enterprise Grid software infrastructure, called caGrid version 1.0 (caGrid 1.0), has been developed as the core Grid architecture of the NCI-sponsored cancer Biomedical Informatics Grid (caBIG) program. It is designed to support a wide range of use cases in basic, translational, and clinical research, including 1) discovery, 2) integrated and large-scale data analysis, and 3) coordinated study. The caGrid is built as a Grid software infrastructure and leverages Grid computing technologies and the Web Services Resource Framework standards. It provides a set of core services, toolkits for the development and deployment of new community provided services, and application programming interfaces for building client applications. The caGrid 1.0 was released to the caBIG community in December 2006. It is built on open source components and caGrid source code is publicly and freely available under a liberal open source license. The core software, associated tools, and documentation can be downloaded from the following URL: https://cabig.nci.nih.gov/workspaces/Architecture/caGrid. While caGrid 1.0 is designed to address use cases in cancer research, the requirements associated with discovery, analysis and integration of large scale data, and coordinated studies are common in other biomedical fields. In this respect, caGrid 1.0 is the realization of a framework that can benefit the entire biomedical community.

  3. Water Quality Instructional Resources Information System (IRIS): A Compilation of Abstracts to Water Quality and Water Resources Materials, Supplement 28, 1986.

    ERIC Educational Resources Information Center

    Ohio State Univ., Columbus, OH. Information Reference Center for Science, Mathematics, and Environmental Education.

    Compiled are abstracts and indexes to selected print and nonprint materials related to wastewater treatment and water quality education and instruction, as well as materials related to pesticides, hazardous wastes, and public participation. Sources of abstracted/indexed materials include all levels of government, private concerns, and educational…

  4. Water Quality Instructional Resources Information System (IRIS): A Compilation of Abstracts to Water Quality and Water Resources Materials, Supplement 29, 1987.

    ERIC Educational Resources Information Center

    Ohio State Univ., Columbus, OH. Information Reference Center for Science, Mathematics, and Environmental Education.

    Compiled are abstracts and indexes to selected print and non-print materials related to wastewater treatment and water quality education and instruction, as well as materials related to pesticides, hazardous wastes, and public participation. Sources of abstracted/indexed materials include all levels of government, private concerns, and educational…

  5. Water Quality Instructional Resources Information System (IRIS): A Compilation of Abstracts to Water Quality and Water Resources, Supplement XIV (1983).

    ERIC Educational Resources Information Center

    Ohio State Univ., Columbus, OH. Information Reference Center for Science, Mathematics, and Environmental Education.

    Compiled are abstracts and indexes to selected print and non-print materials related to wastewater treatment and water quality education and instruction, as well as materials related to pesticides, hazardous wastes, and public participation. Sources of abstracted/indexed materials include all levels of government, private concerns, and educational…

  6. Water Quality Instructional Resources Information System (IRIS): A Compilation of Abstracts to Water Quality and Water Resources Materials, Supplement 23 (1985).

    ERIC Educational Resources Information Center

    Ohio State Univ., Columbus, OH. Information Reference Center for Science, Mathematics, and Environmental Education.

    Compiled are abstracts and indexes to selected print and non-print materials related to wastewater treatment and water quality education and instruction, as well as materials related to pesticides, hazardous wastes, and public participation. Sources of abstracted/indexed materials include all levels of government, private concerns, and educational…

  7. Water Quality Instructional Resources Information System (IRIS): A Compilation of Abstracts to Water Quality and Water Resources Materials. Supplement XII.

    ERIC Educational Resources Information Center

    Ohio State Univ., Columbus, OH. Information Reference Center for Science, Mathematics, and Environmental Education.

    Compiled are abstracts and indexes to selected print and non-print materials related to wastewater treatment and water quality education and instruction, as well as materials related to pesticides, hazardous wastes, and public participation. Sources of abstracted/indexed materials include all levels of government, private concerns, and educational…

  8. Water Quality Instructional Resources Information System (IRIS): A Compilation of Abstracts to Water Quality and Water Resources Materials, Supplement XIX (1984).

    ERIC Educational Resources Information Center

    Ohio State Univ., Columbus, OH. Information Reference Center for Science, Mathematics, and Environmental Education.

    Compiled are abstracts and indexes to selected print and non-print materials related to wastewater treatment and water quality education and instruction, as well as materials related to pesticides, hazardous wastes, and public participation. Sources of abstracted/indexed materials include all levels of government, private concerns, and educational…

  9. Water Quality Instructional Resources Information System (IRIS): A Compilation of Abstracts to Water Quality and Water Resources Materials, Supplement 25 (1986).

    ERIC Educational Resources Information Center

    Ohio State Univ., Columbus, OH. Information Reference Center for Science, Mathematics, and Environmental Education.

    Compiled are abstracts and indexes to selected print and non-print materials related to wastewater treatment and water quality education and instruction, as well as materials related to hazardous wastes and public participation. Sources of abstracted/indexed materials include all levels of government, private concerns, and educational…

  10. Water Quality Instructional Resources Information System (IRIS). A Compilation of Abstracts to Water Quality and Water Resources Materials. Supplement XVI.

    ERIC Educational Resources Information Center

    Ohio State Univ., Columbus, OH. Information Reference Center for Science, Mathematics, and Environmental Education.

    Compiled are abstracts and indexes to selected print and non-print materials related to wastewater treatment and water quality education and instruction, as well as materials related to pesticides, hazardous wastes, and public participation. Sources of abstracted/indexed materials include all levels of government, private concerns, and educational…

  11. Water Quality Instructional Resources Information System (IRIS): A Compilation of Abstracts to Water Quality and Water Resources Materials. Supplement IX.

    ERIC Educational Resources Information Center

    Ohio State Univ., Columbus, OH. Information Reference Center for Science, Mathematics, and Environmental Education.

    Compiled are abstracts and indexes to selected print and non-print materials related to wastewater treatment and water quality education and instruction, as well as materials related to pesticides, hazardous wastes, and public participation. Sources of abstracted/indexed materials include all levels of government, private concerns, and educational…

  12. Water Quality Instructional Resources Information System (IRIS): A Compilation of Abstracts to Water Quality and Water Resources Materials, Supplement 24 (l985).

    ERIC Educational Resources Information Center

    Ohio State Univ., Columbus, OH. Information Reference Center for Science, Mathematics, and Environmental Education.

    Compiled are abstracts and indexes to selected print and non-print materials related to wastewater treatment and water quality education and instruction, as well as materials related to pesticides, hazardous wastes, and public participation. Sources of abstracted/indexed materials include all levels of government, private concerns, and educational…

  13. Water Quality Instructional Resources Information System (IRIS). A Compilation of Abstracts to Water Quality and Water Resources Materials. Supplement XIII.

    ERIC Educational Resources Information Center

    Ohio State Univ., Columbus, OH. Information Reference Center for Science, Mathematics, and Environmental Education.

    Compiled are abstracts and indexes to selected print and non-print materials related to wastewater treatment and water quality education and instruction, as well as materials related to pesticides, hazardous wastes, and public participation. Sources of abstracted/indexed materials include all levels of government, private concerns, and educational…

  14. Water Quality Instructional Resources Information System (IRIS). A Compilation of Abstracts to Water Quality and Water Resources Materials. Supplement XV.

    ERIC Educational Resources Information Center

    Ohio State Univ., Columbus, OH. Information Reference Center for Science, Mathematics, and Environmental Education.

    Compiled are abstracts and indexes to selected print and non-print materials related to wastewater treatment and water quality education and instruction, as well as materials related to pesticides, hazardous wastes, and public participation. Sources of abstracted/indexed materials include all levels of government, private concerns, and educational…

  15. Water Quality Instructional Resources Information System (IRIS): A Compilation of Abstracts to Water Quality and Water Resources Materials. Supplement XI.

    ERIC Educational Resources Information Center

    Ohio State Univ., Columbus, OH. Information Reference Center for Science, Mathematics, and Environmental Education.

    Compiled are abstracts and indexes to selected print and non-print materials related to wastewater treatment and water quality education and instruction, as well as materials related to pesticides, hazardous wastes, and public participation. Sources of abstracted/indexed materials include all levels of government, private concerns, and educational…

  16. Water Quality Instructional Resources Information System (IRIS): A Compilation of Abstracts to Water Quality and Water Resources Materials. Supplement X.

    ERIC Educational Resources Information Center

    Ohio State Univ., Columbus, OH. Information Reference Center for Science, Mathematics, and Environmental Education.

    Compiled are abstracts and indexes to selected print and non-print materials related to wastewater treatment and water quality education and instruction, as well as materials related to pesticides, hazardous wastes, and public participation. Sources of abstracted/indexed materials include all levels of government, private concerns, and educational…

  17. Water Quality Instructional Resources Information System (IRIS): A Compilation of Abstracts to Water Quality and Water Resources Materials, Supplement 26, 1986.

    ERIC Educational Resources Information Center

    Ohio State Univ., Columbus, OH. Information Reference Center for Science, Mathematics, and Environmental Education.

    Compiled are abstracts and indexes to selected print and non-print materials related to wastewater treatment and water quality education and instruction, as well as materials related to pesticides, hazardous wastes, and public participation. Sources of abstracted/indexed materials include all levels of governmental, private concerns, and…

  18. Water Quality Instructional Resources Information System (IRIS): A Compilation of Abstracts to Water Quality and Water Resources Materials, Supplement XX (1984).

    ERIC Educational Resources Information Center

    Ohio State Univ., Columbus, OH. Information Reference Center for Science, Mathematics, and Environmental Education.

    Compiled are abstracts and indexes to selected print and non-print materials related to wastewater treatment and water quality education and instruction, as well as materials related to pesticides, hazardous wastes, and public participation. Sources of abstracted/indexed materials include all levels of government, private concerns, and educational…

  19. Water Quality Instructional Resources Information System (IRIS): A Compilation of Abstracts to Water Quality and Water Resources Materials, Supplement 22 (1985).

    ERIC Educational Resources Information Center

    Ohio State Univ., Columbus, OH. Information Reference Center for Science, Mathematics, and Environmental Education.

    Compiled are abstracts and indexes to selected print and non-print materials related to wastewater treatment and water quality education and instruction, as well as materials related to pesticides, hazardous wastes, and public participation. Sources of abstracted/indexed materials include all levels of government, private concerns, and educational…

  20. Water Quality Instructional Resources Information System (IRIS). A Compilation of Abstracts to Water Quality and Water Resources Materials. Supplement XVII.

    ERIC Educational Resources Information Center

    Ohio State Univ., Columbus, OH. Information Reference Center for Science, Mathematics, and Environmental Education.

    Compiled are abstracts and indexes to selected print and non-print materials related to wastewater treatment and water quality education and instruction, as well as materials related to pesticides, hazardous wastes, and public participation. Sources of abstracted/indexed materials include all levels of government, private concerns, and educational…

  1. Water Quality Instructional Resources Information System (IRIS): A Compilation of Abstracts to Water Quality and Water Resources Materials, Supplement 21 (1985).

    ERIC Educational Resources Information Center

    Ohio State Univ., Columbus, OH. Information Reference Center for Science, Mathematics, and Environmental Education.

    Compiled are abstracts and indexes to selected print and non-print materials related to wastewater treatment and water quality education and instruction, as well as materials related to pesticides, hazardous wastes, and public participation. Sources of abstracted/indexed materials include all levels of government, private concerns, and educational…

  2. Water Quality Instructional Resources Information System (IRIS): A Compilation of Abstracts to Water Quality and Water Resources Materials, Supplement 27, 1986.

    ERIC Educational Resources Information Center

    Ohio State Univ., Columbus, OH. Information Reference Center for Science, Mathematics, and Environmental Education.

    Compiled are abstracts and indexes to selected print and non-print materials related to wastewater treatment and water quality education and instruction, as well as materials related to pesticides, hazardous wastes, and public participation. Sources of abstracted/indexed materials include all levels of government, private concerns, and educational…

  3. Water Quality Instructional Resources Information System (IRIS): A Compilation of Abstracts to Water Quality and Water Resources Materials, Supplement XVIII.

    ERIC Educational Resources Information Center

    Ohio State Univ., Columbus, OH. Information Reference Center for Science, Mathematics, and Environmental Education.

    Compiled are abstracts and indexes to selected print and non-print materials related to wastewater treatment and water quality education and instruction, as well as materials related to pesticides, hazardous wastes, and public participation. Sources of abstracted/indexed materials include all levels of government, private concerns, and educational…

  4. A Technique for Removing an Important Class of Trojan Horses from High-Order Languages

    DTIC Science & Technology

    1988-01-01

    A Technique for Removing an Important Class of Trojan Horses from High Order Languages∗ John McDermott Center for Secure Information Technology...Ken Thompson described a sophisticated Trojan horse attack on a compiler, one that is undetectable by any search of the compiler source code. The...object of the compiler Trojan horse is to modify the semantics of the high order language in a way that breaks the security of a trusted system generated

  5. Impacts of transportation infrastructure on storm water and surfaces waters in Chittenden County, Vermont, USA.

    DOT National Transportation Integrated Search

    2014-06-01

    Transportation infrastructure is a major source of stormwater runoff that can alter hydrology and : contribute significant loading of nutrients, sediment, and other pollutants to surface waters. These : increased loads can contribute to impairment of...

  6. Public engagement in 3D flood modelling through integrating crowd sourced imagery with UAV photogrammetry to create a 3D flood hydrograph.

    NASA Astrophysics Data System (ADS)

    Bond, C. E.; Howell, J.; Butler, R.

    2016-12-01

    With an increase in flood and storm events affecting infrastructure the role of weather systems, in a changing climate, and their impact is of increasing interest. Here we present a new workflow integrating crowd sourced imagery from the public with UAV photogrammetry to create, the first 3D hydrograph of a major flooding event. On December 30th 2015, Storm Frank resulted in high magnitude rainfall, within the Dee catchment in Aberdeenshire, resulting in the highest ever-recorded river level for the Dee, with significant impact on infrastructure and river morphology. The worst of the flooding occurred during daylight hours and was digitally captured by the public on smart phones and cameras. After the flood event a UAV was used to shoot photogrammetry to create a textured elevation model of the area around Aboyne Bridge on the River Dee. A media campaign aided crowd sourced digital imagery from the public, resulting in over 1,000 images submitted by the public. EXIF data captured by the imagery of the time, date were used to sort the images into a time series. Markers such as signs, walls, fences and roads within the images were used to determine river level height through the flood, and matched onto the elevation model to contour the change in river level. The resulting 3D hydrograph shows the build up of water on the up-stream side of the Bridge that resulted in significant scouring and under-mining in the flood. We have created the first known data based 3D hydrograph for a river section, from a UAV photogrammetric model and crowd sourced imagery. For future flood warning and infrastructure management a solution that allows a realtime hydrograph to be created utilising augmented reality to integrate the river level information in crowd sourced imagery directly onto a 3D model, would significantly improve management planning and infrastructure resilience assessment.

  7. Giving USA: A Compilation of Facts and Trends on American Philanthropy for the Year 1973.

    ERIC Educational Resources Information Center

    American Association of Fund-Raising Counsel, Inc., New York, NY.

    This document presents a compilation of facts and trends on American philanthropy for the year 1973. Emphasis is placed on donors, recipients, sources of philanthropy (individuals, bequests, foundations, and corporations), and areas of philanthropic opportunity (religion, education, health and hospitals, social welfare, arts and humanities, civic…

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Chao

    Sparx, a new environment for Cryo-EM image processing; Cryo-EM, Single particle reconstruction, principal component analysis; Hardware Req.: PC, MAC, Supercomputer, Mainframe, Multiplatform, Workstation. Software Req.: operating system is Unix; Compiler C++; type of files: source code, object library, executable modules, compilation instructions; sample problem input data. Location/transmission: http://sparx-em.org; User manual & paper: http://sparx-em.org;

  9. Open source software to control Bioflo bioreactors.

    PubMed

    Burdge, David A; Libourel, Igor G L

    2014-01-01

    Bioreactors are designed to support highly controlled environments for growth of tissues, cell cultures or microbial cultures. A variety of bioreactors are commercially available, often including sophisticated software to enhance the functionality of the bioreactor. However, experiments that the bioreactor hardware can support, but that were not envisioned during the software design cannot be performed without developing custom software. In addition, support for third party or custom designed auxiliary hardware is often sparse or absent. This work presents flexible open source freeware for the control of bioreactors of the Bioflo product family. The functionality of the software includes setpoint control, data logging, and protocol execution. Auxiliary hardware can be easily integrated and controlled through an integrated plugin interface without altering existing software. Simple experimental protocols can be entered as a CSV scripting file, and a Python-based protocol execution model is included for more demanding conditional experimental control. The software was designed to be a more flexible and free open source alternative to the commercially available solution. The source code and various auxiliary hardware plugins are publicly available for download from https://github.com/LibourelLab/BiofloSoftware. In addition to the source code, the software was compiled and packaged as a self-installing file for 32 and 64 bit windows operating systems. The compiled software will be able to control a Bioflo system, and will not require the installation of LabVIEW.

  10. Open Source Software to Control Bioflo Bioreactors

    PubMed Central

    Burdge, David A.; Libourel, Igor G. L.

    2014-01-01

    Bioreactors are designed to support highly controlled environments for growth of tissues, cell cultures or microbial cultures. A variety of bioreactors are commercially available, often including sophisticated software to enhance the functionality of the bioreactor. However, experiments that the bioreactor hardware can support, but that were not envisioned during the software design cannot be performed without developing custom software. In addition, support for third party or custom designed auxiliary hardware is often sparse or absent. This work presents flexible open source freeware for the control of bioreactors of the Bioflo product family. The functionality of the software includes setpoint control, data logging, and protocol execution. Auxiliary hardware can be easily integrated and controlled through an integrated plugin interface without altering existing software. Simple experimental protocols can be entered as a CSV scripting file, and a Python-based protocol execution model is included for more demanding conditional experimental control. The software was designed to be a more flexible and free open source alternative to the commercially available solution. The source code and various auxiliary hardware plugins are publicly available for download from https://github.com/LibourelLab/BiofloSoftware. In addition to the source code, the software was compiled and packaged as a self-installing file for 32 and 64 bit windows operating systems. The compiled software will be able to control a Bioflo system, and will not require the installation of LabVIEW. PMID:24667828

  11. 49 CFR 15.5 - Sensitive security information.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... sources and methods used to gather or develop threat information, including threats against cyber infrastructure. (8) Security measures. Specific details of aviation or maritime transportation security measures... infrastructure asset information. Any list identifying systems or assets, whether physical or virtual, so vital...

  12. 49 CFR 15.5 - Sensitive security information.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... sources and methods used to gather or develop threat information, including threats against cyber infrastructure. (8) Security measures. Specific details of aviation or maritime transportation security measures... infrastructure asset information. Any list identifying systems or assets, whether physical or virtual, so vital...

  13. 49 CFR 15.5 - Sensitive security information.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... sources and methods used to gather or develop threat information, including threats against cyber infrastructure. (8) Security measures. Specific details of aviation or maritime transportation security measures... infrastructure asset information. Any list identifying systems or assets, whether physical or virtual, so vital...

  14. 49 CFR 15.5 - Sensitive security information.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... sources and methods used to gather or develop threat information, including threats against cyber infrastructure. (8) Security measures. Specific details of aviation or maritime transportation security measures... infrastructure asset information. Any list identifying systems or assets, whether physical or virtual, so vital...

  15. ProjectQ Software Framework

    NASA Astrophysics Data System (ADS)

    Steiger, Damian S.; Haener, Thomas; Troyer, Matthias

    Quantum computers promise to transform our notions of computation by offering a completely new paradigm. A high level quantum programming language and optimizing compilers are essential components to achieve scalable quantum computation. In order to address this, we introduce the ProjectQ software framework - an open source effort to support both theorists and experimentalists by providing intuitive tools to implement and run quantum algorithms. Here, we present our ProjectQ quantum compiler, which compiles a quantum algorithm from our high-level Python-embedded language down to low-level quantum gates available on the target system. We demonstrate how this compiler can be used to control actual hardware and to run high-performance simulations.

  16. Comparing drinking water treatment costs to source water protection costs using time series analysis.

    EPA Science Inventory

    We present a framework to compare water treatment costs to source water protection costs, an important knowledge gap for drinking water treatment plants (DWTPs). This trade-off helps to determine what incentives a DWTP has to invest in natural infrastructure or pollution reductio...

  17. SNL Five-Year Facilities & Infrastructure Plan FY2015-2019

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cipriani, Ralph J.

    2014-12-01

    Sandia’s development vision is to provide an agile, flexible, safer, more secure, and efficient enterprise that leverages the scientific and technical capabilities of the workforce and supports national security requirements in multiple areas. Sandia’s Five-Year Facilities & Infrastructure Planning program represents a tool to budget and prioritize immediate and short-term actions from indirect funding sources in light of the bigger picture of proposed investments from direct-funded, Work for Others and other funding sources. As a complementary F&I investment program, Sandia’s indirect investment program supports incremental achievement of the development vision within a constrained resource environment.

  18. Open source GIS for HIV/AIDS management

    PubMed Central

    Vanmeulebrouk, Bas; Rivett, Ulrike; Ricketts, Adam; Loudon, Melissa

    2008-01-01

    Background Reliable access to basic services can improve a community's resilience to HIV/AIDS. Accordingly, work is being done to upgrade the physical infrastructure in affected areas, often employing a strategy of decentralised service provision. Spatial characteristics are one of the major determinants in implementing services, even in the smaller municipal areas, and good quality spatial information is needed to inform decision making processes. However, limited funds, technical infrastructure and human resource capacity result in little or no access to spatial information for crucial infrastructure development decisions at local level. This research investigated whether it would be possible to develop a GIS for basic infrastructure planning and management at local level. Given the resource constraints of the local government context, particularly in small municipalities, it was decided that open source software should be used for the prototype system. Results The design and development of a prototype system illustrated that it is possible to develop an open source GIS system that can be used within the context of local information management. Usability tests show a high degree of usability for the system, which is important considering the heavy workload and high staff turnover that characterises local government in South Africa. Local infrastructure management stakeholders interviewed in a case study of a South African municipality see the potential for the use of GIS as a communication tool and are generally positive about the use of GIS for these purposes. They note security issues that may arise through the sharing of information, lack of skills and resource constraints as the major barriers to adoption. Conclusion The case study shows that spatial information is an identified need at local level. Open source GIS software can be used to develop a system to provide local-level stakeholders with spatial information. However, the suitability of the technology is only a part of the system – there are wider information and management issues which need to be addressed before the implementation of a local-level GIS for infrastructure management can be successful. PMID:18945338

  19. Low-Temperature Hydrothermal Resource Potential

    DOE Data Explorer

    Katherine Young

    2016-06-30

    Compilation of data (spreadsheet and shapefiles) for several low-temperature resource types, including isolated springs and wells, delineated area convection systems, sedimentary basins and coastal plains sedimentary systems. For each system, we include estimates of the accessible resource base, mean extractable resource and beneficial heat. Data compiled from USGS and other sources. The paper (submitted to GRC 2016) describing the methodology and analysis is also included.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maheras, Steven J.; Best, Ralph E.; Ross, Steven B.

    This report presents a preliminary evaluation of removing used nuclear fuel (UNF) from 12 shutdown nuclear power plant sites. At these shutdown sites the nuclear power reactors have been permanently shut down and the sites have been decommissioned or are undergoing decommissioning. The shutdown sites are Maine Yankee, Yankee Rowe, Connecticut Yankee, Humboldt Bay, Big Rock Point, Rancho Seco, Trojan, La Crosse, Zion, Crystal River, Kewaunee, and San Onofre. The evaluation was divided into four components: characterization of the UNF and greater-than-Class C low-level radioactive waste (GTCC waste) inventory; a description of the on-site infrastructure and conditions relevant to transportationmore » of UNF and GTCC waste; an evaluation of the near-site transportation infrastructure and experience relevant to shipping transportation casks containing UNF and GTCC waste, including identification of gaps in information; and, an evaluation of the actions necessary to prepare for and remove UNF and GTCC waste. The primary sources for the inventory of UNF and GTCC waste are the U.S. Department of Energy (DOE) RW-859 used nuclear fuel inventory database, industry sources such as StoreFUEL and SpentFUEL, and government sources such as the U.S. Nuclear Regulatory Commission. The primary sources for information on the conditions of site and near-site transportation infrastructure and experience included observations and information collected during visits to the Maine Yankee, Yankee Rowe, Connecticut Yankee, Humboldt Bay, Big Rock Point, Rancho Seco, Trojan, La Crosse, and Zion sites; information provided by managers at the shutdown sites; Facility Interface Data Sheets compiled for DOE in 2005; Services Planning Documents prepared for DOE in 1993 and 1994; industry publications such as Radwaste Solutions; and Google Earth. State and Regional Group representatives, a Tribal representative, and a Federal Railroad Administration representative participated in six of the shutdown site visits. Every site was found to have at least one off-site transportation mode option for removing its UNF and GTCC waste; some have multiple options. Experience removing large components during reactor decommissioning provided an important source of information used to identify the transportation mode options for the sites. Especially important in conducting the evaluation were site visits, through which information was obtained that would not have been available otherwise. Extensive photographs taken during the site visits proved to be particularly useful in documenting the current conditions at or near the sites. Additional conclusions from this evaluation include: The 12 shutdown sites use designs from 4 different suppliers involving 9 different (horizontal and vertical) dry storage systems that would require the use of 8 different transportation cask designs to remove the UNF and GTCC waste from the shutdown sites; Although there are common aspects, each site has some unique features and/or conditions; Although some regulatory actions will be required, all UNF at the initial 9 shutdown sites (Maine Yankee, Yankee Rowe, Connecticut Yankee, Humboldt Bay, Big Rock Point, Rancho Seco, Trojan, La Crosse, and Zion) is in licensed systems that can be transported, including a small amount of high-burnup fuel; Each site indicated that 2-3 years of advance time would be required for its preparations before shipments could begin; Most sites have more than one transportation option, e.g., rail, barge, or heavy haul truck, as well as constraints and preferences. It is expected that additional site visits will be conducted to add to the information presented in the evaluation.« less

  1. The Potential for Soviet Penetration of the Pacific Islands: An Assessment,

    DTIC Science & Technology

    1984-12-01

    economies also implies diseconornies of 0 scale in both the public and private sectors. Investments in econmic and social infrastructures such as...source) was less to be feared by the Pacific Islands under current circunstances than econmic dcmination (again, whatever the source). Econacnic

  2. Etymological and Phonetic Changes among Foreign Words in Kiswahili.

    ERIC Educational Resources Information Center

    Patel, R.B.

    1967-01-01

    The author considers the etymological sources and phonetic changes which have occurred in such words as "bangi,""butu,""kalua,""mrututu," and "sambarau." The source of these words, which have found a place in Swahili, has been doubted or could not be established by compilers of different Swahili dictionaries. The author feels that the study and…

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gydesen, S.P.

    The purpose of this letter report is to reconstruct from available information that data which can be used to develop daily reactor operating history for 1960--1964. The information needed for source team calculations (as determined by the Source Terms Task Leader) were extracted and included in this report. The data on the amount of uranium dissolved by the separations plants (expressed both as tons and as MW) is also included in this compilation.

  4. Trading Time with Space - Development of subduction zone parameter database for a maximum magnitude correlation assessment

    NASA Astrophysics Data System (ADS)

    Schaefer, Andreas; Wenzel, Friedemann

    2017-04-01

    Subduction zones are generally the sources of the earthquakes with the highest magnitudes. Not only in Japan or Chile, but also in Pakistan, the Solomon Islands or for the Lesser Antilles, subduction zones pose a significant hazard for the people. To understand the behavior of subduction zones, especially to identify their capabilities to produce maximum magnitude earthquakes, various physical models have been developed leading to a large number of various datasets, e.g. from geodesy, geomagnetics, structural geology, etc. There have been various studies to utilize this data for the compilation of a subduction zone parameters database, but mostly concentrating on only the major zones. Here, we compile the largest dataset of subduction zone parameters both in parameter diversity but also in the number of considered subduction zones. In total, more than 70 individual sources have been assessed and the aforementioned parametric data have been combined with seismological data and many more sources have been compiled leading to more than 60 individual parameters. Not all parameters have been resolved for each zone, since the data completeness depends on the data availability and quality for each source. In addition, the 3D down-dip geometry of a majority of the subduction zones has been resolved using historical earthquake hypocenter data and centroid moment tensors where available and additionally compared and verified with results from previous studies. With such a database, a statistical study has been undertaken to identify not only correlations between those parameters to estimate a parametric driven way to identify potentials for maximum possible magnitudes, but also to identify similarities between the sources themselves. This identification of similarities leads to a classification system for subduction zones. Here, it could be expected if two sources share enough common characteristics, other characteristics of interest may be similar as well. This concept technically trades time with space, considering subduction zones where we have likely not observed the maximum possible event yet. However, by identifying sources of the same class, the not-yet observed temporal behavior can be replaced by spatial similarity among different subduction zones. This database aims to enhance the research and understanding of subduction zones and to quantify their potential in producing mega earthquakes considering potential strong motion impact on nearby cities and their tsunami potential.

  5. Water Quality Instructional Resources Information System (IRIS): A Compilation of Abstracts to Water Quality and Water Resources Materials. Includes May 1979 edition and Supplements 1-15.

    ERIC Educational Resources Information Center

    Ohio State Univ., Columbus, OH. Information Reference Center for Science, Mathematics, and Environmental Education.

    Compiled are abstracts and indexes to selected print and non-print materials related to wastewater treatment and water quality education and instruction, as well as materials related to pesticides, hazardous wastes, and public participation. Sources of abstracts/indexed materials include all levels of government, private concerns, and educational…

  6. Phytochemica: a platform to explore phytochemicals of medicinal plants.

    PubMed

    Pathania, Shivalika; Ramakrishnan, Sai Mukund; Bagler, Ganesh

    2015-01-01

    Plant-derived molecules (PDMs) are known to be a rich source of diverse scaffolds that could serve as the basis for rational drug design. Structured compilation of phytochemicals from traditional medicinal plants can facilitate prospection for novel PDMs and their analogs as therapeutic agents. Atropa belladonna, Catharanthus roseus, Heliotropium indicum, Picrorhiza kurroa and Podophyllum hexandrum are important Himalayan medicinal plants, reported to have immense therapeutic properties against various diseases. We present Phytochemica, a structured compilation of 963 PDMs from these plants, inclusive of their plant part source, chemical classification, IUPAC names, SMILES notations, physicochemical properties and 3-dimensional structures with associated references. Phytochemica is an exhaustive resource of natural molecules facilitating prospection for therapeutic molecules from medicinally important plants. It also offers refined search option to explore the neighbourhood of chemical space against ZINC database to identify analogs of natural molecules at user-defined cut-off. Availability of phytochemical structured dataset may enable their direct use in in silico drug discovery which will hasten the process of lead identification from natural products under proposed hypothesis, and may overcome urgent need for phytomedicines. Compilation and accessibility of indigenous phytochemicals and their derivatives can be a source of considerable advantage to research institutes as well as industries. home.iitj.ac.in/∼bagler/webservers/Phytochemica. © The Author(s) 2015. Published by Oxford University Press.

  7. Lake Ontario geological and geophysical data sources

    USGS Publications Warehouse

    Hutchinson, Deborah R.; Wold, Richard J.

    1979-01-01

    A bibliography of various geological and geophysical data sources was compiled as part of an overall effort to evaluate the status of research on the Great Lakes.  We hope that such a summary will be a catalyst for additional work and be an aid in planning future work.  Our presentation has two forms: maps showing the locations of the different data types and a bibliography which lists the references from the maps and additional relevant papers.  The charts shown in this map summarize the data source for Lake Ontario.

  8. Alternative Fuels Data Center: Federal Legislation Supports and Advances

    Science.gov Websites

    the Act addressed mobile source pollution and set forth limitations intended to reduce emissions. The Energy Policy and Conservation Act of 1975 further regulated mobile sources through the implementation of and alternative fuel infrastructure. For additional information about these and other relevant federal

  9. Safe, Multiphase Bounds Check Elimination in Java

    DTIC Science & Technology

    2010-01-28

    production of mobile code from source code, JIT compilation in the virtual ma- chine, and application code execution. The code producer uses...invariants, and inequality constraint analysis) to identify and prove redundancy of bounds checks. During class-loading and JIT compilation, the virtual...unoptimized code if the speculated invariants do not hold. The combined effect of the multiple phases is to shift the effort as- sociated with bounds

  10. Physical properties of alternatives to the fully halogenated chlorofluorocarbons

    NASA Technical Reports Server (NTRS)

    Mclinden, Mark O.

    1990-01-01

    Presented here are recommended values and correlations of selected physical properties of several alternatives to the fully halogenated chlorocarbons. The quality of the data used in this compilation varies widely, ranging from well-documented, high accuracy measurements from published sources to completely undocumented values listed on anonymous data sheets. That some of the properties for some fluids are available only from the latter type of source is clearly not the desired state of affairs. While some would reject all such data, the compilation given here is presented in the spirit of laying out the present state of knowledge and making available a set of data in a timely manner, even though its quality is sometimes uncertain. The correlations presented here are certain to change quickly as additional information becomes available.

  11. Spectral comparisons of sunlight and different lamps

    NASA Technical Reports Server (NTRS)

    Deitzer, Gerald

    1994-01-01

    The tables in this report were compiled to characterize the spectra of available lamp types and provide comparison to the spectra of sunlight. Table 1 reports the spectral distributions for various lamp sources and compares them to those measured for sunlight. Table 2 provides the amount of energy in Wm(exp -2) relative to the number of photons of PAR (photosynthetically active radiation) (400-700 nm) for each light source.

  12. Assistive Technology: A Funding Workbook. Part I, A Road Map to Funding Sources. Part II, An Outline of Federal Laws and Rules. 1991 Edition. RESNA Technical Assistance Project.

    ERIC Educational Resources Information Center

    Morris, Michael W.; Golinker, Lewis A.

    This compilation on assistive technology devices and services aims to improve understanding of funding streams and creative ways to eliminate funding barriers in Medicaid, special education, and vocational rehabilitation. The first part comprises a workbook titled "A Road Map to Funding Sources." It assists in creating a systematic…

  13. Low-Temperature Hydrothermal Resource Potential Estimate

    DOE Data Explorer

    Katherine Young

    2016-06-30

    Compilation of data (spreadsheet and shapefiles) for several low-temperature resource types, including isolated springs and wells, delineated area convection systems, sedimentary basins and coastal plains sedimentary systems. For each system, we include estimates of the accessible resource base, mean extractable resource and beneficial heat. Data compiled from USGS and other sources. The paper (submitted to GRC 2016) describing the methodology and analysis is also included.

  14. The national assessment of shoreline change: a GIS compilation of vector cliff edges and associated cliff erosion data for the California coast

    USGS Publications Warehouse

    Hapke, Cheryl; Reid, David; Borrelli, Mark

    2007-01-01

    The U.S. Geological Survey has generated a comprehensive data clearinghouse of digital vector cliff edges and associated rates of cliff retreat along the open-ocean California coast. These data, which are presented herein, were compiled as part of the U.S. Geological Survey's National Assessment of Shoreline Change Project. Cliff erosion is a chronic problem along many coastlines of the United States. As coastal populations continue to grow and community infrastructures are threatened by erosion, there is increased demand for accurate information including rates and trends of coastal cliff retreat. There is also a critical need for these data to be consistent from one region to another. One objective of this work is to a develop standard, repeatable methodology for mapping and analyzing cliff edge retreat so that periodic, systematic, and internally consistent updates of cliff edge position and associated rates of erosion can be made at a national scale. This data compilation for open-ocean cliff edges for the California coast is a separate, yet related study to Hapke and others, 2006 documenting shoreline change along sandy shorelines of the California coast, which is itself one in a series that includes the Gulf of Mexico and the Southeast Atlantic coast (Morton and others, 2004; Morton and Miller, 2005). Future reports and data compilations will include coverage of the Northeast U.S., the Great Lakes, Hawaii and Alaska. Cliff edge change is determined by comparing the positions of one historical cliff edge digitized from maps with a modern cliff edge derived from topographic LIDAR (light detection and ranging) surveys. Historical cliff edges for the California coast represent the 1920s-1930s time-period; the most recent cliff edge was delineated using data collected between 1998 and 2002. End-point rate calculations were used to evaluate rates of erosion between the two cliff edges. Please refer to our full report on cliff edge erosion along the California coastline at http://pubs.usgs.gov/of/2007/1133/ for additional information regarding methods and results (Hapke and others, 2007). Data in this report are organized into downloadable layers by region (Northern, Central and Southern California) and are provided as vector datasets with accompanying metadata. Vector cliff edges may represent a compilation of data from one or more sources and the sources used are included in the dataset metadata. This project employs the Environmental Systems Research Institute's (ESRI) ArcGIS as it's Geographic Information System (GIS) mapping tool and contains several data layers (shapefiles) that are used to create a geographic view of the California coast. The vector data form a basemap comprising polygon and line themes that include a U.S. coastline (1:80,000), U.S. cities, and state boundaries.

  15. Alaska’s energy anxiety

    Treesearch

    Rachel White; David Nicholls; Allen Brackley; Robert Deering

    2015-01-01

    A severe climate, high transportation costs, and limited infrastructure have led to high energy costs in many places in Alaska, particularly in remote villages. Some communities are seeking opportunities to convert oil and electrical heat to a renewable energy source such as woody biomass. There are different types of wood energy appliances and fuel sources, and...

  16. Coastal estuaries and lagoons: The delicate balance at the edge of the sea

    USGS Publications Warehouse

    Conrads, Paul A.; Rodgers, Kirk D.; Passeri, Davina L.; Prinos, Scott T.; Smith, Christopher; Swarzenski, Christopher M.; Middleton, Beth A.

    2018-04-19

    Coastal communities are increasingly concerned about the dynamic balance between freshwater and saltwater because of its implications for societal, economic, and ecological resources. While the mixing of freshwater and saltwater sources defines coastal estuaries and lagoons, sudden changes in this balance can have a large effect on critical ecosystems and infrastructure. Any change to the delivery of water from either source has the potential to affect the health of both humans and natural biota and also to damage coastal infrastructure. This fact sheet discusses the potential of major shifts in the dynamic freshwater-saltwater balance to alter the environment and coastal stability.

  17. A Workflow to Model Microbial Loadings in Watersheds ...

    EPA Pesticide Factsheets

    Many watershed models simulate overland and instream microbial fate and transport, but few actually provide loading rates on land surfaces and point sources to the water body network. This paper describes the underlying general equations for microbial loading rates associated with 1) land-applied manure on undeveloped areas from domestic animals; 2) direct shedding on undeveloped lands by domestic animals and wildlife; 3) urban or engineered areas; and 4) point sources that directly discharge to streams from septic systems and shedding by domestic animals. A microbial source module, which houses these formulations, is linked within a workflow containing eight models and a set of databases that form a loosely configured modeling infrastructure which supports watershed-scale microbial source-to-receptor modeling by focusing on animal-impacted catchments. A hypothetical example application – accessing, retrieving, and using real-world data – demonstrates the ability of the infrastructure to automate many of the manual steps associated with a standard watershed assessment, culminating with calibrated flow and microbial densities at the pour point of a watershed. Presented at 2016 Biennial Conference, International Environmental Modelling & Software Society.

  18. Biographical Sources in the Sciences--Life, Earth and Physical Sciences (1989-2006). LC Science Tracer Bullet. TB 06-4

    ERIC Educational Resources Information Center

    Freitag, Ruth, Comp.; Bradley, Michelle Cadoree, Comp.

    2006-01-01

    This guide offers a systematic approach to the wide variety of published biographical information on men and women of science in the life, earth and physical sciences, primarily from 1989 to 2006, and complements Library of Congress Science Tracer Bullet "TB88-3" ("Biographical Sources in the Sciences," compiled 1988 [ED306074]) and "TB06-7"…

  19. Chronology of KSC and KSC Related Events for 2012

    NASA Technical Reports Server (NTRS)

    Liston, Elaine E.

    2013-01-01

    This 2012 Chronology is published to describe and document KSC's role in NASA's progress. Materials for this Chronology were selected from a number of published sources. The document records KSC events of interest to historians and other researchers. Arrangement is by date of occurrence, though the source cited may be dated one or more days after the event. Materials were researched and compiled for publication by Archivist Elaine Liston.

  20. Map showing selected surface-water data for the Manti 30 x 60-minute Quadrangle, Utah

    USGS Publications Warehouse

    Price, Don

    1984-01-01

    This is one of a series of maps that describe the geology and related natural resources of the Manti 30 x 60 minute quadrangle. Streamflow records used to compile this map were collected by the U.S. Geological Survey in cooperation with the Utah Department of Natural Resources, Division of Water Rights, and the Utah Department of Transportation. The principal runoff-producing areas shown on the map were delineated from a work map (scale 1:250,000) compiled to estimate water yields in Utah (Bagley and others, 1964). Sources of information about recorded floods resulting from cloudbursts included Woolley (1946) and Butler and Marsell (1972); sources of information about the chemical quality of streamflow included Hahl and Cabell (1965) and Mundorff and Thompson (1982).

  1. Map showing selected surface-water data for the Huntington 30 x 60-minute quadrangle, Utah

    USGS Publications Warehouse

    Price, Don

    1984-01-01

    This is one of a series of maps that describe the geology and related natural resources of the Huntington 30 x 60-minute quadrangle, Utah. Streamflow records used to compile this map were collected by the U.S. Geological Survey in cooperation with the Utah Department of Natural Resources, Division of Water Rights, and the Utah Department of Transportation. The principal runoff-producing area shown on the map was delineated from a work map (scale 1:250,000) compiled to estimate water yields in Utah (Bagley and others, 1964). Sources of information about recorded floods resulting from cloudbursts included Woolley (1946) and Butler and Marsell (1972); sources of information about the chemical quality of streamflow included Mundorff (1972) and Mundorff and Thompson (1982).

  2. Map showing selected surface-water data for the Price 30 x 60-minute Quadrangle, Utah

    USGS Publications Warehouse

    Price, Don

    1984-01-01

    This is one of a series of maps that describe the geology and related natural resources of the Price 30 x 60-minute quadrangle, Utah. Streamflow records used to compile this map were collected by the U.S. Geological Survey in cooperation with the Utah Department of Natural Resources, Division of Water Rights, and the Utah Department of Transportation. The principal runoff-producing areas shown on the map were delineated from a work map (scale 1:250,000) compiled to estimate water yields in Utah (Bagley and others, 1964). Sources of information about recorded floods resulting from cloudbursts included Woolley (1946) and Butler and Marsell (1972); sources of information about the chemical quality of streamflow included Mundorff (1972; 1977), and Waddell and others (1982).

  3. A Bibliography of Library Materials for Vocational-Technical Programs in Community Colleges.

    ERIC Educational Resources Information Center

    Sullivan, Peggy, Ed.

    Compiled at a 1-week institute held at Oregon State University in June 1969, this bibliography has four major sections: (1) an alphabetized listing of sources of materials including the names and addresses of the organizations and the major subject areas of concern, (2) an alphabetized listing of materials arranged by title and including source or…

  4. Developing an Open Source, Reusable Platform for Distributed Collaborative Information Management in the Early Detection Research Network

    NASA Technical Reports Server (NTRS)

    Hart, Andrew F.; Verma, Rishi; Mattmann, Chris A.; Crichton, Daniel J.; Kelly, Sean; Kincaid, Heather; Hughes, Steven; Ramirez, Paul; Goodale, Cameron; Anton, Kristen; hide

    2012-01-01

    For the past decade, the NASA Jet Propulsion Laboratory, in collaboration with Dartmouth University has served as the center for informatics for the Early Detection Research Network (EDRN). The EDRN is a multi-institution research effort funded by the U.S. National Cancer Institute (NCI) and tasked with identifying and validating biomarkers for the early detection of cancer. As the distributed network has grown, increasingly formal processes have been developed for the acquisition, curation, storage, and dissemination of heterogeneous research information assets, and an informatics infrastructure has emerged. In this paper we discuss the evolution of EDRN informatics, its success as a mechanism for distributed information integration, and the potential sustainability and reuse benefits of emerging efforts to make the platform components themselves open source. We describe our experience transitioning a large closed-source software system to a community driven, open source project at the Apache Software Foundation, and point to lessons learned that will guide our present efforts to promote the reuse of the EDRN informatics infrastructure by a broader community.

  5. Retrofitting impervious urban infrastructure with green technology for rainfall-runoff restoration, indirect reuse and pollution load reduction.

    PubMed

    Sansalone, John; Raje, Saurabh; Kertesz, Ruben; Maccarone, Kerrilynn; Seltzer, Karl; Siminari, Michele; Simms, Peter; Wood, Brandon

    2013-12-01

    The built environs alter hydrology and water resource chemistry. Florida is subject to nutrient criteria and is promulgating "no-net-load-increase" criteria for runoff and constituents (nutrients and particulate matter, PM). With such criteria, green infrastructure, hydrologic restoration, indirect reuse and source control are potential design solutions. The study simulates runoff and constituent load control through urban source area re-design to provide long-term "no-net-load-increases". A long-term continuous simulation of pre- and post-development response for an existing surface parking facility is quantified. Retrofits include a biofiltration area reactor (BAR) for hydrologic and denitrification control. A linear infiltration reactor (LIR) of cementitious permeable pavement (CPP) provides infiltration, adsorption and filtration. Pavement cleaning provided source control. Simulation of climate and source area data indicates re-design achieves "no-net-load-increases" at lower costs compared to standard construction. The retrofit system yields lower cost per nutrient load treated compared to Best Management Practices (BMPs). Copyright © 2013 Elsevier Ltd. All rights reserved.

  6. A Note on Compiling Fortran

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Busby, L. E.

    Fortran modules tend to serialize compilation of large Fortran projects, by introducing dependencies among the source files. If file A depends on file B, (A uses a module defined by B), you must finish compiling B before you can begin compiling A. Some Fortran compilers (Intel ifort, GNU gfortran and IBM xlf, at least) offer an option to ‘‘verify syntax’’, with the side effect of also producing any associated Fortran module files. As it happens, this option usually runs much faster than the object code generation and optimization phases. For some projects on some machines, it can be advantageous tomore » compile in two passes: The first pass generates the module files, quickly; the second pass produces the object files, in parallel. We achieve a 3.8× speedup in the case study below.« less

  7. Multiobjective optimization of cluster-scale urban water systems investigating alternative water sources and level of decentralization

    NASA Astrophysics Data System (ADS)

    Newman, J. P.; Dandy, G. C.; Maier, H. R.

    2014-10-01

    In many regions, conventional water supplies are unable to meet projected consumer demand. Consequently, interest has arisen in integrated urban water systems, which involve the reclamation or harvesting of alternative, localized water sources. However, this makes the planning and design of water infrastructure more difficult, as multiple objectives need to be considered, water sources need to be selected from a number of alternatives, and end uses of these sources need to be specified. In addition, the scale at which each treatment, collection, and distribution network should operate needs to be investigated. In order to deal with this complexity, a framework for planning and designing water infrastructure taking into account integrated urban water management principles is presented in this paper and applied to a rural greenfield development. Various options for water supply, and the scale at which they operate were investigated in order to determine the life-cycle trade-offs between water savings, cost, and GHG emissions as calculated from models calibrated using Australian data. The decision space includes the choice of water sources, storage tanks, treatment facilities, and pipes for water conveyance. For each water system analyzed, infrastructure components were sized using multiobjective genetic algorithms. The results indicate that local water sources are competitive in terms of cost and GHG emissions, and can reduce demand on the potable system by as much as 54%. Economies of scale in treatment dominated the diseconomies of scale in collection and distribution of water. Therefore, water systems that connect large clusters of households tend to be more cost efficient and have lower GHG emissions. In addition, water systems that recycle wastewater tended to perform better than systems that captured roof-runoff. Through these results, the framework was shown to be effective at identifying near optimal trade-offs between competing objectives, thereby enabling informed decisions to be made when planning water systems for greenfield developments.

  8. Alternative Fuels Data Center

    Science.gov Websites

    legislation dates back to the Clean Air Act of 1970, which created initiatives to reduce mobile sources of acts also include provisions related to alternative fuel vehicles (AFVs) and infrastructure. The Energy alternative fuel use and infrastructure development. The Energy Independence and Security Act of 2007 included

  9. Utilities for master source code distribution: MAX and Friends

    NASA Technical Reports Server (NTRS)

    Felippa, Carlos A.

    1988-01-01

    MAX is a program for the manipulation of FORTRAN master source code (MSC). This is a technique by which one maintains one and only one master copy of a FORTRAN program under a program developing system, which for MAX is assumed to be VAX/VMS. The master copy is not intended to be directly compiled. Instead it must be pre-processed by MAX to produce compilable instances. These instances may correspond to different code versions (for example, double precision versus single precision), different machines (for example, IBM, CDC, Cray) or different operating systems (i.e., VAX/VMS versus VAX/UNIX). The advantage os using a master source is more pronounced in complex application programs that are developed and maintained over many years and are to be transported and executed on several computer environments. The version lag problem that plagues many such programs is avoided by this approach. MAX is complemented by several auxiliary programs that perform nonessential functions. The ensemble is collectively known as MAX and Friends. All of these programs, including MAX, are executed as foreign VAX/VMS commands and can easily be hidden in customized VMS command procedures.

  10. Building Automatic Grading Tools for Basic of Programming Lab in an Academic Institution

    NASA Astrophysics Data System (ADS)

    Harimurti, Rina; Iwan Nurhidayat, Andi; Asmunin

    2018-04-01

    The skills of computer programming is a core competency that must be mastered by students majoring in computer sciences. The best way to improve this skill is through the practice of writing many programs to solve various problems from simple to complex. It takes hard work and a long time to check and evaluate the results of student labs one by one, especially if the number of students a lot. Based on these constrain, web proposes Automatic Grading Tools (AGT), the application that can evaluate and deeply check the source code in C, C++. The application architecture consists of students, web-based applications, compilers, and operating systems. Automatic Grading Tools (AGT) is implemented MVC Architecture and using open source software, such as laravel framework version 5.4, PostgreSQL 9.6, Bootstrap 3.3.7, and jquery library. Automatic Grading Tools has also been tested for real problems by submitting source code in C/C++ language and then compiling. The test results show that the AGT application has been running well.

  11. Practical Parenting Tips: Over 1,500 Helpful Hints for the First Five Years. Revised and Updated.

    ERIC Educational Resources Information Center

    Lansky, Vicki

    Noting that other parents can be an excellent source of practical parenting advice, this book compiles over a thousand practical tips--those not generally found in baby care books or pediatrician's offices--with over 400 new tips compiled since the book's 1982 version. Major topics include: (1) new baby care, including cesarean deliveries,…

  12. Programming in HAL/S

    NASA Technical Reports Server (NTRS)

    Ryer, M. J.

    1978-01-01

    HAL/S is a computer programming language; it is a representation for algorithms which can be interpreted by either a person or a computer. HAL/S compilers transform blocks of HAL/S code into machine language which can then be directly executed by a computer. When the machine language is executed, the algorithm specified by the HAL/S code (source) is performed. This document describes how to read and write HAL/S source.

  13. [Urologists in National Socialism: persecuted, displaced and murdered urologists: list of persons and overview of sources used].

    PubMed

    Bellmann, J

    2012-07-01

    The names of 247 urologists or physicians working in the field of urology are mentioned, including their biographical data und the sources used to compile their biographies. Each name represents an individual experience of persecution, displacement and in some cases deliberate murder. Investigating these life histories has not yet been completed, as in the course of time additional information becomes available.

  14. VizieR Online Data Catalog: Rotation measures of radio point sources (Xu+, 2014)

    NASA Astrophysics Data System (ADS)

    Xu, J.; Han, J.-L.

    2015-04-01

    We compiled a catalog of Faraday rotation measures (RMs) for 4553 extragalactic radio point sources published in literature. These RMs were derived from multi-frequency polarization observations. The RM data are compared to those in the NRAO VLA Sky Survey (NVSS) RM catalog. We reveal a systematic uncertainty of about 10.0+/-1.5rad/m2 in the NVSS RM catalog. The Galactic foreground RM is calculated through a weighted averaging method by using the compiled RM catalog together with the NVSS RM catalog, with careful consideration of uncertainties in the RM data. The data from the catalog and the interface for the Galactic foreground RM calculations are publicly available on the webpage: http://zmtt.bao.ac.cn/RM/ . (2 data files).

  15. Relationships and trends of E. Coli, human-associated Bacteroides, and pathogens in the Proctor Creek Watershed

    EPA Science Inventory

    Urban surface waters can be impacted by anthropogenic sources such as impervious surfaces, sanitary and storm sewers, and failing infrastructure. Fecal indicator bacteria (FIB) and microbial source tracking (MST) markers are common gauges of stream water quality, however, little...

  16. The TENCompetence Infrastructure: A Learning Network Implementation

    NASA Astrophysics Data System (ADS)

    Vogten, Hubert; Martens, Harrie; Lemmers, Ruud

    The TENCompetence project developed a first release of a Learning Network infrastructure to support individuals, groups and organisations in professional competence development. This infrastructure Learning Network infrastructure was released as open source to the community thereby allowing users and organisations to use and contribute to this development as they see fit. The infrastructure consists of client applications providing the user experience and server components that provide the services to these clients. These services implement the domain model (Koper 2006) by provisioning the entities of the domain model (see also Sect. 18.4) and henceforth will be referenced as domain entity services.

  17. Digital data used to relate nutrient inputs to water quality in the Chesapeake Bay watershed

    USGS Publications Warehouse

    Brakebill, John W.; Preston, Stephen D.

    1999-01-01

    Digital data sets were compiled by the U. S. Geological Survey (USGS) and used as input for a collection of Spatially Referenced Regressions On Watershed attributes for the Chesapeake Bay region. These regressions relate streamwater loads to nutrient sources and the factors that affect the transport of these nutrients throughout the watershed. A digital segmented network based on watershed boundaries serves as the primary foundation for spatially referencing total nitrogen and total phosphorus source and land-surface characteristic data sets within a Geographic Information System. Digital data sets of atmospheric wet deposition of nitrate, point-source discharge locations, land cover, and agricultural sources such as fertilizer and manure were created and compiled from numerous sources and represent nitrogen and phosphorus inputs. Some land-surface characteristics representing factors that affect the transport of nutrients include land use, land cover, average annual precipitation and temperature, slope, and soil permeability. Nutrient input and land-surface characteristic data sets merged with the segmented watershed network provide the spatial detail by watershed segment required by the models. Nutrient stream loads were estimated for total nitrogen, total phosphorus, nitrate/nitrite, amonium, phosphate, and total suspended soilds at as many as 109 sites within the Chesapeake Bay watershed. The total nitrogen and total phosphorus load estimates are the dependent variables for the regressions and were used for model calibration. Other nutrient-load estimates may be used for calibration in future applications of the models.

  18. Improvements in the EQ-10 electrodeless Z-pinch EUV source for metrology applications

    NASA Astrophysics Data System (ADS)

    Horne, Stephen F.; Gustafson, Deborah; Partlow, Matthew J.; Besen, Matthew M.; Smith, Donald K.; Blackborow, Paul A.

    2011-04-01

    Now that EUV lithography systems are beginning to ship into the fabs for next generation chips it is more critical that the EUV infrastructure developments are keeping pace. Energetiq Technology has been shipping the EQ-10 Electrodeless Z-pinch™ light source since 2005. The source is currently being used for metrology, mask inspection, and resist development. These applications require especially stable performance in both power and source size. Over the last 5 years Energetiq has made many source modifications which have included better thermal management as well as high pulse rate operation6. Recently we have further increased the system power handling and electrical pulse reproducibility. The impact of these modifications on source performance will be reported.

  19. The management challenge for household waste in emerging economies like Brazil: realistic source separation and activation of reverse logistics.

    PubMed

    Fehr, M

    2014-09-01

    Business opportunities in the household waste sector in emerging economies still evolve around the activities of bulk collection and tipping with an open material balance. This research, conducted in Brazil, pursued the objective of shifting opportunities from tipping to reverse logistics in order to close the balance. To do this, it illustrated how specific knowledge of sorted waste composition and reverse logistics operations can be used to determine realistic temporal and quantitative landfill diversion targets in an emerging economy context. Experimentation constructed and confirmed the recycling trilogy that consists of source separation, collection infrastructure and reverse logistics. The study on source separation demonstrated the vital difference between raw and sorted waste compositions. Raw waste contained 70% biodegradable and 30% inert matter. Source separation produced 47% biodegradable, 20% inert and 33% mixed material. The study on collection infrastructure developed the necessary receiving facilities. The study on reverse logistics identified private operators capable of collecting and processing all separated inert items. Recycling activities for biodegradable material were scarce and erratic. Only farmers would take the material as animal feed. No composting initiatives existed. The management challenge was identified as stimulating these activities in order to complete the trilogy and divert the 47% source-separated biodegradable discards from the landfills. © The Author(s) 2014.

  20. Footprints of air pollution and changing environment on the sustainability of built infrastructure.

    PubMed

    Kumar, Prashant; Imam, Boulent

    2013-02-01

    Over 150 research articles relating three multi-disciplinary topics (air pollution, climate change and civil engineering structures) are reviewed to examine the footprints of air pollution and changing environment on the sustainability of building and transport structures (referred as built infrastructure). The aim of this review is to synthesize the existing knowledge on this topic, highlight recent advances in our understanding and discuss research priorities. The article begins with the background information on sources and emission trends of global warming (CO(2), CH(4), N(2)O, CFCs, SF(6)) and corrosive (SO(2), O(3), NO(X)) gases and their role in deterioration of building materials (e.g. steel, stone, concrete, brick and wood) exposed in outdoor environments. Further section covers the impacts of climate- and pollution-derived chemical pathways, generally represented by dose-response functions (DRFs), and changing environmental conditions on built infrastructure. The article concludes with the discussions on the topic areas covered and research challenges. A comprehensive inventory of DRFs is compiled. The case study carried out for analysing the inter-comparability of various DRFs on four different materials (carbon steel, limestone, zinc and copper) produced comparable results. Results of another case study revealed that future projected changes in temperature and/or relatively humidity are expected to have a modest effect on the material deterioration rate whereas changes in precipitation were found to show a more dominant impact. Evidences suggest that both changing and extreme environmental conditions are expected to affect the integrity of built infrastructure both in terms of direct structural damage and indirect losses of transport network functionality. Unlike stone and metals, substantially limited information is available on the deterioration of brick, concrete and wooden structures. Further research is warranted to develop more robust and theoretical DRFs for generalising their application, accurately mapping corrosion losses in an area, and costing risk of corrosion damage. Copyright © 2012 Elsevier B.V. All rights reserved.

  1. The National Assessment of Shoreline Change:A GIS Compilation of Vector Shorelines and Associated Shoreline Change Data for the Sandy Shorelines of the California Coast

    USGS Publications Warehouse

    Hapke, Cheryl J.; Reid, David

    2006-01-01

    Introduction The Coastal and Marine Geology Program of the U.S. Geological Survey has generated a comprehensive data clearinghouse of digital vector shorelines and shoreline change rates for the sandy shoreline along the California open coast. These data, which are presented herein, were compiled as part of the U.S. Geological Survey's National Assessment of Shoreline Change Project. Beach erosion is a chronic problem along many open-ocean shores of the United States. As coastal populations continue to grow and community infrastructures are threatened by erosion, there is increased demand for accurate information including rates and trends of shoreline migration. There is also a critical need for shoreline change data that is consistent from one coastal region to another. One purpose of this work is to develop standard, repeatable methods for mapping and analyzing shoreline movement so that periodic, systematic, and internally consistent updates of shorelines and shoreline change rates can be made at a National Scale. This data compilation for open-ocean, sandy shorelines of the California coast is one in a series that already includes the Gulf of Mexico and the Southeast Atlantic Coast (Morton et al., 2004; Morton et al., 2005) and will eventually cover Washington, Oregon, and parts of Hawaii and Alaska. Short- and long-term shoreline change evaluations are determined by comparing the positions of three historical shorelines digitized from maps, with a modern shoreline derived from LIDAR (light detection and ranging) topographic surveys. Historical shorelines generally represent the following time-periods: 1850s-1880s, 1920s-1930s, and late 1940s-1970s. The most recent shoreline is from data collected between 1997 and 2002. Long-term rates of change are calculated by linear regression using all four shorelines. Short-term rates of change are end-point rate calculations using the two most recent shorelines. Please refer to our full report on shoreline change of the California coastline at http://pubs.usgs.gov/of/2006/1219/ for additional information regarding methods and results (Hapke et al., 2006). Data in this report are organized into downloadable layers by region (Northern, Central and Southern California) and are provided as vector datasets with metadata. Vector shorelines may represent a compilation of data from one or more sources and these sources are included in the dataset metadata. This project employs the Environmental Systems Research Institute's (ESRI) ArcGIS as it's GIS mapping tool and contains several data layers (shapefiles) that are used to create a geographic view of the California Coast. These vector data form a basemap comprised of polygon and line themes that include a U.S. coastline (1:80,000), U.S. cities, and state boundaries.

  2. Performance of Compiler-Assisted Memory Safety Checking

    DTIC Science & Technology

    2014-08-01

    software developer has in mind a particular object to which the pointer should point, the intended referent. A memory access error occurs when an ac...Performance of Compiler-Assisted Memory Safety Checking David Keaton Robert C. Seacord August 2014 TECHNICAL NOTE CMU/SEI-2014-TN...based memory safety checking tool and the performance that can be achieved with two such tools whose source code is freely available. The note then

  3. Transformation Systems at NASA Ames

    NASA Technical Reports Server (NTRS)

    Buntine, Wray; Fischer, Bernd; Havelund, Klaus; Lowry, Michael; Pressburger, TOm; Roach, Steve; Robinson, Peter; VanBaalen, Jeffrey

    1999-01-01

    In this paper, we describe the experiences of the Automated Software Engineering Group at the NASA Ames Research Center in the development and application of three different transformation systems. The systems span the entire technology range, from deductive synthesis, to logic-based transformation, to almost compiler-like source-to-source transformation. These systems also span a range of NASA applications, including solving solar system geometry problems, generating data analysis software, and analyzing multi-threaded Java code.

  4. Sunlamp

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nutaro, James J

    The purpose of this model was to facilitate the design of a control system that uses fine grained control of residential and small commercial HVAC loads to counterbalance voltage swings caused by intermittent solar power sources (e.g., rooftop panels) installed in that distribution circuit. Included is the source code and pre-compiled 64 bit dll for adding building HVAC loads to an OpenDSS distribution circuit. As written, the Makefile assumes you are using the Microsoft C++ development tools.

  5. Geologic map of Alaska

    USGS Publications Warehouse

    Wilson, Frederic H.; Hults, Chad P.; Mull, Charles G.; Karl, Susan M.

    2015-12-31

    This Alaska compilation is unique in that it is integrated with a rich database of information provided in the spatial datasets and standalone attribute databases. Within the spatial files every line and polygon is attributed to its original source; the references to these sources are contained in related tables, as well as in stand-alone tables. Additional attributes include typical lithology, geologic setting, and age range for the map units. Also included are tables of radiometric ages.

  6. Data for Quaternary faults, liquefaction features, and possible tectonic features in the Central and Eastern United States, east of the Rocky Mountain Front

    USGS Publications Warehouse

    Crone, Anthony J.; Wheeler, Russell L.

    2000-01-01

    The USGS is currently leading an effort to compile published geological information on Quaternary faults, folds, and earthquake-induced liquefaction in order to develop an internally consistent database on the locations, ages, and activity rates of major earthquake-related features throughout the United States. This report is the compilation for such features in the Central and Eastern United States (CEUS), which for the purposes of the compilation, is defined as the region extending from the Rocky Mountain Front eastward to the Atlantic seaboard. A key objective of this national compilation is to provide a comprehensive database of Quaternary features that might generate strong ground motion and therefore, should be considered in assessing the seismic hazard throughout the country. In addition to printed versions of regional and individual state compilations, the database will be available on the World-Wide Web, where it will be readily available to everyone. The primary purpose of these compilations and the derivative database is to provide a comprehensive, uniform source of geological information that can by used to complement the other types of data that are used in seismic-hazard assessments. Within our CEUS study area, which encompasses more than 60 percent of the continuous U.S., we summarize the geological information on 69 features that are categorized into four classes (Class A, B, C, and D) based on what is known about the feature's Quaternary activity. The CEUS contains only 13 features of tectonic origin for which there is convincing evidence of Quaternary activity (Class A features). Of the remaining 56 features, 11 require further study in order to confidently define their potential as possible sources of earthquake-induced ground motion (Class B), whereas the remaining features either lack convincing geologic evidence of Quaternary tectonic faulting or have been studied carefully enough to determine that they do not pose a significant seismic hazard (Classes C and D). The correlation between historical seismicity and Quaternary faults and liquefaction features in the CEUS is generally poor, which probably reflects the long return times between successive movements on individual structures. Some Quaternary faults and liquefaction features are located in aseismic areas or where historical seismicity is sparse. These relations indicate that the record of historical seismicity does not identify all potential seismic sources in the CEUS. Furthermore, geological studies of some currently aseismic faults have shown that the faults have generated strong earthquakes in the geologically recent past. Thus, the combination of geological information and seismological data can provide better insight into potential earthquake sources and thereby, contribute to better, more comprehensive seismic-hazard assessments.

  7. Region 9 Tribal Lands

    EPA Pesticide Factsheets

    Dataset of all Indian Reservations in US EPA Region 9 (California, Arizona and Nevada) with some reservation border areas of adjacent states included (adjacent areas of Colorado, New Mexico and Utah). Reservation boundaries are compiled from multiple sources and are derived from several different source scales. Information such as reservation type, primary tribe name are included with the feature dataset. Public Domain Allotments are not included in this data set.

  8. Mobile mapping of methane emissions and isoscapes

    NASA Astrophysics Data System (ADS)

    Takriti, Mounir; Ward, Sue; Wynn, Peter; Elias, Dafydd; McNamara, Niall

    2017-04-01

    Methane (CH4) is a potent greenhouse gas emitted from a variety of natural and anthropogenic sources. It is crucial to accurately and efficiently detect CH4 emissions and identify their sources to improve our understanding of changing emission patterns as well as to identify ways to curtail their release into the atmosphere. However, using established methods this can be challenging as well as time and resource intensive due to the temporal and spatial heterogeneity of many sources. To address this problem, we have developed a vehicle mounted mobile system that combines high precision CH4 measurements with isotopic mapping and dual isotope source characterisation. We here present details of the development and testing of a unique system for the detection and isotopic analysis of CH4 plumes built around a Picarro isotopic (13C/12C) gas analyser and a high precision Los Gatos greenhouse gas analyser. Combined with micrometeorological measurements and a mechanism for collecting discrete samples for high precision dual isotope (13C/12C, 2H/1H) analysis the system enables mapping of concentrations as well as directional and isotope based source verification. We then present findings from our mobile methane surveys around the North West of England. This area includes a variety of natural and anthropogenic methane sources within a relatively small geographical area, including livestock farming, urban and industrial gas infrastructure, landfills and waste water treatment facilities, and wetlands. We show that the system was successfully able to locate leaks from natural gas infrastructure and emissions from agricultural activities and to distinguish isotope signatures from these sources.

  9. Relationships and trends of E. Coli, human-associated bacteroides, and pathogens in the Proctor Creek watershed (GWRC 2017)

    EPA Science Inventory

    Urban surface waters can be impacted by anthropogenic sources such as impervious surfaces, sani-tary and storm sewers, and failing infrastructure. Fecal indicator bacteria (FIB) and microbial source tracking (MST) markers are common gauges of stream water qual-ity, however, litt...

  10. Previous experience in manned space flight: A survey of human factors lessons learned

    NASA Technical Reports Server (NTRS)

    Chandlee, George O.; Woolford, Barbara

    1993-01-01

    Previous experience in manned space flight programs can be used to compile a data base of human factors lessons learned for the purpose of developing aids in the future design of inhabited spacecraft. The objectives are to gather information available from relevant sources, to develop a taxonomy of human factors data, and to produce a data base that can be used in the future for those people involved in the design of manned spacecraft operations. A study is currently underway at the Johnson Space Center with the objective of compiling, classifying, and summarizing relevant human factors data bearing on the lessons learned from previous manned space flights. The research reported defines sources of data, methods for collection, and proposes a classification for human factors data that may be a model for other human factors disciplines.

  11. Glossary of Water Resource Terms.

    ERIC Educational Resources Information Center

    Titelbaum, Olga Adler

    Twelve reference sources were used in the compilation of this glossary of water pollution control terminology. Definitions for 364 words, acronyms, and phrases are included with cross references. (KP)

  12. Central Appalachian basin natural gas database: distribution, composition, and origin of natural gases

    USGS Publications Warehouse

    Román Colón, Yomayra A.; Ruppert, Leslie F.

    2015-01-01

    The U.S. Geological Survey (USGS) has compiled a database consisting of three worksheets of central Appalachian basin natural gas analyses and isotopic compositions from published and unpublished sources of 1,282 gas samples from Kentucky, Maryland, New York, Ohio, Pennsylvania, Tennessee, Virginia, and West Virginia. The database includes field and reservoir names, well and State identification number, selected geologic reservoir properties, and the composition of natural gases (methane; ethane; propane; butane, iso-butane [i-butane]; normal butane [n-butane]; iso-pentane [i-pentane]; normal pentane [n-pentane]; cyclohexane, and hexanes). In the first worksheet, location and American Petroleum Institute (API) numbers from public or published sources are provided for 1,231 of the 1,282 gas samples. A second worksheet of 186 gas samples was compiled from published sources and augmented with public location information and contains carbon, hydrogen, and nitrogen isotopic measurements of natural gas. The third worksheet is a key for all abbreviations in the database. The database can be used to better constrain the stratigraphic distribution, composition, and origin of natural gas in the central Appalachian basin.

  13. Evaluating changes in water quality with respect to nonpoint source nutrient management strategies in the Chesapeake Bay Watershed

    NASA Astrophysics Data System (ADS)

    Keisman, J.; Sekellick, A.; Blomquist, J.; Devereux, O. H.; Hively, W. D.; Johnston, M.; Moyer, D.; Sweeney, J.

    2014-12-01

    Chesapeake Bay is a eutrophic ecosystem with periodic hypoxia and anoxia, algal blooms, diminished submerged aquatic vegetation, and degraded stocks of marine life. Knowledge of the effectiveness of actions taken across the watershed to reduce nitrogen (N) and phosphorus (P) loads to the bay (i.e. "best management practices" or BMPs) is essential to its restoration. While nutrient inputs from point sources (e.g. wastewater treatment plants and other industrial and municipal operations) are tracked, inputs from nonpoint sources, including atmospheric deposition, farms, lawns, septic systems, and stormwater, are difficult to measure. Estimating reductions in nonpoint source inputs attributable to BMPs requires compilation and comparison of data on water quality, climate, land use, point source discharges, and BMP implementation. To explore the relation of changes in nonpoint source inputs and BMP implementation to changes in water quality, a subset of small watersheds (those containing at least 10 years of water quality monitoring data) within the Chesapeake Watershed were selected for study. For these watersheds, data were compiled on geomorphology, demographics, land use, point source discharges, atmospheric deposition, and agricultural practices such as livestock populations, crop acres, and manure and fertilizer application. In addition, data on BMP implementation for 1985-2012 were provided by the Environmental Protection Agency Chesapeake Bay Program Office (CBPO) and the U.S. Department of Agriculture. A spatially referenced nonlinear regression model (SPARROW) provided estimates attributing N and P loads associated with receiving waters to different nutrient sources. A recently developed multiple regression technique ("Weighted Regressions on Time, Discharge and Season" or WRTDS) provided an enhanced understanding of long-term trends in N and P loads and concentrations. A suite of deterministic models developed by the CBPO was used to estimate expected nutrient load reductions attributable to BMPs. Further quantification of the relation of land-based nutrient sources and BMPs to water quality in the bay and its tributaries must account for inconsistency in BMP data over time and uncertainty regarding BMP locations and effectiveness.

  14. Obtaining correct compile results by absorbing mismatches between data types representations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Horie, Michihiro; Horii, Hiroshi H.; Kawachiya, Kiyokuni

    Methods and a system are provided. A method includes implementing a function, which a compiler for a first language does not have, using a compiler for a second language. The implementing step includes generating, by the compiler for the first language, a first abstract syntax tree. The implementing step further includes converting, by a converter, the first abstract syntax tree to a second abstract syntax tree of the compiler for the second language using a conversion table from data representation types in the first language to data representation types in the second language. When a compilation error occurs, the implementingmore » step also includes generating a special node for error processing in the second abstract syntax tree and storing an error token in the special node. When unparsing, the implementing step additionally includes outputting the error token, in the form of source code written in the first language.« less

  15. Obtaining correct compile results by absorbing mismatches between data types representations

    DOEpatents

    Horie, Michihiro; Horii, Hiroshi H.; Kawachiya, Kiyokuni; Takeuchi, Mikio

    2017-03-21

    Methods and a system are provided. A method includes implementing a function, which a compiler for a first language does not have, using a compiler for a second language. The implementing step includes generating, by the compiler for the first language, a first abstract syntax tree. The implementing step further includes converting, by a converter, the first abstract syntax tree to a second abstract syntax tree of the compiler for the second language using a conversion table from data representation types in the first language to data representation types in the second language. When a compilation error occurs, the implementing step also includes generating a special node for error processing in the second abstract syntax tree and storing an error token in the special node. When unparsing, the implementing step additionally includes outputting the error token, in the form of source code written in the first language.

  16. Obtaining correct compile results by absorbing mismatches between data types representations

    DOEpatents

    Horie, Michihiro; Horii, Hiroshi H.; Kawachiya, Kiyokuni; Takeuchi, Mikio

    2017-11-21

    Methods and a system are provided. A method includes implementing a function, which a compiler for a first language does not have, using a compiler for a second language. The implementing step includes generating, by the compiler for the first language, a first abstract syntax tree. The implementing step further includes converting, by a converter, the first abstract syntax tree to a second abstract syntax tree of the compiler for the second language using a conversion table from data representation types in the first language to data representation types in the second language. When a compilation error occurs, the implementing step also includes generating a special node for error processing in the second abstract syntax tree and storing an error token in the special node. When unparsing, the implementing step additionally includes outputting the error token, in the form of source code written in the first language.

  17. Workshop on Mare Volcanism and Basalt Petrogenesis: Astounding Fundamental Concepts (AFC) Developed Over the Last Fifteen Years

    NASA Technical Reports Server (NTRS)

    Taylor, Lawrence A. (Editor); Longi, John (Editor)

    1991-01-01

    Papers presented at the workshop on mare volcanism and basalt petrogenesis are compiled. The discussion of recent ideas and concepts within the context of this workshop permitted to catch up on the developments over the last 15 years. The following subject areas were covered: (1) geological setting; (2) magma evolution and source regions; (3) magma source and ascent processes; and (4) history of volcanism.

  18. Pocket guide to transportation, 2000

    DOT National Transportation Integrated Search

    1999-12-01

    The Bureau of Transportation Statistics compiled the data in this guide from multiple sources. The guide is divided into five sections: (1) transportation system extent, (2) transportation and safety, (3) mobility, (4) transportation and the economy,...

  19. High Temporal Resolution Mapping of Seismic Noise Sources Using Heterogeneous Supercomputers

    NASA Astrophysics Data System (ADS)

    Paitz, P.; Gokhberg, A.; Ermert, L. A.; Fichtner, A.

    2017-12-01

    The time- and space-dependent distribution of seismic noise sources is becoming a key ingredient of modern real-time monitoring of various geo-systems like earthquake fault zones, volcanoes, geothermal and hydrocarbon reservoirs. We present results of an ongoing research project conducted in collaboration with the Swiss National Supercomputing Centre (CSCS). The project aims at building a service providing seismic noise source maps for Central Europe with high temporal resolution. We use source imaging methods based on the cross-correlation of seismic noise records from all seismic stations available in the region of interest. The service is hosted on the CSCS computing infrastructure; all computationally intensive processing is performed on the massively parallel heterogeneous supercomputer "Piz Daint". The solution architecture is based on the Application-as-a-Service concept to provide the interested researchers worldwide with regular access to the noise source maps. The solution architecture includes the following sub-systems: (1) data acquisition responsible for collecting, on a periodic basis, raw seismic records from the European seismic networks, (2) high-performance noise source mapping application responsible for the generation of source maps using cross-correlation of seismic records, (3) back-end infrastructure for the coordination of various tasks and computations, (4) front-end Web interface providing the service to the end-users and (5) data repository. The noise source mapping itself rests on the measurement of logarithmic amplitude ratios in suitably pre-processed noise correlations, and the use of simplified sensitivity kernels. During the implementation we addressed various challenges, in particular, selection of data sources and transfer protocols, automation and monitoring of daily data downloads, ensuring the required data processing performance, design of a general service-oriented architecture for coordination of various sub-systems, and engineering an appropriate data storage solution. The present pilot version of the service implements noise source maps for Switzerland. Extension of the solution to Central Europe is planned for the next project phase.

  20. Map showing selected surface-water data for the Nephi 30 x 60-minute quadrangle, Utah

    USGS Publications Warehouse

    Price, Don

    1984-01-01

    This is one of a series of maps that describe the geology and related natural resources of the Nephi 30 x 60 minute quadrangle, Utah. Streamflow records used to compile this map were collected by the U.S. Geological Survey in cooperation with the Utah Department of Natural Resources, Division of Water Rights, and the Utah Department of Transportation. The principal runoff-producing areas shown on the map were delineated from a work map (scale 1:250,000) compiled to estimate water yields in Utah (Bagley and others, 1964). Sources of information about recorded floods resulting from cloudbursts included Woolley (1946) and Butler and Marsell (1972); sources of information about the chemical quality of streamflow included Hahl and Cabell (1965) Mundorff (1972 and 1974), and Waddell and others (1982).

  1. Evaluation of Current Water Treatment and Distribution System Optimization to Provide Safe Drinking Water from Various Source Water Types and Conditions (Deliverable 5.2.C.1)

    EPA Science Inventory

    Increasingly, drinking water treatment plants (DWTPs) are being challenged by changes in the quality of their source waters and by their aging treatment and distribution system infrastructure. Individually or in combination, factors such as shrinking water and financial resources...

  2. A Type-Preserving Compiler Infrastructure

    DTIC Science & Technology

    2002-12-01

    understand this code. This is, in essence , the object encoding we use to compile Java. Before embarking on the formal translation, wemust explore onemore...call. This solution works quite well. We used Jasmin , a JVML assembler (Meyer and Down- 102 CHAPTER 7. FUNCTIONAL JAVA BYTECODE ing 1997), to generate a...European Symp. on Program. 135–149. Flanagan, Cormac, Amr Sabry, Bruce F. Duba, and Matthias Felleisen. 1993, June. “The Essence of Compiling with

  3. Informatics Infrastructure for the Materials Genome Initiative

    NASA Astrophysics Data System (ADS)

    Dima, Alden; Bhaskarla, Sunil; Becker, Chandler; Brady, Mary; Campbell, Carelyn; Dessauw, Philippe; Hanisch, Robert; Kattner, Ursula; Kroenlein, Kenneth; Newrock, Marcus; Peskin, Adele; Plante, Raymond; Li, Sheng-Yen; Rigodiat, Pierre-François; Amaral, Guillaume Sousa; Trautt, Zachary; Schmitt, Xavier; Warren, James; Youssef, Sharief

    2016-08-01

    A materials data infrastructure that enables the sharing and transformation of a wide range of materials data is an essential part of achieving the goals of the Materials Genome Initiative. We describe two high-level requirements of such an infrastructure as well as an emerging open-source implementation consisting of the Materials Data Curation System and the National Institute of Standards and Technology Materials Resource Registry.

  4. Conversion of HSPF Legacy Model to a Platform-Independent, Open-Source Language

    NASA Astrophysics Data System (ADS)

    Heaphy, R. T.; Burke, M. P.; Love, J. T.

    2015-12-01

    Since its initial development over 30 years ago, the Hydrologic Simulation Program - FORTAN (HSPF) model has been used worldwide to support water quality planning and management. In the United States, HSPF receives widespread endorsement as a regulatory tool at all levels of government and is a core component of the EPA's Better Assessment Science Integrating Point and Nonpoint Sources (BASINS) system, which was developed to support nationwide Total Maximum Daily Load (TMDL) analysis. However, the model's legacy code and data management systems have limitations in their ability to integrate with modern software, hardware, and leverage parallel computing, which have left voids in optimization, pre-, and post-processing tools. Advances in technology and our scientific understanding of environmental processes that have occurred over the last 30 years mandate that upgrades be made to HSPF to allow it to evolve and continue to be a premiere tool for water resource planners. This work aims to mitigate the challenges currently facing HSPF through two primary tasks: (1) convert code to a modern widely accepted, open-source, high-performance computing (hpc) code; and (2) convert model input and output files to modern widely accepted, open-source, data model, library, and binary file format. Python was chosen as the new language for the code conversion. It is an interpreted, object-oriented, hpc code with dynamic semantics that has become one of the most popular open-source languages. While python code execution can be slow compared to compiled, statically typed programming languages, such as C and FORTRAN, the integration of Numba (a just-in-time specializing compiler) has allowed this challenge to be overcome. For the legacy model data management conversion, HDF5 was chosen to store the model input and output. The code conversion for HSPF's hydrologic and hydraulic modules has been completed. The converted code has been tested against HSPF's suite of "test" runs and shown good agreement and similar execution times while using the Numba compiler. Continued verification of the accuracy of the converted code against more complex legacy applications and improvement upon execution times by incorporating an intelligent network change detection tool is currently underway, and preliminary results will be presented.

  5. The EDRN knowledge environment: an open source, scalable informatics platform for biological sciences research

    NASA Astrophysics Data System (ADS)

    Crichton, Daniel; Mahabal, Ashish; Anton, Kristen; Cinquini, Luca; Colbert, Maureen; Djorgovski, S. George; Kincaid, Heather; Kelly, Sean; Liu, David

    2017-05-01

    We describe here the Early Detection Research Network (EDRN) for Cancer's knowledge environment. It is an open source platform built by NASA's Jet Propulsion Laboratory with contributions from the California Institute of Technology, and Giesel School of Medicine at Dartmouth. It uses tools like Apache OODT, Plone, and Solr, and borrows heavily from JPL's Planetary Data System's ontological infrastructure. It has accumulated data on hundreds of thousands of biospecemens and serves over 1300 registered users across the National Cancer Institute (NCI). The scalable computing infrastructure is built such that we are being able to reach out to other agencies, provide homogeneous access, and provide seamless analytics support and bioinformatics tools through community engagement.

  6. Network Interdependency Modeling for Risk Assessment on Built Infrastructure Systems

    DTIC Science & Technology

    2013-10-01

    does begin to address infrastructure decay as a source of risk comes from the Department of Homeland Security (DHS). In 2009, the DHS Science and...network of connected edges and nodes. The National Research Council (2005) reported that the study of networks as a science and applications of...principles from this science are still in its early stages. As modern infrastructures have become more interlinked, knowledge of an infrastructure’s network

  7. Pocket guide to transportation, 1998

    DOT National Transportation Integrated Search

    1998-12-01

    The statistics in this Pocket Guide to Transportation were compiled by the Bureau of Transportation Statistics from multiple sources. The guide is divided into four sections: (1) transportation system extent and use, (2) transportation and the econom...

  8. 42 CFR 456.244 - Data sources for studies.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... appropriate hospital data. (b) External organizations that compile statistics, design profiles, and produce other comparative data. (c) Cooperative endeavors with— (1) QIOs; (2) Fiscal agents; (3) Other service...

  9. 42 CFR 456.144 - Data sources for studies.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... appropriate hospital data; (b) External organizations that compile statistics, design profiles, and produce other comparative data; (c) Cooperative endeavors with— (1) QIOs; (2) Fiscal agents; (3) Other service...

  10. 42 CFR 456.144 - Data sources for studies.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... appropriate hospital data; (b) External organizations that compile statistics, design profiles, and produce other comparative data; (c) Cooperative endeavors with— (1) QIOs; (2) Fiscal agents; (3) Other service...

  11. 42 CFR 456.144 - Data sources for studies.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... appropriate hospital data; (b) External organizations that compile statistics, design profiles, and produce other comparative data; (c) Cooperative endeavors with— (1) QIOs; (2) Fiscal agents; (3) Other service...

  12. 42 CFR 456.244 - Data sources for studies.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... appropriate hospital data. (b) External organizations that compile statistics, design profiles, and produce other comparative data. (c) Cooperative endeavors with— (1) QIOs; (2) Fiscal agents; (3) Other service...

  13. 42 CFR 456.244 - Data sources for studies.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... appropriate hospital data. (b) External organizations that compile statistics, design profiles, and produce other comparative data. (c) Cooperative endeavors with— (1) QIOs; (2) Fiscal agents; (3) Other service...

  14. 42 CFR 456.144 - Data sources for studies.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... appropriate hospital data; (b) External organizations that compile statistics, design profiles, and produce other comparative data; (c) Cooperative endeavors with— (1) QIOs; (2) Fiscal agents; (3) Other service...

  15. 42 CFR 456.244 - Data sources for studies.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... appropriate hospital data. (b) External organizations that compile statistics, design profiles, and produce other comparative data. (c) Cooperative endeavors with— (1) QIOs; (2) Fiscal agents; (3) Other service...

  16. 42 CFR 456.244 - Data sources for studies.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... appropriate hospital data. (b) External organizations that compile statistics, design profiles, and produce other comparative data. (c) Cooperative endeavors with— (1) QIOs; (2) Fiscal agents; (3) Other service...

  17. 42 CFR 456.144 - Data sources for studies.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... appropriate hospital data; (b) External organizations that compile statistics, design profiles, and produce other comparative data; (c) Cooperative endeavors with— (1) QIOs; (2) Fiscal agents; (3) Other service...

  18. Opening Up to Open Source: Looking at How Moodle Was Adopted in Higher Education

    ERIC Educational Resources Information Center

    Costello, Eamon

    2013-01-01

    The virtual learning environment (VLE) has grown to become a piece of complex infrastructure that is now deemed critical to higher educational provision. This paper looks at Moodle and its adoption in higher education. Moodle's origins, as an open source VLE, are investigated and its growth examined in the context of how higher educational…

  19. A Workflow to Model Microbial Loadings in Watersheds ...

    EPA Pesticide Factsheets

    Many watershed models simulate overland and instream microbial fate and transport, but few actually provide loading rates on land surfaces and point sources to the water body network. This paper describes the underlying general equations for microbial loading rates associated with 1) land-applied manure on undeveloped areas from domestic animals; 2) direct shedding on undeveloped lands by domestic animals and wildlife; 3) urban or engineered areas; and 4) point sources that directly discharge to streams from septic systems and shedding by domestic animals. A microbial source module, which houses these formulations, is linked within a workflow containing eight models and a set of databases that form a loosely configured modeling infrastructure which supports watershed-scale microbial source-to-receptor modeling by focusing on animal-impacted catchments. A hypothetical example application – accessing, retrieving, and using real-world data – demonstrates the ability of the infrastructure to automate many of the manual steps associated with a standard watershed assessment, culminating with calibrated flow and microbial densities at the pour point of a watershed. In the Proceedings of the International Environmental Modelling and Software Society (iEMSs), 8th International Congress on Environmental Modelling and Software, Toulouse, France

  20. Scaling Agile Infrastructure to People

    NASA Astrophysics Data System (ADS)

    Jones, B.; McCance, G.; Traylen, S.; Barrientos Arias, N.

    2015-12-01

    When CERN migrated its infrastructure away from homegrown fabric management tools to emerging industry-standard open-source solutions, the immediate technical challenges and motivation were clear. The move to a multi-site Cloud Computing model meant that the tool chains that were growing around this ecosystem would be a good choice, the challenge was to leverage them. The use of open-source tools brings challenges other than merely how to deploy them. Homegrown software, for all the deficiencies identified at the outset of the project, has the benefit of growing with the organization. This paper will examine what challenges there were in adapting open-source tools to the needs of the organization, particularly in the areas of multi-group development and security. Additionally, the increase in scale of the plant required changes to how Change Management was organized and managed. Continuous Integration techniques are used in order to manage the rate of change across multiple groups, and the tools and workflow for this will be examined.

  1. Atmospheric State, Cloud Microphysics and Radiative Flux

    DOE Data Explorer

    Mace, Gerald

    2008-01-15

    Atmospheric thermodynamics, cloud properties, radiative fluxes and radiative heating rates for the ARM Southern Great Plains (SGP) site. The data represent a characterization of the physical state of the atmospheric column compiled on a five-minute temporal and 90m vertical grid. Sources for this information include raw measurements, cloud property and radiative retrievals, retrievals and derived variables from other third-party sources, and radiative calculations using the derived quantities.

  2. The Athabasca University eduSource Project: Building an Accessible Learning Object Repository

    ERIC Educational Resources Information Center

    Cleveland-Innes, Martha; McGreal, Rory; Anderson, Terry; Friesen, Norm; Ally, Mohamed; Tin, Tony; Graham, Rodger; Moisey, Susan; Petrinjak, Anita; Schafer, Steve

    2005-01-01

    Athabasca University--Canada's Open University (AU) made the commitment to put all of its courses online as part of its Strategic University Plan. In pursuit of this goal, AU participated in the eduSource project, a pan-Canadian effort to build the infrastructure for an interoperable network of learning object repositories. AU acted as a leader in…

  3. M2Lite: An Open-source, Light-weight, Pluggable and Fast Proteome Discoverer MSF to mzIdentML Tool.

    PubMed

    Aiyetan, Paul; Zhang, Bai; Chen, Lily; Zhang, Zhen; Zhang, Hui

    2014-04-28

    Proteome Discoverer is one of many tools used for protein database search and peptide to spectrum assignment in mass spectrometry-based proteomics. However, the inadequacy of conversion tools makes it challenging to compare and integrate its results to those of other analytical tools. Here we present M2Lite, an open-source, light-weight, easily pluggable and fast conversion tool. M2Lite converts proteome discoverer derived MSF files to the proteomics community defined standard - the mzIdentML file format. M2Lite's source code is available as open-source at https://bitbucket.org/paiyetan/m2lite/src and its compiled binaries and documentation can be freely downloaded at https://bitbucket.org/paiyetan/m2lite/downloads.

  4. Technical challenges for big data in biomedicine and health: data sources, infrastructure, and analytics.

    PubMed

    Peek, N; Holmes, J H; Sun, J

    2014-08-15

    To review technical and methodological challenges for big data research in biomedicine and health. We discuss sources of big datasets, survey infrastructures for big data storage and big data processing, and describe the main challenges that arise when analyzing big data. The life and biomedical sciences are massively contributing to the big data revolution through secondary use of data that were collected during routine care and through new data sources such as social media. Efficient processing of big datasets is typically achieved by distributing computation over a cluster of computers. Data analysts should be aware of pitfalls related to big data such as bias in routine care data and the risk of false-positive findings in high-dimensional datasets. The major challenge for the near future is to transform analytical methods that are used in the biomedical and health domain, to fit the distributed storage and processing model that is required to handle big data, while ensuring confidentiality of the data being analyzed.

  5. Pocket guide to transportation, 2001

    DOT National Transportation Integrated Search

    2000-12-01

    The Bureau of Transportation Statistics compiled the data in this guide from multiple sources. The guide is divided into five sections and a glossary: (1) Transportation System Extent and Use, (2) Transportation and Safety, (3) Mobility, (4) Transpor...

  6. Pocket guide to transportation, 2002

    DOT National Transportation Integrated Search

    2002-02-01

    The Bureau of Transportation Statistics compiled the data in this guide from multiple sources. The guide is divided into five sections and a glossary: (1) Transportation System Extent and Use, (2) Transportation and Safety, (3) Mobility, (4) Transpor...

  7. Pocket guide to transportation, 2004

    DOT National Transportation Integrated Search

    2004-01-01

    The Bureau of Transportation Statistics compiled the data in this guide from multiple sources. The guide is divided into six sections and a glossary: (1) Transportation System Extent and Use, (2) Transportation Safety, (3) Transportation Security, (4...

  8. Pocket guide to transportation, 2005

    DOT National Transportation Integrated Search

    2005-01-01

    The Bureau of Transportation Statistics compiled the data in this guide from multiple sources. The guide is divided into six sections and a glossary: (1) Transportation System Extent and Use, (2) Transportation Safety, (3) Transportation Security, (4...

  9. Pocket guide to transportation, 2003

    DOT National Transportation Integrated Search

    2003-01-01

    The Bureau of Transportation Statistics compiled the data in this guide from multiple sources. The guide is divided into six sections and a glossary: (1) Transportation System Extent and Use, (2) Transportation Safety, (3) Transportation Security, (4...

  10. caCORE: a common infrastructure for cancer informatics.

    PubMed

    Covitz, Peter A; Hartel, Frank; Schaefer, Carl; De Coronado, Sherri; Fragoso, Gilberto; Sahni, Himanso; Gustafson, Scott; Buetow, Kenneth H

    2003-12-12

    Sites with substantive bioinformatics operations are challenged to build data processing and delivery infrastructure that provides reliable access and enables data integration. Locally generated data must be processed and stored such that relationships to external data sources can be presented. Consistency and comparability across data sets requires annotation with controlled vocabularies and, further, metadata standards for data representation. Programmatic access to the processed data should be supported to ensure the maximum possible value is extracted. Confronted with these challenges at the National Cancer Institute Center for Bioinformatics, we decided to develop a robust infrastructure for data management and integration that supports advanced biomedical applications. We have developed an interconnected set of software and services called caCORE. Enterprise Vocabulary Services (EVS) provide controlled vocabulary, dictionary and thesaurus services. The Cancer Data Standards Repository (caDSR) provides a metadata registry for common data elements. Cancer Bioinformatics Infrastructure Objects (caBIO) implements an object-oriented model of the biomedical domain and provides Java, Simple Object Access Protocol and HTTP-XML application programming interfaces. caCORE has been used to develop scientific applications that bring together data from distinct genomic and clinical science sources. caCORE downloads and web interfaces can be accessed from links on the caCORE web site (http://ncicb.nci.nih.gov/core). caBIO software is distributed under an open source license that permits unrestricted academic and commercial use. Vocabulary and metadata content in the EVS and caDSR, respectively, is similarly unrestricted, and is available through web applications and FTP downloads. http://ncicb.nci.nih.gov/core/publications contains links to the caBIO 1.0 class diagram and the caCORE 1.0 Technical Guide, which provide detailed information on the present caCORE architecture, data sources and APIs. Updated information appears on a regular basis on the caCORE web site (http://ncicb.nci.nih.gov/core).

  11. A global database of sap flow measurements (SAPFLUXNET) to link plant and ecosystem physiology

    NASA Astrophysics Data System (ADS)

    Poyatos, Rafael; Granda, Víctor; Flo, Víctor; Molowny-Horas, Roberto; Mencuccini, Maurizio; Oren, Ram; Katul, Gabriel; Mahecha, Miguel; Steppe, Kathy; Martínez-Vilalta, Jordi

    2017-04-01

    Regional and global networks of ecosystem CO2 and water flux monitoring have dramatically increased our understanding of ecosystem functioning in the last 20 years. More recently, analyses of ecosystem-level fluxes have successfully incorporated data streams at coarser (remote sensing) and finer (plant traits) organisational scales. However, there are few data sources that capture the diel to seasonal dynamics of whole-plant physiology and that can provide a link between organism- and ecosystem-level function. Sap flow measured in plant stems reveals the temporal patterns in plant water transport, as mediated by stomatal regulation and hydraulic architecture. The widespread use of thermometric methods of sap flow measurement since the 1990s has resulted in numerous data sets for hundreds of species and sites worldwide, but these data have remained fragmentary and generally unavailable for syntheses of regional to global scope. We are compiling the first global database of sub-daily sap flow measurements in individual plants (SAPFLUXNET), aimed at unravelling the environmental and biotic drivers of plant transpiration regulation globally. I will present the SAPFLUXNET data infrastructure and workflow, which is built upon flexible, open-source computing tools within the R environment (dedicated R packages and classes, interactive documents and apps with Rmarkdown and Shiny). Data collection started in mid-2016, we have already incorporated > 50 datasets representing > 40 species and > 350 individual plants, globally distributed, and the number of contributed data sets is increasing rapidly. I will provide a general overview of the distribution of available data sets according to climate, measurement method, species, functional groups and plant size attributes. In parallel to the sap flow data compilation, we have also collated published results from calibrations of sap flow methods, to provide a first quantification on the variability associated with different sap flow methods. The SAPFLUXNET database is not only a promising resource to investigate the physiological and environmental controls on transpiration by whole plants and stands, but also a tool to link ecosystem fluxes (FLUXNET) with plant functional traits (TRY). Finally, SAPFLUXNET will encourage data sharing and the adoption of common instrumental and analysis protocols among ecophysiologists.

  12. Multi-Level Data-Security and Data-Protection in a Distributed Search Infrastructure for Digital Medical Samples.

    PubMed

    Witt, Michael; Krefting, Dagmar

    2016-01-01

    Human sample data is stored in biobanks with software managing digital derived sample data. When these stand-alone components are connected and a search infrastructure is employed users become able to collect required research data from different data sources. Data protection, patient rights, data heterogeneity and access control are major challenges for such an infrastructure. This dissertation will investigate concepts for a multi-level security architecture to comply with these requirements.

  13. Water scarcity and urban forest management: introduction

    Treesearch

    E. Gregory McPherson; Robert Prince

    2013-01-01

    Between 1997 and 2009 a serious drought affected much of Australia. Whether reasoned or unintentional, water policy decisions closed the tap, turning much of the urban forest’s lifeline into a trickle. Green infrastructure became brown infrastructure, exposing its standing as a low priority relative to other consumptive sources. To share new solutions to water scarcity...

  14. Compilation of climate data from heterogeneous networks across the Hawaiian Islands

    PubMed Central

    Longman, Ryan J.; Giambelluca, Thomas W.; Nullet, Michael A.; Frazier, Abby G.; Kodama, Kevin; Crausbay, Shelley D.; Krushelnycky, Paul D.; Cordell, Susan; Clark, Martyn P.; Newman, Andy J.; Arnold, Jeffrey R.

    2018-01-01

    Long-term, accurate observations of atmospheric phenomena are essential for a myriad of applications, including historic and future climate assessments, resource management, and infrastructure planning. In Hawai‘i, climate data are available from individual researchers, local, State, and Federal agencies, and from large electronic repositories such as the National Centers for Environmental Information (NCEI). Researchers attempting to make use of available data are faced with a series of challenges that include: (1) identifying potential data sources; (2) acquiring data; (3) establishing data quality assurance and quality control (QA/QC) protocols; and (4) implementing robust gap filling techniques. This paper addresses these challenges by providing: (1) a summary of the available climate data in Hawai‘i including a detailed description of the various meteorological observation networks and data accessibility, and (2) a quality controlled meteorological dataset across the Hawaiian Islands for the 25-year period 1990-2014. The dataset draws on observations from 471 climate stations and includes rainfall, maximum and minimum surface air temperature, relative humidity, wind speed, downward shortwave and longwave radiation data. PMID:29437162

  15. Compilation of climate data from heterogeneous networks across the Hawaiian Islands

    NASA Astrophysics Data System (ADS)

    Longman, Ryan J.; Giambelluca, Thomas W.; Nullet, Michael A.; Frazier, Abby G.; Kodama, Kevin; Crausbay, Shelley D.; Krushelnycky, Paul D.; Cordell, Susan; Clark, Martyn P.; Newman, Andy J.; Arnold, Jeffrey R.

    2018-02-01

    Long-term, accurate observations of atmospheric phenomena are essential for a myriad of applications, including historic and future climate assessments, resource management, and infrastructure planning. In Hawai'i, climate data are available from individual researchers, local, State, and Federal agencies, and from large electronic repositories such as the National Centers for Environmental Information (NCEI). Researchers attempting to make use of available data are faced with a series of challenges that include: (1) identifying potential data sources; (2) acquiring data; (3) establishing data quality assurance and quality control (QA/QC) protocols; and (4) implementing robust gap filling techniques. This paper addresses these challenges by providing: (1) a summary of the available climate data in Hawai'i including a detailed description of the various meteorological observation networks and data accessibility, and (2) a quality controlled meteorological dataset across the Hawaiian Islands for the 25-year period 1990-2014. The dataset draws on observations from 471 climate stations and includes rainfall, maximum and minimum surface air temperature, relative humidity, wind speed, downward shortwave and longwave radiation data.

  16. Real-Time Data Management, IP Telemetry, Data Integration, and Data Center Operations for the Source Physics Experiment (SPE), Nevada National Security Site

    NASA Astrophysics Data System (ADS)

    Plank, G.; Slater, D.; Torrisi, J.; Presser, R.; Williams, M.; Smith, K. D.

    2012-12-01

    The Nevada Seismological Laboratory (NSL) manages time-series data and high-throughput IP telemetry for the National Center for Nuclear Security (NCNS) Source Physics Experiment (SPE), underway on the Nevada National Security Site (NNSS). During active-source experiments, SPE's heterogeneous systems record over 350 channels of a variety of data types including seismic, infrasound, acoustic, and electro-magnetic. During the interim periods, broadband and short period instruments record approximately 200 channels of continuous, high-sample-rate seismic data. Frequent changes in sensor and station configurations create a challenging meta-data environment. Meta-data account for complete operational histories, including sensor types, serial numbers, gains, sample rates, orientations, instrument responses, data-logger types etc. To date, these catalogue 217 stations, over 40 different sensor types, and over 1000 unique recording configurations (epochs). Facilities for processing, backup, and distribution of time-series data currently span four Linux servers, 60Tb of disk capacity, and two data centers. Bandwidth, physical security, and redundant power and cooling systems for acquisition, processing, and backup servers are provided by NSL's Reno data center. The Nevada System of Higher Education (NSHE) System Computer Services (SCS) in Las Vegas provides similar facilities for the distribution server. NSL staff handle setup, maintenance, and security of all data management systems. SPE PIs have remote access to meta-data, raw data, and CSS3.0 compilations, via SSL-based transfers such as rsync or secure-copy, as well as shell access for data browsing and limited processing. Meta-data are continuously updated and posted on the Las Vegas distribution server as station histories are better understood and errors are corrected. Raw time series and refined CSS3.0 data compilations with standardized formats are transferred to the Las Vegas data server as available. For better data availability and station monitoring, SPE is beginning to leverage NSL's wide-area digital IP network with nine SPE stations and six Rock Valley area stations that stream continuous recordings in real time to the NSL Reno data center. These stations, in addition to eight regional legacy stations supported by National Security Technologies (NSTec), are integrated with NSL's regional monitoring network and constrain a high-quality local earthquake catalog for NNSS. The telemetered stations provide critical capabilities for SPE, and infrastructure for earthquake response on NNSS as well as southern Nevada and the Las Vegas area.

  17. Deconvolution Methods and Systems for the Mapping of Acoustic Sources from Phased Microphone Arrays

    NASA Technical Reports Server (NTRS)

    Humphreys, Jr., William M. (Inventor); Brooks, Thomas F. (Inventor)

    2012-01-01

    Mapping coherent/incoherent acoustic sources as determined from a phased microphone array. A linear configuration of equations and unknowns are formed by accounting for a reciprocal influence of one or more cross-beamforming characteristics thereof at varying grid locations among the plurality of grid locations. An equation derived from the linear configuration of equations and unknowns can then be iteratively determined. The equation can be attained by the solution requirement of a constraint equivalent to the physical assumption that the coherent sources have only in phase coherence. The size of the problem may then be reduced using zoning methods. An optimized noise source distribution is then generated over an identified aeroacoustic source region associated with a phased microphone array (microphones arranged in an optimized grid pattern including a plurality of grid locations) in order to compile an output presentation thereof, thereby removing beamforming characteristics from the resulting output presentation.

  18. Deconvolution methods and systems for the mapping of acoustic sources from phased microphone arrays

    NASA Technical Reports Server (NTRS)

    Brooks, Thomas F. (Inventor); Humphreys, Jr., William M. (Inventor)

    2010-01-01

    A method and system for mapping acoustic sources determined from a phased microphone array. A plurality of microphones are arranged in an optimized grid pattern including a plurality of grid locations thereof. A linear configuration of N equations and N unknowns can be formed by accounting for a reciprocal influence of one or more beamforming characteristics thereof at varying grid locations among the plurality of grid locations. A full-rank equation derived from the linear configuration of N equations and N unknowns can then be iteratively determined. A full-rank can be attained by the solution requirement of the positivity constraint equivalent to the physical assumption of statically independent noise sources at each N location. An optimized noise source distribution is then generated over an identified aeroacoustic source region associated with the phased microphone array in order to compile an output presentation thereof, thereby removing the beamforming characteristics from the resulting output presentation.

  19. Mobile measurement of methane emissions from natural gas developments in northeastern British Columbia, Canada

    NASA Astrophysics Data System (ADS)

    Atherton, Emmaline; Risk, David; Fougère, Chelsea; Lavoie, Martin; Marshall, Alex; Werring, John; Williams, James P.; Minions, Christina

    2017-10-01

    North American leaders recently committed to reducing methane emissions from the oil and gas sector, but information on current emissions from upstream oil and gas developments in Canada are lacking. This study examined the occurrence of methane plumes in an area of unconventional natural gas development in northwestern Canada. In August to September 2015 we completed almost 8000 km of vehicle-based survey campaigns on public roads dissecting oil and gas infrastructure, such as well pads and processing facilities. We surveyed six routes 3-6 times each, which brought us past over 1600 unique well pads and facilities managed by more than 50 different operators. To attribute on-road plumes to oil- and gas-related sources we used gas signatures of residual excess concentrations (anomalies above background) less than 500 m downwind from potential oil and gas emission sources. All results represent emissions greater than our minimum detection limit of 0.59 g s-1 at our average detection distance (319 m). Unlike many other oil and gas developments in the US for which methane measurements have been reported recently, the methane concentrations we measured were close to normal atmospheric levels, except inside natural gas plumes. Roughly 47 % of active wells emitted methane-rich plumes above our minimum detection limit. Multiple sites that pre-date the recent unconventional natural gas development were found to be emitting, and we observed that the majority of these older wells were associated with emissions on all survey repeats. We also observed emissions from gas processing facilities that were highly repeatable. Emission patterns in this area were best explained by infrastructure age and type. Extrapolating our results across all oil and gas infrastructure in the Montney area, we estimate that the emission sources we located (emitting at a rate > 0.59 g s-1) contribute more than 111 800 t of methane annually to the atmosphere. This value exceeds reported bottom-up estimates of 78 000 t of methane for all oil and gas sector sources in British Columbia. Current bottom-up methods for estimating methane emissions do not normally calculate the fraction of emitting oil and gas infrastructure with thorough on-ground measurements. However, this study demonstrates that mobile surveys could provide a more accurate representation of the number of emission sources in an oil and gas development. This study presents the first mobile collection of methane emissions from oil and gas infrastructure in British Columbia, and these results can be used to inform policy development in an era of methane emission reduction efforts.

  20. Using infrastructure optimization to reduce greenhouse gas emissions from oil sands extraction and processing.

    PubMed

    Middleton, Richard S; Brandt, Adam R

    2013-02-05

    The Alberta oil sands are a significant source of oil production and greenhouse gas emissions, and their importance will grow as the region is poised for decades of growth. We present an integrated framework that simultaneously considers economic and engineering decisions for the capture, transport, and storage of oil sands CO(2) emissions. The model optimizes CO(2) management infrastructure at a variety of carbon prices for the oil sands industry. Our study reveals several key findings. We find that the oil sands industry lends itself well to development of CO(2) trunk lines due to geographic coincidence of sources and sinks. This reduces the relative importance of transport costs compared to nonintegrated transport systems. Also, the amount of managed oil sands CO(2) emissions, and therefore the CCS infrastructure, is very sensitive to the carbon price; significant capture and storage occurs only above 110$/tonne CO(2) in our simulations. Deployment of infrastructure is also sensitive to CO(2) capture decisions and technology, particularly the fraction of capturable CO(2) from oil sands upgrading and steam generation facilities. The framework will help stakeholders and policy makers understand how CCS infrastructure, including an extensive pipeline system, can be safely and cost-effectively deployed.

  1. A Clinician-Centered Evaluation of the Usability of AHLTA and Automated Clinical Practice Guidelines at TAMC

    DTIC Science & Technology

    2011-03-31

    evidence based medicine into clinical practice. It will decrease costs and enable multiple stakeholders to work in an open content/source environment to exchange clinical content, develop and test technology and explore processes in applied CDS. Design: Comparative study between the KMR infrastructure and capabilities developed as an open source, vendor agnostic solution for aCPG execution within AHLTA and the current DoD/MHS standard evaluating: H1: An open source, open standard KMR and Clinical Decision Support Engine can enable organizations to share domain

  2. A compilation of mineral occurrences and the relationship of occurrences to structural elements of the Kentucky and Tennessee region

    NASA Technical Reports Server (NTRS)

    Mayhew, M. A.; Myers, D. M.

    1984-01-01

    A very prominent magnetic anomaly measured by MAGSAT over the eastern mid-continent of the United States was inferred to have a source region beneath Kentucky and Tennessee. Prominent aeromagnetic and gravity anomalies are also associated with the inferred source region. A crustal model constructed to fit these anomalies interpreted the complex as a large mafic plutonic intrusion of Precambrian age. The complex was named the Kentucky body. It was noticed that the Jessamine Dome, which is a locus of intense faulting and mineralization, occurs near the northern end of the Kentucky body, and that more generally there seemed to be a spatial relationship between mineral occurrence and the body. The relationship between mineral deposits in Kentucky and Tennessee and the Kentucky body was investigated. A compilation of mineral occurrences in the region, classified according to type and age, is presented.

  3. Approaches in highly parameterized inversion—PEST++ Version 3, a Parameter ESTimation and uncertainty analysis software suite optimized for large environmental models

    USGS Publications Warehouse

    Welter, David E.; White, Jeremy T.; Hunt, Randall J.; Doherty, John E.

    2015-09-18

    The PEST++ Version 3 software suite can be compiled for Microsoft Windows®4 and Linux®5 operating systems; the source code is available in a Microsoft Visual Studio®6 2013 solution; Linux Makefiles are also provided. PEST++ Version 3 continues to build a foundation for an open-source framework capable of producing robust and efficient parameter estimation tools for large environmental models.

  4. The National Geographic Names Data Base: Phase II instructions

    USGS Publications Warehouse

    Orth, Donald J.; Payne, Roger L.

    1987-01-01

    not recorded on topographic maps be added. The systematic collection of names from other sources, including maps, charts, and texts, is termed Phase II. In addition, specific types of features not compiled during Phase I are encoded and added to the data base. Other names of importance to researchers and users, such as historical and variant names, are also included. The rules and procedures for Phase II research, compilation, and encoding are contained in this publication.

  5. Understanding and Capturing People’s Mobile App Privacy Preferences

    DTIC Science & Technology

    2013-10-28

    The entire apps’ metadata takes up about 500MB of storage space when stored in a MySQL database and all the binary files take approximately 300GB of...functionality that can de- compile Dalvik bytecodes to Java source code faster than other de-compilers. Given the scale of the app analysis we planned on... java libraries, such as parser, sql connectors, etc Targeted Ads 137 admob, adwhirl, greystripe… Provided by mobile behavioral ads company to

  6. Software Model Checking Without Source Code

    NASA Technical Reports Server (NTRS)

    Chaki, Sagar; Ivers, James

    2009-01-01

    We present a framework, called AIR, for verifying safety properties of assembly language programs via software model checking. AIR extends the applicability of predicate abstraction and counterexample guided abstraction refinement to the automated verification of low-level software. By working at the assembly level, AIR allows verification of programs for which source code is unavailable-such as legacy and COTS software-and programs that use features-such as pointers, structures, and object-orientation-that are problematic for source-level software verification tools. In addition, AIR makes no assumptions about the underlying compiler technology. We have implemented a prototype of AIR and present encouraging results on several non-trivial examples.

  7. Developing a search engine for pharmacotherapeutic information that is not published in biomedical journals.

    PubMed

    Do Pazo-Oubiña, F; Calvo Pita, C; Puigventós Latorre, F; Periañez-Párraga, L; Ventayol Bosch, P

    2011-01-01

    To identify publishers of pharmacotherapeutic information not found in biomedical journals that focuses on evaluating and providing advice on medicines and to develop a search engine to access this information. Compiling web sites that publish information on the rational use of medicines and have no commercial interests. Free-access web sites in Spanish, Galician, Catalan or English. Designing a search engine using the Google "custom search" application. Overall 159 internet addresses were compiled and were classified into 9 labels. We were able to recover the information from the selected sources using a search engine, which is called "AlquimiA" and available from http://www.elcomprimido.com/FARHSD/AlquimiA.htm. The main sources of pharmacotherapeutic information not published in biomedical journals were identified. The search engine is a useful tool for searching and accessing "grey literature" on the internet. Copyright © 2010 SEFH. Published by Elsevier Espana. All rights reserved.

  8. SerpentinaDB: a database of plant-derived molecules of Rauvolfia serpentina.

    PubMed

    Pathania, Shivalika; Ramakrishnan, Sai Mukund; Randhawa, Vinay; Bagler, Ganesh

    2015-08-04

    Plant-derived molecules (PDMs) are known to be a rich source of diverse scaffolds that could serve as a basis for rational drug design. Structured compilation of phytochemicals from traditional medicinal plants can facilitate prospection for novel PDMs and their analogs as therapeutic agents. Rauvolfia serpentina is an important medicinal plant, endemic to Himalayan mountain ranges of Indian subcontinent, reported to be of immense therapeutic value against various diseases. We present SerpentinaDB, a structured compilation of 147 R. serpentina PDMs, inclusive of their plant part source, chemical classification, IUPAC, SMILES, physicochemical properties, and 3D chemical structures with associated references. It also provides refined search option for identification of analogs of natural molecules against ZINC database at user-defined cut-off. SerpentinaDB is an exhaustive resource of R. serpentina molecules facilitating prospection for therapeutic molecules from a medicinally important source of natural products. It also provides refined search option to explore the neighborhood of chemical space against ZINC database to identify analogs of natural molecules obtained as leads. In a previous study, we have demonstrated the utility of this resource by identifying novel aldose reductase inhibitors towards intervention of complications of diabetes.

  9. Extending Automatic Parallelization to Optimize High-Level Abstractions for Multicore

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liao, C; Quinlan, D J; Willcock, J J

    2008-12-12

    Automatic introduction of OpenMP for sequential applications has attracted significant attention recently because of the proliferation of multicore processors and the simplicity of using OpenMP to express parallelism for shared-memory systems. However, most previous research has only focused on C and Fortran applications operating on primitive data types. C++ applications using high-level abstractions, such as STL containers and complex user-defined types, are largely ignored due to the lack of research compilers that are readily able to recognize high-level object-oriented abstractions and leverage their associated semantics. In this paper, we automatically parallelize C++ applications using ROSE, a multiple-language source-to-source compiler infrastructuremore » which preserves the high-level abstractions and gives us access to their semantics. Several representative parallelization candidate kernels are used to explore semantic-aware parallelization strategies for high-level abstractions, combined with extended compiler analyses. Those kernels include an array-base computation loop, a loop with task-level parallelism, and a domain-specific tree traversal. Our work extends the applicability of automatic parallelization to modern applications using high-level abstractions and exposes more opportunities to take advantage of multicore processors.« less

  10. Web accessibility and open source software.

    PubMed

    Obrenović, Zeljko

    2009-07-01

    A Web browser provides a uniform user interface to different types of information. Making this interface universally accessible and more interactive is a long-term goal still far from being achieved. Universally accessible browsers require novel interaction modalities and additional functionalities, for which existing browsers tend to provide only partial solutions. Although functionality for Web accessibility can be found as open source and free software components, their reuse and integration is complex because they were developed in diverse implementation environments, following standards and conventions incompatible with the Web. To address these problems, we have started several activities that aim at exploiting the potential of open-source software for Web accessibility. The first of these activities is the development of Adaptable Multi-Interface COmmunicator (AMICO):WEB, an infrastructure that facilitates efficient reuse and integration of open source software components into the Web environment. The main contribution of AMICO:WEB is in enabling the syntactic and semantic interoperability between Web extension mechanisms and a variety of integration mechanisms used by open source and free software components. Its design is based on our experiences in solving practical problems where we have used open source components to improve accessibility of rich media Web applications. The second of our activities involves improving education, where we have used our platform to teach students how to build advanced accessibility solutions from diverse open-source software. We are also partially involved in the recently started Eclipse projects called Accessibility Tools Framework (ACTF), the aim of which is development of extensible infrastructure, upon which developers can build a variety of utilities that help to evaluate and enhance the accessibility of applications and content for people with disabilities. In this article we briefly report on these activities.

  11. The cost of getting CCS wrong: Uncertainty, infrastructure design, and stranded CO 2

    DOE PAGES

    Middleton, Richard Stephen; Yaw, Sean Patrick

    2018-01-11

    Carbon capture, and storage (CCS) infrastructure will require industry—such as fossil-fuel power, ethanol production, and oil and gas extraction—to make massive investment in infrastructure. The cost of getting these investments wrong will be substantial and will impact the success of CCS technology. Multiple factors can and will impact the success of commercial-scale CCS, including significant uncertainties regarding capture, transport, and injection-storage decisions. Uncertainties throughout the CCS supply chain include policy, technology, engineering performance, economics, and market forces. In particular, large uncertainties exist for the injection and storage of CO 2. Even taking into account upfront investment in site characterization, themore » final performance of the storage phase is largely unknown until commercial-scale injection has started. We explore and quantify the impact of getting CCS infrastructure decisions wrong based on uncertain injection rates and uncertain CO 2 storage capacities using a case study managing CO 2 emissions from the Canadian oil sands industry in Alberta. We use SimCCS, a widely used CCS infrastructure design framework, to develop multiple CCS infrastructure scenarios. Each scenario consists of a CCS infrastructure network that connects CO 2 sources (oil sands extraction and processing) with CO 2 storage reservoirs (acid gas storage reservoirs) using a dedicated CO 2 pipeline network. Each scenario is analyzed under a range of uncertain storage estimates and infrastructure performance is assessed and quantified in terms of cost to build additional infrastructure to store all CO 2. We also include the role of stranded CO 2, CO 2 that a source was expecting to but cannot capture due substandard performance in the transport and storage infrastructure. Results show that the cost of getting the original infrastructure design wrong are significant and that comprehensive planning will be required to ensure that CCS becomes a successful climate mitigation technology. Here, we show that the concept of stranded CO 2 can transform a seemingly high-performing infrastructure design into the worst case scenario.« less

  12. The cost of getting CCS wrong: Uncertainty, infrastructure design, and stranded CO 2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Middleton, Richard Stephen; Yaw, Sean Patrick

    Carbon capture, and storage (CCS) infrastructure will require industry—such as fossil-fuel power, ethanol production, and oil and gas extraction—to make massive investment in infrastructure. The cost of getting these investments wrong will be substantial and will impact the success of CCS technology. Multiple factors can and will impact the success of commercial-scale CCS, including significant uncertainties regarding capture, transport, and injection-storage decisions. Uncertainties throughout the CCS supply chain include policy, technology, engineering performance, economics, and market forces. In particular, large uncertainties exist for the injection and storage of CO 2. Even taking into account upfront investment in site characterization, themore » final performance of the storage phase is largely unknown until commercial-scale injection has started. We explore and quantify the impact of getting CCS infrastructure decisions wrong based on uncertain injection rates and uncertain CO 2 storage capacities using a case study managing CO 2 emissions from the Canadian oil sands industry in Alberta. We use SimCCS, a widely used CCS infrastructure design framework, to develop multiple CCS infrastructure scenarios. Each scenario consists of a CCS infrastructure network that connects CO 2 sources (oil sands extraction and processing) with CO 2 storage reservoirs (acid gas storage reservoirs) using a dedicated CO 2 pipeline network. Each scenario is analyzed under a range of uncertain storage estimates and infrastructure performance is assessed and quantified in terms of cost to build additional infrastructure to store all CO 2. We also include the role of stranded CO 2, CO 2 that a source was expecting to but cannot capture due substandard performance in the transport and storage infrastructure. Results show that the cost of getting the original infrastructure design wrong are significant and that comprehensive planning will be required to ensure that CCS becomes a successful climate mitigation technology. Here, we show that the concept of stranded CO 2 can transform a seemingly high-performing infrastructure design into the worst case scenario.« less

  13. The Experiment Factory: Standardizing Behavioral Experiments.

    PubMed

    Sochat, Vanessa V; Eisenberg, Ian W; Enkavi, A Zeynep; Li, Jamie; Bissett, Patrick G; Poldrack, Russell A

    2016-01-01

    The administration of behavioral and experimental paradigms for psychology research is hindered by lack of a coordinated effort to develop and deploy standardized paradigms. While several frameworks (Mason and Suri, 2011; McDonnell et al., 2012; de Leeuw, 2015; Lange et al., 2015) have provided infrastructure and methods for individual research groups to develop paradigms, missing is a coordinated effort to develop paradigms linked with a system to easily deploy them. This disorganization leads to redundancy in development, divergent implementations of conceptually identical tasks, disorganized and error-prone code lacking documentation, and difficulty in replication. The ongoing reproducibility crisis in psychology and neuroscience research (Baker, 2015; Open Science Collaboration, 2015) highlights the urgency of this challenge: reproducible research in behavioral psychology is conditional on deployment of equivalent experiments. A large, accessible repository of experiments for researchers to develop collaboratively is most efficiently accomplished through an open source framework. Here we present the Experiment Factory, an open source framework for the development and deployment of web-based experiments. The modular infrastructure includes experiments, virtual machines for local or cloud deployment, and an application to drive these components and provide developers with functions and tools for further extension. We release this infrastructure with a deployment (http://www.expfactory.org) that researchers are currently using to run a set of over 80 standardized web-based experiments on Amazon Mechanical Turk. By providing open source tools for both deployment and development, this novel infrastructure holds promise to bring reproducibility to the administration of experiments, and accelerate scientific progress by providing a shared community resource of psychological paradigms.

  14. The Experiment Factory: Standardizing Behavioral Experiments

    PubMed Central

    Sochat, Vanessa V.; Eisenberg, Ian W.; Enkavi, A. Zeynep; Li, Jamie; Bissett, Patrick G.; Poldrack, Russell A.

    2016-01-01

    The administration of behavioral and experimental paradigms for psychology research is hindered by lack of a coordinated effort to develop and deploy standardized paradigms. While several frameworks (Mason and Suri, 2011; McDonnell et al., 2012; de Leeuw, 2015; Lange et al., 2015) have provided infrastructure and methods for individual research groups to develop paradigms, missing is a coordinated effort to develop paradigms linked with a system to easily deploy them. This disorganization leads to redundancy in development, divergent implementations of conceptually identical tasks, disorganized and error-prone code lacking documentation, and difficulty in replication. The ongoing reproducibility crisis in psychology and neuroscience research (Baker, 2015; Open Science Collaboration, 2015) highlights the urgency of this challenge: reproducible research in behavioral psychology is conditional on deployment of equivalent experiments. A large, accessible repository of experiments for researchers to develop collaboratively is most efficiently accomplished through an open source framework. Here we present the Experiment Factory, an open source framework for the development and deployment of web-based experiments. The modular infrastructure includes experiments, virtual machines for local or cloud deployment, and an application to drive these components and provide developers with functions and tools for further extension. We release this infrastructure with a deployment (http://www.expfactory.org) that researchers are currently using to run a set of over 80 standardized web-based experiments on Amazon Mechanical Turk. By providing open source tools for both deployment and development, this novel infrastructure holds promise to bring reproducibility to the administration of experiments, and accelerate scientific progress by providing a shared community resource of psychological paradigms. PMID:27199843

  15. A Compilation of Global Bio-Optical in Situ Data for Ocean-Colour Satellite Applications

    NASA Technical Reports Server (NTRS)

    Valente, Andre; Sathyendranath, Shubha; Brotus, Vanda; Groom, Steve; Grant, Michael; Taberner, Malcolm; Antoine, David; Arnone, Robert; Balch, William M.; Barker, Kathryn; hide

    2016-01-01

    A compiled set of in situ data is important to evaluate the quality of ocean-colour satellite-data records. Here we describe the data compiled for the validation of the ocean-colour products from the ESA Ocean Colour Climate Change Initiative (OC-CCI). The data were acquired from several sources (MOBY, BOUSSOLE, AERONET-OC, SeaBASS, NOMAD, MERMAID, AMT, ICES, HOT, GePCO), span between 1997 and 2012, and have a global distribution. Observations of the following variables were compiled: spectral remote-sensing reflectances, concentrations of chlorophyll a, spectral inherent optical properties and spectral diffuse attenuation coefficients. The data were from multi-project archives acquired via the open internet services or from individual projects, acquired directly from data providers. Methodologies were implemented for homogenisation, quality control and merging of all data. No changes were made to the original data, other than averaging of observations that were close in time and space, elimination of some points after quality control and conversion to a standard format. The final result is a merged table designed for validation of satellite-derived ocean-colour products and available in text format. Metadata of each in situ measurement (original source, cruise or experiment, principal investigator) were preserved throughout the work and made available in the final table. Using all the data in a validation exercise increases the number of matchups and enhances the representativeness of different marine regimes. By making available the metadata, it is also possible to analyse each set of data separately. The compiled data are available at doi:10.1594PANGAEA.854832 (Valente et al., 2015).

  16. ROS Hexapod

    NASA Technical Reports Server (NTRS)

    Davis, Kirsch; Bankieris, Derek

    2016-01-01

    As an intern project for NASA Johnson Space Center (JSC), my job was to familiarize myself and operate a Robotics Operating System (ROS). The project outcome converted existing software assets into ROS using nodes, enabling a robotic Hexapod to communicate to be functional and controlled by an existing PlayStation 3 (PS3) controller. Existing control algorithms and current libraries have no ROS capabilities within the Hexapod C++ source code when the internship started, but that has changed throughout my internship. Conversion of C++ codes to ROS enabled existing code to be compatible with ROS, and is now controlled using an existing PS3 controller. Furthermore, my job description was to design ROS messages and script programs that enabled assets to participate in the ROS ecosystem by subscribing and publishing messages. Software programming source code is written in directories using C++. Testing of software assets included compiling code within the Linux environment using a terminal. The terminal ran the code from a directory. Several problems occurred while compiling code and the code would not compile. So modifying code to where C++ can read the source code were made. Once the code was compiled and ran, the code was uploaded to Hexapod and then controlled by a PS3 controller. The project outcome has the Hexapod fully functional and compatible with ROS and operates using the PlayStation 3 controller. In addition, an open source software (IDE) Arduino board will be integrated into the ecosystem with designing circuitry on a breadboard to add additional behavior with push buttons, potentiometers and other simple elements in the electrical circuitry. Other projects with the Arduino will be a GPS module, digital clock that will run off 22 satellites to show accurate real time using a GPS signal and an internal patch antenna to communicate with satellites. In addition, this internship experience has led me to pursue myself to learn coding more efficiently and effectively to write, subscribe and publish my own source code in different programming languages. With some familiarity with software programming, it will enhance my skills in the electrical engineering field. In contrast, my experience here at JSC with the Simulation and Graphics Branch (ER7) has led me to take my coding skill to be more proficient to increase my knowledge in software programming, and also enhancing my skills in ROS. This knowledge will be taken back to my university to implement coding in a school project that will use source coding and ROS to work on the PR2 robot which is controlled by ROS software. My skills learned here will be used to integrate messages to subscribe and publish ROS messages to a PR2 robot. The PR2 robot will be controlled by an existing PS3 controller by changing C++ coding to subscribe and publish messages to ROS. Overall the skills that were obtained here will not be lost, but increased.

  17. Minnesota Freight Flows, 1990 : a compilation of data by mode and commodity

    DOT National Transportation Integrated Search

    1995-02-01

    Extensive transportation systems, which include highways, rivers, Great Lakes ports, railroads, airports, and pipelines, link Minnesota to markets throughout the North American Continent. This study provides a comprehensive source of freight flow inf...

  18. Pocket guide to transportation, 2007

    DOT National Transportation Integrated Search

    2007-01-01

    The Bureau of Transportation Statistics compiled the data in this guide from multiple sources. The guide is divided into six sections and a glossary: (1) System Extent and Use, (2) Safety, (3) Security, (4) Mobility, (5) Economy, (6) Environment, and...

  19. Pocket guide to transportation, 2006

    DOT National Transportation Integrated Search

    2006-01-01

    The Bureau of Transportation Statistics compiled the data in this guide from multiple sources. The guide is divided into six sections and a glossary: (1) System Extent and Use, (2) Safety, (3) Security, (4) Mobility, (5) Economy, (6) Environment, and...

  20. Pocket guide to transportation, 2008

    DOT National Transportation Integrated Search

    2008-02-01

    The Bureau of Transportation Statistics compiled the data in this guide from multiple sources. The guide is divided into six sections and a glossary: (1) System Extent and Use, (2) Safety, (3) Security, (4) Mobility, (5) Economy, (6) Environment, and...

  1. Compiling global name-space programs for distributed execution

    NASA Technical Reports Server (NTRS)

    Koelbel, Charles; Mehrotra, Piyush

    1990-01-01

    Distributed memory machines do not provide hardware support for a global address space. Thus programmers are forced to partition the data across the memories of the architecture and use explicit message passing to communicate data between processors. The compiler support required to allow programmers to express their algorithms using a global name-space is examined. A general method is presented for analysis of a high level source program and along with its translation to a set of independently executing tasks communicating via messages. If the compiler has enough information, this translation can be carried out at compile-time. Otherwise run-time code is generated to implement the required data movement. The analysis required in both situations is described and the performance of the generated code on the Intel iPSC/2 is presented.

  2. Astronomical, physical, and meteorological parameters for planetary atmospheres

    NASA Technical Reports Server (NTRS)

    Allison, Michael; Travis, Larry D.

    1986-01-01

    A newly compiled table of astronomical, physical, and meteorological parameters for planetary atmospheres is presented. Formulae and explanatory notes for their application and a complete listing of sources are also given.

  3. 24 CFR 16.14 - General exemptions.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... existence and of the name and address of the source agency. For any further information concerning the...,” primarily consist of information compiled for the purpose of criminal investigations and are associated with...

  4. RANGER-DTL 2.0: Rigorous Reconstruction of Gene-Family Evolution by Duplication, Transfer, and Loss.

    PubMed

    Bansal, Mukul S; Kellis, Manolis; Kordi, Misagh; Kundu, Soumya

    2018-04-24

    RANGER-DTL 2.0 is a software program for inferring gene family evolution using Duplication-Transfer-Loss reconciliation. This new software is highly scalable and easy to use, and offers many new features not currently available in any other reconciliation program. RANGER-DTL 2.0 has a particular focus on reconciliation accuracy and can account for many sources of reconciliation uncertainty including uncertain gene tree rooting, gene tree topological uncertainty, multiple optimal reconciliations, and alternative event cost assignments. RANGER-DTL 2.0 is open-source and written in C ++ and Python. Pre-compiled executables, source code (open-source under GNU GPL), and a detailed manual are freely available from http://compbio.engr.uconn.edu/software/RANGER-DTL/. mukul.bansal@uconn.edu.

  5. Minimizing Overhead for Secure Computation and Fully Homomorphic Encryption: Overhead

    DTIC Science & Technology

    2015-11-01

    many inputs. We also improved our compiler infrastructure to handle very large circuits in a more scalable way. In Jan’13, we employed the AESNI and...Amazon’s elastic compute infrastructure , and is running under a Xen hypervisor. Since we do not have direct access to the bare metal, we cannot...creating novel opportunities for compressing au- thentication overhead. It is especially compelling that existing public key infrastructures can be used

  6. Technical Review and Analysis of Center for Night Vision and Electro-Optics Life Cycle Cost Analysis Model (CNVEO LCCAM),

    DTIC Science & Technology

    1986-09-01

    source of the module/system. Source options are; battery, gas, cartridge, valve , and miscellaneous costs. NAMELIST OPERAT is used to compile the...hardware costs allocated to transportation for packing. TF1 = Initial transportation factor. WEIGHT = Shipping weight of total system. XSUM = System float...CD(6,I)+CD(9,I). . AROC(7,I) - Replenishment spares by year. CD(4,I) - Valve replacement cost by year. CD(5,I) = Cartridge replacement cost by year

  7. Top-down Estimate of Methane Emissions from Natural Gas Production in Northeastern Pennsylvania Using Aircraft and Tower Observations

    NASA Astrophysics Data System (ADS)

    Barkley, Z.; Lauvaux, T.; Davis, K. J.; Deng, A.; Miles, N. L.; Richardson, S.; Martins, D. K.; Cao, Y.; Sweeney, C.; McKain, K.; Schwietzke, S.; Smith, M. L.; Kort, E. A.

    2016-12-01

    Leaks in natural gas infrastructure release CH4, a potent greenhouse gas, into the atmosphere. The estimated emission rate associated with the production and transportation of natural gas is uncertain, hindering our understanding of the energy's greenhouse footprint. This study presents two applications of inverse methodology for estimating regional emission rates from natural gas production and gathering facilities in northeastern Pennsylvania. First, we used the WRF-Chem mesoscale model at 3km resolution to simulate CH4 enhancements and compared them to observations obtained from a three-week flight campaign in May 2015 over the Marcellus shale region. Methane emission rates were adjusted to minimize the errors between aircraft observations and the model-simulated concentrations for each flight. Second, we present the first tower-based high resolution atmospheric inversion of CH4 emission rates from unconventional natural gas production activities. A year of continuous CH4 and calibrated δ13C isotope measurements were collected at four tower locations in northeastern Pennsylvania. The adjoint model used here combines a backward-in-time Lagrangian Particle Dispersion Model coupled with the WRF-Chem model at the same resolution. The prior for both optimization systems was compiled for major sources of CH4 within the Mid-Atlantic states, accounting for emissions from natural gas sources as well as emissions related to farming, waste management, coal, and other sources. Optimized natural gas emission rates are found to be 0.36% of total gas production, with a 2σ confidence interval between 0.27-0.45% of production. We present the results from the tower inversion over one year at 3km resolution providing additional information on spatial and temporal variability of emission rates from production and gathering facilities within the natural gas industry in comparison to flux estimates from the aircraft campaign.

  8. Water use trends in Washington, 1985-2005

    USGS Publications Warehouse

    Lane, R.C.

    2010-01-01

    Since 1950, the U.S. Geological Survey Washington Water Science Center (USGS-WAWSC) has collected, compiled, and published, at 5-year intervals, statewide estimates of the amounts of water withdrawn and used for various purposes in Washington State. As new data and methods became available, some of the original datasets were recompiled. The most recent versions of these datasets were used in this fact sheet. The datasets are available online along with other USGS-WAWSC water-use publications at the USGS-WAWSC water use web page: http://wa.water.usgs.gov/data/wuse/. Values on these datasets and in this fact sheet may not sum to the indicated total due to independent rounding. Due to variations in data requirements, collection methods, terminology, and data sources, the direct assessment of water-use trends between compilations is difficult. This fact sheet focuses on the trends in total State and public-supplied populations, freshwater withdrawals and use, public-supply withdrawals and deliveries, and crop irrigation withdrawals and acreage in Washington from 1985 through 2005. These four categories were included in all five compilations and were the most stable in terms of data requirements, collection methods, terminology, and data sources.

  9. A Speculative Approach to Design A Hybrid System for Green Energy

    NASA Astrophysics Data System (ADS)

    Sharma, Dinesh; Sharma, Purnima K.; Naidu, Praveen V.

    2017-08-01

    Now a day’s demand of energy is increasing all over the world. Because of this demand the fossils fuels are reducing day by day to meet the requirements of energy in daily life of human beings. It is necessary to balance the situation for the increasing energy demand by taking an optimistic overview about the natural renewable energy sources like sun, gust, hydro etc.,. These energy sources only can balance the situation of unbalancing between fossil fuels and increasing energy demand. Renewable energy systems are suitable for off grid services in power generation, to provide services to remote areas to build complex grid infrastructures. India has the abundant source of solar and wind energy. Individually these energy sources have some own advantages and disadvantages; to overcome the disadvantages of individual energy sources we can combine all these sources to make an efficient renewable source nothing but hybrid renewable energy source. In this paper we proposed a hybrid model which is a combination of four renewable energy sources solar, wind, RF signal and living plants to increase the energy efficiency.

  10. Quantifying methane emissions from natural gas production in north-eastern Pennsylvania

    NASA Astrophysics Data System (ADS)

    Barkley, Zachary R.; Lauvaux, Thomas; Davis, Kenneth J.; Deng, Aijun; Miles, Natasha L.; Richardson, Scott J.; Cao, Yanni; Sweeney, Colm; Karion, Anna; Smith, MacKenzie; Kort, Eric A.; Schwietzke, Stefan; Murphy, Thomas; Cervone, Guido; Martins, Douglas; Maasakkers, Joannes D.

    2017-11-01

    Natural gas infrastructure releases methane (CH4), a potent greenhouse gas, into the atmosphere. The estimated emission rate associated with the production and transportation of natural gas is uncertain, hindering our understanding of its greenhouse footprint. This study presents a new application of inverse methodology for estimating regional emission rates from natural gas production and gathering facilities in north-eastern Pennsylvania. An inventory of CH4 emissions was compiled for major sources in Pennsylvania. This inventory served as input emission data for the Weather Research and Forecasting model with chemistry enabled (WRF-Chem), and atmospheric CH4 mole fraction fields were generated at 3 km resolution. Simulated atmospheric CH4 enhancements from WRF-Chem were compared to observations obtained from a 3-week flight campaign in May 2015. Modelled enhancements from sources not associated with upstream natural gas processes were assumed constant and known and therefore removed from the optimization procedure, creating a set of observed enhancements from natural gas only. Simulated emission rates from unconventional production were then adjusted to minimize the mismatch between aircraft observations and model-simulated mole fractions for 10 flights. To evaluate the method, an aircraft mass balance calculation was performed for four flights where conditions permitted its use. Using the model optimization approach, the weighted mean emission rate from unconventional natural gas production and gathering facilities in north-eastern Pennsylvania approach is found to be 0.36 % of total gas production, with a 2σ confidence interval between 0.27 and 0.45 % of production. Similarly, the mean emission estimates using the aircraft mass balance approach are calculated to be 0.40 % of regional natural gas production, with a 2σ confidence interval between 0.08 and 0.72 % of production. These emission rates as a percent of production are lower than rates found in any other basin using a top-down methodology, and may be indicative of some characteristics of the basin that make sources from the north-eastern Marcellus region unique.

  11. Ninteenth International Cosmic Ray Conference. OG Sessions, Volume 1

    NASA Technical Reports Server (NTRS)

    Jones, F. C. (Compiler)

    1985-01-01

    Contributed papers addressing cosmic ray origin and galactic phenomena are compiled. The topic areas covered in this volume include gamma ray bursts, gamma rays from point sources, and diffuse gamma ray emission.

  12. College Stores and Computers and Students and Faculty.

    ERIC Educational Resources Information Center

    Newcomb, Jack, Ed.

    1982-01-01

    Information on the computer industry and computer use by students, faculty, and the publishing industry that may be useful in planning college store merchandising is compiled from a variety of sources. (MSE)

  13. New Mexico conservative ion water chemistry data and chalcedony geothermometry

    DOE Data Explorer

    Shari Kelley

    2015-10-21

    Compilation of boron, lithium, bromine, and silica data from wells and springs throughout New Mexico from a wide variety of sources. The chalcedony geothermometry calculation is included in this file.

  14. Nineteenth International Cosmic Ray Conference. OG Sessions, Volume 3

    NASA Technical Reports Server (NTRS)

    Jones, F. C. (Compiler)

    1985-01-01

    Papers submitted for presentation at the 19th International Cosmic Ray Conference are compiled. This volume addresses cosmic ray sources and acceleration, interstellar propagation and nuclear interactions, and detection techniques and instrumentation.

  15. Volcanic hazards at distant critical infrastructure: A method for bespoke, multi-disciplinary assessment

    NASA Astrophysics Data System (ADS)

    Odbert, H. M.; Aspinall, W.; Phillips, J.; Jenkins, S.; Wilson, T. M.; Scourse, E.; Sheldrake, T.; Tucker, P.; Nakeshree, K.; Bernardara, P.; Fish, K.

    2015-12-01

    Societies rely on critical services such as power, water, transport networks and manufacturing. Infrastructure may be sited to minimise exposure to natural hazards but not all can be avoided. The probability of long-range transport of a volcanic plume to a site is comparable to other external hazards that must be considered to satisfy safety assessments. Recent advances in numerical models of plume dispersion and stochastic modelling provide a formalized and transparent approach to probabilistic assessment of hazard distribution. To understand the risks to critical infrastructure far from volcanic sources, it is necessary to quantify their vulnerability to different hazard stressors. However, infrastructure assets (e.g. power plantsand operational facilities) are typically complex systems in themselves, with interdependent components that may differ in susceptibility to hazard impact. Usually, such complexity means that risk either cannot be estimated formally or that unsatisfactory simplifying assumptions are prerequisite to building a tractable risk model. We present a new approach to quantifying risk by bridging expertise of physical hazard modellers and infrastructure engineers. We use a joint expert judgment approach to determine hazard model inputs and constrain associated uncertainties. Model outputs are chosen on the basis of engineering or operational concerns. The procedure facilitates an interface between physical scientists, with expertise in volcanic hazards, and infrastructure engineers, with insight into vulnerability to hazards. The result is a joined-up approach to estimating risk from low-probability hazards to critical infrastructure. We describe our methodology and show preliminary results for vulnerability to volcanic hazards at a typical UK industrial facility. We discuss our findings in the context of developing bespoke assessment of hazards from distant sources in collaboration with key infrastructure stakeholders.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pinto, F. T.; Iglesias, G.; Santos, P. R.

    Marine renewable energy (MRE) is generates from waves, currents, tides, and thermal resources in the ocean. MRE has been identified as a potential commercial-scale source of renewable energy. This special topic presents a compilation of works selected from the 3rd IAHR Europe Congress, held in Porto, Portugal, in 2014. It covers different subjects relevant to MRE, including resource assessment, marine energy sector policies, energy source comparisons based on levelized cost, proof-of-concept and new-technology development for wave and tidal energy exploitation, and assessment of possible inference between wave energy converters (WEC).

  17. The 1977 emissions inventory for southeastern Virginia. [environment model of air quality based on exhaust emission from urban areas

    NASA Technical Reports Server (NTRS)

    Brewer, D. A.; Remsberg, E. E.; Woodbury, G. E.; Quinn, L. C.

    1979-01-01

    Regional tropospheric air pollution modeling and data compilation to simulate the time variation of species concentrations in and around an urban area is discussed. The methods used to compile an emissions inventory are outlined. Emissions factors for vehicular travel in the urban area are presented along with an analysis of the emission gases. Emission sources other than vehicular including industrial wastes, residential solid waste disposal, aircraft emissions, and emissions from the railroads are investigated.

  18. A Language-Based Approach To Wireless Sensor Network Security

    DTIC Science & Technology

    2014-03-06

    128 – RPC 119 7.0 Secure RPC 87 32.0 Figure 1: SpartanRPC Memory Overhead (L) and Impact on Messaging (R) Figure 2: Scalaness /nesT Compilation and...language for developing real WSN applica- tions. This language, called Scalaness /nesT, extends Scala with staging features for executing programs on hubs...particular note here is the fact that cross-stage type safety of Scalaness source code ensures that compiled bytecode can be deployed to, and run on

  19. A compilation of redshifts and velocity dispersions for Abell clusters (Struble and Rood 1987): Documentation for the machine-readable version

    NASA Technical Reports Server (NTRS)

    Warren, Wayne H., Jr.

    1989-01-01

    The machine readable version of the compilation, as it is currently being distributed from the Astronomical Data Center, is described. The catalog contains redshifts and velocity dispersions for all Abell clusters for which these data had been published up to 1986 July. Also included are 1950 equatorial coordinates for the centers of the listed clusters, numbers of observations used to determine the redshifts, and bibliographical references citing the data sources.

  20. Historical records of the geomagnetic field

    NASA Astrophysics Data System (ADS)

    Arneitz, Patrick; Heilig, Balázs; Vadasz, Gergely; Valach, Fridrich; Dolinský, Peter; Hejda, Pavel; Fabian, Karl; Hammerl, Christa; Leonhardt, Roman

    2014-05-01

    Records of historical direct measurements of the geomagnetic field are invaluable sources to reconstruct temporal variations of the Earth's magnetic field. They provide information about the field evolution back to the late Middle Age. We have investigated such records with focus on Austria and some neighbouring countries. A variety of new sources and source types are examined. These include 19th century land survey and observatory records of the Imperial and Royal "Centralanstalt f. Meteorologie und Erdmagnetismus", which are not included in the existing compilations. Daily measurements at the Imperial and Royal Observatory in Prague have been digitized. The Imperial and Royal Navy carried out observations in the Adriatic Sea during several surveys. Declination values have been collected from famous mining areas in the former Austro-Hungarian Empire. In this connection, a time series for Banska Stiavnica has been compiled. In the meteorological yearbooks of the monastery Kremsmünster regular declination measurements for the first half of the 19th century were registered. Marsigli's observations during military mapping works in 1696 are also included in our collection. Moreover, compass roses on historical maps or declination values marked on compasses, sundials or globes also provide information about ancient field declination. An evaluation of church orientations in Lower Austria and Northern Germany did not support the hypothesis that church naves had been aligned along the East-West direction by means of magnetic compasses. Therefore, this potential source of information must be excluded from our collection. The gathered records are integrated into a database together with corresponding metadata, such as the used measurement instruments and methods. This information allows an assessment of quality and reliability of the historical observations. The combination of compilations of historical measurements with high quality archeo- and paleomagnetic data in a single database enables a reliable joint evaluation of all types of magnetic field records from different origins. This collection forms the basis for a combined inverse modelling of the geomagnetic field evolution.

  1. Towards Networked Knowledge: The Learning Registry, an Infrastructure for Sharing Online Learning Resources

    ERIC Educational Resources Information Center

    Lee, Ashley; Hobson, Joe; Bienkowski, Marie; Midgley, Steve; Currier, Sarah; Campbell, Lorna M.; Novoselova, Tatiana

    2012-01-01

    In this article, the authors describe an open-source, open-data digital infrastructure for sharing information about open educational resources (OERs) across disparate systems and platforms. The Learning Registry, which began as a project funded by the U.S. Departments of Education and Defense, currently has an active international community…

  2. Front Range Infrastructure Resources Project--Aggregate Resources Activities

    USGS Publications Warehouse

    ,

    1998-01-01

    Infrastructure, such as roads, buildings, airports, and dams, is built and maintained by use of large quantities of aggregate—sand, gravel, and stone. As urban areas expand, local sources of these resources become inaccessible. Other competitive land uses have a higher value than aggregate resources. For example, gravel cannot be mined from under a subdivision. The failure to plan for the protection and extraction of infrastructure resources often results in increased consumer cost, environmental damage, and an adversarial relationship between the industry and the community.

  3. Exploratory study of atmospheric methane enhancements derived from natural gas use in the Houston urban area

    DOE PAGES

    Sanchez, Nancy P.; Zheng, Chuantao; Ye, Weilin; ...

    2018-01-04

    Here, the extensive use of natural gas (NG) in urban areas for heating, cooking and as a vehicular fuel is associated with potentially significant emissions of methane (CH 4) to the atmosphere. Methane, a potent greenhouse gas that influences the chemistry of the atmosphere, can be emitted from different sources including leakage from NG infrastructure, transportation activities, end-use uncombusted NG, landfills and livestock. Although significant CH 4 leakage associated with aging local NG distribution systems in the U.S. has been reported, further investigation is required to study the role of this infrastructure component and other NG-related sources in atmospheric CHmore » 4 enhancements in urban centers. In this study, neighborhood-scale mobile-based monitoring of potential CH 4 emissions associated with NG in the Greater Houston area (GHA) is reported. A novel dual-gas 3.337 μm interband cascade laser-based sensor system was developed and mobile-mode deployed for simultaneous CH 4 and ethane (C 2H 6) monitoring during a period of over 14 days, corresponding to ~ 90 hours of effective data collection during summer 2016. The sampling campaign covered ~ 250 road miles and was primarily concentrated on eight residential zones with distinct infrastructure age and NG usage levels. A moderate number of elevated CH 4 concentration events (37 episodes) with mixing ratios not exceeding 3.60 ppmv and associated with atmospheric background enhancements below 1.21 ppmv were observed during the field campaign. Source discrimination analyses based on the covariance between CH 4 and C 2H 6 levels indicated the predominance of thermogenic sources (e.g., NG) in the elevated CH 4 concentration episodes. The volumetric fraction of C 2H 6 in the sources associated with the thermogenic CH 4 spikes varied between 2.7 and 5.9%, concurring with the C 2H 6 content in NG distributed in the GHA. Isolated CH 4 peak events with significantly higher C 2H 6 enhancements (~11 %) were observed at industrial areas and locations with high density of petroleum and gas pipelines in the GHA, indicating potential variability in Houston’s thermogenic CH 4 sources.« less

  4. Exploratory study of atmospheric methane enhancements derived from natural gas use in the Houston urban area

    NASA Astrophysics Data System (ADS)

    Sanchez, Nancy P.; Zheng, Chuantao; Ye, Weilin; Czader, Beata; Cohan, Daniel S.; Tittel, Frank K.; Griffin, Robert J.

    2018-03-01

    The extensive use of natural gas (NG) in urban areas for heating and cooking and as a vehicular fuel is associated with potentially significant emissions of methane (CH4) to the atmosphere. Methane, a potent greenhouse gas that influences the chemistry of the atmosphere, can be emitted from different sources including leakage from NG infrastructure, transportation activities, end-use uncombusted NG, landfills and livestock. Although significant CH4 leakage associated with aging local NG distribution systems in the U.S. has been reported, further investigation is required to study the role of this infrastructure component and other NG-related sources in atmospheric CH4 enhancements in urban centers. In this study, neighborhood-scale mobile-based monitoring of potential CH4 emissions associated with NG in the Greater Houston area (GHA) is reported. A novel dual-gas 3.337 μm interband cascade laser-based sensor system was developed and mobile-mode deployed for simultaneous CH4 and ethane (C2H6) monitoring during a period of over 14 days, corresponding to ∼ 90 h of effective data collection during summer 2016. The sampling campaign covered ∼250 exclusive road miles and was primarily concentrated on eight residential zones with distinct infrastructure age and NG usage levels. A moderate number of elevated CH4 concentration events (37 episodes) with mixing ratios not exceeding 3.60 ppmv and associated with atmospheric background enhancements below 1.21 ppmv were observed during the field campaign. Source discrimination analyses based on the covariance between CH4 and C2H6 levels indicated the predominance of thermogenic sources (e.g., NG) in the elevated CH4 concentration episodes. The volumetric fraction of C2H6 in the sources associated with the thermogenic CH4 spikes varied between 2.7 and 5.9%, concurring with the C2H6 content in NG distributed in the GHA. Isolated CH4 peak events with significantly higher C2H6 enhancements (∼11%) were observed at industrial areas and locations with high density of petroleum and gas pipelines in the GHA, indicating potential variability in Houston's thermogenic CH4 sources.

  5. Exploratory study of atmospheric methane enhancements derived from natural gas use in the Houston urban area

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sanchez, Nancy P.; Zheng, Chuantao; Ye, Weilin

    Here, the extensive use of natural gas (NG) in urban areas for heating, cooking and as a vehicular fuel is associated with potentially significant emissions of methane (CH 4) to the atmosphere. Methane, a potent greenhouse gas that influences the chemistry of the atmosphere, can be emitted from different sources including leakage from NG infrastructure, transportation activities, end-use uncombusted NG, landfills and livestock. Although significant CH 4 leakage associated with aging local NG distribution systems in the U.S. has been reported, further investigation is required to study the role of this infrastructure component and other NG-related sources in atmospheric CHmore » 4 enhancements in urban centers. In this study, neighborhood-scale mobile-based monitoring of potential CH 4 emissions associated with NG in the Greater Houston area (GHA) is reported. A novel dual-gas 3.337 μm interband cascade laser-based sensor system was developed and mobile-mode deployed for simultaneous CH 4 and ethane (C 2H 6) monitoring during a period of over 14 days, corresponding to ~ 90 hours of effective data collection during summer 2016. The sampling campaign covered ~ 250 road miles and was primarily concentrated on eight residential zones with distinct infrastructure age and NG usage levels. A moderate number of elevated CH 4 concentration events (37 episodes) with mixing ratios not exceeding 3.60 ppmv and associated with atmospheric background enhancements below 1.21 ppmv were observed during the field campaign. Source discrimination analyses based on the covariance between CH 4 and C 2H 6 levels indicated the predominance of thermogenic sources (e.g., NG) in the elevated CH 4 concentration episodes. The volumetric fraction of C 2H 6 in the sources associated with the thermogenic CH 4 spikes varied between 2.7 and 5.9%, concurring with the C 2H 6 content in NG distributed in the GHA. Isolated CH 4 peak events with significantly higher C 2H 6 enhancements (~11 %) were observed at industrial areas and locations with high density of petroleum and gas pipelines in the GHA, indicating potential variability in Houston’s thermogenic CH 4 sources.« less

  6. A 23-GROUP NEUTRON THERMALIZATION CROSS SECTION LIBRARY

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Doctor, R.D.; Boling, M.A.

    1963-07-15

    A set of 23-group neutron cross sections for use in the calculation of neutron thermalization and thermal neutron spectral effects in SNAP reactors is compiled. The sources and methods used to obtain the cross sections are described. (auth)

  7. Pocket guide to large truck and bus statistics, 2015.

    DOT National Transportation Integrated Search

    2015-04-01

    The Federal Motor Carrier Safety Administration compiled the data in this guide from multiple sources. The guide is divided into six sections and a glossary: (1) Overview: Large Trucks and Buses, (2) Roadside Inspections and Violations, (3) Reviews, ...

  8. Pocket guide to large trucks and bus statistics, 2014.

    DOT National Transportation Integrated Search

    2014-10-01

    The Federal Motor Carrier Safety Administration compiled the data in this guide from multiple sources. The guide is divided into six sections and a glossary: (1) Overview: Large Trucks and Buses, (2) Roadside Inspections and Violations, (3) Reviews, ...

  9. Geologic map of the Priest Rapids 1:100,000 quadrangle, Washington

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reidel, S.P.; Fecht, K.R.

    1993-09-01

    This map of the Priest Rapids 1:100,000-scale quadrangle, Washington, shows the geology of one of fifteen complete or partial 1:100,000-scale quadrangles that cover the southeast quadrant of Washington. Geologic maps of these quadrangles have been compiled by geologists with the Washington Division of Geology and Earth Resources (DGER) and Washington State University and are the principal data sources for a 1:250,000scale geologic map of the southeast quadrant of Washington, which is in preparation. Eleven of those quadrangles are being released as DGER open-file reports (listed below). The map of the Wenatchee quadrangle has been published by the US Geological Surveymore » (Tabor and others, 1982), and the Moses Lake (Gulick, 1990a), Ritzville (Gulick, 1990b), and Rosalia (Waggoner, 1990) quadrangles have already been released. The geology of the Priest Rapids quadrangle has not previously been compiled at 1:100,000 scale. Furthermore, this is the first 1:100,000 or smaller scale geologic map of the area to incorporate both bedrock and surficial geology. This map was compiled in 1992, using published and unpublished geologic maps as sources of data.« less

  10. Poem: A Fast Monte Carlo Code for the Calculation of X-Ray Transition Zone Dose and Current

    DTIC Science & Technology

    1975-01-15

    stored on the photon interaction data tape. Following the photoelectric ionization the atom will relax emitting either a fluorescent photon or an Auger 50...shell fluorescence yield CL have been obtained from the Storm and Israel1 9 and 25 Bambynek, et al. compilations, with preference given to the...Bambynek compilation, and stored on the photon inter- action data tape. The mean M fluorescence yield wM is approximated by zero. The total electron source

  11. 8 CFR 1208.12 - Reliance on information compiled by other sources.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... application, or in deciding whether the alien has a credible fear of persecution or torture pursuant to § 1208.30 of this part, or a reasonable fear of persecution or torture pursuant to § 1208.31, the asylum...

  12. ggCyto: Next Generation Open-Source Visualization Software for Cytometry.

    PubMed

    Van, Phu; Jiang, Wenxin; Gottardo, Raphael; Finak, Greg

    2018-06-01

    Open source software for computational cytometry has gained in popularity over the past few years. Efforts such as FlowCAP, the Lyoplate and Euroflow projects have highlighted the importance of efforts to standardize both experimental and computational aspects of cytometry data analysis. The R/BioConductor platform hosts the largest collection of open source cytometry software covering all aspects of data analysis and providing infrastructure to represent and analyze cytometry data with all relevant experimental, gating, and cell population annotations enabling fully reproducible data analysis. Data visualization frameworks to support this infrastructure have lagged behind. ggCyto is a new open-source BioConductor software package for cytometry data visualization built on ggplot2 that enables ggplot-like functionality with the core BioConductor flow cytometry data structures. Amongst its features are the ability to transform data and axes on-the-fly using cytometry-specific transformations, plot faceting by experimental meta-data variables, and partial matching of channel, marker and cell populations names to the contents of the BioConductor cytometry data structures. We demonstrate the salient features of the package using publicly available cytometry data with complete reproducible examples in a supplementary material vignette. https://bioconductor.org/packages/devel/bioc/html/ggcyto.html. gfinak@fredhutch.org. Supplementary data are available at Bioinformatics online and at http://rglab.org/ggcyto/.

  13. SLEEC: Semantics-Rich Libraries for Effective Exascale Computation. Final Technical Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Milind, Kulkarni

    SLEEC (Semantics-rich Libraries for Effective Exascale Computation) was a project funded by the Department of Energy X-Stack Program, award number DE-SC0008629. The initial project period was September 2012–August 2015. The project was renewed for an additional year, expiring August 2016. Finally, the project received a no-cost extension, leading to a final expiry date of August 2017. Modern applications, especially those intended to run at exascale, are not written from scratch. Instead, they are built by stitching together various carefully-written, hand-tuned libraries. Correctly composing these libraries is difficult, but traditional compilers are unable to effectively analyze and transform across abstraction layers.more » Domain specific compilers integrate semantic knowledge into compilers, allowing them to transform applications that use particular domain-specific languages, or domain libraries. But they do not help when new domains are developed, or applications span multiple domains. SLEEC aims to fix these problems. To do so, we are building generic compiler and runtime infrastructures that are semantics-aware but not domain-specific. By performing optimizations related to the semantics of a domain library, the same infrastructure can be made generic and apply across multiple domains.« less

  14. Mother Lode: The Untapped Rare Earth Mineral Resources of Vietnam

    DTIC Science & Technology

    2013-11-01

    Library of Congress, Congressional Research Service. Rare Earth Elements: The Global Supply Chain, 4. 14 Tse , Pui-Kwan. China’s Rare-Earth Industry...U.S. Geological Survey Open-File Report 2011–1042, 2. Figure 2. Global REO production, 1960-2011. Source: Tse , Pui-Kwan. China’s Rare-Earth...3 compiled from three sources: Tse , Pui-Kwan. China’s Rare-Earth Industry: U.S. Geological Survey Open-File Report 2011–1042, 4; Areddy, James T

  15. Preliminary geologic map of the Piru 7.5' quadrangle, southern California: a digital database

    USGS Publications Warehouse

    Yerkes, R.F.; Campbell, Russell H.

    1995-01-01

    This Open-File report is a digital geologic map database. This pamphlet serves to introduce and describe the digital data. There is no paper map included in the Open-File report. This digital map database is compiled from previously published sources combined with some new mapping and modifications in nomenclature. The geologic map database delineates map units that are identified by general age and lithology following the stratigraphic nomenclature of the U. S. Geological Survey. For detailed descriptions of the units, their stratigraphic relations and sources of geologic mapping consult Yerkes and Campbell (1995). More specific information about the units may be available in the original sources.

  16. The NASA earth resources spectral information system: A data compilation

    NASA Technical Reports Server (NTRS)

    Leeman, V.; Earing, D.; Vincent, R. K.; Ladd, S.

    1971-01-01

    The NASA Earth Resources Spectral Information System and the information contained therein are described. It contains an ordered, indexed compilation of natural targets in the optical region from 0.3 to 45.0 microns. The data compilation includes approximately 100 rock and mineral, 2600 vegetation, 1000 soil, and 60 water spectral reflectance, transmittance, and emittance curves. Most of the data have been categorized by subject, and the curves in those subject areas have been plotted on a single graph. Those categories with too few curves and miscellaneous categories have been plotted as single-curve graphs. Each graph, composite of single, is fully titled to indicate curve source and is indexed by subject to facilitate user retrieval.

  17. The National Assessment of Shoreline Change: A GIS Compilation of Vector Shorelines and Associated Shoreline Change Data for the U.S. Gulf of Mexico

    USGS Publications Warehouse

    Miller, Tara L.; Morton, Robert A.; Sallenger, Asbury H.; Moore, Laura J.

    2004-01-01

    Introduction The Coastal and Marine Geology Program of the U.S. Geological Survey has generated a comprehensive database of digital vector shorelines and shoreline change rates for the U.S. Gulf of Mexico. These data, which are presented herein, were compiled as part of the U.S. Geological Survey's National Assessment of Shoreline Change Project. Beach erosion is a chronic problem along most open-ocean shores of the United States. As coastal populations continue to grow and community infrastructures are threatened by erosion, there is increased demand for accurate information including rates and trends of shoreline migration. There is also a critical need for shoreline change data that is consistent from one coastal region to another. One purpose of this work is to develop standard repeatable methods for mapping and analyzing shoreline movement so that periodic updates regarding coastal erosion and land loss can be made nationally that are systematic and internally consistent. This data compilation for open-ocean, sandy shorelines of the Gulf of Mexico is the first in a series that will eventually include the Atlantic Coast, Pacific Coast, and parts of Hawaii and Alaska. Short- and long-term shoreline change evaluations are based on merging three historical shorelines with a modern shoreline derived from lidar (light detection and ranging) topographic surveys. Historical shorelines generally represent the following time periods: 1800s, 1920s-1930s, and 1970s. The most recent shoreline is derived from data collected over the period of 1998-2002. Long-term rates of change are calculated by linear regression using all four shorelines. Short-term rates of change are simple end-point rate calculations using the two most recent shorelines. Please refer to our full report on shoreline change in the Gulf of Mexico, National Assessment of Shoreline Change: Part 1, Historical Shoreline Changes and Associated Coastal Land Loss Along the U.S. Gulf of Mexico (USGS Open File Report 2004-1043) for additional information regarding methods and results. Data in this report are organized into data layers by state and are provided as single-point vector datasets with metadata. Vector shorelines may represent a compilation of data from one or more sources and these sources are attributed in the dataset. All data are intended to be GIS-ready inasmuch as the data should not require any additional cleanup, formatting, or renaming of fields in order to use the data in a Geographic Information System (GIS). This project employs the Environmental Systems Research Institute's (ESRI) ArcView as its GIS mapping tool and contains several data layers (or themes) that are used to create a geographic view of the margin off the U.S. Gulf of Mexico. These vector data form a basemap comprised of polygon and line themes that include a U.S. coastline (1:80,000), U.S. cities, and state boundaries.

  18. A Shared Infrastructure for Federated Search Across Distributed Scientific Metadata Catalogs

    NASA Astrophysics Data System (ADS)

    Reed, S. A.; Truslove, I.; Billingsley, B. W.; Grauch, A.; Harper, D.; Kovarik, J.; Lopez, L.; Liu, M.; Brandt, M.

    2013-12-01

    The vast amount of science metadata can be overwhelming and highly complex. Comprehensive analysis and sharing of metadata is difficult since institutions often publish to their own repositories. There are many disjoint standards used for publishing scientific data, making it difficult to discover and share information from different sources. Services that publish metadata catalogs often have different protocols, formats, and semantics. The research community is limited by the exclusivity of separate metadata catalogs and thus it is desirable to have federated search interfaces capable of unified search queries across multiple sources. Aggregation of metadata catalogs also enables users to critique metadata more rigorously. With these motivations in mind, the National Snow and Ice Data Center (NSIDC) and Advanced Cooperative Arctic Data and Information Service (ACADIS) implemented two search interfaces for the community. Both the NSIDC Search and ACADIS Arctic Data Explorer (ADE) use a common infrastructure which keeps maintenance costs low. The search clients are designed to make OpenSearch requests against Solr, an Open Source search platform. Solr applies indexes to specific fields of the metadata which in this instance optimizes queries containing keywords, spatial bounds and temporal ranges. NSIDC metadata is reused by both search interfaces but the ADE also brokers additional sources. Users can quickly find relevant metadata with minimal effort and ultimately lowers costs for research. This presentation will highlight the reuse of data and code between NSIDC and ACADIS, discuss challenges and milestones for each project, and will identify creation and use of Open Source libraries.

  19. Overcoming Intermediary Bias Through the Use of Social Media Intelligence

    DTIC Science & Technology

    2015-05-23

    PMESII-PT Politics, Military, Economy, Social, Information, Infrastructure, Physical Environment and Time. OSINT Open Source Intelligence...Intelligence (SOCMINT) is a sub-set of open source intelligence ( OSINT ). OSINT has existed for an extended period of time, but it is a relatively new...author determined that there are two 1 Walter R, Draeger, "Take Advantage of OSINT ," Military

  20. Patient adherence to prescribed antimicrobial drug dosing regimens.

    PubMed

    Vrijens, Bernard; Urquhart, John

    2005-05-01

    The aim of this article is to review current knowledge about the clinical impact of patients' variable adherence to prescribed anti-infective drug dosing regimens, with the aim of renewing interest and exploration of this important but largely neglected area of therapeutics. Central to the estimation of a patient's adherence to a prescribed drug regimen is a reliably compiled drug dosing history. Electronic monitoring methods have emerged as the virtual 'gold standard' for compiling drug dosing histories in ambulatory patients. Reliably compiled drug dosing histories are consistently downwardly skewed, with varying degrees of under-dosing. In particular, the consideration of time intervals between protease inhibitor doses has revealed that ambulatory patients' variable execution of prescribed dosing regimens is a leading source of variance in viral response. Such analyses reveal the need for a new discipline, called pharmionics, which is the study of how ambulatory patients use prescription drugs. Properly analysed, reliable data on the time-course of patients' actual intake of prescription drugs can eliminate a major source of unallocated variance in drug responses, including the non-response that occurs and is easily misinterpreted when a patient's complete non-execution of a prescribed drug regimen is unrecognized clinically. As such, reliable compilation of ambulatory patients' drug dosing histories has the promise of being a key step in reducing unallocated variance in drug response and in improving the informational yield of clinical trials. It is also the basis for sound, measurement-guided steps taken to improve a patient's execution of a prescribed dosing regimen.

  1. IDENTIFICATION AND COMPILATION OF UNSATURATED/VADOSE ZONE MODELS

    EPA Science Inventory

    Many ground-water contamination problems are derived from sources at or near the soil surface. Consequently, the physical and (bio-)chemical behavior of contaminants in the shallow subsurface is of critical importance to the development of protection and remediation strategies. M...

  2. Railway tunnels in Europe and North America

    DOT National Transportation Integrated Search

    2002-05-01

    This list of railway tunnels (longer than 1, 000 m) was compiled by the secretariat from various national and international sources. The list is intended to serve as a reference inventory for a long railway tunnels in Europe and North America. Tunnel...

  3. Ambient Sound-Based Collaborative Localization of Indeterministic Devices

    PubMed Central

    Kamminga, Jacob; Le, Duc; Havinga, Paul

    2016-01-01

    Localization is essential in wireless sensor networks. To our knowledge, no prior work has utilized low-cost devices for collaborative localization based on only ambient sound, without the support of local infrastructure. The reason may be the fact that most low-cost devices are indeterministic and suffer from uncertain input latencies. This uncertainty makes accurate localization challenging. Therefore, we present a collaborative localization algorithm (Cooperative Localization on Android with ambient Sound Sources (CLASS)) that simultaneously localizes the position of indeterministic devices and ambient sound sources without local infrastructure. The CLASS algorithm deals with the uncertainty by splitting the devices into subsets so that outliers can be removed from the time difference of arrival values and localization results. Since Android is indeterministic, we select Android devices to evaluate our approach. The algorithm is evaluated with an outdoor experiment and achieves a mean Root Mean Square Error (RMSE) of 2.18 m with a standard deviation of 0.22 m. Estimated directions towards the sound sources have a mean RMSE of 17.5° and a standard deviation of 2.3°. These results show that it is feasible to simultaneously achieve a relative positioning of both devices and sound sources with sufficient accuracy, even when using non-deterministic devices and platforms, such as Android. PMID:27649176

  4. The Mystro system: A comprehensive translator toolkit

    NASA Technical Reports Server (NTRS)

    Collins, W. R.; Noonan, R. E.

    1985-01-01

    Mystro is a system that facilities the construction of compilers, assemblers, code generators, query interpretors, and similar programs. It provides features to encourage the use of iterative enhancement. Mystro was developed in response to the needs of NASA Langley Research Center (LaRC) and enjoys a number of advantages over similar systems. There are other programs available that can be used in building translators. These typically build parser tables, usually supply the source of a parser and parts of a lexical analyzer, but provide little or no aid for code generation. In general, only the front end of the compiler is addressed. Mystro, on the other hand, emphasizes tools for both ends of a compiler.

  5. Application of Open Source Software by the Lunar Mapping and Modeling Project

    NASA Astrophysics Data System (ADS)

    Ramirez, P.; Goodale, C. E.; Bui, B.; Chang, G.; Kim, R. M.; Law, E.; Malhotra, S.; Rodriguez, L.; Sadaqathullah, S.; Mattmann, C. A.; Crichton, D. J.

    2011-12-01

    The Lunar Mapping and Modeling Project (LMMP), led by the Marshall Space Flight center (MSFC), is responsible for the development of an information system to support lunar exploration, decision analysis, and release of lunar data to the public. The data available through the lunar portal is predominantly derived from present lunar missions (e.g., the Lunar Reconnaissance Orbiter (LRO)) and from historical missions (e.g., Apollo). This project has created a gold source of data, models, and tools for lunar explorers to exercise and incorporate into their activities. At Jet Propulsion Laboratory (JPL), we focused on engineering and building the infrastructure to support cataloging, archiving, accessing, and delivery of lunar data. We decided to use a RESTful service-oriented architecture to enable us to abstract from the underlying technology choices and focus on interfaces to be used internally and externally. This decision allowed us to leverage several open source software components and integrate them by either writing a thin REST service layer or relying on the API they provided; the approach chosen was dependent on the targeted consumer of a given interface. We will discuss our varying experience using open source products; namely Apache OODT, Oracle Berkley DB XML, Apache Solr, and Oracle OpenSSO (now named OpenAM). Apache OODT, developed at NASA's Jet Propulsion Laboratory and recently migrated over to Apache, provided the means for ingestion and cataloguing of products within the infrastructure. Its usage was based upon team experience with the project and past benefit received on other projects internal and external to JPL. Berkeley DB XML, distributed by Oracle for both commercial and open source use, was the storage technology chosen for our metadata. This decision was in part based on our use Federal Geographic Data Committee (FGDC) Metadata, which is expressed in XML, and the desire to keep it in its native form and exploit other technologies built on top of XML. Apache Solr, an open source search engine, was used to drive our search interface and as way to store references to metadata and data exposed via REST endpoints. As was the case with Apache OODT there was team experience with this component that helped drive this choice. Lastly, OpenSSO, an open source single sign on service, was used to secure and provide access constraints to our REST based services. For this product there was little past experience but given our service based approach seemed to be a natural fit. Given our exposure to open source we will discuss the tradeoffs and benefits received by the choices made. Moreover, we will dive into the context of how the software packages were used and the impact of their design and extensibility had on the construction of the infrastructure. Finally, we will compare our encounter across open source solutions and attributes that can vary the impression one will get. This comprehensive account of our endeavor should aid others in their assessment and use of open source.

  6. !CHAOS: A cloud of controls

    NASA Astrophysics Data System (ADS)

    Angius, S.; Bisegni, C.; Ciuffetti, P.; Di Pirro, G.; Foggetta, L. G.; Galletti, F.; Gargana, R.; Gioscio, E.; Maselli, D.; Mazzitelli, G.; Michelotti, A.; Orrù, R.; Pistoni, M.; Spagnoli, F.; Spigone, D.; Stecchi, A.; Tonto, T.; Tota, M. A.; Catani, L.; Di Giulio, C.; Salina, G.; Buzzi, P.; Checcucci, B.; Lubrano, P.; Piccini, M.; Fattibene, E.; Michelotto, M.; Cavallaro, S. R.; Diana, B. F.; Enrico, F.; Pulvirenti, S.

    2016-01-01

    The paper is aimed to present the !CHAOS open source project aimed to develop a prototype of a national private Cloud Computing infrastructure, devoted to accelerator control systems and large experiments of High Energy Physics (HEP). The !CHAOS project has been financed by MIUR (Italian Ministry of Research and Education) and aims to develop a new concept of control system and data acquisition framework by providing, with a high level of aaabstraction, all the services needed for controlling and managing a large scientific, or non-scientific, infrastructure. A beta version of the !CHAOS infrastructure will be released at the end of December 2015 and will run on private Cloud infrastructures based on OpenStack.

  7. 41 CFR 101-6.217 - Laws authorizing Federal financial assistance for programs to which this subpart applies.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... Credit Union Act, 12 U.S.C. 1770). (j) Donation of surplus property to the American National Red Cross..., preserving and compiling, and publishing of documentary sources significant to the history of the United...

  8. 41 CFR 101-6.217 - Laws authorizing Federal financial assistance for programs to which this subpart applies.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... Credit Union Act, 12 U.S.C. 1770). (j) Donation of surplus property to the American National Red Cross..., preserving and compiling, and publishing of documentary sources significant to the history of the United...

  9. 41 CFR 101-6.217 - Laws authorizing Federal financial assistance for programs to which this subpart applies.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Credit Union Act, 12 U.S.C. 1770). (j) Donation of surplus property to the American National Red Cross..., preserving and compiling, and publishing of documentary sources significant to the history of the United...

  10. Online molecular image repository and analysis system: A multicenter collaborative open-source infrastructure for molecular imaging research and application.

    PubMed

    Rahman, Mahabubur; Watabe, Hiroshi

    2018-05-01

    Molecular imaging serves as an important tool for researchers and clinicians to visualize and investigate complex biochemical phenomena using specialized instruments; these instruments are either used individually or in combination with targeted imaging agents to obtain images related to specific diseases with high sensitivity, specificity, and signal-to-noise ratios. However, molecular imaging, which is a multidisciplinary research field, faces several challenges, including the integration of imaging informatics with bioinformatics and medical informatics, requirement of reliable and robust image analysis algorithms, effective quality control of imaging facilities, and those related to individualized disease mapping, data sharing, software architecture, and knowledge management. As a cost-effective and open-source approach to address these challenges related to molecular imaging, we develop a flexible, transparent, and secure infrastructure, named MIRA, which stands for Molecular Imaging Repository and Analysis, primarily using the Python programming language, and a MySQL relational database system deployed on a Linux server. MIRA is designed with a centralized image archiving infrastructure and information database so that a multicenter collaborative informatics platform can be built. The capability of dealing with metadata, image file format normalization, and storing and viewing different types of documents and multimedia files make MIRA considerably flexible. With features like logging, auditing, commenting, sharing, and searching, MIRA is useful as an Electronic Laboratory Notebook for effective knowledge management. In addition, the centralized approach for MIRA facilitates on-the-fly access to all its features remotely through any web browser. Furthermore, the open-source approach provides the opportunity for sustainable continued development. MIRA offers an infrastructure that can be used as cross-boundary collaborative MI research platform for the rapid achievement in cancer diagnosis and therapeutics. Copyright © 2018 Elsevier Ltd. All rights reserved.

  11. Virtual Labs (Science Gateways) as platforms for Free and Open Source Science

    NASA Astrophysics Data System (ADS)

    Lescinsky, David; Car, Nicholas; Fraser, Ryan; Friedrich, Carsten; Kemp, Carina; Squire, Geoffrey

    2016-04-01

    The Free and Open Source Software (FOSS) movement promotes community engagement in software development, as well as provides access to a range of sophisticated technologies that would be prohibitively expensive if obtained commercially. However, as geoinformatics and eResearch tools and services become more dispersed, it becomes more complicated to identify and interface between the many required components. Virtual Laboratories (VLs, also known as Science Gateways) simplify the management and coordination of these components by providing a platform linking many, if not all, of the steps in particular scientific processes. These enable scientists to focus on their science, rather than the underlying supporting technologies. We describe a modular, open source, VL infrastructure that can be reconfigured to create VLs for a wide range of disciplines. Development of this infrastructure has been led by CSIRO in collaboration with Geoscience Australia and the National Computational Infrastructure (NCI) with support from the National eResearch Collaboration Tools and Resources (NeCTAR) and the Australian National Data Service (ANDS). Initially, the infrastructure was developed to support the Virtual Geophysical Laboratory (VGL), and has subsequently been repurposed to create the Virtual Hazards Impact and Risk Laboratory (VHIRL) and the reconfigured Australian National Virtual Geophysics Laboratory (ANVGL). During each step of development, new capabilities and services have been added and/or enhanced. We plan on continuing to follow this model using a shared, community code base. The VL platform facilitates transparent and reproducible science by providing access to both the data and methodologies used during scientific investigations. This is further enhanced by the ability to set up and run investigations using computational resources accessed through the VL. Data is accessed using registries pointing to catalogues within public data repositories (notably including the NCI National Environmental Research Data Interoperability Platform), or by uploading data directly from user supplied addresses or files. Similarly, scientific software is accessed through registries pointing to software repositories (e.g., GitHub). Runs are configured by using or modifying default templates designed by subject matter experts. After the appropriate computational resources are identified by the user, Virtual Machines (VMs) are spun up and jobs are submitted to service providers (currently the NeCTAR public cloud or Amazon Web Services). Following completion of the jobs the results can be reviewed and downloaded if desired. By providing a unified platform for science, the VL infrastructure enables sophisticated provenance capture and management. The source of input data (including both collection and queries), user information, software information (version and configuration details) and output information are all captured and managed as a VL resource which can be linked to output data sets. This provenance resource provides a mechanism for publication and citation for Free and Open Source Science.

  12. Thomson scattering laser-electron X-ray source for reduction of patient radiation dose in interventional coronary angiography

    NASA Astrophysics Data System (ADS)

    Artyukov, I. A.; Dyachkov, N. V.; Feshchenko, R. M.; Polunina, A. V.; Popov, N. L.; Shvedunov, V. I.; Vinogradov, A. V.

    2017-05-01

    It was medical applications that stimulated F. Carrol in the early 1990s to start the research of on relativistic Thomson scattering X-ray sources, as a part of the infrastructure of the future society. The possibility to use such a source in interventional cardiology is discussed in this paper. The replacement of X-ray tube by relativistic Thomson scattering Xray source is predicted to lower the patient radiation dose by a factor of 3 while image quality remains the same. The required general characteristics of accelerator and laser units are found. They can be reached by existing technology. A semiempirical method for simulation of medical and technical parameters of interventional coronary angiography systems is suggested.

  13. A compilation of global bio-optical in situ data for ocean-colour satellite applications

    NASA Astrophysics Data System (ADS)

    Valente, André; Sathyendranath, Shubha; Brotas, Vanda; Groom, Steve; Grant, Michael; Taberner, Malcolm; Antoine, David; Arnone, Robert; Balch, William M.; Barker, Kathryn; Barlow, Ray; Bélanger, Simon; Berthon, Jean-François; Beşiktepe, Şükrü; Brando, Vittorio; Canuti, Elisabetta; Chavez, Francisco; Claustre, Hervé; Crout, Richard; Frouin, Robert; García-Soto, Carlos; Gibb, Stuart W.; Gould, Richard; Hooker, Stanford; Kahru, Mati; Klein, Holger; Kratzer, Susanne; Loisel, Hubert; McKee, David; Mitchell, Brian G.; Moisan, Tiffany; Muller-Karger, Frank; O'Dowd, Leonie; Ondrusek, Michael; Poulton, Alex J.; Repecaud, Michel; Smyth, Timothy; Sosik, Heidi M.; Twardowski, Michael; Voss, Kenneth; Werdell, Jeremy; Wernand, Marcel; Zibordi, Giuseppe

    2016-06-01

    A compiled set of in situ data is important to evaluate the quality of ocean-colour satellite-data records. Here we describe the data compiled for the validation of the ocean-colour products from the ESA Ocean Colour Climate Change Initiative (OC-CCI). The data were acquired from several sources (MOBY, BOUSSOLE, AERONET-OC, SeaBASS, NOMAD, MERMAID, AMT, ICES, HOT, GeP&CO), span between 1997 and 2012, and have a global distribution. Observations of the following variables were compiled: spectral remote-sensing reflectances, concentrations of chlorophyll a, spectral inherent optical properties and spectral diffuse attenuation coefficients. The data were from multi-project archives acquired via the open internet services or from individual projects, acquired directly from data providers. Methodologies were implemented for homogenisation, quality control and merging of all data. No changes were made to the original data, other than averaging of observations that were close in time and space, elimination of some points after quality control and conversion to a standard format. The final result is a merged table designed for validation of satellite-derived ocean-colour products and available in text format. Metadata of each in situ measurement (original source, cruise or experiment, principal investigator) were preserved throughout the work and made available in the final table. Using all the data in a validation exercise increases the number of matchups and enhances the representativeness of different marine regimes. By making available the metadata, it is also possible to analyse each set of data separately. The compiled data are available at doi:10.1594/PANGAEA.854832 (Valente et al., 2015).

  14. 8 CFR 208.12 - Reliance on information compiled by other sources.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... whether the alien has a credible fear of persecution or torture pursuant to § 208.30 of this part, or a reasonable fear of persecution or torture pursuant to § 208.31, the asylum officer may rely on material...

  15. Toward Broadband Source Modeling for the Himalayan Collision Zone

    NASA Astrophysics Data System (ADS)

    Miyake, H.; Koketsu, K.; Kobayashi, H.; Sharma, B.; Mishra, O. P.; Yokoi, T.; Hayashida, T.; Bhattarai, M.; Sapkota, S. N.

    2017-12-01

    The Himalayan collision zone is characterized by the significant tectonic setting. There are earthquakes with low-angle thrust faulting as well as continental outerrise earthquakes. Recently several historical earthquakes have been identified by active fault surveys [e.g., Sapkota et al., 2013]. We here investigate source scaling for the Himalayan collision zone as a fundamental factor to construct source models toward seismic hazard assessment. As for the source scaling for collision zones, Yen and Ma [2011] reported the subduction-zone source scaling in Taiwan, and pointed out the non-self-similar scaling due to the finite crustal thickness. On the other hand, current global analyses of stress drop do not show abnormal values for the continental collision zones [e.g., Allmann and Shearer, 2009]. Based on the compile profiling of finite thickness of the curst and dip angle variations, we discuss whether the bending exists for the Himalayan source scaling and implications on stress drop that will control strong ground motions. Due to quite low-angle dip faulting, recent earthquakes in the Himalayan collision zone showed the upper bound of the current source scaling of rupture area vs. seismic moment (< Mw 8.0), and does not show significant bending of the source scaling. Toward broadband source modeling for ground motion prediction, we perform empirical Green's function simulations for the 2009 Butan and 2015 Gorkha earthquake sequence to quantify both long- and short-period source spectral levels.

  16. Building integrated business environments: analysing open-source ESB

    NASA Astrophysics Data System (ADS)

    Martínez-Carreras, M. A.; García Jimenez, F. J.; Gómez Skarmeta, A. F.

    2015-05-01

    Integration and interoperability are two concepts that have gained significant prominence in the business field, providing tools which enable enterprise application integration (EAI). In this sense, enterprise service bus (ESB) has played a crucial role as the underpinning technology for creating integrated environments in which companies may connect all their legacy-applications. However, the potential of these technologies remains unknown and some important features are not used to develop suitable business environments. The aim of this paper is to describe and detail the elements for building the next generation of integrated business environments (IBE) and to analyse the features of ESBs as the core of this infrastructure. For this purpose, we evaluate how well-known open-source ESB products fulfil these needs. Moreover, we introduce a scenario in which the collaborative system 'Alfresco' is integrated in the business infrastructure. Finally, we provide a comparison of the different open-source ESBs available for IBE requirements. According to this study, Fuse ESB provides the best results, considering features such as support for a wide variety of standards and specifications, documentation and implementation, security, advanced business trends, ease of integration and performance.

  17. flexCloud: Deployment of the FLEXPART Atmospheric Transport Model as a Cloud SaaS Environment

    NASA Astrophysics Data System (ADS)

    Morton, Don; Arnold, Dèlia

    2014-05-01

    FLEXPART (FLEXible PARTicle dispersion model) is a Lagrangian transport and dispersion model used by a growing international community. We have used it to simulate and forecast the atmospheric transport of wildfire smoke, volcanic ash and radionuclides. Additionally, FLEXPART may be run in backwards mode to provide information for the determination of emission sources such as nuclear emissions and greenhouse gases. This open source software is distributed in source code form, and has several compiler and library dependencies that users need to address. Although well-documented, getting it compiled, set up, running, and post-processed is often tedious, making it difficult for the inexperienced user. Our interest is in moving scientific modeling and simulation activities from site-specific clusters and supercomputers to a cloud model as a service paradigm. Choosing FLEXPART for our prototyping, our vision is to construct customised IaaS images containing fully-compiled and configured FLEXPART codes, including pre-processing, execution and postprocessing components. In addition, with the inclusion of a small web server in the image, we introduce a web-accessible graphical user interface that drives the system. A further initiative being pursued is the deployment of multiple, simultaneous FLEXPART ensembles in the cloud. A single front-end web interface is used to define the ensemble members, and separate cloud instances are launched, on-demand, to run the individual models and to conglomerate the outputs into a unified display. The outcome of this work is a Software as a Service (Saas) deployment whereby the details of the underlying modeling systems are hidden, allowing modelers to perform their science activities without the burden of considering implementation details.

  18. Geospatial datasets for assessing the effects of rangeland conditions on dissolved-solids yields in the Upper Colorado River Basin

    USGS Publications Warehouse

    Tillman, Fred D.; Flynn, Marilyn E.; Anning, David W.

    2015-01-01

    In 2009, the U.S. Geological Survey (USGS) developed a Spatially Referenced Regressions on Watershed Attributes (SPARROW) surface-water quality model for the Upper Colorado River Basin (UCRB) relating dissolved-solids sources and transport in the 1991 water year to upstream catchment characteristics. The SPARROW model focused on geologic and agricultural sources of dissolved solids in the UCRB and was calibrated using water-year 1991 dissolved-solids loads from 218 monitoring sites. A new UCRB SPARROW model is planned that will update the investigation of dissolved-solids sources and transport in the basin to circa 2010 conditions and will improve upon the 2009 model by incorporating more detailed information about agricultural-irrigation and rangeland-management practices, among other improvements. Geospatial datasets relating to circa 2010 rangeland conditions are required for the new UCRB SPARROW modeling effort. This study compiled geospatial datasets for the UCRB that relate to the biotic alterations and rangeland conditions of grazing, fire and other land disturbance, and vegetation type and cover. Datasets representing abiotic alterations of access control (off-highway vehicles) and sediment generation and transport in general, were also compiled. These geospatial datasets may be tested in the upcoming SPARROW model to better understand the potential contribution of rangelands to dissolved-solids loading in UCRB streams.

  19. Estimated water use, by county, in North Carolina, 1995

    USGS Publications Warehouse

    Walters, D.A.

    1997-01-01

    Data on water use in North Carolina were compiled for 1995 as part of a cooperative agreement between the U.S. Geological Survey and the Division of Water Resources of the North Carolina Department of Environment and Natural Resources. Data were compiled from a number of Federal, State, and private sources for the offstream water-use categories of public supply, domestic, commercial, industrial, mining, livestock, irrigation, and thermoelectric-power generation. Data also were collected for instream use from hydroelectric facilities. Total withdrawals (fresh and saline) during 1995 were an estimated 9,286 million gallons per day for the offstream water-use categories. About 94 percent of the water withdrawn was from surface water. Thermoelectric-power generation accounted for 80 percent of all withdrawals. Instream water use for hydroelectric-power generation totaled about 56,400 million gallons per day. Each water-use category is summarized in this report by county and source of water supply.

  20. Turning a remotely controllable observatory into a fully autonomous system

    NASA Astrophysics Data System (ADS)

    Swindell, Scott; Johnson, Chris; Gabor, Paul; Zareba, Grzegorz; Kubánek, Petr; Prouza, Michael

    2014-08-01

    We describe a complex process needed to turn an existing, old, operational observatory - The Steward Observatory's 61" Kuiper Telescope - into a fully autonomous system, which observers without an observer. For this purpose, we employed RTS2,1 an open sourced, Linux based observatory control system, together with other open sourced programs and tools (GNU compilers, Python language for scripting, JQuery UI for Web user interface). This presentation provides a guide with time estimates needed for a newcomers to the field to handle such challenging tasks, as fully autonomous observatory operations.

  1. Experiences Using a Meta-Data Based Integration Infrastructure

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Critchlow, T.; Masick, R.; Slezak, T.

    1999-07-08

    A data warehouse that presents data from many of the genomics community data sources in a consistent, intuitive fashion has long been a goal of bioinformatics. Unfortunately, it is one of the goals that has not yet been achieved. One of the major problems encountered by previous attempts has been the high cost of creating and maintaining a warehouse in a dynamic environment. In this abstract we have outlined a meta-data based approach to integrating data sources that begins to address this problem. We have used this infrastructure to successfully integrate new sources into an existing warehouse in substantially lessmore » time than would have traditionally been required--and the resulting mediators are more maintainable than the traditionally defined ones would have been. In the final paper, we will describe in greater detail both our architecture and our experiences using this framework. In particular, we will outline the new, XML based representation of the meta-data, describe how the mediator generator works, and highlight other potential uses for the meta-data.« less

  2. [Access to health information sources in Spain. how to combat "infoxication"].

    PubMed

    Navas-Martin, Miguel Ángel; Albornos-Muñoz, Laura; Escandell-García, Cintia

    2012-01-01

    Internet has become a priceless source for finding health information for both patients and healthcare professionals. However, the universality and the abundance of information can lead to unfounded conclusions about health issues that can confuse further than clarify the health information. This aspect causes intoxication of information: infoxication. The question lies in knowing how to filter the information that is useful, accurate and relevant for our purposes. In this regard, integrative portals, such as the Biblioteca Virtual de Salud, compile information at different levels (international, national and regional), different types of resources (databases, repositories, bibliographic sources, etc.), becoming a starting point for obtaining quality information. Copyright © 2011 Elsevier España, S.L. All rights reserved.

  3. FreeSASA: An open source C library for solvent accessible surface area calculations.

    PubMed

    Mitternacht, Simon

    2016-01-01

    Calculating solvent accessible surface areas (SASA) is a run-of-the-mill calculation in structural biology. Although there are many programs available for this calculation, there are no free-standing, open-source tools designed for easy tool-chain integration. FreeSASA is an open source C library for SASA calculations that provides both command-line and Python interfaces in addition to its C API. The library implements both Lee and Richards' and Shrake and Rupley's approximations, and is highly configurable to allow the user to control molecular parameters, accuracy and output granularity. It only depends on standard C libraries and should therefore be easy to compile and install on any platform. The library is well-documented, stable and efficient. The command-line interface can easily replace closed source legacy programs, with comparable or better accuracy and speed, and with some added functionality.

  4. Unveiling Deeply Embedded Sources by Near-Infrared Polarimetric Imaging

    NASA Astrophysics Data System (ADS)

    Yao, Yongqiang; Ishii, Miki; Nagata, Tetsuya; Nakaya, Hidehiko; Sato, Shuji

    2000-10-01

    Near-infrared polarimetric images are presented for six molecular outflow sources: IRAS 20050+2720, IRAS 20126+4104, IRAS 20188+3928, S233, AFGL 5180, and AFGL 6366S. All the regions are found to exhibit reflection nebulae and to be associated with massive and clustered star formation. By inspecting polarimetric patterns in the nebulae, we have identified six deeply embedded sources (DESs) which illuminate circumstellar nebulosity but are not detectable in wavelengths shorter than 2 μm. While the DES in IRAS 20050 coincides with an infrared source in a previous, longer wavelength observation and the one in IRAS 20126 with a hot molecular core, the nature of the other newly discovered DESs is not known. From the compilation of the observations of DESs over a large wavelength range, we suspect that the DESs possess characteristics similar to hot molecular cores and are likely to be in the pre-ultracompact H II region phase of massive star formation.

  5. What you should know about land-cover data

    USGS Publications Warehouse

    Gallant, Alisa L.

    2009-01-01

    Wildlife biologists are using land-characteristics data sets for a variety of applications. Many kinds of landscape variables have been characterized and the resultant data sets or maps are readily accessible. Often, too little consideration is given to the accuracy or traits of these data sets, most likely because biologists do not know how such data are compiled and rendered, or the potential pitfalls that can be encountered when applying these data. To increase understanding of the nature of land-characteristics data sets, I introduce aspects of source information and data-handling methodology that include the following: ambiguity of land characteristics; temporal considerations and the dynamic nature of the landscape; type of source data versus landscape features of interest; data resolution, scale, and geographic extent; data entry and positional problems; rare landscape features; and interpreter variation. I also include guidance for determining the quality of land-characteristics data sets through metadata or published documentation, visual clues, and independent information. The quality or suitability of the data sets for wildlife applications may be improved with thematic or spatial generalization, avoidance of transitional areas on maps, and merging of multiple data sources. Knowledge of the underlying challenges in compiling such data sets will help wildlife biologists to better assess the strengths and limitations and determine how best to use these data.

  6. Ada Implementation Guide. Software Engineering With Ada. Volume 2

    DTIC Science & Technology

    1994-04-01

    copy of the latest Ada Compiler Validation Capability (ACVC), the validation test suite ADA-BIB 10/15/91 2048 How to obtain the AJPO’S Ada...A I A-4Department of the Navy I I I 3 Helpful Sources AF-INT9I 8/12/91 2048 Text of Air Force 1991 Interpretation of Congressional Mandate SAF-POL88...the Ada language I 3 Ada Implementation Guide A--45 I I Helpful Sources CREASE 11/27/91 2048 How to obtain AJPO’s April 1988 CREASE Version 5.0 3

  7. Deploying Crowd-Sourced Formal Verification Systems in a DoD Network

    DTIC Science & Technology

    2013-09-01

    INTENTIONALLY LEFT BLANK 1 I. INTRODUCTION A. INTRODUCTION In 2014 cyber attacks on critical infrastructure are expected to increase...CSFV systems on the Internet‒‒possibly using cloud infrastructure (Dean, 2013). By using Amazon Compute Cloud (EC2) systems, DARPA will use ordinary...through standard access methods. Those clients could be mobile phones, laptops, netbooks, tablet computers or personal digital assistants (PDAs) (Smoot

  8. Experimental criticality specifications. An annotated bibliography through 1977

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Paxton, H.C.

    1978-05-01

    The compilation of approximately 300 references gives sources of experimental criticality parameters of systems containing /sup 235/U, /sup 233/U, and /sup 239/Pu. The intent is to cover basic data for criticality safety applications. The references are arranged by subject.

  9. Minefields Associated with Mining Data from Peer-reviewed Literature

    EPA Science Inventory

    The USEPA’s ECOTOX database is the largest compilation of ecotoxicity study results, providing information on the adverse effects of single chemical stressors to ecologically relevant aquatic and terrestrial species. The primary source of data included in the ECOTOX database is t...

  10. Privacy and Security: A Bibliography.

    ERIC Educational Resources Information Center

    Computer and Business Equipment Manufacturers Association, Washington, DC.

    Compiled at random from many sources, this bibliography attempts to cite as many publications concerning privacy and security as are available. The entries are organized under seven headings: (1) systems security, technical security, clearance of personnel, (2) corporate physical security, (3) administrative security, (4) miscellaneous--privacy…

  11. Innovative finance : strategic research project.

    DOT National Transportation Integrated Search

    2013-08-01

    Its time to rethink how we fund transportation infrastructure because most transportation : experts agree: theres a transportation funding and financing crisis looming. : Projected revenues from current sources of transportation funding will am...

  12. ESIP Earth Sciences Data Analytics (ESDA) Cluster - Work in Progress

    NASA Technical Reports Server (NTRS)

    Kempler, Steven

    2015-01-01

    The purpose of this poster is to promote a common understanding of the usefulness of, and activities that pertain to, Data Analytics and more broadly, the Data Scientist; Facilitate collaborations to better understand the cross usage of heterogeneous datasets and to provide accommodating data analytics expertise, now and as the needs evolve into the future; Identify gaps that, once filled, will further collaborative activities. Objectives Provide a forum for Academic discussions that provides ESIP members a better understanding of the various aspects of Earth Science Data Analytics Bring in guest speakers to describe external efforts, and further teach us about the broader use of Data Analytics. Perform activities that:- Compile use cases generated from specific community needs to cross analyze heterogeneous data- Compile sources of analytics tools, in particular, to satisfy the needs of the above data users- Examine gaps between needs and sources- Examine gaps between needs and community expertise- Document specific data analytics expertise needed to perform Earth science data analytics Seek graduate data analytics Data Science student internship opportunities.

  13. Stream restoration and sanitary infrastructure alter sources and fluxes of water, carbon, and nutrients in urban watersheds

    NASA Astrophysics Data System (ADS)

    Pennino, M. J.; Kaushal, S. S.; Mayer, P. M.; Utz, R. M.; Cooper, C. A.

    2015-12-01

    An improved understanding of sources and timing of water and nutrient fluxes associated with urban stream restoration is critical for guiding effective watershed management. We investigated how sources, fluxes, and flowpaths of water, carbon (C), nitrogen (N), and phosphorus (P) shift in response to differences in stream restoration and sanitary infrastructure. We compared a restored stream with 3 unrestored streams draining urban development and stormwater management over a 3 year period. We found that there was significantly decreased peak discharge in response to precipitation events following stream restoration. Similarly, we found that the restored stream showed significantly lower monthly peak runoff (9.4 ± 1.0 mm d-1) compared with two urban unrestored streams (ranging from 44.9 ± 4.5 to 55.4 ± 5.8 mm d-1) draining higher impervious surface cover. Peak runoff in the restored stream was more similar to a less developed stream draining extensive stormwater management (13.2 ± 1.9 mm d-1). Interestingly, the restored stream exported most carbon, nitrogen, and phosphorus loads at relatively lower streamflow than the 2 more urban streams, which exported most of their loads at higher and less frequent streamflow. Annual exports of total carbon (6.6 ± 0.5 kg ha-1 yr-1), total nitrogen (4.5 ± 0.3 kg ha-1 yr-1), and total phosphorus (161 ± 15 g ha-1 yr-1) were significantly lower in the restored stream compared to both urban unrestored streams (p < 0.05) and similar to the stream draining stormwater management. Although stream restoration appeared to potentially influence hydrology to some degree, nitrate isotope data suggested that 55 ± 1 % of the nitrate in the restored stream was derived from leaky sanitary sewers (during baseflow), similar to the unrestored streams. Longitudinal synoptic surveys of water and nitrate isotopes along all 4 watersheds suggested the importance of urban groundwater contamination from leaky piped infrastructure. Urban groundwater contamination was also suggested by additional tracer measurements including fluoride (added to drinking water) and iodide (contained in dietary salt). Our results suggest that integrating stream restoration with restoration of aging sanitary infrastructure can be critical to more effectively minimize watershed nutrient export. Given that both stream restoration and sanitary pipe repairs both involve extensive channel manipulation, they can be considered simultaneously in management strategies. In addition, ground water can be a major source of nutrient fluxes in urban watersheds, which has been less considered compared with upland sources and storm drains. Goundwater sources, fluxes, and flowpath should also be targeted in efforts to improve stream restoration strategies and prioritize hydrologic "hot spots" along watersheds where stream restoration is most likely to succeed.

  14. Local SPTHA through tsunami inundation simulations: a test case for two coastal critical infrastructures in the Mediterranean

    NASA Astrophysics Data System (ADS)

    Volpe, M.; Selva, J.; Tonini, R.; Romano, F.; Lorito, S.; Brizuela, B.; Argyroudis, S.; Salzano, E.; Piatanesi, A.

    2016-12-01

    Seismic Probabilistic Tsunami Hazard Analysis (SPTHA) is a methodology to assess the exceedance probability for different thresholds of tsunami hazard intensity, at a specific site or region in a given time period, due to a seismic source. A large amount of high-resolution inundation simulations is typically required for taking into account the full variability of potential seismic sources and their slip distributions. Starting from regional SPTHA offshore results, the computational cost can be reduced by considering for inundation calculations only a subset of `important' scenarios. We here use a method based on an event tree for the treatment of the seismic source aleatory variability; a cluster analysis on the offshore results to define the important sources; epistemic uncertainty treatment through an ensemble modeling approach. We consider two target sites in the Mediterranean (Milazzo, Italy, and Thessaloniki, Greece) where coastal (non nuclear) critical infrastructures (CIs) are located. After performing a regional SPTHA covering the whole Mediterranean, for each target site, few hundreds of representative scenarios are filtered out of all the potential seismic sources and the tsunami inundation is explicitly modeled, obtaining a site-specific SPTHA, with a complete characterization of the tsunami hazard in terms of flow depth and velocity time histories. Moreover, we also explore the variability of SPTHA at the target site accounting for coseismic deformation (i.e. uplift or subsidence) due to near field sources located in very shallow water. The results are suitable and will be applied for subsequent multi-hazard risk analysis for the CIs. These applications have been developed in the framework of the Italian Flagship Project RITMARE, EC FP7 ASTARTE (Grant agreement 603839) and STREST (Grant agreement 603389) projects, and of the INGV-DPC Agreement.

  15. GNU Data Language (GDL) - a free and open-source implementation of IDL

    NASA Astrophysics Data System (ADS)

    Arabas, Sylwester; Schellens, Marc; Coulais, Alain; Gales, Joel; Messmer, Peter

    2010-05-01

    GNU Data Language (GDL) is developed with the aim of providing an open-source drop-in replacement for the ITTVIS's Interactive Data Language (IDL). It is free software developed by an international team of volunteers led by Marc Schellens - the project's founder (a list of contributors is available on the project's website). The development is hosted on SourceForge where GDL continuously ranks in the 99th percentile of most active projects. GDL with its library routines is designed as a tool for numerical data analysis and visualisation. As its proprietary counterparts (IDL and PV-WAVE), GDL is used particularly in geosciences and astronomy. GDL is dynamically-typed, vectorized and has object-oriented programming capabilities. The library routines handle numerical calculations, data visualisation, signal/image processing, interaction with host OS and data input/output. GDL supports several data formats such as netCDF, HDF4, HDF5, GRIB, PNG, TIFF, DICOM, etc. Graphical output is handled by X11, PostScript, SVG or z-buffer terminals, the last one allowing output to be saved in a variety of raster graphics formats. GDL is an incremental compiler with integrated debugging facilities. It is written in C++ using the ANTLR language-recognition framework. Most of the library routines are implemented as interfaces to open-source packages such as GNU Scientific Library, PLPlot, FFTW, ImageMagick, and others. GDL features a Python bridge (Python code can be called from GDL; GDL can be compiled as a Python module). Extensions to GDL can be written in C++, GDL, and Python. A number of open software libraries written in IDL, such as the NASA Astronomy Library, MPFIT, CMSVLIB and TeXtoIDL are fully or partially functional under GDL. Packaged versions of GDL are available for several Linux distributions and Mac OS X. The source code compiles on some other UNIX systems, including BSD and OpenSolaris. The presentation will cover the current status of the project, the key accomplishments, and the weaknesses - areas where contributions and users' feedback are welcome! While still being in beta-stage of development, GDL proved to be a useful tool for classroom work on data analysis. Its usage for teaching meteorological-data processing at the University of Warsaw will serve as an example.

  16. Mapping habitat for multiple species in the Desert Southwest

    USGS Publications Warehouse

    Inman, Richard D.; Nussear, Kenneth E.; Esque, Todd C.; Vandergast, Amy G.; Hathaway, Stacie A.; Wood, Dustin A.; Barr, Kelly R.; Fisher, Robert N.

    2014-01-01

    Many utility scale renewable energy projects are currently proposed across the Mojave Ecoregion. Agencies that manage biological resources throughout this region need to understand the potential impacts of these renewable energy projects and their associated infrastructure (for example, transmission corridors, substations, access roads, etc.) on species movement, genetic exchange among populations, and species’ abilities to adapt to changing environmental conditions. Understanding these factors will help managers’ select appropriate project sites and possibly mitigate for anticipated effects of management activities. We used species distribution models to map habitat for 15 species across the Mojave Ecoregion to aid regional land-use management planning. Models were developed using a common 1 × 1 kilometer resolution with maximum entropy and generalized additive models. Occurrence data were compiled from multiple sources, including VertNet (http://vertnet.org/), HerpNET (http://www.herpnet.org), and MaNIS (http://manisnet.org), as well as from internal U.S. Geological Survey databases and other biologists. Background data included 20 environmental covariates representing terrain, vegetation, and climate covariates. This report summarizes these environmental covariates and species distribution models used to predict habitat for the 15 species across the Mojave Ecoregion.

  17. Compilation and analysis of multiple groundwater-quality datasets for Idaho

    USGS Publications Warehouse

    Hundt, Stephen A.; Hopkins, Candice B.

    2018-05-09

    Groundwater is an important source of drinking and irrigation water throughout Idaho, and groundwater quality is monitored by various Federal, State, and local agencies. The historical, multi-agency records of groundwater quality include a valuable dataset that has yet to be compiled or analyzed on a statewide level. The purpose of this study is to combine groundwater-quality data from multiple sources into a single database, to summarize this dataset, and to perform bulk analyses to reveal spatial and temporal patterns of water quality throughout Idaho. Data were retrieved from the Water Quality Portal (https://www.waterqualitydata.us/), the Idaho Department of Environmental Quality, and the Idaho Department of Water Resources. Analyses included counting the number of times a sample location had concentrations above Maximum Contaminant Levels (MCL), performing trends tests, and calculating correlations between water-quality analytes. The water-quality database and the analysis results are available through USGS ScienceBase (https://doi.org/10.5066/F72V2FBG).

  18. Spatial aspects of the research on tourist infrastructure with the use of the cartographic method on the basis of Roztoczański National Park

    NASA Astrophysics Data System (ADS)

    Kałamucki, Krzysztof; Kamińska, Anna; Buk, Dorota

    2012-01-01

    The aim of the research was to demonstrate changes in tourist trails and in the distribution of tourist infrastructure spots in the area of Roztoczański National Park in its vicinity. Another, equally important aim, was to check the usefulness of tourist infrastructure in both cartographic method of infrastructure research and in cartography of presentation methods. The research covered the region of Roztoczański National Park. The following elements of tourist infrastructure were selected for the analysis: linear elements (walking trails, education paths) and spot elements (accommodation, eating places and the accompanied basis). In order to recreate the state of infrastructure during the last 50 years, it was necessary to analyse the following source material: tourist maps issued as independent publications, maps issued as supplements to tour guides and aerial photography. The information from text sources was used, e.g. from tourist guides, leaflets and monographs. The temporal framework was defined as 50 years from the 1960's until 2009. This time range was divided into five 10-year periods. In order to present the state of tourist infrastructure, its spatial and qualitative changes, 6 maps were produces (maps of states and types of changes). The conducted spatial analyses and the interpretations of maps of states and changes in tourist infrastructure allowed to capture both qualitative and quantitative changes. It was stated that the changes in the trails were not regular. There were parts of trails that did not change for 40 years. There were also some that were constructed during the last decade. Presently, the area is densely covered with tourist trails and education paths. The measurements of lengths of tourist trails and their parts with regard to land cover and category of roads allowed to determine the character of trails and the scope of changes. The conducted analyses proved the usefulness of cartographic methods in researching tourist infrastructure in spatial and quantitative aspects.

  19. A Complete Public Archive for the Einstein Imaging Proportional Counter

    NASA Technical Reports Server (NTRS)

    Helfand, David J.

    1996-01-01

    Consistent with our proposal to the Astrophysics Data Program in 1992, we have completed the design, construction, documentation, and distribution of a flexible and complete archive of the data collected by the Einstein Imaging Proportional Counter. Along with software and data delivered to the High Energy Astrophysics Science Archive Research Center at Goddard Space Flight Center, we have compiled and, where appropriate, published catalogs of point sources, soft sources, hard sources, extended sources, and transient flares detected in the database along with extensive analyses of the instrument's backgrounds and other anomalies. We include in this document a brief summary of the archive's functionality, a description of the scientific catalogs and other results, a bibliography of publications supported in whole or in part under this contract, and a list of personnel whose pre- and post-doctoral education consisted in part in participation in this project.

  20. A large-scale solar dynamics observatory image dataset for computer vision applications.

    PubMed

    Kucuk, Ahmet; Banda, Juan M; Angryk, Rafal A

    2017-01-01

    The National Aeronautics Space Agency (NASA) Solar Dynamics Observatory (SDO) mission has given us unprecedented insight into the Sun's activity. By capturing approximately 70,000 images a day, this mission has created one of the richest and biggest repositories of solar image data available to mankind. With such massive amounts of information, researchers have been able to produce great advances in detecting solar events. In this resource, we compile SDO solar data into a single repository in order to provide the computer vision community with a standardized and curated large-scale dataset of several hundred thousand solar events found on high resolution solar images. This publicly available resource, along with the generation source code, will accelerate computer vision research on NASA's solar image data by reducing the amount of time spent performing data acquisition and curation from the multiple sources we have compiled. By improving the quality of the data with thorough curation, we anticipate a wider adoption and interest from the computer vision to the solar physics community.

  1. ProteoWizard: open source software for rapid proteomics tools development.

    PubMed

    Kessner, Darren; Chambers, Matt; Burke, Robert; Agus, David; Mallick, Parag

    2008-11-01

    The ProteoWizard software project provides a modular and extensible set of open-source, cross-platform tools and libraries. The tools perform proteomics data analyses; the libraries enable rapid tool creation by providing a robust, pluggable development framework that simplifies and unifies data file access, and performs standard proteomics and LCMS dataset computations. The library contains readers and writers of the mzML data format, which has been written using modern C++ techniques and design principles and supports a variety of platforms with native compilers. The software has been specifically released under the Apache v2 license to ensure it can be used in both academic and commercial projects. In addition to the library, we also introduce a rapidly growing set of companion tools whose implementation helps to illustrate the simplicity of developing applications on top of the ProteoWizard library. Cross-platform software that compiles using native compilers (i.e. GCC on Linux, MSVC on Windows and XCode on OSX) is available for download free of charge, at http://proteowizard.sourceforge.net. This website also provides code examples, and documentation. It is our hope the ProteoWizard project will become a standard platform for proteomics development; consequently, code use, contribution and further development are strongly encouraged.

  2. Mapping of forested wetland: use of Seasat radar images to complement conventional sources ( USA).

    USGS Publications Warehouse

    Place, J.L.

    1985-01-01

    Distinguishing forested wetland from dry forest using aerial photographs is handicapped because photographs often do not reveal the presence of water below tree canopies. Radar images obtained by the Seasat satellite reveal forested wetland as highly reflective patterns on the coastal plain between Maryland and Florida. Seasat radar images may complement aerial photographs for compiling maps of wetland. A test with experienced photointerpreters revealed that interpretation accuracy was significantly higher when using Seasat radar images than when using only conventional sources.-Author

  3. Monitor-based evaluation of pollutant load from urban stormwater runoff in Beijing.

    PubMed

    Liu, Y; Che, W; Li, J

    2005-01-01

    As a major pollutant source to urban receiving waters, the non-point source pollution from urban runoff needs to be well studied and effectively controlled. Based on monitoring data from urban runoff pollutant sources, this article describes a systematic estimation of total pollutant loads from the urban areas of Beijing. A numerical model was developed to quantify main pollutant loads of urban runoff in Beijing. A sub-procedure is involved in this method, in which the flush process influences both the quantity and quality of stormwater runoff. A statistics-based method was applied in computing the annual pollutant load as an output of the runoff. The proportions of pollutant from point-source and non-point sources were compared. This provides a scientific basis for proper environmental input assessment of urban stormwater pollution to receiving waters, improvement of infrastructure performance, implementation of urban stormwater management, and utilization of stormwater.

  4. Alaska IPASS database preparation manual.

    Treesearch

    P. McHugh; D. Olson; C. Schallau

    1989-01-01

    Describes the data, their sources, and the calibration procedures used in compiling a database for the Alaska IPASS (interactive policy analysis simulation system) model. Although this manual is for Alaska, it provides generic instructions for analysts preparing databases for other geographical areas.

  5. On search guide phrase compilation for recommending home medical products.

    PubMed

    Luo, Gang

    2010-01-01

    To help people find desired home medical products (HMPs), we developed an intelligent personal health record (iPHR) system that can automatically recommend HMPs based on users' health issues. Using nursing knowledge, we pre-compile a set of "search guide" phrases that provides semantic translation from words describing health issues to their underlying medical meanings. Then iPHR automatically generates queries from those phrases and uses them and a search engine to retrieve HMPs. To avoid missing relevant HMPs during retrieval, the compiled search guide phrases need to be comprehensive. Such compilation is a challenging task because nursing knowledge updates frequently and contains numerous details scattered in many sources. This paper presents a semi-automatic tool facilitating such compilation. Our idea is to formulate the phrase compilation task as a multi-label classification problem. For each newly obtained search guide phrase, we first use nursing knowledge and information retrieval techniques to identify a small set of potentially relevant classes with corresponding hints. Then a nurse makes the final decision on assigning this phrase to proper classes based on those hints. We demonstrate the effectiveness of our techniques by compiling search guide phrases from an occupational therapy textbook.

  6. Transport Traffic Analysis for Abusive Infrastructure Characterization

    DTIC Science & Technology

    2012-12-14

    Introduction Abusive traffic abounds on the Internet, in the form of email, malware, vulnerability scanners, worms, denial-of-service, drive-by-downloads, scam ...insight is two-fold. First, attackers have a basic requirement to source large amounts of data, be it denial-of-service, scam -hosting, spam, or other...the network core. This paper explores the power of transport-layer traffic analysis to detect and characterize scam hosting infrastructure, including

  7. Comparative LCA of decentralized wastewater treatment alternatives for non-potable urban reuse.

    PubMed

    Opher, Tamar; Friedler, Eran

    2016-11-01

    Municipal wastewater (WW) effluent represents a reliable and significant source for reclaimed water, very much needed nowadays. Water reclamation and reuse has become an attractive option for conserving and extending available water sources. The decentralized approach to domestic WW treatment benefits from the advantages of source separation, which makes available simple small-scale systems and on-site reuse, which can be constructed on a short time schedule and occasionally upgraded with new technological developments. In this study we perform a Life Cycle Assessment to compare between the environmental impacts of four alternatives for a hypothetical city's water-wastewater service system. The baseline alternative is the most common, centralized approach for WW treatment, in which WW is conveyed to and treated in a large wastewater treatment plant (WWTP) and is then discharged to a stream. The three alternatives represent different scales of distribution of the WW treatment phase, along with urban irrigation and domestic non-potable water reuse (toilet flushing). The first alternative includes centralized treatment at a WWTP, with part of the reclaimed WW (RWW) supplied back to the urban consumers. The second and third alternatives implement de-centralized greywater (GW) treatment with local reuse, one at cluster level (320 households) and one at building level (40 households). Life cycle impact assessment results show a consistent disadvantage of the prevailing centralized approach under local conditions in Israel, where seawater desalination is the marginal source of water supply. The alternative of source separation and GW reuse at cluster level seems to be the most preferable one, though its environmental performance is only slightly better than GW reuse at building level. Centralized WW treatment with urban reuse of WWTP effluents is not advantageous over decentralized treatment of GW because the supply of RWW back to consumers is very costly in materials and energy. Electricity is a major driver of the impacts in most categories, pertaining mostly to potable water production and supply. Infrastructure was found to have a notable effect on metal depletion, human toxicity and freshwater and marine ecotoxicity. Sensitivity to major model parameters was analyzed. A shift to a larger share of renewable energy sources in the electricity mix results in a dramatic improvement in most impact categories. Switching to a mix of water sources, rather than the marginal source, leads to a significant reduction in most impacts. It is concluded that under the conditions tested, a decentralized approach to urban wastewater management is environmentally preferable to the common centralized system. It is worth exploring such options under different conditions as well, in cases which new urban infrastructure is planned or replacement of old infrastructure is required. Copyright © 2016 Elsevier Ltd. All rights reserved.

  8. Mesoscale carbon sequestration site screening and CCS infrastructure analysis.

    PubMed

    Keating, Gordon N; Middleton, Richard S; Stauffer, Philip H; Viswanathan, Hari S; Letellier, Bruce C; Pasqualini, Donatella; Pawar, Rajesh J; Wolfsberg, Andrew V

    2011-01-01

    We explore carbon capture and sequestration (CCS) at the meso-scale, a level of study between regional carbon accounting and highly detailed reservoir models for individual sites. We develop an approach to CO(2) sequestration site screening for industries or energy development policies that involves identification of appropriate sequestration basin, analysis of geologic formations, definition of surface sites, design of infrastructure, and analysis of CO(2) transport and storage costs. Our case study involves carbon management for potential oil shale development in the Piceance-Uinta Basin, CO and UT. This study uses new capabilities of the CO(2)-PENS model for site screening, including reservoir capacity, injectivity, and cost calculations for simple reservoirs at multiple sites. We couple this with a model of optimized source-sink-network infrastructure (SimCCS) to design pipeline networks and minimize CCS cost for a given industry or region. The CLEAR(uff) dynamical assessment model calculates the CO(2) source term for various oil production levels. Nine sites in a 13,300 km(2) area have the capacity to store 6.5 GtCO(2), corresponding to shale-oil production of 1.3 Mbbl/day for 50 years (about 1/4 of U.S. crude oil production). Our results highlight the complex, nonlinear relationship between the spatial deployment of CCS infrastructure and the oil-shale production rate.

  9. Lessons learned from Khartoum flash flood impacts: An integrated assessment.

    PubMed

    Mahmood, Mohamad Ibrahim; Elagib, Nadir Ahmed; Horn, Finlay; Saad, Suhair A G

    2017-12-01

    This study aims at enabling the compilation of key lessons for decision makers and urban planners in rapidly urbanizing cities regarding the identification of representative, chief causal natural and human factors for the increased level of flash flood risk. To achieve this, the impacts of flash flood events of 2013 and 2014 in the capital of Sudan, Khartoum, were assessed using seven integrated approaches, i.e. rainfall data analysis, document analysis of affected people and houses, observational fieldwork in the worst flood affected areas, people's perception of causes and mitigation measures through household interviews, reported drinking water quality, reported water-related diseases and social risk assessment. Several lessons have been developed as follows. Urban planners must recognize the devastating risks of building within natural pathways of ephemeral watercourses. They must also ensure effective drainage infrastructures and physio-geographical investigations prior to developing urban areas. The existing urban drainage systems become ineffective due to blockage by urban waste. Building of unauthorized drainage and embankment structures by locals often cause greater flood problems than normal. The urban runoff is especially problematic for residential areas built within low-lying areas having naturally low infiltration capacity, as surface water can rapidly collect within hollows and depressions, or beside elevated roads that preclude the free flow of floodwater. Weak housing and infrastructure quality are especially vulnerable to flash flooding and even to rainfall directly. Establishment of services infrastructure is imperative for flash flood disaster risk reduction. Water supply should be from lower aquifers to avoid contaminant groundwater. Regular monitoring of water quality and archiving of its indicators help identify water-related diseases and sources of water contamination in the event of environmental disasters such as floods. Though the understanding of risk perception by the locals is an important aspect of the decision making and planning processes, it should be advanced enough for proper awareness. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. Odors and Air Pollution: A Bibliography with Abstracts.

    ERIC Educational Resources Information Center

    Environmental Protection Agency, Research Triangle Park, NC. Office of Air Programs.

    The annotated bibliography presents a compilation of abstracts which deal with odors as they relate to air pollution. The abstracts are arranged within the following categories: Emission sources; Control methods; Measurement methods; Air quality measurements; Atmospheric interaction; Basic science and technology; Effects-human health;…

  11. Religious Education Forum: Legitimizing Your Value List.

    ERIC Educational Resources Information Center

    McBride, Alfred

    1979-01-01

    Addressing the problem that educators have in compiling a legitimate list of values to teach, the author examines the Bible as a source of value legitimacy and details the values expressed in the Hebrew and Christian covenants, the Ten Commandments, and the six beatitudes. (SJL)

  12. Methodology to evaluate the performance of simulation models for alternative compiler and operating system configurations

    USDA-ARS?s Scientific Manuscript database

    Simulation modelers increasingly require greater flexibility for model implementation on diverse operating systems, and they demand high computational speed for efficient iterative simulations. Additionally, model users may differ in preference for proprietary versus open-source software environment...

  13. PUBLIC SOURCES OF MUTAGENICITY AND CARCINOGENICITY DATA: USE IN STRUCTURE-ACTIVITY RELATIONSHIP MODELS

    EPA Science Inventory

    No Abstract - first paragraph of INTRODUCTION

    Publicly supported compilations of mutagenicity and carcinogenicity data are available
    for a significant number and variety of environmental and industrial chemicals and, to a lesser
    extent, pharmaceutical chemicals. T...

  14. Multisource Data Integration in Remote Sensing

    NASA Technical Reports Server (NTRS)

    Tilton, James C. (Editor)

    1991-01-01

    Papers presented at the workshop on Multisource Data Integration in Remote Sensing are compiled. The full text of these papers is included. New instruments and new sensors are discussed that can provide us with a large variety of new views of the real world. This huge amount of data has to be combined and integrated in a (computer-) model of this world. Multiple sources may give complimentary views of the world - consistent observations from different (and independent) data sources support each other and increase their credibility, while contradictions may be caused by noise, errors during processing, or misinterpretations, and can be identified as such. As a consequence, integration results are very reliable and represent a valid source of information for any geographical information system.

  15. The Paucity Problem: Where Have All the Space Reactor Experiments Gone?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bess, John D.; Marshall, Margaret A.

    2016-10-01

    The Handbooks of the International Criticality Safety Benchmark Evaluation Project (ICSBEP) and the International Reactor Physics Experiment Evaluation Project (IRPhEP) together contain a plethora of documented and evaluated experiments essential in the validation of nuclear data, neutronics codes, and modeling of various nuclear systems. Unfortunately, only a minute selection of handbook data (twelve evaluations) are of actual experimental facilities and mockups designed specifically for space nuclear research. There is a paucity problem, such that the multitude of space nuclear experimental activities performed in the past several decades have yet to be recovered and made available in such detail that themore » international community could benefit from these valuable historical research efforts. Those experiments represent extensive investments in infrastructure, expertise, and cost, as well as constitute significantly valuable resources of data supporting past, present, and future research activities. The ICSBEP and IRPhEP were established to identify and verify comprehensive sets of benchmark data; evaluate the data, including quantification of biases and uncertainties; compile the data and calculations in a standardized format; and formally document the effort into a single source of verified benchmark data. See full abstract in attached document.« less

  16. Integrating Emerging Data Sources Into Operational Practice

    DOT National Transportation Integrated Search

    2018-05-15

    Agencies have the potential to collect, use, and share data from connected and automated vehicles (CAV), connected travelers, and connected infrastructure elements to improve the performance of their traffic management systems and traffic management ...

  17. 'Anyone can edit', not everyone does: Wikipedia's infrastructure and the gender gap.

    PubMed

    Ford, Heather; Wajcman, Judy

    2017-08-01

    Feminist STS has long established that science's provenance as a male domain continues to define what counts as knowledge and expertise. Wikipedia, arguably one of the most powerful sources of information today, was initially lauded as providing the opportunity to rebuild knowledge institutions by providing greater representation of multiple groups. However, less than ten percent of Wikipedia editors are women. At one level, this imbalance in contributions and therefore content is yet another case of the masculine culture of technoscience. This is an important argument and, in this article, we examine the empirical research that highlights these issues. Our main objective, however, is to extend current accounts by demonstrating that Wikipedia's infrastructure introduces new and less visible sources of gender disparity. In sum, our aim here is to present a consolidated analysis of the gendering of Wikipedia.

  18. Watershed Characteristics and Land Management in the Nonpoint-Source Evaluation Monitoring Watersheds in Wisconsin

    USGS Publications Warehouse

    Rappold, K.F.; Wierl, J.A.; Amerson, F.U.

    1997-01-01

    In 1992, the Wisconsin Department of Natural Resources, in cooperation with the U.S. Geological Survey, began a land-use inventory to identify sources of contaminants and track the land-management changes for eight evaluation monitoring watersheds in Wisconsin. An important component of the land-use inventory has been developing descriptions and preliminary assessments for the eight watersheds. These descriptions establish a baseline for future data analysis. The watershed descriptions include sections on location, reference watersheds, climate, land use, soils and topography, and surface-water resources. The land-management descriptions include sections on objectives, sources of nonpoint contamination and goals of contaminant reduction, and implementation of best-management practices. This information was compiled primarily from the nonpoint-source control plans, county soil surveys, farm conservation plans, Federal and State agency data reports, and data collected through the land-use inventory.

  19. VizieR Online Data Catalog: NIR proper motion catalogue from UKIDSS-LAS (Smith+, 2014)

    NASA Astrophysics Data System (ADS)

    Smith, L.; Lucas, P. W.; Burningham, B.; Jones, H. R. A.; Smart, R. L.; Andrei, A. H.; Catalan, S.; Pinfield, D. J.

    2015-07-01

    We constructed two epoch catalogues for each pointing by matching sources within the pairs of multiframes using the Starlink Tables Infrastructure Library Tool Set (STILTS; Taylor 2006, ASP conf. Ser. 351, 666). We required pairs of sources to be uniquely paired to their closest match within 6-arcsec, and we required the J band magnitudes for the two epochs to agree within 0.5mag, to minimize mismatches. (1 data file).

  20. Design, implementation, and initial results from a water-quality monitoring network for Atlanta, Georgia, USA

    USGS Publications Warehouse

    Horowitz, A.J.; Elrick, K.A.; Smith, J.J.

    2005-01-01

    In cooperation with the City of Atlanta, Georgia, the US Geological Survey has designed and implemented a water-quantity and quality monitoring network that measures a variety of biological and chemical constituents in water and suspended sediment. The network consists of 20 long-term monitoring sites and is intended to assess water-quality trends in response to planned infrastructural improvements. Initial results from the network indicate that nonpoint-source contributions may be more significant than point-source contributions for selected sediment associated trace elements and nutrients. There also are indications of short-term discontinuous point-source contributions of these same constituents during baseflow.

  1. Interference with a quantum dot single-photon source and a laser at telecom wavelength

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Felle, M.; Centre for Advanced Photonics and Electronics, University of Cambridge, J.J. Thomson Avenue, Cambridge CB3 0FA; Huwer, J., E-mail: jan.huwer@crl.toshiba.co.uk

    The interference of photons emitted by dissimilar sources is an essential requirement for a wide range of photonic quantum information applications. Many of these applications are in quantum communications and need to operate at standard telecommunication wavelengths to minimize the impact of photon losses and be compatible with existing infrastructure. Here, we demonstrate for the first time the quantum interference of telecom-wavelength photons from an InAs/GaAs quantum dot single-photon source and a laser; an important step towards such applications. The results are in good agreement with a theoretical model, indicating a high degree of indistinguishability for the interfering photons.

  2. Optimization of urban water supply portfolios combining infrastructure capacity expansion and water use decisions

    NASA Astrophysics Data System (ADS)

    Medellin-Azuara, J.; Fraga, C. C. S.; Marques, G.; Mendes, C. A.

    2015-12-01

    The expansion and operation of urban water supply systems under rapidly growing demands, hydrologic uncertainty, and scarce water supplies requires a strategic combination of various supply sources for added reliability, reduced costs and improved operational flexibility. The design and operation of such portfolio of water supply sources merits decisions of what and when to expand, and how much to use of each available sources accounting for interest rates, economies of scale and hydrologic variability. The present research provides a framework and an integrated methodology that optimizes the expansion of various water supply alternatives using dynamic programming and combining both short term and long term optimization of water use and simulation of water allocation. A case study in Bahia Do Rio Dos Sinos in Southern Brazil is presented. The framework couples an optimization model with quadratic programming model in GAMS with WEAP, a rain runoff simulation models that hosts the water supply infrastructure features and hydrologic conditions. Results allow (a) identification of trade offs between cost and reliability of different expansion paths and water use decisions and (b) evaluation of potential gains by reducing water system losses as a portfolio component. The latter is critical in several developing countries where water supply system losses are high and often neglected in favor of more system expansion. Results also highlight the potential of various water supply alternatives including, conservation, groundwater, and infrastructural enhancements over time. The framework proves its usefulness for planning its transferability to similarly urbanized systems.

  3. Highly efficient entanglement swapping and teleportation at telecom wavelength

    PubMed Central

    Jin, Rui-Bo; Takeoka, Masahiro; Takagi, Utako; Shimizu, Ryosuke; Sasaki, Masahide

    2015-01-01

    Entanglement swapping at telecom wavelengths is at the heart of quantum networking in optical fiber infrastructures. Although entanglement swapping has been demonstrated experimentally so far using various types of entangled photon sources both in near-infrared and telecom wavelength regions, the rate of swapping operation has been too low to be applied to practical quantum protocols, due to limited efficiency of entangled photon sources and photon detectors. Here we demonstrate drastic improvement of the efficiency at telecom wavelength by using two ultra-bright entangled photon sources and four highly efficient superconducting nanowire single photon detectors. We have attained a four-fold coincidence count rate of 108 counts per second, which is three orders higher than the previous experiments at telecom wavelengths. A raw (net) visibility in a Hong-Ou-Mandel interference between the two independent entangled sources was 73.3 ± 1.0% (85.1 ± 0.8%). We performed the teleportation and entanglement swapping, and obtained a fidelity of 76.3% in the swapping test. Our results on the coincidence count rates are comparable with the ones ever recorded in teleportation/swapping and multi-photon entanglement generation experiments at around 800 nm wavelengths. Our setup opens the way to practical implementation of device-independent quantum key distribution and its distance extension by the entanglement swapping as well as multi-photon entangled state generation in telecom band infrastructures with both space and fiber links. PMID:25791212

  4. Highly efficient entanglement swapping and teleportation at telecom wavelength.

    PubMed

    Jin, Rui-Bo; Takeoka, Masahiro; Takagi, Utako; Shimizu, Ryosuke; Sasaki, Masahide

    2015-03-20

    Entanglement swapping at telecom wavelengths is at the heart of quantum networking in optical fiber infrastructures. Although entanglement swapping has been demonstrated experimentally so far using various types of entangled photon sources both in near-infrared and telecom wavelength regions, the rate of swapping operation has been too low to be applied to practical quantum protocols, due to limited efficiency of entangled photon sources and photon detectors. Here we demonstrate drastic improvement of the efficiency at telecom wavelength by using two ultra-bright entangled photon sources and four highly efficient superconducting nanowire single photon detectors. We have attained a four-fold coincidence count rate of 108 counts per second, which is three orders higher than the previous experiments at telecom wavelengths. A raw (net) visibility in a Hong-Ou-Mandel interference between the two independent entangled sources was 73.3 ± 1.0% (85.1 ± 0.8%). We performed the teleportation and entanglement swapping, and obtained a fidelity of 76.3% in the swapping test. Our results on the coincidence count rates are comparable with the ones ever recorded in teleportation/swapping and multi-photon entanglement generation experiments at around 800 nm wavelengths. Our setup opens the way to practical implementation of device-independent quantum key distribution and its distance extension by the entanglement swapping as well as multi-photon entangled state generation in telecom band infrastructures with both space and fiber links.

  5. Thermal Expansion Properties of Aerospace Materials

    NASA Technical Reports Server (NTRS)

    Green, E. F.

    1969-01-01

    Thermal expansion properties of materials used in aerospace systems are compiled into a single handbook. The data, derived from experimental measurements supplemented by information from literature sources, are presented in charts and tables arranged in two sections, covering cryogenic and elevated temperatures.

  6. Quick Information Sheets.

    ERIC Educational Resources Information Center

    Wisconsin Univ., Madison. Trace Center.

    This compilation of "Trace Quick Sheets" provides descriptions, prices, and ordering information for products and services that assist with communication, control, and computer access for disabled individuals. Product descriptions or product sources are included for: adaptive toys and toy modifications; head pointers, light pointers, and…

  7. RADIOLOGICAL SEALED SOURCE LIBRARY: A NUCLEAR FORENSICS TOOL

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Canaday, Jodi; Chamberlain, David; Finck, Martha

    If a terrorist were to obtain and possibly detonate a device that contained radiological material, radiological forensic analysis of the material and source capsule could provide law enforcement with valuable clues about the origin of the radiological material; this information could then provide further leads on where the material and sealed source was obtained, and the loss of control point. This information could potentially be utilized for attribution and prosecution. Analyses of nuclear forensic signatures for radiological materials are generally understood to include isotopic ratios, trace element concentrations, the time since irradiation or purification, and morphology. Radiological forensic signatures formore » sealed sources provide additional information that leverages information on the physical design and chemical composition of the source capsule and containers, physical markings indicative of an owner or manufacturer. Argonne National Laboratory (Argonne), in collaboration with Idaho National Laboratory (INL), has been working since 2003 to understand signatures that could be used to identify specific source manufacturers. These signatures include the materials from which the capsule is constructed, dimensions, weld details, elemental composition, and isotopic abundances of the radioactive material. These signatures have been compiled in a library known as the Argonne/INL Radiological Sealed Source Library. Data collected for the library has included open-source information from vendor catalogs and web pages; discussions with source manufacturers and touring of production facilities (both protected through non-disclosure agreements); technical publications; and government registries such as the U.S. Nuclear Regulatory Commission’s Sealed Source and Device Registry.« less

  8. An Open Source Model for Open Access Journal Publication

    PubMed Central

    Blesius, Carl R.; Williams, Michael A.; Holzbach, Ana; Huntley, Arthur C.; Chueh, Henry

    2005-01-01

    We describe an electronic journal publication infrastructure that allows a flexible publication workflow, academic exchange around different forms of user submissions, and the exchange of articles between publishers and archives using a common XML based standard. This web-based application is implemented on a freely available open source software stack. This publication demonstrates the Dermatology Online Journal's use of the platform for non-biased independent open access publication. PMID:16779183

  9. Cyberinfrastructure for Airborne Sensor Webs

    NASA Technical Reports Server (NTRS)

    Freudinger, Lawrence C.

    2009-01-01

    Since 2004 the NASA Airborne Science Program has been prototyping and using infrastructure that enables researchers to interact with each other and with their instruments via network communications. This infrastructure uses satellite links and an evolving suite of applications and services that leverage open-source software. The use of these tools has increased near-real-time situational awareness during field operations, resulting in productivity improvements and the collection of better data. This paper describes the high-level system architecture and major components, with example highlights from the use of the infrastructure. The paper concludes with a discussion of ongoing efforts to transition to operational status.

  10. Assessing Survivability Using Software Fault Injection

    DTIC Science & Technology

    2001-04-01

    UNCLASSIFIED Defense Technical Information Center Compilation Part Notice ADPO10875 TITLE: Assessing Survivability Using Software Fault Injection...Esc to exit .......................................................................... = 11-1 Assessing Survivability Using Software Fault Injection...Jeffrey Voas Reliable Software Technologies 21351 Ridgetop Circle, #400 Dulles, VA 20166 jmvoas@rstcorp.crom Abstract approved sources have the

  11. Exploring the Potential of Predictive Analytics and Big Data in Emergency Care.

    PubMed

    Janke, Alexander T; Overbeek, Daniel L; Kocher, Keith E; Levy, Phillip D

    2016-02-01

    Clinical research often focuses on resource-intensive causal inference, whereas the potential of predictive analytics with constantly increasing big data sources remains largely unexplored. Basic prediction, divorced from causal inference, is much easier with big data. Emergency care may benefit from this simpler application of big data. Historically, predictive analytics have played an important role in emergency care as simple heuristics for risk stratification. These tools generally follow a standard approach: parsimonious criteria, easy computability, and independent validation with distinct populations. Simplicity in a prediction tool is valuable, but technological advances make it no longer a necessity. Emergency care could benefit from clinical predictions built using data science tools with abundant potential input variables available in electronic medical records. Patients' risks could be stratified more precisely with large pools of data and lower resource requirements for comparing each clinical encounter to those that came before it, benefiting clinical decisionmaking and health systems operations. The largest value of predictive analytics comes early in the clinical encounter, in which diagnostic and prognostic uncertainty are high and resource-committing decisions need to be made. We propose an agenda for widening the application of predictive analytics in emergency care. Throughout, we express cautious optimism because there are myriad challenges related to database infrastructure, practitioner uptake, and patient acceptance. The quality of routinely compiled clinical data will remain an important limitation. Complementing big data sources with prospective data may be necessary if predictive analytics are to achieve their full potential to improve care quality in the emergency department. Copyright © 2015 American College of Emergency Physicians. Published by Elsevier Inc. All rights reserved.

  12. Landscape evolution in south-central Minnesota and the role of geomorphic history on modern erosional processes

    USGS Publications Warehouse

    Gran, K.B.; Belmont, P.; Day, S.S.; Finnegan, N.; Jennings, C.; Lauer, J.W.; Wilcock, P.R.

    2011-01-01

    The Minnesota River Valley was carved during catastrophic drainage of glacial Lake Agassiz at the end of the late Pleistocene. The ensuing base-level drop on tributaries created knickpoints that excavated deep valleys as they migrated upstream. A sediment budget compiled in one of these tributaries, the Le Sueur River, shows that these deep valleys are now the primary source of sediment to the Minnesota River. To compare modern sediment loads with pre-European settlement erosion rates, we analyzed incision history using fluvial terrace ages to constrain a valley incision model. Results indicate that even thoughthe dominant sediment sources are derived from natural sources (bluffs, ravines, and streambanks), erosion rates have increased substantially, due in part to pervasive changes in watershed hydrology.

  13. PTB’s radiometric scales for UV and VUV source calibration based on synchrotron radiation

    NASA Astrophysics Data System (ADS)

    Klein, Roman; Kroth, Simone; Paustian, Wolfgang; Richter, Mathias; Thornagel, Reiner

    2018-06-01

    The radiant intensity of synchrotron radiation can be accurately calculated with classical electrodynamics. This primary realization of the spectral radiant intensity has been used by PTB at several electron storage rings which have been optimized to be operated as primary source standards for the calibration of transfer sources in the spectral range of UV and VUV for almost 30 years. The transfer sources are compared to the primary source standard by means of suitable wavelength-dispersive transfer stations. The spectral range covered by deuterium lamps, which represent transfer sources that are easy to handle, is of particular relevance in practice. Here, we report on developments in the realization and preservation of the radiometric scales for spectral radiant intensity and spectral radiance in the wavelength region from 116 nm to 400 nm, based on a set of deuterium reference lamps, over the last few decades. An inside view and recommendations on the operation of the D2 lamps used for the realization of the radiometric scale are presented. The data has been recently compiled to illustrate the chronological behaviour at various wavelengths. Moreover, an overview of the internal and external validation measurements and intercomparisons is given.

  14. A program to generate a Fortran interface for a C++ library

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Taylor, Lee

    Shroud is a utility to create a Fortran and C interface for a C++ library. An existing C++ library API is described in an input file. Shroud reads the file and creates source files which can be compiled to provide a Fortran API for the library.

  15. Exposure Factors Resources: Contrasting EPA’s Exposure Factors Handbook with International Sources (Journal Article)

    EPA Science Inventory

    Efforts to compile and standardize exposure human factors have resulted in the development of a variety of resources available to the scientific community. For example, the U.S. EPA developed the Exposure Factors Handbook and Child-specific Exposure Factors Handbook to promote c...

  16. 10 CFR 1303.104 - Board records exempt from public disclosure.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... leaves no discretion on the issue; or (2) Establishes criteria for withholding or refers to particular... personal privacy. (g) Records or information compiled for law enforcement purposes. Buy only to the extent... privacy; (4) Could reasonably be expected to disclose the identity of any confidential source, including a...

  17. NASA World Wind: Infrastructure for Spatial Data

    NASA Technical Reports Server (NTRS)

    Hogan, Patrick

    2011-01-01

    The world has great need for analysis of Earth observation data, be it climate change, carbon monitoring, disaster response, national defense or simply local resource management. To best provide for spatial and time-dependent information analysis, the world benefits from an open standards and open source infrastructure for spatial data. In the spirit of NASA's motto "for the benefit of all" NASA invites the world community to collaboratively advance this core technology. The World Wind infrastructure for spatial data both unites and challenges the world for innovative solutions analyzing spatial data while also allowing absolute command and control over any respective information exchange medium.

  18. Chemotion ELN: an Open Source electronic lab notebook for chemists in academia.

    PubMed

    Tremouilhac, Pierre; Nguyen, An; Huang, Yu-Chieh; Kotov, Serhii; Lütjohann, Dominic Sebastian; Hübsch, Florian; Jung, Nicole; Bräse, Stefan

    2017-09-25

    The development of an electronic lab notebook (ELN) for researchers working in the field of chemical sciences is presented. The web based application is available as an Open Source software that offers modern solutions for chemical researchers. The Chemotion ELN is equipped with the basic functionalities necessary for the acquisition and processing of chemical data, in particular the work with molecular structures and calculations based on molecular properties. The ELN supports planning, description, storage, and management for the routine work of organic chemists. It also provides tools for communicating and sharing the recorded research data among colleagues. Meeting the requirements of a state of the art research infrastructure, the ELN allows the search for molecules and reactions not only within the user's data but also in conventional external sources as provided by SciFinder and PubChem. The presented development makes allowance for the growing dependency of scientific activity on the availability of digital information by providing Open Source instruments to record and reuse research data. The current version of the ELN has been using for over half of a year in our chemistry research group, serves as a common infrastructure for chemistry research and enables chemistry researchers to build their own databases of digital information as a prerequisite for the detailed, systematic investigation and evaluation of chemical reactions and mechanisms.

  19. ACOUSTICS IN ARCHITECTURAL DESIGN, AN ANNOTATED BIBLIOGRAPHY ON ARCHITECTURAL ACOUSTICS.

    ERIC Educational Resources Information Center

    DOELLE, LESLIE L.

    THE PURPOSE OF THIS ANNOTATED BIBLIOGRAPHY ON ARCHITECTURAL ACOUSTICS WAS--(1) TO COMPILE A CLASSIFIED BIBLIOGRAPHY, INCLUDING MOST OF THOSE PUBLICATIONS ON ARCHITECTURAL ACOUSTICS, PUBLISHED IN ENGLISH, FRENCH, AND GERMAN WHICH CAN SUPPLY A USEFUL AND UP-TO-DATE SOURCE OF INFORMATION FOR THOSE ENCOUNTERING ANY ARCHITECTURAL-ACOUSTIC DESIGN…

  20. 77 FR 39519 - Records Schedules; Availability and Request for Comments

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-03

    ... compiled from various sources to track and monitor the effects of past nuclear tests on Armed Forces personnel. 3. Department of Energy, Federal Energy Regulatory Commission (N1- 138-12-2, 1 temporary item... temporary item). Project files, including working papers relating to product reports. 7. Department of Labor...

  1. Learning to Live: A Manual of Environmental Education Activities.

    ERIC Educational Resources Information Center

    Minnesota State Dept. of Natural Resources, St. Paul. Bureau of Information and Education.

    Contributions from a variety of sources are compiled in this manual to provide both students and teachers with environmental study activities. Several activities are suggested under each of the following topics: Ecology and Esthetics (emphasizing awareness); The Decision-Making Process (resource management problems); A Plea for an Alternative…

  2. The German Speaking Countries of Europe: A Selective Bibliography.

    ERIC Educational Resources Information Center

    Krewson, Margrit B.

    This bibliography was compiled to provide researchers and students with a current guide to sources on the German-speaking countries of Europe: Austria, the Federal Republic of Germany, the German Democratic Republic, Liechtenstein, and Switzerland. The following subject areas are included under each country: (1) bibliographies and reference works;…

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davis, Lynn; Arquit Niederberger, Anne

    Abstract— Lighting systems have the ability to transform the economic and educational infrastructure of disadvantaged communities, and eradicating “light poverty” has become one of the primary goals of the International Year of Light 2015. Solid-state lighting (SSL) technology, based on light-emitting diode (LED) light sources, has emerged as the next generation of lighting technology, with a current global market penetration of roughly 5%. This paper will report on recent research on understanding SSL lighting system reliability (failure modes, environmental stressors, electrical power quality); discuss the implications of SSL technology reliability for providing lighting services; and suggest practical approaches to ensuremore » SSL reliability to benefit humanity. Among the key findings from this work is that LED sources can be extremely reliable, withstanding a broad range of environmental stresses without failure. Nonetheless, SSL lighting systems can have a negative impact on electrical power reliability, as well as on the affordability of lighting services, without attention to the quality of the accompanying power infrastructure. It is therefore critical to ensure that the performance of the power supply electronics used in lighting systems is matched to the quality of the power source, when evaluating energy efficient lighting choices.« less

  4. Marshall Space Flight Center 1990 annual chronology of events

    NASA Technical Reports Server (NTRS)

    Wright, Michael

    1991-01-01

    A chronological listing is provided of the major events for the Marshall Space Flight Center for the calendar year 1990. The MSFC Historian, Management Operations Office, compiled the chronology from various sources and from supplemental information provided by the major MSFC organizations.

  5. Marshall Space Flight Center 1989 annual chronology of events

    NASA Technical Reports Server (NTRS)

    Wright, Michael

    1990-01-01

    A chronological listing of the major events for the Marshall Space Flight Center for the calendar year 1989 is provided. The MSFC Historian, Management Operations Office, compiled the chronology from various sources and from supplemental information provided by the major MSFC organizations.

  6. Teaching Tips.

    ERIC Educational Resources Information Center

    Journalism Educator, 1985

    1985-01-01

    Presents a variety of journalism classroom techniques, including an editing scavenger hunt, a discovery method for compiling news sources, intense instruction in news technology, criteria for evaluating ad copy, a course combining print and broadcast journalism, use of the Teletext, and teaching ad forms for the new media. (HTH)

  7. Post-disaster supply chain interdependent critical infrastructure system restoration: A review of data necessary and available for modeling

    USGS Publications Warehouse

    Ramachandran, Varun; Long, Suzanna K.; Shoberg, Thomas G.; Corns, Steven; Carlo, Hector J.

    2016-01-01

    The majority of restoration strategies in the wake of large-scale disasters have focused on short-term emergency response solutions. Few consider medium- to long-term restoration strategies to reconnect urban areas to national supply chain interdependent critical infrastructure systems (SCICI). These SCICI promote the effective flow of goods, services, and information vital to the economic vitality of an urban environment. To re-establish the connectivity that has been broken during a disaster between the different SCICI, relationships between these systems must be identified, formulated, and added to a common framework to form a system-level restoration plan. To accomplish this goal, a considerable collection of SCICI data is necessary. The aim of this paper is to review what data are required for model construction, the accessibility of these data, and their integration with each other. While a review of publically available data reveals a dearth of real-time data to assist modeling long-term recovery following an extreme event, a significant amount of static data does exist and these data can be used to model the complex interdependencies needed. For the sake of illustration, a particular SCICI (transportation) is used to highlight the challenges of determining the interdependencies and creating models capable of describing the complexity of an urban environment with the data publically available. Integration of such data as is derived from public domain sources is readily achieved in a geospatial environment, after all geospatial infrastructure data are the most abundant data source and while significant quantities of data can be acquired through public sources, a significant effort is still required to gather, develop, and integrate these data from multiple sources to build a complete model. Therefore, while continued availability of high quality, public information is essential for modeling efforts in academic as well as government communities, a more streamlined approach to a real-time acquisition and integration of these data is essential.

  8. Antarctic and Sub-Antarctic Asteroidea database.

    PubMed

    Moreau, Camille; Mah, Christopher; Agüera, Antonio; Améziane, Nadia; David Barnes; Crokaert, Guillaume; Eléaume, Marc; Griffiths, Huw; Charlène Guillaumot; Hemery, Lenaïg G; Jażdżewska, Anna; Quentin Jossart; Vladimir Laptikhovsky; Linse, Katrin; Neill, Kate; Sands, Chester; Thomas Saucède; Schiaparelli, Stefano; Siciński, Jacek; Vasset, Noémie; Bruno Danis

    2018-01-01

    The present dataset is a compilation of georeferenced occurrences of asteroids (Echinodermata: Asteroidea) in the Southern Ocean. Occurrence data south of 45°S latitude were mined from various sources together with information regarding the taxonomy, the sampling source and sampling sites when available. Records from 1872 to 2016 were thoroughly checked to ensure the quality of a dataset that reaches a total of 13,840 occurrences from 4,580 unique sampling events. Information regarding the reproductive strategy (brooders vs. broadcasters) of 63 species is also made available. This dataset represents the most exhaustive occurrence database on Antarctic and Sub-Antarctic asteroids.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liscom, W.L.

    This book presents a complete graphic and statistical portrait of the dramatic shifts in global energy flows during the 1970s and the resultant transfer of economic and political power from the industrial nations to the oil-producing states. The information was extracted from government-source documents and compiled in a computer data base. Computer graphics were combined with the data base to produce over 400 full-color graphs. The energy commodities covered are oil, natural gas, coal, nuclear, and conventional electric-power generation. Also included are data on hydroelectric and geothermal power, oil shale, tar sands, and other alternative energy sources. 72 references.

  10. Study of nonpoint source nutrient loading in the Patuxent River basin, Maryland

    USGS Publications Warehouse

    Preston, S.D.

    1997-01-01

    Study of nonpoint-source (NPS) nutrient loading in Maryland has focused on the Patuxent watershed because of its importance and representativeness of conditions in the State. Evaluation of NPS nutrient loading has been comprehensive and has included long-term monitoring, detailed watershed modeling, and synoptic sampling studies. A large amount of information has been compiled for the watershed and that information is being used to identify primary controls and efficient management strategies for NPS nutrient loading. Results of the Patuxent NPS study have identified spatial trends in water quality that appear to be related to basin charcteristics such as land use, physiography, andgeology. Evaluation of the data compiled by the study components is continuing and is expected to provide more detailed assessments of the reasons for spatial trends. In particular, ongoing evaluation of the watershed model output is expected to provide detailed information on the relative importance of nutrient sources and transport pathways across the entire watershed. Planned future directions of NPS evaluation in the State of Maryland include continued study of water quality in the Patuxent watershed and a shift in emphasis to a statewide approach. Eventually, the statewide approach will become the primary approach usedby the State to evaluate NPS loading. The information gained in the Patuxent study and the tools developed will represent valuable assets indeveloping the statewide NPS assessment program.

  11. Intense Pulsed Neutron Source: Progress report 1991--1996. 15. Anniversary edition -- Volume 2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1996-05-01

    The 15th Anniversary Edition of the IPNS Progress Report is being published in recognition of the Intense Pulsed Neutron Source`s first 15 years of successful operation as a user facility. To emphasize the importance of this milestone, the author shave made the design and organization of the report significantly different from previous IPNS Progress Reports. This report consists of two volumes. For Volume 1, authors were asked to prepare articles that highlighted recent scientific accomplishments at IPNS, from 1991 to present; to focus on and illustrate the scientific advances achieved through the unique capabilities of neutron studies performed by IPNSmore » users; to report on specific activities or results from an instrument; or to focus on a body of work encompassing different neutron-scattering techniques. Articles were also included on the accelerator system, instrumentation, computing, target, and moderators. A list of published and ``in press` articles in journals, books, and conference proceedings, resulting from work done at IPNS since 1991, was compiled. This list is arranged alphabetically according to first author. Publication references in the articles are listed by last name of first author and year of publication. The IPNS experimental reports received since 1991 are compiled in Volume 2. Experimental reports referenced in the articles are listed by last name of first author, instrument designation, and experiment number.« less

  12. Intense Pulsed Neutron Source: Progress report 1991--1996. 15. Anniversary edition -- Volume 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marzec, B.

    1996-05-01

    The 15th Anniversary Edition of the IPNS Progress Report is being published in recognition of the Intense Pulsed Neutron Source`s first 15 years of successful operation as a user facility. To emphasize the importance of this milestone, the authors have made the design and organization of the report significantly different from previous IPNS Progress Reports. This report consists of two volumes. For Volume 1, authors were asked to prepare articles that highlighted recent scientific accomplishments at IPNS, from 1991 to present; to focus on and illustrate the scientific advances achieved through the unique capabilities of neutron studies performed by IPNSmore » users; to report on specific activities or results from an instrument; or to focus on a body of work encompassing different neutron-scattering techniques. Articles were also included on the accelerator system, instrumentation, computing, target, and moderators. A list of published and ``in press` articles in journals, books, and conference proceedings, resulting from work done at IPNS since 1991, was compiled. This list is arranged alphabetically according to first author. Publication references in the articles are listed by last name of first author and year of publication. The IPNS experimental reports received since 1991 are compiled in Volume 2. Experimental reports referenced in the articles are listed by last name of first author, instrument designation, and experiment number.« less

  13. GéoSAS: A modular and interoperable Open Source Spatial Data Infrastructure for research

    NASA Astrophysics Data System (ADS)

    Bera, R.; Squividant, H.; Le Henaff, G.; Pichelin, P.; Ruiz, L.; Launay, J.; Vanhouteghem, J.; Aurousseau, P.; Cudennec, C.

    2015-05-01

    To-date, the commonest way to deal with geographical information and processes still appears to consume local resources, i.e. locally stored data processed on a local desktop or server. The maturity and subsequent growing use of OGC standards to exchange data on the World Wide Web, enhanced in Europe by the INSPIRE Directive, is bound to change the way people (and among them research scientists, especially in environmental sciences) make use of, and manage, spatial data. A clever use of OGC standards can help scientists to better store, share and use data, in particular for modelling. We propose a framework for online processing by making an intensive use of OGC standards. We illustrate it using the Spatial Data Infrastructure (SDI) GéoSAS which is the SDI set up for researchers' needs in our department. It is based on the existing open source, modular and interoperable Spatial Data Architecture geOrchestra.

  14. Elastic Cloud Computing Infrastructures in the Open Cirrus Testbed Implemented via Eucalyptus

    NASA Astrophysics Data System (ADS)

    Baun, Christian; Kunze, Marcel

    Cloud computing realizes the advantages and overcomes some restrictionsof the grid computing paradigm. Elastic infrastructures can easily be createdand managed by cloud users. In order to accelerate the research ondata center management and cloud services the OpenCirrusTM researchtestbed has been started by HP, Intel and Yahoo!. Although commercialcloud offerings are proprietary, Open Source solutions exist in the field ofIaaS with Eucalyptus, PaaS with AppScale and at the applications layerwith Hadoop MapReduce. This paper examines the I/O performance ofcloud computing infrastructures implemented with Eucalyptus in contrastto Amazon S3.

  15. Digest of Education Statistics 2006. NCES 2007-017

    ERIC Educational Resources Information Center

    Snyder, Thomas D.; Dillow, Sally A.

    2007-01-01

    The primary purpose of the "Digest of Education Statistics" is to provide a compilation of statistical information covering the broad field of American education from prekindergarten through graduate school. The "Digest" includes a selection of data from both government and private sources, and draws especially on the results…

  16. TECHNOLOGIES FOR UPGRADING EXISTING OR DESIGNING NEW DRINKING WATER TREATMENT FACILITIES

    EPA Science Inventory

    The publication compiles material presented at a series of workshops and helps to focus attention on the many treatment and disinfection decisions that will be facing both ground water and surface source systems over the next several years, as implementation of the 1986 Safe Drin...

  17. Monitoring Knowledge Base (MKB)

    EPA Pesticide Factsheets

    The Monitoring Knowledge Base (MKB) is a compilation of emissions measurement and monitoring techniques associated with air pollution control devices, industrial process descriptions, and permitting techniques, including flexible permit development. Using MKB, one can gain a comprehensive understanding of emissions sources, control devices, and monitoring techniques, enabling one to determine appropriate permit terms and conditions.

  18. Pokeweed (Phytolacca americana): possible source of a molluscicide

    Treesearch

    Arnold Krochmal; P.W. LeQuesne; P.W. LeQuesne

    1970-01-01

    Pokeweed, a plant abundant in Appalachia, exhibits some chemical similarities to a related species that has shown molluscicidal properties. Because this suggests that pokeweed, Phytolacca americana L. (P. decandra L.), has potential for controlling fresh-water snails, we have compiled this report of its chenlical composition, uses, propagation methods, and other...

  19. Integrating Technology into the Curriculum for "At-Risk" Youth

    ERIC Educational Resources Information Center

    McCall, Denise

    2009-01-01

    This Independent Learning Project (ILP) discusses the best practices in educational technology to improve the behavior, instruction, and learning of at-risk youth, for whom technology offers unique opportunities. Research is compiled from numerous scholarly print and online sources. A guide for teachers provides detailed strategies, software…

  20. Appalachian Children and Their Families. A Statistical Profile.

    ERIC Educational Resources Information Center

    CSR, Inc., Arlington, VA.

    A statistical profile of Appalachia's young children, from birth to 9 years, was compiled from federal and state data sources. The profile provides information important in making immediate and long range plans for improving the status of Appalachian children and their families. An examination of family living conditions suggests that Appalachian…

  1. Digest of Education Statistics, 2008. NCES 2009-020

    ERIC Educational Resources Information Center

    Snyder, Thomas D.; Dillow, Sally A.; Hoffman, Charlene M.

    2009-01-01

    The primary purpose of the "Digest of Education Statistics" is to provide a compilation of statistical information covering the broad field of American education from prekindergarten through graduate school. The "Digest" includes a selection of data from both government and private sources, and draws especially on the results…

  2. Digest of Education Statistics, 2007. NCES 2008-022

    ERIC Educational Resources Information Center

    Snyder, Thomas D.; Dillow, Sally A.; Hoffman, Charlene M.

    2008-01-01

    The primary purpose of the "Digest of Education Statistics" is to provide a compilation of statistical information covering the broad field of American education from prekindergarten through graduate school. The "Digest" includes a selection of data from both government and private sources, and draws especially on the results…

  3. The Big Book of Library Grant Money.

    ERIC Educational Resources Information Center

    Taft Group, Rockville, MD.

    Libraries facing diminishing budgets and increasing demand for services must explore all funding sources, especially the more than $6 billion available in annual foundation and corporate giving. The easier and greater access to information on prospective givers provided simplifies this task. It profiles 1,471 library grant givers, compiled from…

  4. A Military Guide to Terrorism in the Twenty-First Century

    DTIC Science & Technology

    2007-08-15

    B-1 Handguns ...Compiled from open source materials , this handbook promotes a “Threats” perspective and enemy situational awareness of U.S. strategies and operations in...are one of the most effective casualty producing weapons in the ongoing coalition presence in Iraq. Syria’s political and material support of

  5. Household Production.

    ERIC Educational Resources Information Center

    Scholl, Kathleen K.; And Others

    1982-01-01

    Compiled to give readers information on current research in household production, this special issue focuses on the family as a provider of goods and services. It includes five feature articles, a summary of a survey of American farm women, and a brief analysis of sources of time-use data for estimating the value of household production. Covered…

  6. Real-Time Analysis of African Political Violence, November 2012

    DTIC Science & Technology

    2012-11-01

    internationalised ) AQIM will be the real test of the ECOWAS force. Their forces, along with the affili- ated MUJAO, have been more actively involved in armed...from a variety of sources, including ACLED data, compiled from news reports; NGO reports, and Africa-focused news reports. Citations refer to African

  7. Writing Assessment: Issues and Strategies. Longman Series in College Composition and Communication.

    ERIC Educational Resources Information Center

    Greenberg, Karen L., Ed.; And Others

    Data compiled from more than 2,000 member institutions of the National Testing Network in Writing were the source of this guide to writing assessment. Using an interdisciplinary approach, with insights from cognitive psychology, sociology, linguistics, educational measurement, rhetoric, and English education, the book gives suggestions on…

  8. Making a Difference Through Engineer Capacity Building in Africa

    DTIC Science & Technology

    2014-05-22

    expense is 10 percent in Africa whereas in China energy it is only 3 percent.37 A lack of transportation infrastructure in Africa means that businesses...Africa. In 2012, China committed to provide $20 billion in loans for agriculture and infrastructure development. While this level of aid is roughly the...using wind, solar, hydropower, natural gas, and geothermal sources. This program is being executed in six sub-Saharan African nations. They are

  9. Surface water data and geographic relation to Tertiary age intrusions and hydrothermal alteration in the Grand Mesa, Uncompahgre, and Gunnison National Forests (GMUG) and intervening Bureau of Land Management (BLM) lands

    USGS Publications Warehouse

    Bove, Dana J.; Knepper, Daniel H.

    2000-01-01

    This data set covering the western part of Colorado includes water quality data from eight different sources (points), nine U.S. Geological Survey Digital Raster Graph (DRG) files for topographic bases, a compilation of Tertiary age intrusions (polygons and lines), and two geotiff files showing areas of hydrothermally altered rock. These data were compiled for use with an ongoing mineral resource assessment of theGrand Mesa, Uncompahgre, and Gunnison National Forests (GMUG) and intervening Bureau of Land Management(BLM) lands. This compilation was assembled to give federal land managers a preliminary view of water within sub-basinal areas, and to show possible relationships to Tertiary age intrusion and areas of hydrothermal alteration.

  10. Evaluation of Urban Drainage Infrastructure: New York City Case Study

    NASA Astrophysics Data System (ADS)

    Hamidi, A.; Grossberg, M.; Khanbilvardi, R.

    2017-12-01

    Flood response in an urban area is the product of interactions of spatially and temporally varying rainfall and infrastructures. In urban areas, however, the complex sub-surface networks of tunnels, waste and storm water drainage systems are often inaccessible, pose challenges for modeling and prediction of the drainage infrastructure performance. The increased availability of open data in cities is an emerging information asset for a better understanding of the dynamics of urban water drainage infrastructure. This includes crowd sourced data and community reporting. A well-known source of this type of data is the non-emergency hotline "311" which is available in many US cities, and may contain information pertaining to the performance of physical facilities, condition of the environment, or residents' experience, comfort and well-being. In this study, seven years of New York City 311 (NYC311) call during 2010-2016 is employed, as an alternative approach for identifying the areas of the city most prone to sewer back up flooding. These zones are compared with the hydrologic analysis of runoff flooding zones to provide a predictive model for the City. The proposed methodology is an example of urban system phenomenology using crowd sourced, open data. A novel algorithm for calculating the spatial distribution of flooding complaints across NYC's five boroughs is presented in this study. In this approach, the features that represent reporting bias are separated from those that relate to actual infrastructure system performance. The sewer backup results are assessed with the spatial distribution of runoff in NYC during 2010-2016. With advances in radar technologies, a high spatial-temporal resolution data set for precipitation is available for most of the United States that can be implemented in hydrologic analysis of dense urban environments. High resolution gridded Stage IV radar rainfall data along with the high resolution spatially distributed land cover data are employed to investigate the urban pluvial flooding. The monthly results of excess runoff are compared with the sewer backup in NYC to build a predictive model of flood zones according to the 311 phone calls.

  11. Communications: Critical Infrastructure and Key Resources Sector-Specific Plan as Input to the National Infrastructure Protection Plan

    DTIC Science & Technology

    2007-05-01

    Commission maintains an expert staff of engineers and statisticians to analyze this data in an attempt to reveal troublesome trends in network reliability...response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing and...to a penalty for failing to comply with a collection of information if it does not display a currently valid OMB control number. 1. REPORT DATE MAY

  12. The Icelandic volcanological data node and data service

    NASA Astrophysics Data System (ADS)

    Vogfjord, Kristin; Sigmundsson, Freysteinn; Futurevolc Team

    2013-04-01

    Through funding from the European FP7 programme, the International Civil Aviation Authority (ICAO), as well as the local Icelandic government and RANNÍS research fund, the establishment of the Icelandic volcano observatory (VO) as a cross-disciplinary, international volcanological data node and data service is starting to materialize. At the core of this entity is the close collaboration between the Icelandic Meteorological Office (IMO), a natural hazard monitoring and research institution, and researchers at the Earth Science Institute of the University of Iceland, ensuring long-term sustainable access to research quality data and products. Existing Icelandic Earth science monitoring and research infrastructures are being prepared for integration with the European EPOS infrastructure. Because the VO is located at a Met Office, this infrastructure also includes meteorological infrastructures relevant to volcanology. Furthermore, the FP7 supersite project, FUTUREVOLC cuts across disciplines to bring together European researchers from Earth science, atmospheric science, remote sensing and space science focussed on combined processing of the different data sources and results to generate a multiparametric volcano monitoring and early warning system. Integration with atmospheric and space science is to meet the need for better estimates of the volcanic eruption source term and dispersion, which depend not only on the magma flow rate and composition, but also on atmosphere-plume interaction and dispersion. This should lead to better estimates of distribution of ash in the atmosphere. FUTUREVOLC will significantly expand the existing Icelandic EPOS infrastructure to an even more multidisciplinary volcanological infrastructure. A central and sustainable part of the project is the establishment of a research-quality data centre at the VO. This data centre will be able to serve as a volcanological data node within EPOS, making multidisciplinary data accessible to scientists and stakeholders, and enabling the generation of products and services useful for civil protection, societal infrastructure and international aviation. The 2010 Eyjafjallajökull eruption demonstrated that eruption and dispersion of volcanic ash in the atmosphere can have far-reaching detrimental effects on aviation. The aviation community is therefore an important stakeholder in volcano monitoring, but interaction between the two communities is not well established. Traditionally Met Offices provide services vital to aviation safety and therefore have strong ties to the aviation community, with internationally established protocols for interaction. The co-habitation of a Met Office with a VO establishes a firm connection between these communities and allows adaptation of already established protocols to facilitate access to information and development of services for aviation, as well as sources of support for the VO.

  13. Design for Connecting Spatial Data Infrastructures with Sensor Web (sensdi)

    NASA Astrophysics Data System (ADS)

    Bhattacharya, D.; M., M.

    2016-06-01

    Integrating Sensor Web With Spatial Data Infrastructures (SENSDI) aims to extend SDIs with sensor web enablement, converging geospatial and built infrastructure, and implement test cases with sensor data and SDI. It is about research to harness the sensed environment by utilizing domain specific sensor data to create a generalized sensor webframework. The challenges being semantic enablement for Spatial Data Infrastructures, and connecting the interfaces of SDI with interfaces of Sensor Web. The proposed research plan is to Identify sensor data sources, Setup an open source SDI, Match the APIs and functions between Sensor Web and SDI, and Case studies like hazard applications, urban applications etc. We take up co-operative development of SDI best practices to enable a new realm of a location enabled and semantically enriched World Wide Web - the "Geospatial Web" or "Geosemantic Web" by setting up one to one correspondence between WMS, WFS, WCS, Metadata and 'Sensor Observation Service' (SOS); 'Sensor Planning Service' (SPS); 'Sensor Alert Service' (SAS); a service that facilitates asynchronous message interchange between users and services, and between two OGC-SWE services, called the 'Web Notification Service' (WNS). Hence in conclusion, it is of importance to geospatial studies to integrate SDI with Sensor Web. The integration can be done through merging the common OGC interfaces of SDI and Sensor Web. Multi-usability studies to validate integration has to be undertaken as future research.

  14. A Methodology for Assessing the Impact of Sea Level Rise on Representative Military Installation in the Southwestern United States (RC-1703)

    DTIC Science & Technology

    2015-04-01

    escarpments, relic sediment fans off river mouths , and submarine canyons (courtesy of the Coastal Data Information Program http://cdip.ucsd.edu...with the Source- Pathway-Receptor model. In other words , the question should specify the source of the vulnerability, the receptor that is impacted...works and other infrastructure, renewable and subsistence resources, tourism , recreation, transportation functions, cultural resources, agriculture

  15. Mercury in western North America: A synthesis of environmental contamination, fluxes, bioaccumulation, and risk to fish and wildlife

    USGS Publications Warehouse

    Eagles-Smith, Collin A.; Wiener, James G.; Eckley, Chris S.; Willacker, James J.; Evers, David C.; Marvin-DiPasquale, Mark C.; Obrist, Daniel; Fleck, Jacob; Aiken, George R.; Lepak, Jesse M.; Jackson, Allyson K.; Webster, Jackson; Stewart, Robin; Davis, Jay; Alpers, Charles N.; Ackerman, Joshua T.

    2016-10-15

    Western North America is a region defined by extreme gradients in geomorphology and climate, which support a diverse array of ecological communities and natural resources. The region also has extreme gradients in mercury (Hg) contamination due to a broad distribution of inorganic Hg sources. These diverse Hg sources and a varied landscape create a unique and complex mosaic of ecological risk from Hg impairment associated with differential methylmercury (MeHg) production and bioaccumulation. Understanding the landscape-scale variation in the magnitude and relative importance of processes associated with Hg transport, methylation, and MeHg bioaccumulation requires a multidisciplinary synthesis that transcends small-scale variability. The Western North America Mercury Synthesis compiled, analyzed, and interpreted spatial and temporal patterns and drivers of Hg and MeHg in air, soil, vegetation, sediments, fish, and wildlife across western North America. This collaboration evaluated the potential risk from Hg to fish, and wildlife health, human exposure, and examined resource management activities that influenced the risk of Hg contamination. This paper integrates the key information presented across the individual papers that comprise the synthesis. The compiled information indicates that Hg contamination is widespread, but heterogeneous, across western North America. The storage and transport of inorganic Hg across landscape gradients are largely regulated by climate and land-cover factors such as plant productivity and precipitation. Importantly, there was a striking lack of concordance between pools and sources of inorganic Hg, and MeHg in aquatic food webs. Additionally, water management had a widespread influence on MeHg bioaccumulation in aquatic ecosystems, whereas mining impacts where relatively localized. These results highlight the decoupling of inorganic Hg sources with MeHg production and bioaccumulation. Together the findings indicate that developing efforts to control MeHg production in the West may be particularly beneficial for reducing food web exposure instead of efforts to simply control inorganic Hg sources.

  16. Mercury in western North America: A synthesis of environmental contamination, fluxes, bioaccumulation, and risk to fish and wildlife

    USGS Publications Warehouse

    Eagles-Smith, Collin A.; Wiener, James G.; Eckley, Chris S.; Willacker, James J.; Evers, David C.; Marvin-DiPasquale, Mark C.; Obrist, Daniel; Fleck, Jacob; Aiken, George R.; Lepak, Jesse M.; Jackson, Allyson K.; Webster, Jackson; Stewart, Robin; Davis, Jay; Alpers, Charles N.; Ackerman, Joshua T.

    2016-01-01

    Western North America is a region defined by extreme gradients in geomorphology and climate, which support a diverse array of ecological communities and natural resources. The region also has extreme gradients in mercury (Hg) contamination due to a broad distribution of inorganic Hg sources. These diverse Hg sources and a varied landscape create a unique and complex mosaic of ecological risk from Hg impairment associated with differential methylmercury (MeHg) production and bioaccumulation. Understanding the landscape-scale variation in the magnitude and relative importance of processes associated with Hg transport, methylation, and MeHg bioaccumulation requires a multidisciplinary synthesis that transcends small-scale variability. The Western North America Mercury Synthesis compiled, analyzed, and interpreted spatial and temporal patterns and drivers of Hg and MeHg in air, soil, vegetation, sediments, fish, and wildlife across western North America. This collaboration evaluated the potential risk from Hg to fish, and wildlife health, human exposure, and examined resource management activities that influenced the risk of Hg contamination. This paper integrates the key information presented across the individual papers that comprise the synthesis. The compiled information indicates that Hg contamination is widespread, but heterogeneous, across western North America. The storage and transport of inorganic Hg across landscape gradients are largely regulated by climate and land-cover factors such as plant productivity and precipitation. Importantly, there was a striking lack of concordance between pools and sources of inorganic Hg, and MeHg in aquatic food webs. Additionally, water management had a widespread influence on MeHg bioaccumulation in aquatic ecosystems, whereas mining impacts where relatively localized. These results highlight the decoupling of inorganic Hg sources with MeHg production and bioaccumulation. Together the findings indicate that developing efforts to control MeHg production in the West may be particularly beneficial for reducing food web exposure instead of efforts to simply control inorganic Hg sources.

  17. Alternative Fuels Data Center

    Science.gov Websites

    and programs that help meet the requirements of the Clean Air Act by reducing mobile source emissions ), diesel retrofit projects, and alternative fuel vehicles and infrastructure. Projects supported with CMAQ

  18. Data needs and data bases for climate studies

    NASA Technical Reports Server (NTRS)

    Matthews, Elaine

    1986-01-01

    Two complementary global digital data bases of vegetation and land use, compiled at 1 deg resolution from published sources for use in climate studies, are discussed. The data bases were implemented, in several individually tailored formulations, in a series of climate related applications including: land-surface prescriptions in three-dimensional general circulation models, global biogeochemical cycles (CO2, methane), critical-area mapping for satellite monitoring of land-cover change, and large-scale remote sensing of surface reflectance. The climate applications are discussed with reference to data needs, and data availability from traditional and remote sensing sources.

  19. National Register of Research Projects, 1986/87. Part 2A: Natural sciences. Physical, engineering and related sciences (modified projects)

    NASA Astrophysics Data System (ADS)

    1988-08-01

    This Register is intended to serve as a source of information on research which is being conducted in all fields (both natural and human sciences) in the Republic of South Africa. New and current research projects that were commenced or modified during 1986 and 1987, on which information was received by the compilers until January 1988, are included, with the exception of confidential projects. Project titles and keywords are presented in the language as supplied, and the classifications are based on those provided by the primary sources.

  20. National register of research projects, 1986/87. Part 4: Human sciences. Economic, political, language, art and other sciences

    NASA Astrophysics Data System (ADS)

    1988-08-01

    This Register is intended to serve as a source of information on research which is being conducted in all fields (both natural and human sciences) in the Republic of South Africa. New and current research projects that were commenced or modified during 1986 and 1987, on which information was received by the compilers until January 1988, are included, with the exception of confidential projects. Project titles and keywords are presented in the language as supplied, and the classifications are based on those provided by the primary sources.

  1. Tritium-powered radiation sensor network

    NASA Astrophysics Data System (ADS)

    Litz, Marc S.; Russo, Johnny A.; Katsis, Dimos

    2016-05-01

    Isotope power supplies offer long-lived (100 years using 63Ni), low-power energy sources, enabling sensors or communications nodes for the lifetime of infrastructure. A tritium beta-source (12.5-year half-life) encapsulated in a phosphor-lined vial couples directly to a photovoltaic (PV) to generate a trickle current into an electrical load. An inexpensive design is described using commercial-of-the-shelf (COTS) components that generate 100 μWe for nextgeneration compact electronics/sensors. A matched radiation sensor has been built for long-duration missions utilizing microprocessor-controlled sleep modes, low-power electronic components, and a passive interrupt driven environmental wake-up. The low-power early-warning radiation detector network and isotope power source enables no-maintenance mission lifetimes.

  2. Code Analysis and Refactoring with Clang Tools, Version 0.1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kelley, Timothy M.

    2016-12-23

    Code Analysis and Refactoring with Clang Tools is a small set of example code that demonstrates techniques for applying tools distributed with the open source Clang compiler. Examples include analyzing where variables are used and replacing old data structures with standard structures.

  3. 36 CFR 1206.22 - What type of proposal is eligible for a publications grant?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... projects include the production of: (1) Documentary editions that involve collecting, compiling... records; (2) Microfilm editions consisting of organized collections of images of original sources, usually without transcription and annotations; (3) Electronic editions consisting of organized collections of...

  4. 36 CFR 1206.22 - What type of proposal is eligible for a publications grant?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... projects include the production of: (1) Documentary editions that involve collecting, compiling... records; (2) Microfilm editions consisting of organized collections of images of original sources, usually without transcription and annotations; (3) Electronic editions consisting of organized collections of...

  5. 36 CFR § 1206.22 - What type of proposal is eligible for a publications grant?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... projects include the production of: (1) Documentary editions that involve collecting, compiling... records; (2) Microfilm editions consisting of organized collections of images of original sources, usually without transcription and annotations; (3) Electronic editions consisting of organized collections of...

  6. Information about old growth for selected forest type groups in the eastern United States.

    Treesearch

    Lucy E. Tyrrell; Gregory J. Nowacki; David S. Buckley; Elizabeth A. Nauertz; Jeffrey N. Niese; Jeanette L. Rollinger; John C. Zasada; John C. Zasada

    1998-01-01

    Compiles information about old-growth attributes for nine forest type groups that occur in the eastern United States. A range of values for each old-growth attribute for each forest type is summarized regionally from published and unpublished sources.

  7. Is There Scientific Consensus on Acid Rain? -- Excerpts from Six Governmental Reports.

    ERIC Educational Resources Information Center

    Environmental Education Report and Newsletter, 1986

    1986-01-01

    Compiles a series of direct quotations from six governmental reports that reflect a scientific consensus on major aspects of acid deposition. Presents the statements in a question and answer format. Also reviews the sources, extent, and effects of acid rain. (ML)

  8. Design and Implementation of a High-Flux Photoneutron Converter for Analysis of Fast Neutron Radiation Damage on Gallium Nitride Transistors

    DTIC Science & Technology

    2017-06-01

    Porter Approved for public release. Distribution is unlimited. THIS PAGE INTENTIONALLY LEFT BLANK REPORT DOCUMENTATION PAGE Form Approved OMB No. 0704...THIS PAGE Unclassified 19. SECURITY CLASSIFICATION OF ABSTRACT Unclassified 20. LIMITATION OF ABSTRACT UU NSN 7540-01-280-5500 Standard Form 298 (Rev...infrastructure in today’s society. Due to the projected depletion of fossil fuel sources in the 21st century, nuclear power is a potential source of

  9. Challenges and opportunities in the design and construction of a GIS-based emission inventory infrastructure for the Niger Delta region of Nigeria.

    PubMed

    Fagbeja, Mofoluso A; Hill, Jennifer L; Chatterton, Tim J; Longhurst, James W S; Akpokodje, Joseph E; Agbaje, Ganiy I; Halilu, Shaba A

    2017-03-01

    Environmental monitoring in middle- and low-income countries is hampered by many factors which include enactment and enforcement of legislations; deficiencies in environmental data reporting and documentation; inconsistent, incomplete and unverifiable data; a lack of access to data; and technical expertise. This paper describes the processes undertaken and the major challenges encountered in the construction of the first Niger Delta Emission Inventory (NDEI) for criteria air pollutants and CO 2 released from the anthropogenic activities in the region. This study focused on using publicly available government and research data. The NDEI has been designed to provide a Geographic Information System-based component of an air quality and carbon management framework. The NDEI infrastructure was designed and constructed at 1-, 10- and 20-km grid resolutions for point, line and area sources using industry standard processes and emission factors derived from activities similar to those in the Niger Delta. Due to inadequate, incomplete, potentially inaccurate and unavailable data, the infrastructure was populated with data based on a series of best possible assumptions for key emission sources. This produces outputs with variable levels of certainty, which also highlights the critical challenges in the estimation of emissions from a developing country. However, the infrastructure is functional and has the ability to produce spatially resolved emission estimates.

  10. Event heap: a coordination infrastructure for dynamic heterogeneous application interactions in ubiquitous computing environments

    DOEpatents

    Johanson, Bradley E.; Fox, Armando; Winograd, Terry A.; Hanrahan, Patrick M.

    2010-04-20

    An efficient and adaptive middleware infrastructure called the Event Heap system dynamically coordinates application interactions and communications in a ubiquitous computing environment, e.g., an interactive workspace, having heterogeneous software applications running on various machines and devices across different platforms. Applications exchange events via the Event Heap. Each event is characterized by a set of unordered, named fields. Events are routed by matching certain attributes in the fields. The source and target versions of each field are automatically set when an event is posted or used as a template. The Event Heap system implements a unique combination of features, both intrinsic to tuplespaces and specific to the Event Heap, including content based addressing, support for routing patterns, standard routing fields, limited data persistence, query persistence/registration, transparent communication, self-description, flexible typing, logical/physical centralization, portable client API, at most once per source first-in-first-out ordering, and modular restartability.

  11. OpenARC: Extensible OpenACC Compiler Framework for Directive-Based Accelerator Programming Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Seyong; Vetter, Jeffrey S

    2014-01-01

    Directive-based, accelerator programming models such as OpenACC have arisen as an alternative solution to program emerging Scalable Heterogeneous Computing (SHC) platforms. However, the increased complexity in the SHC systems incurs several challenges in terms of portability and productivity. This paper presents an open-sourced OpenACC compiler, called OpenARC, which serves as an extensible research framework to address those issues in the directive-based accelerator programming. This paper explains important design strategies and key compiler transformation techniques needed to implement the reference OpenACC compiler. Moreover, this paper demonstrates the efficacy of OpenARC as a research framework for directive-based programming study, by proposing andmore » implementing OpenACC extensions in the OpenARC framework to 1) support hybrid programming of the unified memory and separate memory and 2) exploit architecture-specific features in an abstract manner. Porting thirteen standard OpenACC programs and three extended OpenACC programs to CUDA GPUs shows that OpenARC performs similarly to a commercial OpenACC compiler, while it serves as a high-level research framework.« less

  12. Global EDGAR v4.1 emissions of air pollutants: analysis of impacts of emissions abatement in industry and road transport on regional and global scale

    NASA Astrophysics Data System (ADS)

    Janssens-Maenhout, G.; Olivier, J. G.; Doering, U. M.; van Aardenne, J.; Monni, S.; Pagliari, V.; Peters, J. A.

    2010-12-01

    The new version v4.1 of the Emission Database for Global Atmospheric Research (EDGAR) compiled by JRC and PBL provides independent estimates of the global anthropogenic emissions and emission trends of precursors of tropospheric ozone (CO, NMVOC, NOx) and acidifying substances (NOx, NH3, SO2) for the period 1970-2005. All emissions are detailed at country level consistently using the same technology-based methodology, combining activity data (international statistics) from publicly available sources and to the extent possible emission factors as recommended by the EMEP/EEA air pollutant emission inventory guidebook. By using high resolution global grid maps per source category of area sources and point sources, we also compiled datasets with annual emissions on a 0.1x0.1 degree grid, as input for atmospheric models. We provide full and up-to-date inventories per country, also for developing countries. Moreover, the time series back in time to 1970 provides for the trends in official national inventories a historic perspective. As part of our objective to contribute to more reliable inventories by providing a reference emissions database for emission scenarios, inventory comparisons and for atmospheric modellers, we strive to transparently document all data sources used and assumptions made where data was missing, in particular for assumptions made on the shares of technologies where relevant. Technology mixes per country or region were taken from other data sources (such as the Platts database) or estimated using other sources or countries as proxy. The evolution in the adoption of technologies world-wide over the 35 years covered by EDGAR v4.1 will be illustrated for the power industry and the road transport sectors, in particular for Europe and the US. Similarly the regional and global impacts of implemented control measures and end-of pipe abatements will be illustrated by the examples of - NOx and SO2 end-of pipe abatements being implemented since the late eighties for power plants in Europe, and since 2000 appearing in the economically emerging countries such as China; - EURO3 control measures, a European standard for passenger cars, which now reached the age of being exported to African and Latin-American countries. An outlook will be given on the current readiness of Europe to meet the challenging goals of the National Emission Ceilings directive.

  13. Building a Business Case for Compressed Natural Gas in Fleet Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mitchell, G.

    2015-03-19

    Natural gas is a clean-burning, abundant, and domestically produced source of energy. Compressed natural gas (CNG) has recently garnered interest as a transportation fuel because of these attributes and because of its cost savings and price stability compared to conventional petroleum fuels. The National Renewable Energy Laboratory (NREL) developed the Vehicle Infrastructure and Cash-Flow Evaluation (VICE) model to help businesses and fleets evaluate the financial soundness of CNG vehicle and CNG fueling infrastructure projects.

  14. Elevated Arsenic and Uranium Concentrations in Unregulated Water Sources on the Navajo Nation, USA.

    PubMed

    Hoover, Joseph; Gonzales, Melissa; Shuey, Chris; Barney, Yolanda; Lewis, Johnnye

    2017-01-01

    Regional water pollution and use of unregulated water sources can be an important mixed metals exposure pathway for rural populations located in areas with limited water infrastructure and an extensive mining history. Using censored data analysis and mapping techniques we analyzed the joint geospatial distribution of arsenic and uranium in unregulated water sources throughout the Navajo Nation, where over 500 abandoned uranium mine sites are located in the rural southwestern United States. Results indicated that arsenic and uranium concentrations exceeded national drinking water standards in 15.1 % (arsenic) and 12.8 % (uranium) of tested water sources. Unregulated sources in close proximity (i.e., within 6 km) to abandoned uranium mines yielded significantly higher concentrations of arsenic or uranium than more distant sources. The demonstrated regional trends for potential co-exposure to these chemicals have implications for public policy and future research. Specifically, to generate solutions that reduce human exposure to water pollution from unregulated sources in rural areas, the potential for co-exposure to arsenic and uranium requires expanded documentation and examination. Recommendations for prioritizing policy and research decisions related to the documentation of existing health exposures and risk reduction strategies are also provided.

  15. The Human Physiome: how standards, software and innovative service infrastructures are providing the building blocks to make it achievable

    PubMed Central

    2016-01-01

    Reconstructing and understanding the Human Physiome virtually is a complex mathematical problem, and a highly demanding computational challenge. Mathematical models spanning from the molecular level through to whole populations of individuals must be integrated, then personalized. This requires interoperability with multiple disparate and geographically separated data sources, and myriad computational software tools. Extracting and producing knowledge from such sources, even when the databases and software are readily available, is a challenging task. Despite the difficulties, researchers must frequently perform these tasks so that available knowledge can be continually integrated into the common framework required to realize the Human Physiome. Software and infrastructures that support the communities that generate these, together with their underlying standards to format, describe and interlink the corresponding data and computer models, are pivotal to the Human Physiome being realized. They provide the foundations for integrating, exchanging and re-using data and models efficiently, and correctly, while also supporting the dissemination of growing knowledge in these forms. In this paper, we explore the standards, software tooling, repositories and infrastructures that support this work, and detail what makes them vital to realizing the Human Physiome. PMID:27051515

  16. The Human Physiome: how standards, software and innovative service infrastructures are providing the building blocks to make it achievable.

    PubMed

    Nickerson, David; Atalag, Koray; de Bono, Bernard; Geiger, Jörg; Goble, Carole; Hollmann, Susanne; Lonien, Joachim; Müller, Wolfgang; Regierer, Babette; Stanford, Natalie J; Golebiewski, Martin; Hunter, Peter

    2016-04-06

    Reconstructing and understanding the Human Physiome virtually is a complex mathematical problem, and a highly demanding computational challenge. Mathematical models spanning from the molecular level through to whole populations of individuals must be integrated, then personalized. This requires interoperability with multiple disparate and geographically separated data sources, and myriad computational software tools. Extracting and producing knowledge from such sources, even when the databases and software are readily available, is a challenging task. Despite the difficulties, researchers must frequently perform these tasks so that available knowledge can be continually integrated into the common framework required to realize the Human Physiome. Software and infrastructures that support the communities that generate these, together with their underlying standards to format, describe and interlink the corresponding data and computer models, are pivotal to the Human Physiome being realized. They provide the foundations for integrating, exchanging and re-using data and models efficiently, and correctly, while also supporting the dissemination of growing knowledge in these forms. In this paper, we explore the standards, software tooling, repositories and infrastructures that support this work, and detail what makes them vital to realizing the Human Physiome.

  17. Deep challenges for China's war on water pollution.

    PubMed

    Han, Dongmei; Currell, Matthew J; Cao, Guoliang

    2016-11-01

    China's Central government has released an ambitious plan to tackle the nation's water pollution crisis. However, this is inhibited by a lack of data, particularly for groundwater. We compiled and analyzed water quality classification data from publicly available government sources, further revealing the scale and extent of the crisis. We also compiled nitrate data in shallow and deep groundwater from a range of literature sources, covering 52 of China's groundwater systems; the most comprehensive national-scale assessment yet. Nitrate pollution at levels exceeding the US EPA's maximum contaminant level (10 mg/L NO 3 N) occurs at the 90th percentile in 25 of 36 shallow aquifers and 10 out of 37 deep or karst aquifers. Isotopic compositions of groundwater nitrate (δ 15 N and δ 18 O NO3 values ranging from -14.9‰ to 35.5‰ and -8.1‰ to 51.0‰, respectively) indicate many nitrate sources including soil nitrogen, agricultural fertilizers, untreated wastewater and/or manure, and locally show evidence of de-nitrification. From these data, it is clear that contaminated groundwater is ubiquitous in deep aquifers as well as shallow groundwater (and surface water). Deep aquifers contain water recharged tens of thousands of years before present, long before widespread anthropogenic nitrate contamination. This groundwater has therefore likely been contaminated due to rapid bypass flow along wells or other conduits. Addressing the issue of well condition is urgently needed to stop further pollution of China's deep aquifers, which are some of China's most important drinking water sources. China's new 10-point Water Pollution Plan addresses previous shortcomings, however, control and remediation of deep groundwater pollution will take decades of sustained effort. Copyright © 2016. Published by Elsevier Ltd.

  18. Workflow with pitfalls to derive a regional airborne magnetic compilation

    NASA Astrophysics Data System (ADS)

    Brönner, Marco; Baykiev, Eldar; Ebbing, Jörg

    2017-04-01

    Today, large scale magnetic maps are usually a patchwork of different airborne surveys from different size, different resolution and different years. Airborne magnetic acquisition is a fast and economic method to map and gain geological and tectonic information for large areas, onshore and offshore. Depending on the aim of a survey, acquisition parameters like altitude and profile distance are usually adjusted to match the purpose of investigation. The subsequent data processing commonly follows a standardized workflow comprising core-field subtraction and line leveling to yield a coherent crustal field magnetic grid for a survey area. The resulting data makes it possible to correlate with geological and tectonic features in the subsurface, which is of importance for e.g. oil and mineral exploration. Crustal scale magnetic interpretation and modeling demand regional compilation of magnetic data and the merger of adjacent magnetic surveys. These studies not only focus on shallower sources, reflected by short to intermediate magnetic wavelength anomalies, but also have a particular interest in the long wavelength deriving from deep seated sources. However, whilst the workflow to produce such a merger is supported by quite a few powerful routines, the resulting compilation contains several pitfalls and limitations, which were discussed before, but still are very little recognized. The maximum wavelength that can be resolved of each individual survey is directly related to the survey size and consequently a merger will contribute erroneous long-wavelength components in the magnetic data compilation. To minimize this problem and to homogenous the longer wavelengths, a first order approach is the combination of airborne and satellite magnetic data commonly combined with the compilation from airborne data, which is sufficient only under particular preconditions. A more advanced approach considers the gap in frequencies between airborne and satellite data, which motivated countries like Sweden and Australia (AWAGS) to collect high altitude- long distance airborne magnetic data for the entire country to homogenous the high-resolution magnetic data before the merger with satellite data. We present the compilation of a regional magnetic map for an area in northern Europe and discuss the problems and pitfalls for a common workflow applied.

  19. Space Station Freedom Workshop Opportunities for Commercial Users and Providers: Issues and Recommendations

    NASA Technical Reports Server (NTRS)

    1989-01-01

    The responses to issues and questions raised at the Space Station Freedom Workshops are compiled. The findings are presented under broad divisions of general, materials processing in space, commercial earth and ocean observations, life sciences, infrastructure services, and infrastructure policy. The responses represent the best answers available at this time and future modifications may be expected. Contact names, telephone numbers, and organizations are included.

  20. An empirical assessment of taxic paleobiology.

    PubMed

    Adrain, J M; Westrop, S R

    2000-07-07

    The analysis of major changes in faunal diversity through time is a central theme of analytical paleobiology. The most important sources of data are literature-based compilations of stratigraphic ranges of fossil taxa. The levels of error in these compilations and the possible effects of such error have often been discussed but never directly assessed. We compared our comprehensive database of trilobites to the equivalent portion of J. J. Sepkoski Jr.'s widely used global genus database. More than 70% of entries in the global database are inaccurate; however, as predicted, the error is randomly distributed and does not introduce bias.

  1. Radiometric age file for Alaska: A section in The United States Geological Survey in Alaska: Accomplishments during 1980

    USGS Publications Warehouse

    Shew, Nora B.; Wilson, Frederic H.

    1982-01-01

    The Alaska radiometric age file of the Branch of Alaskan Geology is a computer-based compilation of radiometric dates from the state of Alaska and the western parts of the Yukon Territory and British Columbia. More than 1800 age determinations from over 250 references have been entered in the file. References date back to 1958 and include both published and unpublished sources. The file is the outgrowth of an original radiometric age file compiled by Don Grybeck and students at the University of Alaska-Fairbanks (Turner and others, 1975).

  2. Integrating Emerging Data Sources into Operational Practice: Capabilities and Limitations of Devices to Collect, Compile, Save, and Share Messages from CAVs and Connected Travelers

    DOT National Transportation Integrated Search

    2018-03-01

    Connected and automated vehicles (CAVs) and connected travelers will be providing substantially increased levels of data which will be available for agencies to consider using to improve the management and operation of the surface transportation syst...

  3. Technology in Education

    ERIC Educational Resources Information Center

    Roden, Kasi

    2011-01-01

    This paper was written to support a position on using technology in education. The purpose of this study was to support the use of technology in education by synthesizing previous research. A variety of sources including books and journal articles were studied in order to compile an overview of the benefits of using technology in elementary,…

  4. A comparative study of programming languages for next-generation astrodynamics systems

    NASA Astrophysics Data System (ADS)

    Eichhorn, Helge; Cano, Juan Luis; McLean, Frazer; Anderl, Reiner

    2018-03-01

    Due to the computationally intensive nature of astrodynamics tasks, astrodynamicists have relied on compiled programming languages such as Fortran for the development of astrodynamics software. Interpreted languages such as Python, on the other hand, offer higher flexibility and development speed thereby increasing the productivity of the programmer. While interpreted languages are generally slower than compiled languages, recent developments such as just-in-time (JIT) compilers or transpilers have been able to close this speed gap significantly. Another important factor for the usefulness of a programming language is its wider ecosystem which consists of the available open-source packages and development tools such as integrated development environments or debuggers. This study compares three compiled languages and three interpreted languages, which were selected based on their popularity within the scientific programming community and technical merit. The three compiled candidate languages are Fortran, C++, and Java. Python, Matlab, and Julia were selected as the interpreted candidate languages. All six languages are assessed and compared to each other based on their features, performance, and ease-of-use through the implementation of idiomatic solutions to classical astrodynamics problems. We show that compiled languages still provide the best performance for astrodynamics applications, but JIT-compiled dynamic languages have reached a competitive level of speed and offer an attractive compromise between numerical performance and programmer productivity.

  5. An Open Source modular platform for hydrological model implementation

    NASA Astrophysics Data System (ADS)

    Kolberg, Sjur; Bruland, Oddbjørn

    2010-05-01

    An implementation framework for setup and evaluation of spatio-temporal models is developed, forming a highly modularized distributed model system. The ENKI framework allows building space-time models for hydrological or other environmental purposes, from a suite of separately compiled subroutine modules. The approach makes it easy for students, researchers and other model developers to implement, exchange, and test single routines in a fixed framework. The open-source license and modular design of ENKI will also facilitate rapid dissemination of new methods to institutions engaged in operational hydropower forecasting or other water resource management. Written in C++, ENKI uses a plug-in structure to build a complete model from separately compiled subroutine implementations. These modules contain very little code apart from the core process simulation, and are compiled as dynamic-link libraries (dll). A narrow interface allows the main executable to recognise the number and type of the different variables in each routine. The framework then exposes these variables to the user within the proper context, ensuring that time series exist for input variables, initialisation for states, GIS data sets for static map data, manually or automatically calibrated values for parameters etc. ENKI is designed to meet three different levels of involvement in model construction: • Model application: Running and evaluating a given model. Regional calibration against arbitrary data using a rich suite of objective functions, including likelihood and Bayesian estimation. Uncertainty analysis directed towards input or parameter uncertainty. o Need not: Know the model's composition of subroutines, or the internal variables in the model, or the creation of method modules. • Model analysis: Link together different process methods, including parallel setup of alternative methods for solving the same task. Investigate the effect of different spatial discretization schemes. o Need not: Write or compile computer code, handle file IO for each modules, • Routine implementation and testing. Implementation of new process-simulating methods/equations, specialised objective functions or quality control routines, testing of these in an existing framework. o Need not: Implement user or model interface for the new routine, IO handling, administration of model setup and run, calibration and validation routines etc. From being developed for Norway's largest hydropower producer Statkraft, ENKI is now being turned into an Open Source project. At the time of writing, the licence and the project administration is not established. Also, it remains to port the application to other compilers and computer platforms. However, we hope that ENKI will prove useful for both academic and operational users.

  6. Methods to determine long-term durability of Wisconsin aggregates.

    DOT National Transportation Integrated Search

    2013-02-01

    Wisconsin uses approximately 10 to 11 million tons of aggregates annually in transportation infrastructure projects in the state. The quality of aggregates has a tremendous influence on the performance and durability of roadways and bridges. As sourc...

  7. EPA's Safe and Sustainable Water Resources Research Program

    EPA Science Inventory

    Increasing demands for sources of clean water—combined with changing land use practices, population growth, aging infrastructure, and climate change and variability—pose significant threats to our water resources. Failure to manage the Nation’s waters in an inte...

  8. A spatial database for landslides in northern Bavaria: A methodological approach

    NASA Astrophysics Data System (ADS)

    Jäger, Daniel; Kreuzer, Thomas; Wilde, Martina; Bemm, Stefan; Terhorst, Birgit

    2018-04-01

    Landslide databases provide essential information for hazard modeling, damages on buildings and infrastructure, mitigation, and research needs. This study presents the development of a landslide database system named WISL (Würzburg Information System on Landslides), currently storing detailed landslide data for northern Bavaria, Germany, in order to enable scientific queries as well as comparisons with other regional landslide inventories. WISL is based on free open source software solutions (PostgreSQL, PostGIS) assuring good correspondence of the various softwares and to enable further extensions with specific adaptions of self-developed software. Apart from that, WISL was designed to be particularly compatible for easy communication with other databases. As a central pre-requisite for standardized, homogeneous data acquisition in the field, a customized data sheet for landslide description was compiled. This sheet also serves as an input mask for all data registration procedures in WISL. A variety of "in-database" solutions for landslide analysis provides the necessary scalability for the database, enabling operations at the local server. In its current state, WISL already enables extensive analysis and queries. This paper presents an example analysis of landslides in Oxfordian Limestones in the northeastern Franconian Alb, northern Bavaria. The results reveal widely differing landslides in terms of geometry and size. Further queries related to landslide activity classifies the majority of the landslides as currently inactive, however, they clearly possess a certain potential for remobilization. Along with some active mass movements, a significant percentage of landslides potentially endangers residential areas or infrastructure. The main aspect of future enhancements of the WISL database is related to data extensions in order to increase research possibilities, as well as to transfer the system to other regions and countries.

  9. Catalog of infrared observations. Part 1: Data

    NASA Technical Reports Server (NTRS)

    Gezari, Daniel Y.; Schmitz, Marion; Mead, Jaylee M.

    1987-01-01

    The Catalog of Infrared Observations (CIO) is a compilation of infrared astronomical observational data obtained from an extensive literature search of astronomical journals and major astronomical catalogs and surveys. The literature searches are complete for 1965 through 1986 in this Second Edition. The Catalog is published in two parts, with the observational data (roughly 200,000 observations of 20,000 individual sources) listed in Part I, and supporting appendices in Part II. The expanded Second Edition contains a new feature: complete IRAS 4-band data for all CIO sources detected, listed with the main Catalog observations, as well as in complete detail in the Appendix. The appendices include an atlas of infrared source positions, two bibliographies of infrared literature upon which the search was based, and, keyed to the main Catalog listings (organized alphabetically by author and then chronologically), an atlas of infrared spectral ranges, and IRAS data from the CIO sources. The complete CIO database is available to qualified users in printed microfiche and magnetic tape formats.

  10. Small values in big data: The continuing need for appropriate metadata

    USGS Publications Warehouse

    Stow, Craig A.; Webster, Katherine E.; Wagner, Tyler; Lottig, Noah R.; Soranno, Patricia A.; Cha, YoonKyung

    2018-01-01

    Compiling data from disparate sources to address pressing ecological issues is increasingly common. Many ecological datasets contain left-censored data – observations below an analytical detection limit. Studies from single and typically small datasets show that common approaches for handling censored data — e.g., deletion or substituting fixed values — result in systematic biases. However, no studies have explored the degree to which the documentation and presence of censored data influence outcomes from large, multi-sourced datasets. We describe left-censored data in a lake water quality database assembled from 74 sources and illustrate the challenges of dealing with small values in big data, including detection limits that are absent, range widely, and show trends over time. We show that substitutions of censored data can also bias analyses using ‘big data’ datasets, that censored data can be effectively handled with modern quantitative approaches, but that such approaches rely on accurate metadata that describe treatment of censored data from each source.

  11. Who died in the Holocaust? Recovering names from official records.

    PubMed

    Anders, Edward; Dubrovskis, Juris

    2003-01-01

    The names of most Holocaust victims from Central and East Europe have remained unknown, but can be retrieved from official records, as illustrated here for the town of Liepāja, Latvia. Drawing on thirteen different sources, the authors have recovered the names and fates of about 7,000 of the 7,140 Jews once living there. The main source (5,700 names) is a census conducted during August 1941, the second month of the German occupation. Other sources are victims' lists compiled by Yad Vashem and by the Soviet Extraordinary Commission, house books, police and camp records, telephone and business directories, lists of deportees to the USSR, survivor reports, etc. All sources are incomplete and many are error-ridden, but they usually can be reconciled. Coverage of victims from Liepāja thus has increased from 20% to over 95%. For most victims, accurate data are now available on birth dates and places, prewar and wartime addresses, occupations, etc., permitting analysis of demography, survival patterns, and other trends.

  12. Data for Regional Heat flow Studies in and around Japan and its relationship to seismogenic layer

    NASA Astrophysics Data System (ADS)

    Tanaka, A.

    2017-12-01

    Heat flow is a fundamental parameter to constrain the thermal structure of the lithosphere. It also provides a constraint to lithospheric rheology, which is sensitive to temperature. General features of the heat flow distribution in and around Japan had been revealed by the early 1970's, and heat flow data have been continuously updated by further data compilation from mainly published data and investigations. These include additional data, which were not published individually, but were included in site-specific reports. Also, thermal conductivity measurements were conducted on cores from boreholes using a line-source device with a half-space type box probe and an optical scanning device, and previously unpublished thermal conductivities were compiled. It has been more than 10 years since the last published compilation and analysis of heat flow data of Tanaka et al. (2004), which published all of the heat flow data in the northwestern Pacific area (from 0 to 60oN and from 120 to 160oE) and geothermal gradient data in and around Japan. Because these added data and information are drawn from various sources, the updated database is compiled in each datasets: heat flow, geothermal gradient, and thermal conductivity. The updated and improved database represents considerable improvement to past updates and presents an opportunity to revisit the thermal state of the lithosphere along with other geophysical/geochemical constraints on heat flow extrapolation. The spatial distribution of the cut-off depth of shallow seismicity of Japan using relocated hypocentres during the last decade (Omuralieva et al., 2012) and this updated database are used to quantify the concept of temperature as a fundamental parameter for determining the seismogenic thickness.

  13. PACS for Bhutan: a cost effective open source architecture for emerging countries.

    PubMed

    Ratib, Osman; Roduit, Nicolas; Nidup, Dechen; De Geer, Gerard; Rosset, Antoine; Geissbuhler, Antoine

    2016-10-01

    This paper reports the design and implementation of an innovative and cost-effective imaging management infrastructure suitable for radiology centres in emerging countries. It was implemented in the main referring hospital of Bhutan equipped with a CT, an MRI, digital radiology, and a suite of several ultrasound units. They lacked the necessary informatics infrastructure for image archiving and interpretation and needed a system for distribution of images to clinical wards. The solution developed for this project combines several open source software platforms in a robust and versatile archiving and communication system connected to analysis workstations equipped with a FDA-certified version of the highly popular Open-Source software. The whole system was implemented on standard off-the-shelf hardware. The system was installed in three days, and training of the radiologists as well as the technical and IT staff was provided onsite to ensure full ownership of the system by the local team. Radiologists were rapidly capable of reading and interpreting studies on the diagnostic workstations, which had a significant benefit on their workflow and ability to perform diagnostic tasks more efficiently. Furthermore, images were also made available to several clinical units on standard desktop computers through a web-based viewer. • Open source imaging informatics platforms can provide cost-effective alternatives for PACS • Robust and cost-effective open architecture can provide adequate solutions for emerging countries • Imaging informatics is often lacking in hospitals equipped with digital modalities.

  14. Methods for evaluating potential sources of chloride in surface waters and groundwaters of the conterminous United States

    USGS Publications Warehouse

    Granato, Gregory E.; DeSimone, Leslie A.; Barbaro, Jeffrey R.; Jeznach, Lillian C.

    2015-09-04

    Scientists, engineers, regulators, and decisionmakers need information about potential sources of chloride, water and solute budgets, and methods for collecting water-quality data to help identify potential sources. This information is needed to evaluate potential sources of chloride in areas where chloride may have adverse ecological effects or may degrade water supplies used for drinking water, agriculture, or industry. Knowledge of potential sources will help decisionmakers identify the best mitigation measures to reduce the total background chloride load, thereby reducing the potential for water-quality exceedances that occur because of superposition on rising background concentrations. Also, knowledge of potential sources may help decisionmakers identify the potential for the presence of contaminants that have toxic, carcinogenic, mutagenic, or endocrine-disrupting effects at concentrations that are lower by orders of magnitude than the chloride concentrations in the source water. This report is a comprehensive synthesis of relevant information, but it is not the result of an exhaustive search for literature on each topic. The potential adverse effects of chloride on infrastructure and the environment are not discussed in this report because these issues have been extensively documented elsewhere.

  15. Unidentified Flying Objects (UFOs). LC Science Tracer Bullet.

    ERIC Educational Resources Information Center

    Library of Congress, Washington, DC. Science and Technology Div.

    This guide lists information sources dealing with unidentified flying objects (UFOs). Not meant to be a comprehensive bibliography, this compilation is designed--as the name of the series implies--to put the reader "on target." Included are: (1) subject headings used by the Library of Congress, under which publications on this subject…

  16. Southern Salish Sea Habitat Map Series data catalog

    USGS Publications Warehouse

    Cochrane, Guy R.

    2015-01-01

    This data catalog contains much of the data used to prepare the SIMs in the Southern Salish Sea Habitat Map Series. Other data that were used to prepare the maps were compiled from previously published sources (for example, sediment samples and seismic reflection profiles) and are not included in this data series.

  17. American Revolution: Selections from Secondary School History Books of Other Nations.

    ERIC Educational Resources Information Center

    Barendsen, Robert D., Comp.; And Others

    Selections from the recent history texts of 13 foreign countries are contained in this document as an effort to gather the curricular perceptions of other countries about key events or periods in American history related to the U.S. Revolutionary War. The compilation provides American secondary teachers with contemporary source material not…

  18. Publishing in Educational Psychology Journals: Comments from Editors

    ERIC Educational Resources Information Center

    Nihalani, Priya K.; Mayrath, Michael C.

    2008-01-01

    The following paper emphasizes the importance of publishing and learning how to publish in educational psychology journals. We have compiled a set of recommendations based on advice from editors in the field and several other sources on how to publish. Additionally, this paper provides a step-by-step guide that graduate students and junior faculty…

  19. Emissions from oil and gas operations in the United States and their air quality implications.

    PubMed

    Allen, David T

    2016-06-01

    The energy supply infrastructure in the United States has been changing dramatically over the past decade. Increased production of oil and natural gas, particularly from shale resources using horizontal drilling and hydraulic fracturing, made the United States the world's largest producer of oil and natural gas in 2014. This review examines air quality impacts, specifically, changes in greenhouse gas, criteria air pollutant, and air toxics emissions from oil and gas production activities that are a result of these changes in energy supplies and use. National emission inventories indicate that volatile organic compound (VOC) and nitrogen oxide (NOx) emissions from oil and gas supply chains in the United States have been increasing significantly, whereas emission inventories for greenhouse gases have seen slight declines over the past decade. These emission inventories are based on counts of equipment and operational activities (activity factors), multiplied by average emission factors, and therefore are subject to uncertainties in these factors. Although uncertainties associated with activity data and missing emission source types can be significant, multiple recent measurement studies indicate that the greatest uncertainties are associated with emission factors. In many source categories, small groups of devices or sites, referred to as super-emitters, contribute a large fraction of emissions. When super-emitters are accounted for, multiple measurement approaches, at multiple scales, produce similar results for estimated emissions. Challenges moving forward include identifying super-emitters and reducing their emission magnitudes. Work done to date suggests that both equipment malfunction and operational practices can be important. Finally, although most of this review focuses on emissions from energy supply infrastructures, the regional air quality implications of some coupled energy production and use scenarios are examined. These case studies suggest that both energy production and use should be considered in assessing air quality implications of changes in energy infrastructures, and that impacts are likely to vary among regions. The energy supply infrastructure in the United States has been changing dramatically over the past decade, leading to changes in emissions from oil and natural gas supply chain sources. In many source categories along these supply chains, small groups of devices or sites, referred to as super-emitters, contribute a large fraction of emissions. Effective emission reductions will require technologies for both identifying super-emitters and reducing their emission magnitudes.

  20. Hadoop and friends - first experience at CERN with a new platform for high throughput analysis steps

    NASA Astrophysics Data System (ADS)

    Duellmann, D.; Surdy, K.; Menichetti, L.; Toebbicke, R.

    2017-10-01

    The statistical analysis of infrastructure metrics comes with several specific challenges, including the fairly large volume of unstructured metrics from a large set of independent data sources. Hadoop and Spark provide an ideal environment in particular for the first steps of skimming rapidly through hundreds of TB of low relevance data to find and extract the much smaller data volume that is relevant for statistical analysis and modelling. This presentation will describe the new Hadoop service at CERN and the use of several of its components for high throughput data aggregation and ad-hoc pattern searches. We will describe the hardware setup used, the service structure with a small set of decoupled clusters and the first experience with co-hosting different applications and performing software upgrades. We will further detail the common infrastructure used for data extraction and preparation from continuous monitoring and database input sources.

Top