DOE Office of Scientific and Technical Information (OSTI.GOV)
Jiang, Hanchen, E-mail: jhc13@mails.tsinghua.edu.cn; Qiang, Maoshan, E-mail: qiangms@tsinghua.edu.cn; Lin, Peng, E-mail: celinpe@mail.tsinghua.edu.cn
Public opinion becomes increasingly salient in the ex post evaluation stage of large infrastructure projects which have significant impacts to the environment and the society. However, traditional survey methods are inefficient in collection and assessment of the public opinion due to its large quantity and diversity. Recently, Social media platforms provide a rich data source for monitoring and assessing the public opinion on controversial infrastructure projects. This paper proposes an assessment framework to transform unstructured online public opinions on large infrastructure projects into sentimental and topical indicators for enhancing practices of ex post evaluation and public participation. The framework usesmore » web crawlers to collect online comments related to a large infrastructure project and employs two natural language processing technologies, including sentiment analysis and topic modeling, with spatio-temporal analysis, to transform these comments into indicators for assessing online public opinion on the project. Based on the framework, we investigate the online public opinion of the Three Gorges Project on China's largest microblogging site, namely, Weibo. Assessment results present spatial-temporal distributions of post intensity and sentiment polarity, reveals major topics with different sentiments and summarizes managerial implications, for ex post evaluation of the world's largest hydropower project. The proposed assessment framework is expected to be widely applied as a methodological strategy to assess public opinion in the ex post evaluation stage of large infrastructure projects. - Highlights: • We developed a framework to assess online public opinion on large infrastructure projects with environmental impacts. • Indicators were built to assess post intensity, sentiment polarity and major topics of the public opinion. • We took the Three Gorges Project (TGP) as an example to demonstrate the effectiveness proposed framework. • We revealed spatial-temporal patterns of post intensity and sentiment polarity on the TGP. • We drew implications for a more in-depth understanding of the public opinion on large infrastructure projects.« less
EPA Recognizes Excellence and Innovation in Clean Water Infrastructure
Today, the U.S. Environmental Protection Agency recognized 28 clean water infrastructure projects for excellence & innovation within the Clean Water State Revolving Fund (CWSRF) program. Honored projects include large wastewater infrastructure projects.
NASA Astrophysics Data System (ADS)
Angius, S.; Bisegni, C.; Ciuffetti, P.; Di Pirro, G.; Foggetta, L. G.; Galletti, F.; Gargana, R.; Gioscio, E.; Maselli, D.; Mazzitelli, G.; Michelotti, A.; Orrù, R.; Pistoni, M.; Spagnoli, F.; Spigone, D.; Stecchi, A.; Tonto, T.; Tota, M. A.; Catani, L.; Di Giulio, C.; Salina, G.; Buzzi, P.; Checcucci, B.; Lubrano, P.; Piccini, M.; Fattibene, E.; Michelotto, M.; Cavallaro, S. R.; Diana, B. F.; Enrico, F.; Pulvirenti, S.
2016-01-01
The paper is aimed to present the !CHAOS open source project aimed to develop a prototype of a national private Cloud Computing infrastructure, devoted to accelerator control systems and large experiments of High Energy Physics (HEP). The !CHAOS project has been financed by MIUR (Italian Ministry of Research and Education) and aims to develop a new concept of control system and data acquisition framework by providing, with a high level of aaabstraction, all the services needed for controlling and managing a large scientific, or non-scientific, infrastructure. A beta version of the !CHAOS infrastructure will be released at the end of December 2015 and will run on private Cloud infrastructures based on OpenStack.
Skate Genome Project: Cyber-Enabled Bioinformatics Collaboration
Vincent, J.
2011-01-01
The Skate Genome Project, a pilot project of the North East Cyber infrastructure Consortium, aims to produce a draft genome sequence of Leucoraja erinacea, the Little Skate. The pilot project was designed to also develop expertise in large scale collaborations across the NECC region. An overview of the bioinformatics and infrastructure challenges faced during the first year of the project will be presented. Results to date and lessons learned from the perspective of a bioinformatics core will be highlighted.
A Cloud-based Infrastructure and Architecture for Environmental System Research
NASA Astrophysics Data System (ADS)
Wang, D.; Wei, Y.; Shankar, M.; Quigley, J.; Wilson, B. E.
2016-12-01
The present availability of high-capacity networks, low-cost computers and storage devices, and the widespread adoption of hardware virtualization and service-oriented architecture provide a great opportunity to enable data and computing infrastructure sharing between closely related research activities. By taking advantage of these approaches, along with the world-class high computing and data infrastructure located at Oak Ridge National Laboratory, a cloud-based infrastructure and architecture has been developed to efficiently deliver essential data and informatics service and utilities to the environmental system research community, and will provide unique capabilities that allows terrestrial ecosystem research projects to share their software utilities (tools), data and even data submission workflow in a straightforward fashion. The infrastructure will minimize large disruptions from current project-based data submission workflows for better acceptances from existing projects, since many ecosystem research projects already have their own requirements or preferences for data submission and collection. The infrastructure will eliminate scalability problems with current project silos by provide unified data services and infrastructure. The Infrastructure consists of two key components (1) a collection of configurable virtual computing environments and user management systems that expedite data submission and collection from environmental system research community, and (2) scalable data management services and system, originated and development by ORNL data centers.
Modeling the Hydrologic Effects of Large-Scale Green Infrastructure Projects with GIS
NASA Astrophysics Data System (ADS)
Bado, R. A.; Fekete, B. M.; Khanbilvardi, R.
2015-12-01
Impervious surfaces in urban areas generate excess runoff, which in turn causes flooding, combined sewer overflows, and degradation of adjacent surface waters. Municipal environmental protection agencies have shown a growing interest in mitigating these effects with 'green' infrastructure practices that partially restore the perviousness and water holding capacity of urban centers. Assessment of the performance of current and future green infrastructure projects is hindered by the lack of adequate hydrological modeling tools; conventional techniques fail to account for the complex flow pathways of urban environments, and detailed analyses are difficult to prepare for the very large domains in which green infrastructure projects are implemented. Currently, no standard toolset exists that can rapidly and conveniently predict runoff, consequent inundations, and sewer overflows at a city-wide scale. We demonstrate how streamlined modeling techniques can be used with open-source GIS software to efficiently model runoff in large urban catchments. Hydraulic parameters and flow paths through city blocks, roadways, and sewer drains are automatically generated from GIS layers, and ultimately urban flow simulations can be executed for a variety of rainfall conditions. With this methodology, users can understand the implications of large-scale land use changes and green/gray storm water retention systems on hydraulic loading, peak flow rates, and runoff volumes.
Schweitzer, Peter; Povoroznyuk, Olga; Schiesser, Sigrid
2017-01-01
Abstract Public and academic discourses about the Polar regions typically focus on the so-called natural environment. While, these discourses and inquiries continue to be relevant, the current article asks the question how to conceptualize the on-going industrial and infrastructural build-up of the Arctic. Acknowledging that the “built environment” is not an invention of modernity, the article nevertheless focuses on large-scale infrastructural projects of the twentieth century, which marks a watershed of industrial and infrastructural development in the north. Given that the Soviet Union was at the vanguard of these developments, the focus will be on Soviet and Russian large-scale projects. We will be discussing two cases of transportation infrastructure, one of them based on an on-going research project being conducted by the authors along the Baikal–Amur Mainline (BAM) and the other focused on the so-called Northern Sea Route, the marine passage with a long history that has recently been regaining public and academic attention. The concluding section will argue for increased attention to the interactions between humans and the built environment, serving as a kind of programmatic call for more anthropological attention to infrastructure in the Russian north and other polar regions. PMID:29098112
The Computing and Data Grid Approach: Infrastructure for Distributed Science Applications
NASA Technical Reports Server (NTRS)
Johnston, William E.
2002-01-01
With the advent of Grids - infrastructure for using and managing widely distributed computing and data resources in the science environment - there is now an opportunity to provide a standard, large-scale, computing, data, instrument, and collaboration environment for science that spans many different projects and provides the required infrastructure and services in a relatively uniform and supportable way. Grid technology has evolved over the past several years to provide the services and infrastructure needed for building 'virtual' systems and organizations. We argue that Grid technology provides an excellent basis for the creation of the integrated environments that can combine the resources needed to support the large- scale science projects located at multiple laboratories and universities. We present some science case studies that indicate that a paradigm shift in the process of science will come about as a result of Grids providing transparent and secure access to advanced and integrated information and technologies infrastructure: powerful computing systems, large-scale data archives, scientific instruments, and collaboration tools. These changes will be in the form of services that can be integrated with the user's work environment, and that enable uniform and highly capable access to these computers, data, and instruments, regardless of the location or exact nature of these resources. These services will integrate transient-use resources like computing systems, scientific instruments, and data caches (e.g., as they are needed to perform a simulation or analyze data from a single experiment); persistent-use resources. such as databases, data catalogues, and archives, and; collaborators, whose involvement will continue for the lifetime of a project or longer. While we largely address large-scale science in this paper, Grids, particularly when combined with Web Services, will address a broad spectrum of science scenarios. both large and small scale.
Review of EuCARD project on accelerator infrastructure in Europe
NASA Astrophysics Data System (ADS)
Romaniuk, Ryszard S.
2013-01-01
The aim of big infrastructural and research programs (like pan-European Framework Programs) and individual projects realized inside these programs in Europe is to structure the European Research Area - ERA in this way as to be competitive with the leaders of the world. One of this projects in EuCARD (European Coordination of Accelerator Research and Development) with the aim to structure and modernize accelerator, (including accelerators for big free electron laser machines) research infrastructure. This article presents the periodic development of EuCARD which took place between the annual meeting, April 2012 in Warsaw and SC meeting in Uppsala, December 2012. The background of all these efforts are achievements of the LHC machine and associated detectors in the race for new physics. The LHC machine works in the regime of p-p, Pb-p, Pb-Pb (protons and lead ions). Recently, a discovery by the LHC of Higgs like boson, has started vivid debates on the further potential of this machine and the future. The periodic EuCARD conference, workshop and meetings concern building of the research infrastructure, including in this advanced photonic and electronic systems for servicing large high energy physics experiments. There are debated a few basic groups of such systems like: measurement - control networks of large geometrical extent, multichannel systems for large amounts of metrological data acquisition, precision photonic networks of reference time, frequency and phase distribution. The aim of the discussion is not only summarize the current status but make plans and prepare practically to building new infrastructures. Accelerator science and technology is one of a key enablers of the developments in the particle physic, photon physics and also applications in medicine and industry. Accelerator technology is intensely developed in all developed nations and regions of the world. The EuCARD project contains a lot of subjects related directly and indirectly to photon physics and photonics, as well as optoelectronics, electronics and integration of these with large research infrastructure.
J.D Wickham; Kurt H. Riitters; T.G. Wade; P. Vogt
2010-01-01
Green infrastructure is a popular framework for conservation planning. The main elements of green infrastructure are hubs and links. Hubs tend to be large areas of ânaturalâ vegetation and links tend to be linear features (e.g., streams) that connect hubs. Within the United States, green infrastructure projects can be characterized as: (...
Transportation Infrastructure: Managing the Costs of Large-Dollar Highway Projects
DOT National Transportation Integrated Search
1997-02-01
The General Accounting Office (GAO) was requested to assess the effectiveness of the Federal Highway Administration's (FHWA's) oversight of the costs of large-dollar highway and bridge projects (those with a total estimated cost of over $100 million)...
Comparison of WinSLAMM Modeled Results with Monitored Biofiltration Data
The US EPA’s Green Infrastructure Demonstration project in Kansas City incorporates both small scale individual biofiltration device monitoring, along with large scale watershed monitoring. The test watershed (100 acres) is saturated with green infrastructure components (includin...
Evaluating Green/Gray Infrastructure for CSO/Stormwater Control
The NRMRL is conducting this project to evaluate the water quality and quantity benefits of a large-scale application of green infrastructure (low-impact development/best management practices) retrofits in an entire subcatchment. It will document ORD's effort to demonstrate the e...
Water Infrastructure Finance and Innovation Act
How WIFIA works, program implementation, program guidance, how potential recipients can obtain funding, and project eligibility. WIFIA works with State Revolving Funds to provide subsidized financing for large dollar-value projects.
Mass transit : project management oversight benefits and future funding requirements
DOT National Transportation Integrated Search
2000-09-01
To meet the nation's transportation needs, many states, cities, and localities are building or planning mass transit projects to replace aging infrastructure or add new capacity. These transit projects are very costly and require large investments of...
Transportation of Large Wind Components: A Review of Existing Geospatial Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mooney, Meghan; Maclaurin, Galen
2016-09-01
This report features the geospatial data component of a larger project evaluating logistical and infrastructure requirements for transporting oversized and overweight (OSOW) wind components. The goal of the larger project was to assess the status and opportunities for improving the infrastructure and regulatory practices necessary to transport wind turbine towers, blades, and nacelles from current and potential manufacturing facilities to end-use markets. The purpose of this report is to summarize existing geospatial data on wind component transportation infrastructure and to provide a data gap analysis, identifying areas for further analysis and data collection.
Infrastructure stability surveillance with high resolution InSAR
NASA Astrophysics Data System (ADS)
Balz, Timo; Düring, Ralf
2017-02-01
The construction of new infrastructure in largely unknown and difficult environments, as it is necessary for the construction of the New Silk Road, can lead to a decreased stability along the construction site, leading to an increase in landslide risk and deformation caused by surface motion. This generally requires a thorough pre-analysis and consecutive surveillance of the deformation patterns to ensure the stability and safety of the infrastructure projects. Interferometric SAR (InSAR) and the derived techniques of multi-baseline InSAR are very powerful tools for a large area observation of surface deformation patterns. With InSAR and deriver techniques, the topographic height and the surface motion can be estimated for large areas, making it an ideal tool for supporting the planning, construction, and safety surveillance of new infrastructure elements in remote areas.
NASA Technical Reports Server (NTRS)
Bush, Harold
1991-01-01
Viewgraphs describing the in-space assembly and construction technology project of the infrastructure operations area of the operation technology program are presented. Th objective of the project is to develop and demonstrate an in-space assembly and construction capability for large and/or massive spacecraft. The in-space assembly and construction technology program will support the need to build, in orbit, the full range of spacecraft required for the missions to and from planet Earth, including: earth-orbiting platforms, lunar transfer vehicles, and Mars transfer vehicles.
NASA Astrophysics Data System (ADS)
Romaniuk, Ryszard S.
2013-10-01
Accelerator science and technology is one of a key enablers of the developments in the particle physic, photon physics and also applications in medicine and industry. The paper presents a digest of the research results in the domain of accelerator science and technology in Europe, shown during the realization of CARE (Coordinated Accelerator R&D), EuCARD (European Coordination of Accelerator R&D) and during the national annual review meeting of the TIARA - Test Infrastructure of European Research Area in Accelerator R&D. The European projects on accelerator technology started in 2003 with CARE. TIARA is an European Collaboration of Accelerator Technology, which by running research projects, technical, networks and infrastructural has a duty to integrate the research and technical communities and infrastructures in the global scale of Europe. The Collaboration gathers all research centers with large accelerator infrastructures. Other ones, like universities, are affiliated as associate members. TIARA-PP (preparatory phase) is an European infrastructural project run by this Consortium and realized inside EU-FP7. The paper presents a general overview of CARE, EuCARD and especially TIARA activities, with an introduction containing a portrait of contemporary accelerator technology and a digest of its applications in modern society. CARE, EuCARD and TIARA activities integrated the European accelerator community in a very effective way. These projects are expected very much to be continued.
Physicists Get INSPIREd: INSPIRE Project and Grid Applications
NASA Astrophysics Data System (ADS)
Klem, Jukka; Iwaszkiewicz, Jan
2011-12-01
INSPIRE is the new high-energy physics scientific information system developed by CERN, DESY, Fermilab and SLAC. INSPIRE combines the curated and trusted contents of SPIRES database with Invenio digital library technology. INSPIRE contains the entire HEP literature with about one million records and in addition to becoming the reference HEP scientific information platform, it aims to provide new kinds of data mining services and metrics to assess the impact of articles and authors. Grid and cloud computing provide new opportunities to offer better services in areas that require large CPU and storage resources including document Optical Character Recognition (OCR) processing, full-text indexing of articles and improved metrics. D4Science-II is a European project that develops and operates an e-Infrastructure supporting Virtual Research Environments (VREs). It develops an enabling technology (gCube) which implements a mechanism for facilitating the interoperation of its e-Infrastructure with other autonomously running data e-Infrastructures. As a result, this creates the core of an e-Infrastructure ecosystem. INSPIRE is one of the e-Infrastructures participating in D4Science-II project. In the context of the D4Science-II project, the INSPIRE e-Infrastructure makes available some of its resources and services to other members of the resulting ecosystem. Moreover, it benefits from the ecosystem via a dedicated Virtual Organization giving access to an array of resources ranging from computing and storage resources of grid infrastructures to data and services.
Policy model for space economy infrastructure
NASA Astrophysics Data System (ADS)
Komerath, Narayanan; Nally, James; Zilin Tang, Elizabeth
2007-12-01
Extraterrestrial infrastructure is key to the development of a space economy. Means for accelerating transition from today's isolated projects to a broad-based economy are considered. A large system integration approach is proposed. The beginnings of an economic simulation model are presented, along with examples of how interactions and coordination bring down costs. A global organization focused on space infrastructure and economic expansion is proposed to plan, coordinate, fund and implement infrastructure construction. This entity also opens a way to raise low-cost capital and solve the legal and public policy issues of access to extraterrestrial resources.
NASA Astrophysics Data System (ADS)
Myers, B.; Wiggins, H. V.; Turner-Bogren, E. J.; Warburton, J.
2017-12-01
Project Managers at the Arctic Research Consortium of the U.S. (ARCUS) lead initiatives to convene, communicate with, and connect the Arctic research community across challenging disciplinary, geographic, temporal, and cultural boundaries. They regularly serve as the organizing hubs, archivists and memory-keepers for collaborative projects comprised of many loosely affiliated partners. As leading organizers of large open science meetings and other outreach events, they also monitor the interdisciplinary landscape of community needs, concerns, opportunities, and emerging research directions. However, leveraging the ARCUS Project Manager role to strategically build out the intangible infrastructure necessary to advance Arctic research requires a unique set of knowledge, skills, and experience. Drawing on a range of lessons learned from past and ongoing experiences with collaborative science, education and outreach programming, this presentation will highlight a model of ARCUS project management that we believe works best to support and sustain our community in its long-term effort to conquer the complexities of Arctic research.
NASA Astrophysics Data System (ADS)
McKee, Shawn; Kissel, Ezra; Meekhof, Benjeman; Swany, Martin; Miller, Charles; Gregorowicz, Michael
2017-10-01
We report on the first year of the OSiRIS project (NSF Award #1541335, UM, IU, MSU and WSU) which is targeting the creation of a distributed Ceph storage infrastructure coupled together with software-defined networking to provide high-performance access for well-connected locations on any participating campus. The projects goal is to provide a single scalable, distributed storage infrastructure that allows researchers at each campus to read, write, manage and share data directly from their own computing locations. The NSF CC*DNI DIBBS program which funded OSiRIS is seeking solutions to the challenges of multi-institutional collaborations involving large amounts of data and we are exploring the creative use of Ceph and networking to address those challenges. While OSiRIS will eventually be serving a broad range of science domains, its first adopter will be the LHC ATLAS detector project via the ATLAS Great Lakes Tier-2 (AGLT2) jointly located at the University of Michigan and Michigan State University. Part of our presentation will cover how ATLAS is using the OSiRIS infrastructure and our experiences integrating our first user community. The presentation will also review the motivations for and goals of the project, the technical details of the OSiRIS infrastructure, the challenges in providing such an infrastructure, and the technical choices made to address those challenges. We will conclude with our plans for the remaining 4 years of the project and our vision for what we hope to deliver by the projects end.
NASA Astrophysics Data System (ADS)
Calignano, Elisa; Freda, Carmela; Baracchi, Laura
2017-04-01
Women are outnumbered by men in geosciences senior research positions, but what is the situation if we consider large pan-European Research Infrastructures? With this contribution we want to show an analysis of the role of women in the implementation of the European Plate Observing System (EPOS): a planned research infrastructure for European Solid Earth sciences, integrating national and transnational research infrastructures to enable innovative multidisciplinary research. EPOS involves 256 national research infrastructures, 47 partners (universities and research institutes) from 25 European countries and 4 international organizations. The EPOS integrated platform demands significant coordination between diverse solid Earth disciplinary communities, national research infrastructures and the policies and initiatives they drive, geoscientists and information technologists. The EPOS architecture takes into account governance, legal, financial and technical issues and is designed so that the enterprise works as a single, but distributed, sustainable research infrastructure. A solid management structure is vital for the successful implementation and sustainability of EPOS. The internal organization relies on community-specific Working Packages (WPs), Transversal WPs in charge of the overall EPOS integration and implementation, several governing, executive and advisory bodies, a Project Management Office (PMO) and the Project Coordinator. Driven by the timely debate on gender balance and commitment of the European Commission to promote gender equality in research and innovation, we decided to conduct a mapping exercise on a project that crosses European national borders and that brings together diverse geoscience disciplines under one management structure. We present an analysis of women representation in decision-making positions in each EPOS Working Package (WP Leader, proxy, legal, financial and IT contact persons), in the Boards and Councils and in the PMO, together with statistics on women participation based on the project intranet, which counts more than 500 users. The analysis allows us not only to assess the gender balance in decision-making positions in a pan-European research infrastructure, but also to investigate how women's participation varies with different aspects of the project implementation (management, coordination, legal, financial or technical). Most of the women in EPOS are active geoscientists (academic or in national research institutes), or have a scientific background. By interviewing some of them we report also on how being involved in the project affects their careers. We believe this kind of analysis is an important starting point to promote awareness and achieve gender equality in research and innovation.
Framing the Dialogue: Strategies, Issues and Opportunities
1993-05-01
issue is the relationship between the declining Federal financing of public works and the Federal interest in providing infrastructure services. Large...programs and projects are significant. lmprove Infrastructure Managmnt : Management improvements closely parallel the issues associated with strategic... Relationship Between Examination of the linkage between standards GKY & Standards & and the delivery of goods and services from Associates Performance
Virtual Civilian Aeromedical Evacuation Sustainment Training Project (V-CAEST)
2015-08-01
evacuation liaison team (AELT), and the mobile aeromedical staging facility (MASF). The content covered in the V-CAEST environment therefore covered the...environment was set-up in a large gymnasium building including a mock military plane and Mobile Aeromedical Staging Facility (MASF) located just...staffing exam backhoe scenarios exam infrastructure interface tsunami infrastructure commander telecommunication disrupting commander
Environmental impacts of dispersed development from federal infrastructure projects.
Southerland, Mark T
2004-06-01
Dispersed development, also referred to as urban growth or sprawl, is a pattern of low-density development spread over previously rural landscapes. Such growth can result in adverse impacts to air quality, water quality, human health, aquatic and terrestrial ecosystems, agricultural land, military training areas, water supply and wastewater treatment, recreational resources, viewscapes, and cultural resources. The U.S. Environmental Protection Agency (U.S. EPA) is charged with protecting public health and the environment, which includes consideration of impacts from dispersed development. Specifically, because federal infrastructure projects can affect the progress of dispersed development, the secondary impacts resulting from it must be assessed in documents prepared under the National Environmental Policy Act (NEPA). The Council on Environmental Quality (CEQ) has oversight for NEPA and Section 309 of the Clean Air Act requires that U.S. EPA review and comment on federal agency NEPA documents. The adverse effects of dispersed development can be induced by federal infrastructure projects including transportation, built infrastructure, modifications in natural infrastructure, public land conversion and redevelopment of properties, construction of federal facilities, and large traffic or major growth generation developments requiring federal permits. This paper presents an approach that U.S. EPA reviewers and NEPA practitioners can use to provide accurate, realistic, and consistent analysis of secondary impacts of dispersed development resulting from federal infrastructure projects. It also presents 24 measures that can be used to mitigate adverse impacts from dispersed development by modifying project location and design, participating in preservation or restoration activities, or informing and supporting local communities in planning.
Front Range Infrastructure Resources Project: water-resources activities
Robson, Stanley G.; Heiny, Janet S.
1998-01-01
Infrastructure, such as roads, buildings, airports, and dams, is built and maintained by use of large quantities of natural resources such as aggregate (sand and gravel), energy, and water. As urban area expand, local sources of these resource are becoming inaccessible (gravel cannot be mined from under a subdivision, for example), or the cost of recovery of the resource becomes prohibitive (oil and gas drilling in urban areas is costly), or the resources may become unfit for some use (pollution of ground water may preclude its use as a water supply). Governmental land-use decision and environmental mandates can further preclude development of natural resources. If infrastructure resources are to remain economically available. current resource information must be available for use in well-reasoned decisions bout future land use. Ground water is an infrastructure resource that is present in shallow aquifers and deeper bedrock aquifers that underlie much of the 2,450-square-mile demonstration area of the Colorado Front Range Infrastructure Resources Project. In 1996, mapping of the area's ground-water resources was undertaken as a U.S. Geological Survey project in cooperation with the Colorado Department of Natural Resources, Division of Water Resources, and the Colorado Water Conservation Board.
Front Range Infrastructure Resources Project--Aggregate Resources Activities
,
1998-01-01
Infrastructure, such as roads, buildings, airports, and dams, is built and maintained by use of large quantities of aggregate—sand, gravel, and stone. As urban areas expand, local sources of these resources become inaccessible. Other competitive land uses have a higher value than aggregate resources. For example, gravel cannot be mined from under a subdivision. The failure to plan for the protection and extraction of infrastructure resources often results in increased consumer cost, environmental damage, and an adversarial relationship between the industry and the community.
Riegman, Peter H J; de Jong, Bas W D; Llombart-Bosch, Antonio
2010-04-01
Today's translational cancer research increasingly depends on international multi-center studies. Biobanking infrastructure or comprehensive sample exchange platforms to enable networking of clinical cancer biobanks are instrumental to facilitate communication, uniform sample quality, and rules for exchange. The Organization of European Cancer Institutes (OECI) Pathobiology Working Group supports European biobanking infrastructure by maintaining the OECI-TuBaFrost exchange platform and organizing regular meetings. This platform originated from a European Commission project and is updated with knowledge from ongoing and new biobanking projects. This overview describes how European biobanking projects that have a large impact on clinical biobanking, including EuroBoNeT, SPIDIA, and BBMRI, contribute to the update of the OECI-TuBaFrost exchange platform. Combining the results of these European projects enabled the creation of an open (upon valid registration only) catalogue view of cancer biobanks and their available samples to initiate research projects. In addition, closed environments supporting active projects could be developed together with the latest views on quality, access rules, ethics, and law. With these contributions, the OECI Pathobiology Working Group contributes to and stimulates a professional attitude within biobanks at the European comprehensive cancer centers. Improving the fundamentals of cancer sample exchange in Europe stimulates the performance of large multi-center studies, resulting in experiments with the desired statistical significance outcome. With this approach, future innovation in cancer patient care can be realized faster and more reliably.
Parallel digital forensics infrastructure.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liebrock, Lorie M.; Duggan, David Patrick
2009-10-01
This report documents the architecture and implementation of a Parallel Digital Forensics infrastructure. This infrastructure is necessary for supporting the design, implementation, and testing of new classes of parallel digital forensics tools. Digital Forensics has become extremely difficult with data sets of one terabyte and larger. The only way to overcome the processing time of these large sets is to identify and develop new parallel algorithms for performing the analysis. To support algorithm research, a flexible base infrastructure is required. A candidate architecture for this base infrastructure was designed, instantiated, and tested by this project, in collaboration with New Mexicomore » Tech. Previous infrastructures were not designed and built specifically for the development and testing of parallel algorithms. With the size of forensics data sets only expected to increase significantly, this type of infrastructure support is necessary for continued research in parallel digital forensics. This report documents the implementation of the parallel digital forensics (PDF) infrastructure architecture and implementation.« less
Some recent advances of intelligent health monitoring systems for civil infrastructures in HIT
NASA Astrophysics Data System (ADS)
Ou, Jinping
2005-06-01
The intelligent health monitoring systems more and more become a technique for ensuring the health and safety of civil infrastructures and also an important approach for research of the damage accumulation or even disaster evolving characteristics of civil infrastructures, and attracts prodigious research interests and active development interests of scientists and engineers since a great number of civil infrastructures are planning and building each year in mainland China. In this paper, some recent advances on research, development nad implementation of intelligent health monitoring systems for civil infrastructuresin mainland China, especially in Harbin Institute of Technology (HIT), P.R.China. The main contents include smart sensors such as optical fiber Bragg grating (OFBG) and polivinyllidene fluoride (PVDF) sensors, fatigue life gauges, self-sensing mortar and carbon fiber reinforced polymer (CFRP), wireless sensor networks and their implementation in practical infrastructures such as offshore platform structures, hydraulic engineering structures, large span bridges and large space structures. Finally, the relative research projects supported by the national foundation agencies of China are briefly introduced.
Yang Yang; Theodore A. Endreny; David J. Nowak
2015-01-01
Impervious land cover was the choice for many urban development projects in order to accelerate runoff and reduce the depth and duration of local flooding, however this led to increases in downstream runoff characterized by large, flashy peak flows. Urban ecosystem restoration now involves slowing down urban runoff to restore local hydrology with green infrastructure,...
Managing a tier-2 computer centre with a private cloud infrastructure
NASA Astrophysics Data System (ADS)
Bagnasco, Stefano; Berzano, Dario; Brunetti, Riccardo; Lusso, Stefano; Vallero, Sara
2014-06-01
In a typical scientific computing centre, several applications coexist and share a single physical infrastructure. An underlying Private Cloud infrastructure eases the management and maintenance of such heterogeneous applications (such as multipurpose or application-specific batch farms, Grid sites, interactive data analysis facilities and others), allowing dynamic allocation resources to any application. Furthermore, the maintenance of large deployments of complex and rapidly evolving middleware and application software is eased by the use of virtual images and contextualization techniques. Such infrastructures are being deployed in some large centres (see e.g. the CERN Agile Infrastructure project), but with several open-source tools reaching maturity this is becoming viable also for smaller sites. In this contribution we describe the Private Cloud infrastructure at the INFN-Torino Computer Centre, that hosts a full-fledged WLCG Tier-2 centre, an Interactive Analysis Facility for the ALICE experiment at the CERN LHC and several smaller scientific computing applications. The private cloud building blocks include the OpenNebula software stack, the GlusterFS filesystem and the OpenWRT Linux distribution (used for network virtualization); a future integration into a federated higher-level infrastructure is made possible by exposing commonly used APIs like EC2 and OCCI.
Dahlö, Martin; Scofield, Douglas G; Schaal, Wesley; Spjuth, Ola
2018-05-01
Next-generation sequencing (NGS) has transformed the life sciences, and many research groups are newly dependent upon computer clusters to store and analyze large datasets. This creates challenges for e-infrastructures accustomed to hosting computationally mature research in other sciences. Using data gathered from our own clusters at UPPMAX computing center at Uppsala University, Sweden, where core hour usage of ∼800 NGS and ∼200 non-NGS projects is now similar, we compare and contrast the growth, administrative burden, and cluster usage of NGS projects with projects from other sciences. The number of NGS projects has grown rapidly since 2010, with growth driven by entry of new research groups. Storage used by NGS projects has grown more rapidly since 2013 and is now limited by disk capacity. NGS users submit nearly twice as many support tickets per user, and 11 more tools are installed each month for NGS projects than for non-NGS projects. We developed usage and efficiency metrics and show that computing jobs for NGS projects use more RAM than non-NGS projects, are more variable in core usage, and rarely span multiple nodes. NGS jobs use booked resources less efficiently for a variety of reasons. Active monitoring can improve this somewhat. Hosting NGS projects imposes a large administrative burden at UPPMAX due to large numbers of inexperienced users and diverse and rapidly evolving research areas. We provide a set of recommendations for e-infrastructures that host NGS research projects. We provide anonymized versions of our storage, job, and efficiency databases.
2018-01-01
Abstract Background Next-generation sequencing (NGS) has transformed the life sciences, and many research groups are newly dependent upon computer clusters to store and analyze large datasets. This creates challenges for e-infrastructures accustomed to hosting computationally mature research in other sciences. Using data gathered from our own clusters at UPPMAX computing center at Uppsala University, Sweden, where core hour usage of ∼800 NGS and ∼200 non-NGS projects is now similar, we compare and contrast the growth, administrative burden, and cluster usage of NGS projects with projects from other sciences. Results The number of NGS projects has grown rapidly since 2010, with growth driven by entry of new research groups. Storage used by NGS projects has grown more rapidly since 2013 and is now limited by disk capacity. NGS users submit nearly twice as many support tickets per user, and 11 more tools are installed each month for NGS projects than for non-NGS projects. We developed usage and efficiency metrics and show that computing jobs for NGS projects use more RAM than non-NGS projects, are more variable in core usage, and rarely span multiple nodes. NGS jobs use booked resources less efficiently for a variety of reasons. Active monitoring can improve this somewhat. Conclusions Hosting NGS projects imposes a large administrative burden at UPPMAX due to large numbers of inexperienced users and diverse and rapidly evolving research areas. We provide a set of recommendations for e-infrastructures that host NGS research projects. We provide anonymized versions of our storage, job, and efficiency databases. PMID:29659792
Joint-operation in water resources project in Indonesia: Integrated or non-integrated
NASA Astrophysics Data System (ADS)
Ophiyandri, Taufika; Istijono, Bambang; Hidayat, Benny
2017-11-01
The construction of large water resources infrastructure project often involved a joint-operation (JO) project between two or more construction companies. The form of JO can be grouped into two categories - an integrated type and a non-integrated type. This paper investigates the reason of forming a JO project made by companies. The specific advantages and problems of JO project is also analysed in this paper. In order to achieve the objectives, three water resources infrastructure projects were selected as case studies. Data was gathered by conducting 11 semi-structured interviews to project owners, contractor managers, and project staffs. Data was analysed by means of content analysis. It was found that the most fundamental factor to form a JO is to win a competition or tender. An integrated model is in favour because it can reduce overhead costs and has a simple management system, while a non-integrated model is selected because it can avoid a sleeping partner and make contractor more responsible for their own job.
Accelerator science and technology in Europe 2008-2017
NASA Astrophysics Data System (ADS)
Romaniuk, Ryszard S.
2013-10-01
European Framework Research Projects have recently added a lot of meaning to the building process of the ERA - the European Research Area. Inside this, the accelerator technology plays an essential role. Accelerator technology includes large infrastructure and intelligent, modern instrumentation embracing mechatronics, electronics, photonics and ICT. During the realization of the European research and infrastructure project FP6 CARE 2004-2008 (Coordinated Accelerator Research in Europe), concerning the development of large accelerator infrastructure in Europe, it was decided that a scientific editorial series of peer-reviewed monographs from this research area will be published in close relation with the projects. It was a completely new and quite brave idea to combine a kind of a strictly research publisher with a transient project, lasting only four or five years. Till then nobody did something like that. The idea turned out to be a real success. The publications now known and valued in the accelerator world, as the (CERN-WUT) Editorial Series on Accelerator Science and Technology, is successfully continued in already the third European project EuCARD2 and has logistic guarantees, for the moment, till the 2017, when it will mature to its first decade. During the realization of the European projects EuCARD (European Coordination for Accelerator R&D 2009-2013 and TIARA (Test Infrastructure of Accelerator Research Area in Europe) there were published 18 volumes in this series. The ambitious plans for the nearest years is to publish, hopefully, a few tens of new volumes. Accelerator science and technology is one of a key enablers of the developments in the particle physic, photon physics and also applications in medicine and industry. The paper presents a digest of the research results in the domain of accelerator science and technology in Europe, published in the monographs of the European Framework Projects (FP) on accelerator technology. The succession of CARE, EuCARD and EuCARD Projects is evidently creating a new quality in the European Accelerator Research. It is consolidating the technical and research communities in a new way, completely different than the traditional ones, for example via the periodic topical conferences.
Extremely Large Telescope Project Selected in ESFRI Roadmap
NASA Astrophysics Data System (ADS)
2006-10-01
In its first Roadmap, the European Strategy Forum on Research Infrastructures (ESFRI) choose the European Extremely Large Telescope (ELT), for which ESO is presently developing a Reference Design, as one of the large scale projects to be conducted in astronomy, and the only one in optical astronomy. The aim of the ELT project is to build before the end of the next decade an optical/near-infrared telescope with a diameter in the 30-60m range. ESO PR Photo 40/06 The ESFRI Roadmap states: "Extremely Large Telescopes are seen world-wide as one of the highest priorities in ground-based astronomy. They will vastly advance astrophysical knowledge allowing detailed studies of inter alia planets around other stars, the first objects in the Universe, super-massive Black Holes, and the nature and distribution of the Dark Matter and Dark Energy which dominate the Universe. The European Extremely Large Telescope project will maintain and reinforce Europe's position at the forefront of astrophysical research." Said Catherine Cesarsky, Director General of ESO: "In 2004, the ESO Council mandated ESO to play a leading role in the development of an ELT for Europe's astronomers. To that end, ESO has undertaken conceptual studies for ELTs and is currently also leading a consortium of European institutes engaged in studying enabling technologies for such a telescope. The inclusion of the ELT in the ESFRI roadmap, together with the comprehensive preparatory work already done, paves the way for the next phase of this exciting project, the design phase." ESO is currently working, in close collaboration with the European astronomical community and the industry, on a baseline design for an Extremely Large Telescope. The plan is a telescope with a primary mirror between 30 and 60 metres in diameter and a financial envelope of about 750 m Euros. It aims at more than a factor ten improvement in overall performance compared to the current leader in ground based astronomy: the ESO Very Large Telescope at the Paranal Observatory. The draft Baseline Reference Design will be presented to the wider scientific community on 29 - 30 November 2006 at a dedicated ELT Workshop Meeting in Marseille (France) and will be further reiterated. The design is then to be presented to the ESO Council at the end of 2006. The goal is to start the detailed E-ELT design work by the first half of 2007. Launched in April 2002, the European Strategy Forum on Research Infrastructures was set-up following a recommendation of the European Union Council, with the role to support a coherent approach to policy-making on research infrastructures in Europe, and to act as an incubator for international negotiations about concrete initiatives. In particular, ESFRI has prepared a European Roadmap identifying new Research Infrastructure of pan-European interest corresponding to the long term needs of the European research communities, covering all scientific areas, regardless of possible location and likely to be realised in the next 10 to 20 years. The Roadmap was presented on 19 October. It is the result of an intensive two-year consultation and peer review process involving over 1000 high level European and international experts. The Roadmap identifies 35 large scale infrastructure projects, at various stages of development, in seven key research areas including Environmental Sciences; Energy; Materials Sciences; Astrophysics, Astronomy, Particle and Nuclear Physics; Biomedical and Life Sciences; Social Sciences and the Humanities; Computation and data Treatment.
A Cross-Platform Infrastructure for Scalable Runtime Application Performance Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jack Dongarra; Shirley Moore; Bart Miller, Jeffrey Hollingsworth
2005-03-15
The purpose of this project was to build an extensible cross-platform infrastructure to facilitate the development of accurate and portable performance analysis tools for current and future high performance computing (HPC) architectures. Major accomplishments include tools and techniques for multidimensional performance analysis, as well as improved support for dynamic performance monitoring of multithreaded and multiprocess applications. Previous performance tool development has been limited by the burden of having to re-write a platform-dependent low-level substrate for each architecture/operating system pair in order to obtain the necessary performance data from the system. Manual interpretation of performance data is not scalable for large-scalemore » long-running applications. The infrastructure developed by this project provides a foundation for building portable and scalable performance analysis tools, with the end goal being to provide application developers with the information they need to analyze, understand, and tune the performance of terascale applications on HPC architectures. The backend portion of the infrastructure provides runtime instrumentation capability and access to hardware performance counters, with thread-safety for shared memory environments and a communication substrate to support instrumentation of multiprocess and distributed programs. Front end interfaces provides tool developers with a well-defined, platform-independent set of calls for requesting performance data. End-user tools have been developed that demonstrate runtime data collection, on-line and off-line analysis of performance data, and multidimensional performance analysis. The infrastructure is based on two underlying performance instrumentation technologies. These technologies are the PAPI cross-platform library interface to hardware performance counters and the cross-platform Dyninst library interface for runtime modification of executable images. The Paradyn and KOJAK projects have made use of this infrastructure to build performance measurement and analysis tools that scale to long-running programs on large parallel and distributed systems and that automate much of the search for performance bottlenecks.« less
Consortium for materials development in space interaction with Space Station Freedom
NASA Technical Reports Server (NTRS)
Lundquist, Charles A.; Seaquist, Valerie
1992-01-01
The Consortium for Materials Development in Space (CMDS) is one of seventeen Centers for the Commercial Development of Space (CCDS) sponsored by the Office of Commercial Programs of NASA. The CMDS formed at the University of Alabama in Huntsville in the fall of 1985. The Consortium activities therefore will have progressed for over a decade by the time Space Station Freedom (SSF) begins operation. The topic to be addressed here is: what are the natural, mutually productive relationships between the CMDS and SSF? For management and planning purposes, the Consortium organizes its activities into a number of individual projects. Normally, each project has a team of personnel from industry, university, and often government organizations. This is true for both product-oriented materials projects and for infrastructure projects. For various projects Space Station offers specific mutually productive relationships. First, SSF can provide a site for commercial operations that have evolved as a natural stage in the life cycle of individual projects. Efficiency and associated cost control lead to another important option. With SSF in place, there is the possibility to leave major parts of processing equipment in SSF, and only bring materials to SSF to be processed and return to earth the treated materials. This saves the transportation costs of repeatedly carrying heavy equipment to orbit and back to the ground. Another generic feature of commercial viability can be the general need to accomplish large through-put or large scale operations. The size of SSF lends itself to such needs. Also in addition to processing equipment, some of the other infrastructure capabilities developed in CCDS projects may be applied on SSF to support product activities. The larger SSF program may derive mutual benefits from these infrastructure abilities.
Development of Affordable, Low-Carbon Hydrogen Supplies at an Industrial Scale
ERIC Educational Resources Information Center
Roddy, Dermot J.
2008-01-01
An existing industrial hydrogen generation and distribution infrastructure is described, and a number of large-scale investment projects are outlined. All of these projects have the potential to generate significant volumes of low-cost, low-carbon hydrogen. The technologies concerned range from gasification of coal with carbon capture and storage…
Randhawa, Gurvaneet S
2014-11-01
There are large gaps in our knowledge on the potential impact of diagnostics and therapeutics on outcomes of patients treated in the real world. Comparative effectiveness research aims to fill these gaps to maximize effectiveness of these interventions. Health information technology has the potential to dramatically improve the practice of medicine and of research. This is an overview of about US$100 million of American Recovery and Reinvestment Act investment in 12 projects managed by the Agency for Healthcare Research and Quality to build an electronic clinical data infrastructure that connects research with healthcare delivery. The achievements and lessons learned from these projects provided a foundation for the National Patient-Centered Clinical Research Network (PCORnet)and will help to guide future infrastructure development needed to build an efficient, scalable and sustainable learning health system.
Information about the San Francisco Bay Water Quality Project (SFBWQP) Urban Greening Bay Area, a large-scale effort to re-envision urban landscapes to include green infrastructure (GI) making communities more livable and reducing stormwater runoff.
LAGUNA DESIGN STUDY, Underground infrastructures and engineering
NASA Astrophysics Data System (ADS)
Nuijten, Guido Alexander
2011-07-01
The European Commission has awarded the LAGUNA project a grant of 1.7 million euro for a Design Study from the seventh framework program of research and technology development (FP7-INFRASTRUCTURES - 2007-1) in 2008. The purpose of this two year work is to study the feasibility of the considered experiments and prepare a conceptual design of the required underground infrastructure. It is due to deliver a report that allows the funding agencies to decide on the realization of the experiment and to select the site and the technology. The result of this work is the first step towards fulfilling the goals of LAGUNA. The work will continue with EU funding to study the possibilities more thoroughly. The LAGUNA project is included in the future plans prepared by European funding organizations. (Astroparticle physics in Europe). It is recommended that a new large European infrastructure is put forward, as a future international multi-purpose facility for improved studies on proton decay and low-energy neutrinos from astrophysical origin. The three detection techniques being studied for such large detectors in Europe, Water-Cherenkov (like MEMPHYS), liquid scintillator (like LENA) and liquid argon (like GLACIER), are evaluated in the context of a common design study which should also address the underground infrastructure and the possibility of an eventual detection of future accelerator neutrino beams. The design study is also to take into account worldwide efforts and converge, on a time scale of 2010, to a common proposal.
Infrastructure Joint Venture Projects in Malaysia: A Preliminary Study
NASA Astrophysics Data System (ADS)
Romeli, Norsyakilah; Muhamad Halil, Faridah; Ismail, Faridah; Sufian Hasim, Muhammad
2018-03-01
As many developed country practise, the function of the infrastructure is to connect the each region of Malaysia holistically and infrastructure is an investment network projects such as transportation water and sewerage, power, communication and irrigations system. Hence, a billions allocations of government income reserved for the sake of the infrastructure development. Towards a successful infrastructure development, a joint venture approach has been promotes by 2016 in one of the government thrust in Construction Industry Transformation Plan which encourage the internationalisation among contractors. However, there is depletion in information on the actual practise of the infrastructure joint venture projects in Malaysia. Therefore, this study attempt to explore the real application of the joint venture in Malaysian infrastructure projects. Using the questionnaire survey, a set of survey question distributed to the targeted respondents. The survey contained three section which the sections are respondent details, organizations background and project capital in infrastructure joint venture project. The results recorded and analyse using SPSS software. The contractors stated that they have implemented the joint venture practice with mostly the client with the usual construction period of the infrastructure project are more than 5 years. Other than that, the study indicates that there are problems in the joint venture project in the perspective of the project capital and the railway infrastructure should be given a highlights in future study due to its high significant in term of cost and technical issues.
Working towards a European Geological Data Infrastructure
NASA Astrophysics Data System (ADS)
van der Krogt, Rob; Hughes, Richard; Pedersen, Mikael; Serrano, Jean-Jacques; Lee, Kathryn A.; Tulstrup, Jørgen; Robida, François
2013-04-01
The increasing importance of geological information for policy, regulation and business needs at European and international level has been recognized by the European Parliament and the European Commission, who have called for the development of a common European geological knowledge base. The societal relevance of geoscience data/information is clear from many current issues such as shale gas exploration (including environmental impacts), the availability of critical mineral resources in a global economy, management and security with regard to geohazards (seismic, droughts, floods, ground stability), quality of (ground-)water and soil and societal responses to the impacts of climate change. The EGDI-Scope project responds to this, aiming to prepare an implementation plan for a pan-European Geological Data Infrastructure (EGDI), under the umbrella of the FP7 e- Infrastructures program. It is envisaged that the EGDI will build on geological datasets and models currently held by the European Geological Surveys at national and regional levels, and will also provide a platform for datasets generated by the large number of relevant past, ongoing and future European projects which have geological components. With European policy makers and decision makers from (international) industry as the main target groups (followed by research communities and the general public) stakeholder involvement is imperative to the successful realization and continuity of the EGDI. With these ambitions in mind, the presentation will focus on the following issues, also based on the first results and experiences of the EGDI-Scope project that started mid-2012: • The organization of stakeholder input and commitment connected to relevant 'use cases' within different thematic domains; a number of stakeholder representatives is currently involved, but the project is open to more extensive participation; • A large number of European projects relevant for data delivery to EGDI has been reviewed; what can we conclude and what is the way forward? • The project has evaluated relevant existing interoperable infrastructures revealing a typology of infrastructures that may be useful models for the EGDI; • Planning for the EGDI also need to be integrated with other relevant international initiatives and programs such as GMES, GEO and EPOS, and with legally binding regulations like INSPIRE. The outcomes of these relevant evaluations and activities will contribute to the implementation plan for the EGDI including the prioritization of relevant datasets and the most important functional, technical (design, use of standards), legal and organizational requirements.
Using Cloud Computing infrastructure with CloudBioLinux, CloudMan and Galaxy
Afgan, Enis; Chapman, Brad; Jadan, Margita; Franke, Vedran; Taylor, James
2012-01-01
Cloud computing has revolutionized availability and access to computing and storage resources; making it possible to provision a large computational infrastructure with only a few clicks in a web browser. However, those resources are typically provided in the form of low-level infrastructure components that need to be procured and configured before use. In this protocol, we demonstrate how to utilize cloud computing resources to perform open-ended bioinformatics analyses, with fully automated management of the underlying cloud infrastructure. By combining three projects, CloudBioLinux, CloudMan, and Galaxy into a cohesive unit, we have enabled researchers to gain access to more than 100 preconfigured bioinformatics tools and gigabytes of reference genomes on top of the flexible cloud computing infrastructure. The protocol demonstrates how to setup the available infrastructure and how to use the tools via a graphical desktop interface, a parallel command line interface, and the web-based Galaxy interface. PMID:22700313
Using cloud computing infrastructure with CloudBioLinux, CloudMan, and Galaxy.
Afgan, Enis; Chapman, Brad; Jadan, Margita; Franke, Vedran; Taylor, James
2012-06-01
Cloud computing has revolutionized availability and access to computing and storage resources, making it possible to provision a large computational infrastructure with only a few clicks in a Web browser. However, those resources are typically provided in the form of low-level infrastructure components that need to be procured and configured before use. In this unit, we demonstrate how to utilize cloud computing resources to perform open-ended bioinformatic analyses, with fully automated management of the underlying cloud infrastructure. By combining three projects, CloudBioLinux, CloudMan, and Galaxy, into a cohesive unit, we have enabled researchers to gain access to more than 100 preconfigured bioinformatics tools and gigabytes of reference genomes on top of the flexible cloud computing infrastructure. The protocol demonstrates how to set up the available infrastructure and how to use the tools via a graphical desktop interface, a parallel command-line interface, and the Web-based Galaxy interface.
The 3D Elevation Program and America's infrastructure
Lukas, Vicki; Carswell, Jr., William J.
2016-11-07
Infrastructure—the physical framework of transportation, energy, communications, water supply, and other systems—and construction management—the overall planning, coordination, and control of a project from beginning to end—are critical to the Nation’s prosperity. The American Society of Civil Engineers has warned that, despite the importance of the Nation’s infrastructure, it is in fair to poor condition and needs sizable and urgent investments to maintain and modernize it, and to ensure that it is sustainable and resilient. Three-dimensional (3D) light detection and ranging (lidar) elevation data provide valuable productivity, safety, and cost-saving benefits to infrastructure improvement projects and associated construction management. By providing data to users, the 3D Elevation Program (3DEP) of the U.S. Geological Survey reduces users’ costs and risks and allows them to concentrate on their mission objectives. 3DEP includes (1) data acquisition partnerships that leverage funding, (2) contracts with experienced private mapping firms, (3) technical expertise, lidar data standards, and specifications, and (4) most important, public access to high-quality 3D elevation data. The size and breadth of improvements for the Nation’s infrastructure and construction management needs call for an efficient, systematic approach to acquiring foundational 3D elevation data. The 3DEP approach to national data coverage will yield large cost savings over individual project-by-project acquisitions and will ensure that data are accessible for other critical applications.
The U.S. EPA and other organizations have projected that a large portion of the United States’ aging water conveyance infrastructure will reach the end of its service life in the next several decades. EPA has identified asset management as a critical factor in efficiently addre...
The data access infrastructure of the Wadden Sea Long Term Ecosystem Research (WaLTER) project
NASA Astrophysics Data System (ADS)
De Bruin, T.
2011-12-01
The Wadden Sea, North of The Netherlands, Germany and Danmark, is one of the most important tidal areas in the world. In 2009, the Wadden Sea was listed on the UNESCO World Heritage list. The area is noted for its ecological diversity and value, being a stopover for large numbers of migrating birds. The Wadden Sea is also used intensively for economic activities by inhabitants of the surrounding coasts and islands, as well as by the many tourists visiting the area every year. A whole series of monitoring programmes is carried out by a range of governmental bodies and institutes to study the natural processes occuring in the Wadden Sea ecosystems as well as the influence of human activities on those ecosystems. Yet, the monitoring programmes are scattered and it is difficult to get an overview of those monitoring activities or to get access to the data resulting from those monitoring programmes. The Wadden Sea Long Term Ecosystem Research (WaLTER) project aims to: 1. To provide a base set of consistent, standardized, long-term data on changes in the Wadden Sea ecological and socio-economic system in order to model and understand interrelationships with human use, climate variation and possible other drivers. 2. To provide a research infrastructure, open access to commonly shared databases, educational facilities and one or more field sites in which experimental, innovative and process-driven research can be carried out. This presentation will introduce the WaLTER-project and explain the rationale for this project. The presentation will focus on the data access infrastructure which will be used for WaLTER. This infrastructure is part of the existing and operational infrastructure of the National Oceanographic Data Committee (NODC) in the Netherlands. The NODC forms the Dutch node in the European SeaDataNet consortium, which has built an European, distributed data access infrastructure. WaLTER, NODC and SeaDataNet all use the same technology, developed within the SeaDataNet-project, resulting in a high level of standardization across Europe. Benefits and pitfalls of using this infrastructure will be addressed.
Urban Security Initiative: Earthquake impacts on the urban ``system of systems``
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maheshwari, S.; Jones, E.; Rasmussen, S.
1999-06-01
This paper is a discussion of how to address the problems of disasters in a large city, a project titled Urban Security Initiative undertaken by the Los Alamos National Laboratory. The paper first discusses the need to address the problems of disasters in large cities and ten provides a framework that is suitable to address this problem. The paper then provides an overview of the module of the project that deals with assessment of earthquake damage on urban infrastructure in large cities and an internet-based approach for consensus building leading to better coordination in the post-disaster period. Finally, the papermore » discusses the future direction of the project.« less
Alternative Fuels Data Center: Electric Vehicle Infrastructure Projection
Tool (EVI-Pro) Lite Electric Vehicle Infrastructure Projection Tool (EVI-Pro) Lite to someone by E-mail Share Alternative Fuels Data Center: Electric Vehicle Infrastructure Projection Tool (EVI -Pro) Lite on Facebook Tweet about Alternative Fuels Data Center: Electric Vehicle Infrastructure
ERIC Educational Resources Information Center
Perz, Stephen G.; Cabrera, Liliana; Carvalho, Lucas Araujo; Castillo, Jorge; Barnes, Grenville
2010-01-01
Recent years have witnessed an expansion in international investment in large-scale infrastructure projects with the goal of achieving global economic integration. We focus on one such project, the Inter-Oceanic Highway in the "MAP" region, a trinational frontier where Bolivia, Brazil, and Peru meet in the southwestern Amazon. We adopt a…
Infrastructure and the Virtual Observatory
NASA Astrophysics Data System (ADS)
Dowler, P.; Gaudet, S.; Schade, D.
2011-07-01
The modern data center is faced with architectural and software engineering challenges that grow along with the challenges facing observatories: massive data flow, distributed computing environments, and distributed teams collaborating on large and small projects. By using VO standards as key components of the infrastructure, projects can take advantage of a decade of intellectual investment by the IVOA community. By their nature, these standards are proven and tested designs that already exist. Adopting VO standards saves considerable design effort, allows projects to take advantage of open-source software and test suites to speed development, and enables the use of third party tools that understand the VO protocols. The evolving CADC architecture now makes heavy use of VO standards. We show examples of how these standards may be used directly, coupled with non-VO standards, or extended with custom capabilities to solve real problems and provide value to our users. In the end, we use VO services as major parts of the core infrastructure to reduce cost rather than as an extra layer with additional cost and we can deliver more general purpose and robust services to our user community.
NASA Astrophysics Data System (ADS)
Curdt, C.; Hoffmeister, D.; Bareth, G.; Lang, U.
2017-12-01
Science conducted in collaborative, cross-institutional research projects, requires active sharing of research ideas, data, documents and further information in a well-managed, controlled and structured manner. Thus, it is important to establish corresponding infrastructures and services for the scientists. Regular project meetings and joint field campaigns support the exchange of research ideas. Technical infrastructures facilitate storage, documentation, exchange and re-use of data as results of scientific output. Additionally, also publications, conference contributions, reports, pictures etc. should be managed. Both, knowledge and data sharing is essential to create synergies. Within the coordinated programme `Collaborative Research Center' (CRC), the German Research Foundation offers funding to establish research data management (RDM) infrastructures and services. CRCs are large-scale, interdisciplinary, multi-institutional, long-term (up to 12 years), university-based research institutions (up to 25 sub-projects). These CRCs address complex and scientifically challenging research questions. This poster presents the RDM services and infrastructures that have been established for two CRCs, both focusing on environmental sciences. Since 2007, a RDM support infrastructure and associated services have been set up for the CRC/Transregio 32 (CRC/TR32) `Patterns in Soil-Vegetation-Atmosphere-Systems: Monitoring, Modelling and Data Assimilation' (www.tr32.de). The experiences gained have been used to arrange RDM services for the CRC1211 `Earth - Evolution at the Dry Limit' (www.crc1211.de), funded since 2016. In both projects scientists from various disciplines collect heterogeneous data at field campaigns or by modelling approaches. To manage the scientific output, the TR32DB data repository (www.tr32db.de) has been designed and implemented for the CRC/TR32. This system was transferred and adapted to the CRC1211 needs (www.crc1211db.uni-koeln.de) in 2016. Both repositories support secure and sustainable data storage, backup, documentation, publication with DOIs, search, download, statistics as well as web mapping features. Moreover, RDM consulting and support services as well as training sessions are carried out regularly.
NASA Astrophysics Data System (ADS)
Meertens, C. M.; Boler, F. M.; Ertz, D. J.; Mencin, D.; Phillips, D.; Baker, S.
2017-12-01
UNAVCO, in its role as a NSF facility for geodetic infrastructure and data, has succeeded for over two decades using on-premises infrastructure, and while the promise of cloud-based infrastructure is well-established, significant questions about suitability of such infrastructure for facility-scale services remain. Primarily through the GeoSciCloud award from NSF EarthCube, UNAVCO is investigating the costs, advantages, and disadvantages of providing its geodetic data and services in the cloud versus using UNAVCO's on-premises infrastructure. (IRIS is a collaborator on the project and is performing its own suite of investigations). In contrast to the 2-3 year time scale for the research cycle, the time scale of operation and planning for NSF facilities is for a minimum of five years and for some services extends to a decade or more. Planning for on-premises infrastructure is deliberate, and migrations typically take months to years to fully implement. Migrations to a cloud environment can only go forward with similar deliberate planning and understanding of all costs and benefits. The EarthCube GeoSciCloud project is intended to address the uncertainties of facility-level operations in the cloud. Investigations are being performed in a commercial cloud environment (Amazon AWS) during the first year of the project and in a private cloud environment (NSF XSEDE resource at the Texas Advanced Computing Center) during the second year. These investigations are expected to illuminate the potential as well as the limitations of running facility scale production services in the cloud. The work includes running parallel equivalent cloud-based services to on premises services and includes: data serving via ftp from a large data store, operation of a metadata database, production scale processing of multiple months of geodetic data, web services delivery of quality checked data and products, large-scale compute services for event post-processing, and serving real time data from a network of 700-plus GPS stations. The evaluation is based on a suite of metrics that we have developed to elucidate the effectiveness of cloud-based services in price, performance, and management. Services are currently running in AWS and evaluation is underway.
Review of CERN Data Centre Infrastructure
NASA Astrophysics Data System (ADS)
Andrade, P.; Bell, T.; van Eldik, J.; McCance, G.; Panzer-Steindel, B.; Coelho dos Santos, M.; Traylen and, S.; Schwickerath, U.
2012-12-01
The CERN Data Centre is reviewing strategies for optimizing the use of the existing infrastructure and expanding to a new data centre by studying how other large sites are being operated. Over the past six months, CERN has been investigating modern and widely-used tools and procedures used for virtualisation, clouds and fabric management in order to reduce operational effort, increase agility and support unattended remote data centres. This paper gives the details on the project's motivations, current status and areas for future investigation.
Surface infrastructure : cost, financing and schedules for large-dollar transportation projects
DOT National Transportation Integrated Search
1998-02-01
In fiscal year 1998, the federal government will distribute nearly $26 billion to states and localities for the construction and repair of the nation's surface transportation systems. To meet the nations' transportation needs, states and localities a...
NASA Astrophysics Data System (ADS)
Hoffmann, Carsten; Schulz, Sina; Svoboda, Nikolai; Zoarder, Muqit; Eberhardt, Einar; Russell, David; Heinrich, Uwe
2017-04-01
Within the research project BonaRes ("Soil as a sustainable resource for the bioeconomy") an infrastructure is being developed to upload, manage, store, and provide the increasing amount of soil and agricultural research data, raw data, and metadata in Germany. Large joint research projects such as BonaRes require rules for data handling. The application and designation of standards, standard methods and widely disseminated and accepted data formats for all stages of data life (from acquisition to provision) is accompanied by a number of advantages for data providers, -managers and -users. Standards enable e.g. an easy data exchange and provision for data re-use, communication with other disciplines, and improve the visibility and accessibility of research activities and results. To harmonize national with international data infrastructures, standards used in the scope of BonaRes should either meet international requirements or be transformable by derivation tools. In the first project phase an overview of standards was compiled including more than 600 relevant norms, directives, exchange formats and code lists. With the collaboration of an international expert consortium we then developed a "Recommendation list Standards" for all project partners and other soil/agricultural data providers. We present and discuss selected recommendations and possible implementations of standards to be used in the BonaRes data infrastructure for data acquisition (e.g. soil description, agronomy), data management (e.g. exchange languages, derivation tools), and data provision (e.g. licenses, geo-data services).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sperling, Joshua; Fisher, Stephen; Reiner, Mark B.
The term 'leapfrogging' has been applied to cities and nations that have adopted a new form of infrastructure by bypassing the traditional progression of development, e.g., from no phones to cell phones - bypassing landlines all together. However, leapfrogging from unreliable infrastructure systems to 'smart' cities is too large a jump resulting in unsustainable and unhealthy infrastructure systems. In the Global South, a baseline of unreliable infrastructure is a prevalent problem. The push for sustainable and 'smart' [re]development tends to ignore many of those already living with failing, unreliable infrastructure. Without awareness of baseline conditions, uninformed projects run the riskmore » of returning conditions to the status quo, keeping many urban populations below targets of the United Nations' Sustainable Development Goals. A key part of understanding the baseline is to identify how citizens have long learned to adjust their expectations of basic services. To compensate for poor infrastructure, most residents in the Global South invest in remedial secondary infrastructure (RSI) at the household and business levels. The authors explore three key 'smart' city transformations that address RSI within a hierarchical planning pyramid known as the comprehensive resilient and reliable infrastructure systems (CRISP) planning framework.« less
ERIC Educational Resources Information Center
Newton, Warren P.; Lefebvre, Ann; Donahue, Katrina E.; Bacon, Thomas; Dobson, Allen
2010-01-01
Introduction: Little is known regarding how to accomplish large-scale health care improvement. Our goal is to improve the quality of chronic disease care in all primary care practices throughout North Carolina. Methods: Methods for improvement include (1) common quality measures and shared data system; (2) rapid cycle improvement principles; (3)…
The GILDA t-Infrastructure: grid training activities in Africa and future opportunities
NASA Astrophysics Data System (ADS)
Ardizzone, V.; Barbera, R.; Ciuffo, L.; Giorgio, E.
2009-04-01
Scientists, educators, and students from many parts of the worlds are not able to take advantage of ICT because the digital divide is growing and prevents less developed countries to exploit its benefits. Instead of becoming more empowered and involved in worldwide developments, they are becoming increasingly marginalised as the world of education and science becomes increasingly Internet-dependent. The Grid Infn Laboratory for Dissemination Activities (GILDA) spreads since almost five years the awareness of Grid technology to a large audience, training new communities and fostering new organisations to provide resources. The knowledge dissemination process guided by the training activities is a key factor to ensure that all users can fully understand the characteristics of the Grid services offered by large existing e-Infrastructure. GILDA is becoming a "de facto" standard in training infrastructures (t-Infrastructures) and it is adopted by many grid projects worldwide. In this contribution we will report on the latest status of GILDA services and on the training activities recently carried out in sub-Saharan Africa (Malawi and South Africa). Particular care will be devoted to show how GILDA can be "cloned" to satisfy both education and research demands of African Organisations. The opportunities to benefit from GILDA in the framework of the EPIKH project as well as the plans of the European Commission on grid training and education for the 2010-2011 calls of its 7th Framework Programme will be presented and discussed.
MFC Communications Infrastructure Study
DOE Office of Scientific and Technical Information (OSTI.GOV)
Michael Cannon; Terry Barney; Gary Cook
2012-01-01
Unprecedented growth of required telecommunications services and telecommunications applications change the way the INL does business today. High speed connectivity compiled with a high demand for telephony and network services requires a robust communications infrastructure. The current state of the MFC communication infrastructure limits growth opportunities of current and future communication infrastructure services. This limitation is largely due to equipment capacity issues, aging cabling infrastructure (external/internal fiber and copper cable) and inadequate space for telecommunication equipment. While some communication infrastructure improvements have been implemented over time projects, it has been completed without a clear overall plan and technology standard.more » This document identifies critical deficiencies with the current state of the communication infrastructure in operation at the MFC facilities and provides an analysis to identify needs and deficiencies to be addressed in order to achieve target architectural standards as defined in STD-170. The intent of STD-170 is to provide a robust, flexible, long-term solution to make communications capabilities align with the INL mission and fit the various programmatic growth and expansion needs.« less
NASA Astrophysics Data System (ADS)
Shirley, Rebekah Grace
This dissertation focuses on an integration of energy modeling tools to explore energy transition pathways for emerging economies. The spate of growth in the global South has led to a global energy transition, evidenced in part by a surge in the development of large scale energy infrastructure projects for the provision of reliable electricity service. The rational of energy security and exigency often usher these large scale projects through to implementation with minimal analysis of costs: social and environmental impact, ecological risk, or opportunity costs of alternative energy transition pathways foregone. Furthermore, development of energy infrastructure is inherently characterized by the involvement of a number of state and non-state actors, with varying interests, objectives and access to authority. Being woven through and into social institutions necessarily impacts the design, control and functionality of infrastructure. In this dissertation I therefore conceptualize energy infrastructure as lying at the intersection, or nexus, of people, the environment and energy security. I argue that energy infrastructure plans and policy should, and can, be informed by each of these fields of influence in order to appropriately satisfy local development needs. This case study explores the socio-techno-environmental context of contemporary mega-dam development in northern Borneo. I describe the key actors of an ongoing mega-dam debate and the constellation of their interaction. This highlights the role that information may play in public discourse and lends insight into how inertia in the established system may stymie technological evolution. I then use a combination of power system simulation, ecological modeling and spatial analysis to analyze the potential for, and costs and tradeoffs of, future energy scenarios. In this way I demonstrate reproducible methods that can support energy infrastructure decision making by directly addressing data limitation barriers. I offer a platform for integrated analysis that considers cost perspectives across the nexus. The management of energy transitions is a growing field, critically important to low carbon futures. With the broader implications of my study I hope to contribute to a paradigm shift away from the dominant large-scale energy infrastructure as a means of energy security discourse, to a more encompassing security agenda that considers distributed and localized solutions.
Colling, D.; Britton, D.; Gordon, J.; Lloyd, S.; Doyle, A.; Gronbech, P.; Coles, J.; Sansum, A.; Patrick, G.; Jones, R.; Middleton, R.; Kelsey, D.; Cass, A.; Geddes, N.; Clark, P.; Barnby, L.
2013-01-01
The Large Hadron Collider (LHC) is one of the greatest scientific endeavours to date. The construction of the collider itself and the experiments that collect data from it represent a huge investment, both financially and in terms of human effort, in our hope to understand the way the Universe works at a deeper level. Yet the volumes of data produced are so large that they cannot be analysed at any single computing centre. Instead, the experiments have all adopted distributed computing models based on the LHC Computing Grid. Without the correct functioning of this grid infrastructure the experiments would not be able to understand the data that they have collected. Within the UK, the Grid infrastructure needed by the experiments is provided by the GridPP project. We report on the operations, performance and contributions made to the experiments by the GridPP project during the years of 2010 and 2011—the first two significant years of the running of the LHC. PMID:23230163
Green Infrastructure Checklists and Renderings
Materials and checklists for Denver, CO to review development project plans for green infrastructure components, best practices for inspecting and maintaining installed green infrastructure. Also includes renderings of streetscape projects.
Geospatial-enabled Data Exploration and Computation through Data Infrastructure Building Blocks
NASA Astrophysics Data System (ADS)
Song, C. X.; Biehl, L. L.; Merwade, V.; Villoria, N.
2015-12-01
Geospatial data are present everywhere today with the proliferation of location-aware computing devices and sensors. This is especially true in the scientific community where large amounts of data are driving research and education activities in many domains. Collaboration over geospatial data, for example, in modeling, data analysis and visualization, must still overcome the barriers of specialized software and expertise among other challenges. The GABBs project aims at enabling broader access to geospatial data exploration and computation by developing spatial data infrastructure building blocks that leverage capabilities of end-to-end application service and virtualized computing framework in HUBzero. Funded by NSF Data Infrastructure Building Blocks (DIBBS) initiative, GABBs provides a geospatial data architecture that integrates spatial data management, mapping and visualization and will make it available as open source. The outcome of the project will enable users to rapidly create tools and share geospatial data and tools on the web for interactive exploration of data without requiring significant software development skills, GIS expertise or IT administrative privileges. This presentation will describe the development of geospatial data infrastructure building blocks and the scientific use cases that help drive the software development, as well as seek feedback from the user communities.
2005-05-01
Standish Group 1995a; 1995b). In general , the risk of failure for large software projects is significantly greater than for small projects (Humphrey...learning, geographical dispersion, and team experience. Various weighting schemes can be developed and applied to these parameters for various...Fadtool DbCAS/ WebCAS ObligationsFunding data COPS MDMS Committments Obligations Committments PADDS Obligations EDA Contracts CAPS Contracts Document
Infrastructure for Training and Partnershipes: California Water and Coastal Ocean Resources
NASA Technical Reports Server (NTRS)
Siegel, David A.; Dozier, Jeffrey; Gautier, Catherine; Davis, Frank; Dickey, Tommy; Dunne, Thomas; Frew, James; Keller, Arturo; MacIntyre, Sally; Melack, John
2000-01-01
The purpose of this project was to advance the existing ICESS/Bren School computing infrastructure to allow scientists, students, and research trainees the opportunity to interact with environmental data and simulations in near-real time. Improvements made with the funding from this project have helped to strengthen the research efforts within both units, fostered graduate research training, and helped fortify partnerships with government and industry. With this funding, we were able to expand our computational environment in which computer resources, software, and data sets are shared by ICESS/Bren School faculty researchers in all areas of Earth system science. All of the graduate and undergraduate students associated with the Donald Bren School of Environmental Science and Management and the Institute for Computational Earth System Science have benefited from the infrastructure upgrades accomplished by this project. Additionally, the upgrades fostered a significant number of research projects (attached is a list of the projects that benefited from the upgrades). As originally proposed, funding for this project provided the following infrastructure upgrades: 1) a modem file management system capable of interoperating UNIX and NT file systems that can scale to 6.7 TB, 2) a Qualstar 40-slot tape library with two AIT tape drives and Legato Networker backup/archive software, 3) previously unavailable import/export capability for data sets on Zip, Jaz, DAT, 8mm, CD, and DLT media in addition to a 622Mb/s Internet 2 connection, 4) network switches capable of 100 Mbps to 128 desktop workstations, 5) Portable Batch System (PBS) computational task scheduler, and vi) two Compaq/Digital Alpha XP1000 compute servers each with 1.5 GB of RAM along with an SGI Origin 2000 (purchased partially using funds from this project along with funding from various other sources) to be used for very large computations, as required for simulation of mesoscale meteorology or climate.
NASA Astrophysics Data System (ADS)
Low, W. W.; Wong, K. S.; Lee, J. L.
2018-04-01
With the growth of economy and population, there is an increase in infrastructure construction projects. As such, it is unavoidable to have construction projects on soft soil. Without proper risk management plan, construction projects are vulnerable to different types of risks which will have negative impact on project’s time, cost and quality. Literature review showed that little or none of the research is focused on the risk assessment on the infrastructure project in soft soil. Hence, the aim of this research is to propose a risk assessment framework in infrastructure projects in soft soil during the construction stage. This research was focused on the impact of risks on project time and internal risk factors. The research method was Analytical Hierarchy Process and the sample population was experienced industry experts who have experience in infrastructure projects. Analysis was completed and result showed that for internal factors, the five most significant risks on time element are lack of special equipment, potential contractual disputes and claims, shortage of skilled workers, delay/lack of materials supply, and insolvency of contractor/sub-contractor. Results indicated that resources risk factor play a critical role on project time frame in infrastructure projects in soft soil during the construction stage.
Ameur, Adam; Bunikis, Ignas; Enroth, Stefan; Gyllensten, Ulf
2014-01-01
CanvasDB is an infrastructure for management and analysis of genetic variants from massively parallel sequencing (MPS) projects. The system stores SNP and indel calls in a local database, designed to handle very large datasets, to allow for rapid analysis using simple commands in R. Functional annotations are included in the system, making it suitable for direct identification of disease-causing mutations in human exome- (WES) or whole-genome sequencing (WGS) projects. The system has a built-in filtering function implemented to simultaneously take into account variant calls from all individual samples. This enables advanced comparative analysis of variant distribution between groups of samples, including detection of candidate causative mutations within family structures and genome-wide association by sequencing. In most cases, these analyses are executed within just a matter of seconds, even when there are several hundreds of samples and millions of variants in the database. We demonstrate the scalability of canvasDB by importing the individual variant calls from all 1092 individuals present in the 1000 Genomes Project into the system, over 4.4 billion SNPs and indels in total. Our results show that canvasDB makes it possible to perform advanced analyses of large-scale WGS projects on a local server. Database URL: https://github.com/UppsalaGenomeCenter/CanvasDB PMID:25281234
Ameur, Adam; Bunikis, Ignas; Enroth, Stefan; Gyllensten, Ulf
2014-01-01
CanvasDB is an infrastructure for management and analysis of genetic variants from massively parallel sequencing (MPS) projects. The system stores SNP and indel calls in a local database, designed to handle very large datasets, to allow for rapid analysis using simple commands in R. Functional annotations are included in the system, making it suitable for direct identification of disease-causing mutations in human exome- (WES) or whole-genome sequencing (WGS) projects. The system has a built-in filtering function implemented to simultaneously take into account variant calls from all individual samples. This enables advanced comparative analysis of variant distribution between groups of samples, including detection of candidate causative mutations within family structures and genome-wide association by sequencing. In most cases, these analyses are executed within just a matter of seconds, even when there are several hundreds of samples and millions of variants in the database. We demonstrate the scalability of canvasDB by importing the individual variant calls from all 1092 individuals present in the 1000 Genomes Project into the system, over 4.4 billion SNPs and indels in total. Our results show that canvasDB makes it possible to perform advanced analyses of large-scale WGS projects on a local server. Database URL: https://github.com/UppsalaGenomeCenter/CanvasDB. © The Author(s) 2014. Published by Oxford University Press.
INTEGRATED WASTE AND WATER MANAGEMENT PROJECT (IWWMP) – BATANGAS, PHILIPPINES
Mass evacuations of rural residents in the Philippines to large urban areas overburden an already strained infrastructure. There have been investments by non-profit groups to develop housing to attract the evacuees back to their regions. However, there remains a great need for...
DOT National Transportation Integrated Search
2011-02-01
An understanding of traffic flow in time and space is fundamental to the development of : strategies for the efficient use of the existing transportation infrastructure in large : metropolitan areas. Thus, this project involved developing the methods...
DOT National Transportation Integrated Search
1999-11-01
The FAST-TRAC (Faster and Safer Travel through Traffic Routing and Advanced Controls) Operational Field Test (OFT) is an Intelligent Transportation Systems (ITS) project being conducted in Southeast Michigan, largely within Oakland County. The projec...
Landlord project multi-year program plan, fiscal year 1999, WBS 1.5
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dallas, M.D.
The MYWP technical baseline describes the work to be accomplished by the Project and the technical standards which govern that work. The mission of Landlord Project is to provide more maintenance replacement of general infrastructure facilities and systems to facilitate the Hanford Site cleanup mission. Also, once an infrastructure facility or system is no longer needed the Landlord Project transitions the facility to final closure/removal through excess, salvage or demolition. Landlord Project activities will be performed in an environmentally sound, safe, economical, prudent, and reliable manner. The Landlord Project consists of the following facilities systems: steam, water, liquid sanitary waste,more » electrical distribution, telecommunication, sanitary landfill, emergency services, general purpose offices, general purpose shops, general purpose warehouses, environmental supports facilities, roads, railroad, and the site land. The objectives for general infrastructure support are reflected in two specific areas, (1) Core Infrastructure Maintenance, and (2) Infrastructure Risk Mitigation.« less
Vision for a 21st Century Information Infrastructure.
ERIC Educational Resources Information Center
Council on Competitiveness, Washington, DC.
In order to ensure that the United States maintains an advanced information infrastructure, the Council on Competitiveness has started a project on the 21st century infrastructure. Participating in this project are the many different parties who are providing and using the infrastructure, including cable companies, regional Bell companies, long…
Castro, Marcia C; Krieger, Gary R; Balge, Marci Z; Tanner, Marcel; Utzinger, Jürg; Whittaker, Maxine; Singer, Burton H
2016-12-20
Large-scale corporate projects, particularly those in extractive industries or hydropower development, have a history from early in the twentieth century of creating negative environmental, social, and health impacts on communities proximal to their operations. In many instances, especially for hydropower projects, the forced resettlement of entire communities was a feature in which local cultures and core human rights were severely impacted. These projects triggered an activist opposition that progressively expanded and became influential at both the host community level and with multilateral financial institutions. In parallel to, and spurred by, this activism, a shift occurred in 1969 with the passage of the National Environmental Policy Act in the United States, which required Environmental Impact Assessment (EIA) for certain types of industrial and infrastructure projects. Over the last four decades, there has been a global movement to develop a formal legal/regulatory EIA process for large industrial and infrastructure projects. In addition, social, health, and human rights impact assessments, with associated mitigation plans, were sequentially initiated and have increasingly influenced project design and relations among companies, host governments, and locally impacted communities. Often, beneficial community-level social, economic, and health programs have voluntarily been put in place by companies. These flagship programs can serve as benchmarks for community-corporate-government partnerships in the future. Here, we present examples of such positive phenomena and also focus attention on a myriad of challenges that still lie ahead.
Castro, Marcia C.; Krieger, Gary R.; Balge, Marci Z.; Tanner, Marcel; Utzinger, Jürg; Whittaker, Maxine; Singer, Burton H.
2016-01-01
Large-scale corporate projects, particularly those in extractive industries or hydropower development, have a history from early in the twentieth century of creating negative environmental, social, and health impacts on communities proximal to their operations. In many instances, especially for hydropower projects, the forced resettlement of entire communities was a feature in which local cultures and core human rights were severely impacted. These projects triggered an activist opposition that progressively expanded and became influential at both the host community level and with multilateral financial institutions. In parallel to, and spurred by, this activism, a shift occurred in 1969 with the passage of the National Environmental Policy Act in the United States, which required Environmental Impact Assessment (EIA) for certain types of industrial and infrastructure projects. Over the last four decades, there has been a global movement to develop a formal legal/regulatory EIA process for large industrial and infrastructure projects. In addition, social, health, and human rights impact assessments, with associated mitigation plans, were sequentially initiated and have increasingly influenced project design and relations among companies, host governments, and locally impacted communities. Often, beneficial community-level social, economic, and health programs have voluntarily been put in place by companies. These flagship programs can serve as benchmarks for community–corporate–government partnerships in the future. Here, we present examples of such positive phenomena and also focus attention on a myriad of challenges that still lie ahead. PMID:27791077
Critical success factors in infrastructure projects
NASA Astrophysics Data System (ADS)
Zakaria, Siti Fairus; Zin, Rosli Mohamad; Mohamad, Ismail; Balubaid, Saeed; Mydin, Shaik Hussein; Mohd Rahim, E. M. Roodienyanto
2017-11-01
Construction of infrastructure project is different from buildings. The main difference is term of project site where infrastructure project need to command a long stretch while building mostly confine to a limited area. As such factors that are critical to infrastructure project may not be that significant to building project and vice versa. Flood mitigation can be classified under infrastructure projects under which their developments are planned by the government with the specific objective to reduce or avoid the negative effects of flood to the environment and livelihood. One of the indicators in project success is delay. The impact of project delay in construction industry is significant that it decelerates the projects implementation, specifically the government projects. This study attempted to identify and compare the success factors between infrastructure and building projects, as such comparison rarely found in the current literature. A model of flood mitigation projects' success factors was developed by merging the experts' views and reports from the existing literature. The experts' views were obtained from the responses to open-ended questions on the required fundamentals to achieve successful completion of flood mitigation projects. An affinity analysis was applied to these responses to develop the model. The developed model was then compared to the established success factors found in building project, extracted from the previous studies to identify the similarities and differences between the two models. This study would assist the government and construction players to become more effective in constructing successful flood mitigation projects for the future practice in a flood-prone country like Malaysia.
NASA Astrophysics Data System (ADS)
Sucipto, Katoningsih, Sri; Ratnaningrum, Anggry
2017-03-01
With large number of schools and many components of school infrastructure supporting with limited funds,so, the school infrastructure development cannot be done simultaneously. Implementation of development must be based on priorities according to the needs. Record all existing needs Identify the condition of the school infrastructure, so that all data recorded bias is valid and has covered all the infrastructure needs of the school. SIPIS very helpful in the process of recording all the necessary needs of the school. Make projections of school development, student participants to the HR business. Make the order needs based on their level of importance. Determine the order in accordance with the needs of its importance, the most important first. By using SIPIS can all be arranged correctly so that do not confuse to construct what should be done in advance but be the last because of factors like and dislike. Make the allocation of funds in detail, then when submitting the budget funds provided in accordance with demand.
He, Guizhen; Zhang, Lei; Lu, Yonglong
2009-09-01
Large-scale public infrastructure projects have featured in China's modernization course since the early 1980s. During the early stages of China's rapid economic development, public attention focused on the economic and social impact of high-profile construction projects. In recent years, however, we have seen a shift in public concern toward the environmental and ecological effects of such projects, and today governments are required to provide valid environmental impact assessments prior to allowing large-scale construction. The official requirement for the monitoring of environmental conditions has led to an increased number of debates in recent years regarding the effectiveness of Environmental Impact Assessments (EIAs) and Governmental Environmental Audits (GEAs) as environmental safeguards in instances of large-scale construction. Although EIA and GEA are conducted by different institutions and have different goals and enforcement potential, these two practices can be closely related in terms of methodology. This article cites the construction of the Qinghai-Tibet Railway as an instance in which EIA and GEA offer complementary approaches to environmental impact management. This study concludes that the GEA approach can serve as an effective follow-up to the EIA and establishes that the EIA lays a base for conducting future GEAs. The relationship that emerges through a study of the Railway's construction calls for more deliberate institutional arrangements and cooperation if the two practices are to be used in concert to optimal effect.
A framework to support human factors of automation in railway intelligent infrastructure.
Dadashi, Nastaran; Wilson, John R; Golightly, David; Sharples, Sarah
2014-01-01
Technological and organisational advances have increased the potential for remote access and proactive monitoring of the infrastructure in various domains and sectors - water and sewage, oil and gas and transport. Intelligent Infrastructure (II) is an architecture that potentially enables the generation of timely and relevant information about the state of any type of infrastructure asset, providing a basis for reliable decision-making. This paper reports an exploratory study to understand the concepts and human factors associated with II in the railway, largely drawing from structured interviews with key industry decision-makers and attachment to pilot projects. Outputs from the study include a data-processing framework defining the key human factors at different levels of the data structure within a railway II system and a system-level representation. The framework and other study findings will form a basis for human factors contributions to systems design elements such as information interfaces and role specifications.
DOT National Transportation Integrated Search
2012-12-01
Fully operational highways are necessary for efficient freight movements by the trucking industry. Yet, the combination of limited funding and aging infrastructure creates a grim scenario for states, which are dependent upon the economic benefits of ...
High Fidelity Simulations of Large-Scale Wireless Networks (Plus-Up)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Onunkwo, Uzoma
Sandia has built a strong reputation in scalable network simulation and emulation for cyber security studies to protect our nation’s critical information infrastructures. Georgia Tech has preeminent reputation in academia for excellence in scalable discrete event simulations, with strong emphasis on simulating cyber networks. Many of the experts in this field, such as Dr. Richard Fujimoto, Dr. George Riley, and Dr. Chris Carothers, have strong affiliations with Georgia Tech. The collaborative relationship that we intend to immediately pursue is in high fidelity simulations of practical large-scale wireless networks using ns-3 simulator via Dr. George Riley. This project will have mutualmore » benefits in bolstering both institutions’ expertise and reputation in the field of scalable simulation for cyber-security studies. This project promises to address high fidelity simulations of large-scale wireless networks. This proposed collaboration is directly in line with Georgia Tech’s goals for developing and expanding the Communications Systems Center, the Georgia Tech Broadband Institute, and Georgia Tech Information Security Center along with its yearly Emerging Cyber Threats Report. At Sandia, this work benefits the defense systems and assessment area with promise for large-scale assessment of cyber security needs and vulnerabilities of our nation’s critical cyber infrastructures exposed to wireless communications.« less
Public-private partnerships potential for Arizona-Mexico border infrastructure projects.
DOT National Transportation Integrated Search
2009-09-01
This study of the PublicPrivate Partnership Potential for ArizonaMexico Border Infrastructure Projects originated as an action item of the Transportation, Infrastructure, and Ports Committee of the ArizonaMexico Commission. The purpose of th...
Battery Electric Vehicle Driving and Charging Behavior Observed Early in The EV Project
DOE Office of Scientific and Technical Information (OSTI.GOV)
John Smart; Stephen Schey
2012-04-01
As concern about society's dependence on petroleum-based transportation fuels increases, many see plug-in electric vehicles (PEV) as enablers to diversifying transportation energy sources. These vehicles, which include plug-in hybrid electric vehicles (PHEV), range-extended electric vehicles (EREV), and battery electric vehicles (BEV), draw some or all of their power from electricity stored in batteries, which are charged by the electric grid. In order for PEVs to be accepted by the mass market, electric charging infrastructure must also be deployed. Charging infrastructure must be safe, convenient, and financially sustainable. Additionally, electric utilities must be able to manage PEV charging demand on themore » electric grid. In the Fall of 2009, a large scale PEV infrastructure demonstration was launched to deploy an unprecedented number of PEVs and charging infrastructure. This demonstration, called The EV Project, is led by Electric Transportation Engineering Corporation (eTec) and funded by the U.S. Department of Energy. eTec is partnering with Nissan North America to deploy up to 4,700 Nissan Leaf BEVs and 11,210 charging units in five market areas in Arizona, California, Oregon, Tennessee, and Washington. With the assistance of the Idaho National Laboratory, eTec will collect and analyze data to characterize vehicle consumer driving and charging behavior, evaluate the effectiveness of charging infrastructure, and understand the impact of PEV charging on the electric grid. Trials of various revenue systems for commercial and public charging infrastructure will also be conducted. The ultimate goal of The EV Project is to capture lessons learned to enable the mass deployment of PEVs. This paper is the first in a series of papers documenting the progress and findings of The EV Project. This paper describes key research objectives of The EV Project and establishes the project background, including lessons learned from previous infrastructure deployment and PEV demonstrations. One such previous study was a PHEV demonstration conducted by the U.S. Department of Energy's Advanced Vehicle Testing Activity (AVTA), led by the Idaho National Laboratory (INL). AVTA's PHEV demonstration involved over 250 vehicles in the United States, Canada, and Finland. This paper summarizes driving and charging behavior observed in that demonstration, including the distribution of distance driven between charging events, charging frequency, and resulting proportion of operation charge depleting mode. Charging demand relative to time of day and day of the week will also be shown. Conclusions from the PHEV demonstration will be given which highlight the need for expanded analysis in The EV Project. For example, the AVTA PHEV demonstration showed that in the absence of controlled charging by the vehicle owner or electric utility, the majority of vehicles were charged in the evening hours, coincident with typical utility peak demand. Given this baseline, The EV Project will demonstrate the effects of consumer charge control and grid-side charge management on electricity demand. This paper will outline further analyses which will be performed by eTec and INL to documenting driving and charging behavior of vehicles operated in a infrastructure-rich environment.« less
NASA Astrophysics Data System (ADS)
Orlecka-Sikora, Beata; Lasocki, Stanislaw; Staszek, Monika; Olszewska, Dorota; Urban, Pawel; Jaroslawski, Janusz; Cielesta, Szymon; Mirek, Janusz; Wiszniowski, Jan; Picozzi, Matteo; Solaro, Giuseppe; Pringle, Jamie; Toon, Sam; Cesca, Simone; Kuehn, Daniela; Ruigrok, Elmer; Gunning, Andrew; Isherwood, Catherine
2017-04-01
The main objective of the "Shale gas exploration and exploitation induced risks - SHEER" project (Horizon 2020, call LCE 16-2014) is to develop a probabilistic methodology to assess and mitigate the short- and the long-term environmental risks associated with the exploration and exploitation of shale gas. To this end, the SHEER project makes use of a large amount of heterogeneous data of various types. This data, from different disciplines of science e.g. geophysical, geochemical, geological, technological, etc., must be homogenized, harmonized and made accessible exclusively for all project participants. This requires to develop an over-arching structure for high-level multidisciplinary data integration. The bespoke solution is provided by Thematic Core Service Anthropogenic Hazards (TCS AH) developed in the framework of European Plate Observing System Program (https://tcs.ah-epos.eu/, infrastructural projects IS-EPOS, POIG.02.03.00-14-090/13-00 and EPOS IP, H2020-INFRADEV-1-2015-1). TCS AH provides virtual access to a comprehensive, wide-scale and high quality research infrastructure in the field of induced seismicity and other anthropogenic hazards evoked by exploration and exploitation of geo-resources. TCS AH is designed as a functional e-research environment to ensure a researcher the maximum possible freedom for experimentation by providing a virtual laboratory flexible to create own workspace for processing streams. A data-management process promotes the use of research infrastructure in novel ways providing an access to (i) data gathered in the so-called "episodes", comprehensively describing a geophysical process, induced or triggered by human technological activity, which under certain circumstances can become hazardous for people, infrastructure and the environment, (ii) problem-oriented, specific services, with the particular attention devoted to methods analyzing correlations between technology, geophysical response and resulting hazards, (iii) the intercommunity social functions, e.g. brokering of projects, common workspace for the project shared by the project members, upload/download data and codes to the common workspace, tools for communication of project members. The SHEER project uses TCS AH e-infrastructure to manage interdisciplinary data from seven independent episodes and data products from own research. Since presently more than 500 users from 21 countries have registered to TCS AH, the SHEER use of TCS AH by leaps and bounds increases visibility of the project. This work was supported within SHEER: "Shale Gas Exploration and Exploitation Induced Risks" project funded from Horizon 2020 - R&I Framework Programme, call H2020-LCE 16-2014-1 and within statutory activities No3841/E-41/S/2016 of Ministry of Science and Higher Education of Poland.
EuCARD2: enhanced accelerator research and development in Europe
NASA Astrophysics Data System (ADS)
Romaniuk, Ryszard S.
2013-10-01
Accelerator science and technology is one of a key enablers of the developments in the particle physic, photon physics and also applications in medicine and industry. EuCARD2 is an European research project which will be realized during 2013-2017 inside the EC FP7 framework. The project concerns the development and coordination of European Accelerator Research and Development. The project is particularly important, to a number of domestic laboratories, due to some plans to build large accelerator infrastructure in Poland. Large accelerator infrastructure of fundamental and applied research character stimulates around it the development and industrial applications as well as biomedical of advanced accelerators, material research and engineering, cryo-technology, mechatronics, robotics, and in particular electronics - like networked measurement and control systems, sensors, computer systems, automation and control systems. The paper presents a digest of the European project EuCARD2 which is Enhanced European Coordination for Accelerator Research and Development. The paper presents a digest of the research results and assumptions in the domain of accelerator science and technology in Europe, shown during the final fourth annual meeting of the EuCARD - European Coordination of Accelerator R&D, and the kick-off meeting of the EuCARD2. There are debated a few basic groups of accelerator systems components like: measurement - control networks of large geometrical extent, multichannel systems for large amounts of metrological data acquisition, precision photonic networks of reference time, frequency and phase distribution, high field magnets, superconducting cavities, novel beam collimators, etc. The paper bases on the following materials: Internet and Intranet documents combined with EuCARD2, Description of Work FP7 EuCARD-2 DoW-312453, 2013-02-13, and discussions and preparatory materials worked on by Eucard-2 initiators.
DOT National Transportation Integrated Search
2009-09-01
This study of the PublicPrivate Partnership Potential for ArizonaMexico Border Infrastructure Projects originated as an action item of the Transportation, Infrastructure, and Ports Committee of the ArizonaMexico Commission. The purpose of th...
NASA Astrophysics Data System (ADS)
Nidziy, Elena
2017-10-01
Dependence of the regional economic development from efficiency of financing of the construction of transport infrastructure is analyzed and proved in this article. Effective mechanism for infrastructure projects financing, public and private partnership, is revealed and its concrete forms are formulated. Here is proposed an optimal scenario for financing for the transport infrastructure, which can lead to positive transformations in the economy. Paper considers the advantages and risks of public and private partnership for subjects of contractual relations. At that, components for the assessment of economic effect of the implementation of infrastructure projects were proposed simultaneously with formulation of conditions for minimization risks. Results of the research could be used for solution of persistent problems in the development of transport infrastructure, issues of financial assurance of construction of infrastructure projects at the regional level.
Cloud Infrastructure & Applications - CloudIA
NASA Astrophysics Data System (ADS)
Sulistio, Anthony; Reich, Christoph; Doelitzscher, Frank
The idea behind Cloud Computing is to deliver Infrastructure-as-a-Services and Software-as-a-Service over the Internet on an easy pay-per-use business model. To harness the potentials of Cloud Computing for e-Learning and research purposes, and to small- and medium-sized enterprises, the Hochschule Furtwangen University establishes a new project, called Cloud Infrastructure & Applications (CloudIA). The CloudIA project is a market-oriented cloud infrastructure that leverages different virtualization technologies, by supporting Service-Level Agreements for various service offerings. This paper describes the CloudIA project in details and mentions our early experiences in building a private cloud using an existing infrastructure.
Geels, Mark J; Thøgersen, Regitze L; Guzman, Carlos A; Ho, Mei Mei; Verreck, Frank; Collin, Nicolas; Robertson, James S; McConkey, Samuel J; Kaufmann, Stefan H E; Leroy, Odile
2015-10-05
TRANSVAC was a collaborative infrastructure project aimed at enhancing European translational vaccine research and training. The objective of this four year project (2009-2013), funded under the European Commission's (EC) seventh framework programme (FP7), was to support European collaboration in the vaccine field, principally through the provision of transnational access (TNA) to critical vaccine research and development (R&D) infrastructures, as well as by improving and harmonising the services provided by these infrastructures through joint research activities (JRA). The project successfully provided all available services to advance 29 projects and, through engaging all vaccine stakeholders, successfully laid down the blueprint for the implementation of a permanent research infrastructure for early vaccine R&D in Europe. Copyright © 2015. Published by Elsevier Ltd.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, Barton
2014-06-30
Peta-scale computing environments pose significant challenges for both system and application developers and addressing them required more than simply scaling up existing tera-scale solutions. Performance analysis tools play an important role in gaining this understanding, but previous monolithic tools with fixed feature sets have not sufficed. Instead, this project worked on the design, implementation, and evaluation of a general, flexible tool infrastructure supporting the construction of performance tools as “pipelines” of high-quality tool building blocks. These tool building blocks provide common performance tool functionality, and are designed for scalability, lightweight data acquisition and analysis, and interoperability. For this project, wemore » built on Open|SpeedShop, a modular and extensible open source performance analysis tool set. The design and implementation of such a general and reusable infrastructure targeted for petascale systems required us to address several challenging research issues. All components needed to be designed for scale, a task made more difficult by the need to provide general modules. The infrastructure needed to support online data aggregation to cope with the large amounts of performance and debugging data. We needed to be able to map any combination of tool components to each target architecture. And we needed to design interoperable tool APIs and workflows that were concrete enough to support the required functionality, yet provide the necessary flexibility to address a wide range of tools. A major result of this project is the ability to use this scalable infrastructure to quickly create tools that match with a machine architecture and a performance problem that needs to be understood. Another benefit is the ability for application engineers to use the highly scalable, interoperable version of Open|SpeedShop, which are reassembled from the tool building blocks into a flexible, multi-user interface set of tools. This set of tools targeted at Office of Science Leadership Class computer systems and selected Office of Science application codes. We describe the contributions made by the team at the University of Wisconsin. The project built on the efforts in Open|SpeedShop funded by DOE/NNSA and the DOE/NNSA Tri-Lab community, extended Open|Speedshop to the Office of Science Leadership Class Computing Facilities, and addressed new challenges found on these cutting edge systems. Work done under this project at Wisconsin can be divided into two categories, new algorithms and techniques for debugging, and foundation infrastructure work on our Dyninst binary analysis and instrumentation toolkits and MRNet scalability infrastructure.« less
NASA Astrophysics Data System (ADS)
Gold, D.; Walter, M. T.; Watkins, L.; Kaufman, Z.; Meyer, A.; Mahaney, M.
2016-12-01
The concurrent threats posed by climate change and aging infrastructure have become of increasing concern in recent years. In the Northeastern US, storms such as Hurricane Irene and Super Storm Sandy have highlighted the vulnerability of infrastructure to extreme weather events, which are projected to become more frequent under future climate change scenarios. Road culverts are one type of infrastructure that is particularly vulnerable to such threats. Culverts allow roads to safely traverse small streams or drainage ditches, and their proper design is critical to ensuring a safe and reliable transportation network. Much of the responsibility for designing and maintaining road culverts lies at the local level, but many local governments lack the resources to quantify the vulnerability of their culverts to major storms. This study contributes a model designed to assist local governments in rapidly assessing the vulnerability of large numbers of culverts and identifies common characteristics of vulnerable culverts. Model inputs include culvert geometry and location data collected by trained local field teams. The model uses custom tools created in ArcGIS and Python to determine the maximum return period storm that each culvert can safely convey under current and projected future rainfall regimes. As a demonstration, over 1000 culverts in New York State were modeled. It was found that a significant percentage of modeled culverts failed to convey the current 5 year return period storm event (deemed a failure) and this percentage increased under projected future rainfall conditions. The model results were analyzed to determine correlations between culvert characteristics and failure. Characteristics investigated included watershed size, road type (state, county or local), affluence of the surrounding area and suitability for aquatic organism passage. Results from this study can be used by local governments to quantify and characterize the vulnerability of current infrastructure and prioritize future infrastructure investment.
Reeves, Lilith; Dunn‐Jensen, Linda M.; Baldwin, Timothy T.; Tatikonda, Mohan V.
2013-01-01
Abstract Biomedical research enterprises require a large number of core facilities and resources to supply the infrastructure necessary for translational research. Maintaining the financial viability and promoting efficiency in an academic environment can be particularly challenging for medical schools and universities. The Indiana Clinical and Translational Sciences Institute sought to improve core and service programs through a partnership with the Indiana University Kelley School of Business. The program paired teams of Masters of Business Administration students with cores and programs that self‐identified the need for assistance in project management, financial management, marketing, or resource efficiency. The projects were developed by CTSI project managers and business school faculty using service‐learning principles to ensure learning for students who also received course credit for their participation. With three years of experience, the program demonstrates a successful partnership that improves clinical research infrastructure by promoting business best practices and providing a valued learning experience for business students. PMID:23919365
Reeves, Lilith; Dunn-Jensen, Linda M; Baldwin, Timothy T; Tatikonda, Mohan V; Cornetta, Kenneth
2013-08-01
Biomedical research enterprises require a large number of core facilities and resources to supply the infrastructure necessary for translational research. Maintaining the financial viability and promoting efficiency in an academic environment can be particularly challenging for medical schools and universities. The Indiana Clinical and Translational Sciences Institute sought to improve core and service programs through a partnership with the Indiana University Kelley School of Business. The program paired teams of Masters of Business Administration students with cores and programs that self-identified the need for assistance in project management, financial management, marketing, or resource efficiency. The projects were developed by CTSI project managers and business school faculty using service-learning principles to ensure learning for students who also received course credit for their participation. With three years of experience, the program demonstrates a successful partnership that improves clinical research infrastructure by promoting business best practices and providing a valued learning experience for business students. © 2013 Wiley Periodicals, Inc.
Evolution of Safeguards over Time: Past, Present, and Projected Facilities, Material, and Budget
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kollar, Lenka; Mathews, Caroline E.
This study examines the past trends and evolution of safeguards over time and projects growth through 2030. The report documents the amount of nuclear material and facilities under safeguards from 1970 until present, along with the corresponding budget. Estimates for the future amount of facilities and material under safeguards are made according to non-nuclear-weapons states’ (NNWS) plans to build more nuclear capacity and sustain current nuclear infrastructure. Since nuclear energy is seen as a clean and economic option for base load electric power, many countries are seeking to either expand their current nuclear infrastructure, or introduce nuclear power. In ordermore » to feed new nuclear power plants and sustain existing ones, more nuclear facilities will need to be built, and thus more nuclear material will be introduced into the safeguards system. The projections in this study conclude that a zero real growth scenario for the IAEA safeguards budget will result in large resource gaps in the near future.« less
NASA Astrophysics Data System (ADS)
Favali, Paolo; Beranzoli, Laura; Best, Mairi; Franceschini, PierLuigi; Materia, Paola; Peppoloni, Silvia; Picard, John
2014-05-01
EMSO (European Multidisciplinary Seafloor and Water Column Observatory) is a large-scale European Research Infrastructure (RI). It is a geographically distributed infrastructure composed of several deep-seafloor and water-column observatories, which will be deployed at key sites in European waters, spanning from the Arctic, through the Atlantic and Mediterranean, to the Black Sea, with the basic scientific objective of real-time, long-term monitoring of environmental processes related to the interaction between the geosphere, biosphere and hydrosphere. EMSO is one of the environmental RIs on the ESFRI roadmap. The ESRFI Roadmap identifies new RIs of pan-European importance that correspond to the long term needs of European research communities. EMSO will be the sub-sea segment of the EU's large-scale Earth Observation program, Copernicus (previously known as GMES - Global Monitoring for Environment and Security) and will significantly enhance the observational capabilities of European member states. An open data policy compliant with the recommendations being developed within the GEOSS initiative (Global Earth Observation System of Systems) will allow for shared use of the infrastructure and the exchange of scientific information and knowledge. The processes that occur in the oceans have a direct impact on human societies, therefore it is crucial to improve our understanding of how they operate and interact. To encompass the breadth of these major processes, sustained and integrated observations are required that appreciate the interconnectedness of atmospheric, surface ocean, biological pump, deep-sea, and solid-Earth dynamics and that can address: • natural and anthropogenic change; • interactions between ecosystem services, biodiversity, biogeochemistry, physics, and climate; • impacts of exploration and extraction of energy, minerals, and living resources; • geo-hazard early warning capability for earthquakes, tsunamis, gas-hydrate release, and slope instability and failure; • connecting scientific outcomes to stakeholders and policy makers, including to government decision-makers. The development of a large research infrastructure initiatives like EMSO must continuously take into account wide-reaching environmental and socio-economic implications and objectives. For this reason, an Ethics Commitee was established early in EMSO's initial Preparatory Phase with responsibility for overseeing the key ethical and social aspects of the project. These include: • promoting inclusive science communication and data dissemination services to civil society according to Open Access principles; • guaranteeing top quality scientific information and data as results of top quality research; • promoting the increased adoption of eco-friendly, sustainable technologies through the dissemination of advanced scientific knowledge and best practices to the private sector and to policy makers; • developing Education Strategies in cooperation with academia and industry aimed at informing and sensitizing the general public on the environmental and socio-economic implications and benefits of large research infrastructure initiatives such as EMSO; • carrying out Excellent Science following strict criteria of research integrity, as expressed in the Montreal Statement (2013); • promoting Geo-ethical awareness and innovation by spurring innovative approaches in the management of environmental aspects of large research projects; • supporting technological Innovation by working closely in support of SMEs; • providing a constant, qualified and authoritative one-stop-shopping Reference Point and Advisory for politicians and decision-makers. The paper shows how Geoethics is an essential tool for guiding methodological and operational choices, and management of an European project with great impact on the environment and society.
DOT National Transportation Integrated Search
2016-10-01
Due to shale oil/gas recovery : operations, a large number : of truck trips on Louisiana : roadways are required for : transporting equipment and : materials to and from the : recovery sites. As a result, : roads and bridges that were : designed for ...
Ontology-Driven Provenance Management in eScience: An Application in Parasite Research
NASA Astrophysics Data System (ADS)
Sahoo, Satya S.; Weatherly, D. Brent; Mutharaju, Raghava; Anantharam, Pramod; Sheth, Amit; Tarleton, Rick L.
Provenance, from the French word "provenir", describes the lineage or history of a data entity. Provenance is critical information in scientific applications to verify experiment process, validate data quality and associate trust values with scientific results. Current industrial scale eScience projects require an end-to-end provenance management infrastructure. This infrastructure needs to be underpinned by formal semantics to enable analysis of large scale provenance information by software applications. Further, effective analysis of provenance information requires well-defined query mechanisms to support complex queries over large datasets. This paper introduces an ontology-driven provenance management infrastructure for biology experiment data, as part of the Semantic Problem Solving Environment (SPSE) for Trypanosoma cruzi (T.cruzi). This provenance infrastructure, called T.cruzi Provenance Management System (PMS), is underpinned by (a) a domain-specific provenance ontology called Parasite Experiment ontology, (b) specialized query operators for provenance analysis, and (c) a provenance query engine. The query engine uses a novel optimization technique based on materialized views called materialized provenance views (MPV) to scale with increasing data size and query complexity. This comprehensive ontology-driven provenance infrastructure not only allows effective tracking and management of ongoing experiments in the Tarleton Research Group at the Center for Tropical and Emerging Global Diseases (CTEGD), but also enables researchers to retrieve the complete provenance information of scientific results for publication in literature.
Recent developments in user-job management with Ganga
NASA Astrophysics Data System (ADS)
Currie, R.; Elmsheuser, J.; Fay, R.; Owen, P. H.; Richards, A.; Slater, M.; Sutcliffe, W.; Williams, M.
2015-12-01
The Ganga project was originally developed for use by LHC experiments and has been used extensively throughout Run1 in both LHCb and ATLAS. This document describes some the most recent developments within the Ganga project. There have been improvements in the handling of large scale computational tasks in the form of a new GangaTasks infrastructure. Improvements in file handling through using a new IGangaFile interface makes handling files largely transparent to the end user. In addition to this the performance and usability of Ganga have both been addressed through the development of a new queues system allows for parallel processing of job related tasks.
NASA Astrophysics Data System (ADS)
Rauser, Florian; Vamborg, Freja
2016-04-01
The interdisciplinary project on High Definition Clouds and Precipitation for advancing climate prediction HD(CP)2 (hdcp2.eu) is an example for the trend in fundamental research in Europe to increasingly focus on large national and international research programs that require strong scientific coordination. The current system has traditionally been host-based: project coordination activities and funding is placed at the host institute of the central lead PI of the project. This approach is simple and has the advantage of strong collaboration between project coordinator and lead PI, while exhibiting a list of strong, inherent disadvantages that are also mentioned in this session's description: no community best practice development, lack of integration between similar projects, inefficient methodology development and usage, and finally poor career development opportunities for the coordinators. Project coordinators often leave the project before it is finalized, leaving some of the fundamentally important closing processes to the PIs. This systematically prevents the creation of professional science management expertise within academia, which leads to an automatic imbalance that hinders the outcome of large research programs to help future funding decisions. Project coordinators in academia often do not work in a professional project office environment that could distribute activities and use professional tools and methods between different projects. Instead, every new project manager has to focus on methodological work anew (communication infrastructure, meetings, reporting), even though the technological needs of large research projects are similar. This decreases the efficiency of the coordination and leads to funding that is effectively misallocated. We propose to challenge this system by creating a permanent, virtual "Centre for Earth System Science Management CESSMA" (cessma.com), and changing the approach from host- based to centre-based. This should complement the current system, by creating permanent, sustained options for interactions between large research projects in similar fields. In the long run such a centre might improve on the host-based system because the centre-based solution allows multiple projects to be coordinated in conjunction by experienced science managers, using overlap in meeting organization, reporting, infrastructure, travel and so on. To still maintain close cooperation between project managers and lead PIs, we envision a virtual centre that creates extensive collaborative opportunities by organizing yearly retreats, a shared technical data base, et cetera. As "CESSMA" is work in progress (we have applied for funding for 2016-18), we would like to use this opportunity to discuss chances, potential problems, experiences and options for this attempt to institutionalise the very reason for this session: improved, coordinated, effective science coordination; and to create a central focal point for public / academia interactions.
Thomopoulos, N; Grant-Muller, S; Tight, M R
2009-11-01
Interest has re-emerged on the issue of how to incorporate equity considerations in the appraisal of transport projects and large road infrastructure projects in particular. This paper offers a way forward in addressing some of the theoretical and practical concerns that have presented difficulties to date in incorporating equity concerns in the appraisal of such projects. Initially an overview of current practice within transport regarding the appraisal of equity considerations in Europe is offered based on an extensive literature review. Acknowledging the value of a framework approach, research towards introducing a theoretical framework is then presented. The proposed framework is based on the well established MCA Analytic Hierarchy Process and is also contrasted with the use of a CBA based approach. The framework outlined here offers an additional support tool to decision makers who will be able to differentiate choices based on their views on specific equity principles and equity types. It also holds the potential to become a valuable tool for evaluators as a result of the option to assess predefined equity perspectives of decision makers against both the project objectives and the estimated project impacts. This framework may also be of further value to evaluators outside transport.
EIA application in China's expressway infrastructure: clarifying the decision-making hierarchy.
Zhou, Kai-Yi; Sheate, William R
2011-06-01
China's EIA Law came into effect in 2003 and formally requires road transport infrastructure development actions to be subject to Environmental Impact Assessment (EIA). EIAs (including project EIA and plan EIA, or strategic environmental impact assessment, SEA) have been being widely applied in the expressway infrastructure planning field. Among those applications, SEA is applied to provincial level expressway network (PLEI) plans, and project EIA is applied to expressway infrastructure development 'projects' under PLEI plans. Three case studies (one expressway project EIA and two PLEI plan SEAs) were examined to understand currently how EIAs are applied to expressway infrastructure development planning. Through the studies, a number of problems that significantly influence the quality of EIA application in the field were identified. The reasons causing those problems are analyzed and possible solutions are suggested aimed at enhancing EIA practice, helping deliver better decision-making and ultimately improving the environmental performance of expressway infrastructure. Copyright © 2010 Elsevier Ltd. All rights reserved.
Geophysical methods for road construction and maintenance
NASA Astrophysics Data System (ADS)
Rasul, Hedi; Karlson, Caroline; Jamali, Imran; Earon, Robert; Olofsson, Bo
2015-04-01
Infrastructure, such as road transportation, is a vital in civilized societies; which need to be constructed and maintained regularly. A large part of the project cost is attributed to subsurface conditions, where unsatisfactory conditions could increase either the geotechnical stabilization measures needed or the design cost itself. A way to collect information of the subsurface and existing installations which can lead to measures reducing the project cost and damage is to use geophysical methods during planning, construction and maintenance phases. The moisture in road layers is an important factor, which will affect the bearing capacity of the construction as well as the maintenances. Moisture in the road is a key factor for a well-functioning road. On the other hand the excessive moisture is the main reason of road failure and problems. From a hydrological point of view geophysical methods could help road planners identify the water table, geological strata, pollution arising from the road and the movement of the pollution before, during and after construction. Geophysical methods also allow road planners to collect valuable data for a large area without intrusive investigations such as with boreholes, i.e. minimizing the environmental stresses and costs. However, it is important to specify the investigation site and to choose the most appropriate geophysical method based on the site chosen and the objective of the investigation. Currently, numerous construction and rehabilitation projects are taking places around the world. Many of these projects are focused on infrastructural development, comprising both new projects and expansion of the existing infrastructural network. Geophysical methods can benefit these projects greatly during all phases. During the construction phase Ground Penetrating radar (GPR) is very useful in combination with Electrical Resistivity (ER) for detecting soil water content and base course compaction. However, ER and Electromagnetic (EM) methods can also be used for monitoring changes in water content and pollutant spreading during the maintenance phase. The objective of this study was to describe various geophysical methods which could benefit the road planning, construction and maintenance phases focusing on hydrological impacts.
Landauer, Mia; Komendantova, Nadejda
2018-06-22
Several infrastructure projects are under development or already operational across the Arctic region. Often the deployment of such projects creates benefits at the national, regional, or global scales. However, local communities can experience negative impacts due to the requirements for extensive land areas, which cause pressure on traditional land use. Public participation in environmental planning such as Environmental Impact Assessment (EIA) enables local communities to provide feedback on the environmental, social, and economic challenges of infrastructure projects. Ideally, participation can improve the means of social learning for all involved parties and help to co-develop sustainable solutions. The subject of our research is reindeer herders' participation in EIA procedures of mines and wind farms in Finland because these types of projects affect reindeer husbandry. We study empirically how stakeholders involved in the EIAs perceive the participation of reindeer herders in the planning and implementation of infrastructure projects, and how these differ from the perceptions of the reindeer herders who are affected by the infrastructure projects. Our qualitative data is based on in-depth semi-structured interviews (N = 31) with members of the industry sector, consultants, governmental authorities, and representatives of local communities; in this study, the reindeer herders. The results show that herders' level of participation in the EIAs and the benefits and challenges of participation are perceived differently. Furthermore, the regulatory framework does not adequately ensure that the developer carries social and environmental responsibilities throughout the infrastructure project's lifetime, and that regular communication with herders will also be maintained after the EIAs. Herders' expertise should be used throughout the project lifetime. For example, more attention should be paid to both negotiating possible options for compensation and monitoring mechanisms when the infrastructure projects are pre-screened for the EIAs, as well as to co-designing the different project alternatives with herders for the EIAs. Copyright © 2018 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chowdhury, Farhat Jahan; Amin, A.T.M. Nurul
2006-08-15
This paper reports findings from a study on slum improvement projects to show the difference that environmental assessment (EA) can make in such interventions and to suggest mechanisms for its integration into such projects. The findings are based on a field survey that was carried out in two slums of Dhaka where infrastructure projects were implemented. In one slum, the EA process was considered in designing and locating infrastructure and in the other it was not. The survey results traced the severe problems that existed in both slums before the implementation of infrastructure improvement projects and reveal that after themore » intervention the situation has considerably improved in the slum where EA was conducted. In contrast, some problems still persist in the other slum where EA was not considered. To make it worse, the newly built infrastructures have even given rise to a set of new problems. In order to avoid such negative outcomes from development interventions, the paper finally develops the mechanism for integration of EA into slum improvement project.« less
Development of Armenian-Georgian Virtual Observatory
NASA Astrophysics Data System (ADS)
Mickaelian, Areg; Kochiashvili, Nino; Astsatryan, Hrach; Harutyunian, Haik; Magakyan, Tigran; Chargeishvili, Ketevan; Natsvlishvili, Rezo; Kukhianidze, Vasil; Ramishvili, Giorgi; Sargsyan, Lusine; Sinamyan, Parandzem; Kochiashvili, Ia; Mikayelyan, Gor
2009-10-01
The Armenian-Georgian Virtual Observatory (ArGVO) project is the first initiative in the world to create a regional VO infrastructure based on national VO projects and regional Grid. The Byurakan and Abastumani Astrophysical Observatories are scientific partners since 1946, after establishment of the Byurakan observatory . The Armenian VO project (ArVO) is being developed since 2005 and is a part of the International Virtual Observatory Alliance (IVOA). It is based on the Digitized First Byurakan Survey (DFBS, the digitized version of famous Markarian survey) and other Armenian archival data. Similarly, the Georgian VO will be created to serve as a research environment to utilize the digitized Georgian plate archives. Therefore, one of the main goals for creation of the regional VO is the digitization of large amounts of plates preserved at the plate stacks of these two observatories. The total amount of plates is more than 100,000 units. Observational programs of high importance have been selected and some 3000 plates will be digitized during the next two years; the priority is being defined by the usefulness of the material for future science projects, like search for new objects, optical identifications of radio, IR, and X-ray sources, study of variability and proper motions, etc. Having the digitized material in VO standards, a VO database through the regional Grid infrastructure will be active. This partnership is being carried out in the framework of the ISTC project A-1606 "Development of Armenian-Georgian Grid Infrastructure and Applications in the Fields of High Energy Physics, Astrophysics and Quantum Physics".
Current and future flood risk to railway infrastructure in Europe
NASA Astrophysics Data System (ADS)
Bubeck, Philip; Kellermann, Patric; Alfieri, Lorenzo; Feyen, Luc; Dillenardt, Lisa; Thieken, Annegret H.
2017-04-01
Railway infrastructure plays an important role in the transportation of freight and passengers across the European Union. According to Eurostat, more than four billion passenger-kilometres were travelled on national and international railway lines of the EU28 in 2014. To further strengthen transport infrastructure in Europe, the European Commission will invest another € 24.05 billion in the transnational transport network until 2020 as part of its new transport infrastructure policy (TEN-T), including railway infrastructure. Floods pose a significant risk to infrastructure elements. Damage data of recent flood events in Europe show that infrastructure losses can make up a considerable share of overall losses. For example, damage to state and municipal infrastructure in the federal state of Saxony (Germany) accounted for nearly 60% of overall losses during the large-scale event in June 2013. Especially in mountainous areas with little usable space available, roads and railway lines often follow floodplains or are located along steep and unsteady slopes. In Austria, for instance, the flood of 2013 caused € 75 million of direct damage to railway infrastructure. Despite the importance of railway infrastructure and its exposure to flooding, assessments of potential damage and risk (i.e. probability * damage) are still in its infancy compared with other sectors, such as the residential or industrial sector. Infrastructure-specific assessments at the regional scale are largely lacking. Regional assessment of potential damage to railway infrastructure has been hampered by a lack of infrastructure-specific damage models and data availability. The few available regional approaches have used damage models that assess damage to various infrastructure elements (e.g. roads, railway, airports and harbours) using one aggregated damage function and cost estimate. Moreover, infrastructure elements are often considerably underrepresented in regional land cover data, such as CORINE, due to their line shapes. To assess current and future damage and risk to railway infrastructure in Europe, we apply the damage model RAIL -' RAilway Infrastructure Loss' that was specifically developed for railway infrastructure using empirical damage data. To adequately and comprehensively capture the line-shaped features of railway infrastructure, the assessment makes use of the open-access data set of openrailway.org. Current and future flood hazard in Europe is obtained with the LISFLOOD-based pan-European flood hazard mapping procedure combined with ensemble projections of extreme streamflow for the current century based on EURO-CORDEX RCP 8.5 climate scenarios. The presentation shows first results of the combination of the hazard data and the model RAIL for Europe.
Large Payload Transportation and Test Considerations
NASA Technical Reports Server (NTRS)
Rucker, Michelle A.; Pope, James C.
2011-01-01
Ironically, the limiting factor to a national heavy lift strategy may not be the rocket technology needed to throw a heavy payload, but rather the terrestrial infrastructure - roads, bridges, airframes, and buildings - necessary to transport, acceptance test, and process large spacecraft. Failure to carefully consider how large spacecraft are designed, and where they are manufactured, tested, or launched, could result in unforeseen cost to modify/develop infrastructure, or incur additional risk due to increased handling or elimination of key verifications. During test and verification planning for the Altair project, a number of transportation and test issues related to the large payload diameter were identified. Although the entire Constellation Program - including Altair - was canceled in the 2011 NASA budget, issues identified by the Altair project serve as important lessons learned for future payloads that may be developed to support national "heavy lift" strategies. A feasibility study performed by the Constellation Ground Operations (CxGO) project found that neither the Altair Ascent nor Descent Stage would fit inside available transportation aircraft. Ground transportation of a payload this large over extended distances is generally not permitted by most states, so overland transportation alone would not have been an option. Limited ground transportation to the nearest waterway may be permitted, but water transportation could take as long as 66 days per production unit, depending on point of origin and acceptance test facility; transportation from the western United States would require transit through the Panama Canal to access the Kennedy Space Center launch site. Large payloads also pose acceptance test and ground processing challenges. Although propulsion, mechanical vibration, and reverberant acoustic test facilities at NASA s Plum Brook Station have been designed to accommodate large spacecraft, special handling and test work-arounds may be necessary, which could increase cost, schedule, and technical risk. Once at the launch site, there are no facilities currently capable of accommodating the combination of large payload size and hazardous processing (which includes hypergolic fuels, pyrotechnic devices, and high pressure gasses).
Quantifying economic benefits for rail infrastructure projects.
DOT National Transportation Integrated Search
2014-10-01
This project identifies metrics for measuring the benefit of rail infrastructure projects for key : stakeholders. It is important that stakeholders with an interest in community economic development play an active : role in the development of the rai...
NASA Astrophysics Data System (ADS)
Dhakal, N.; Jain, S.
2013-12-01
Rare and unusually large events (such as hurricanes and floods) can create unusual and interesting trends in statistics. Generalized Extreme Value (GEV) distribution is usually used to statistically describe extreme rainfall events. A number of the recent studies have shown that the frequency of extreme rainfall events has increased over the last century and as a result, there has been change in parameters of GEV distribution with the time (non-stationary). But what impact does a single unusually large rainfall event (e.g., hurricane Irene) have on the GEV parameters and consequently on the level of risks or the return periods used in designing the civil infrastructures? In other words, if such a large event occurs today, how will it influence the level of risks (estimated based on past rainfall records) for the civil infrastructures? To answer these questions, we performed sensitivity analysis of the distribution parameters of GEV as well as the return periods to unusually large outlier events. The long-term precipitation records over the period of 1981-2010 from 12 USHCN stations across the state of Maine were used for analysis. For most of the stations, addition of each outlier event caused an increase in the shape parameter with a huge decrease on the corresponding return period. This is a key consideration for time-varying engineering design. These isolated extreme weather events should simultaneously be considered with traditional statistical methodology related to extreme events while designing civil infrastructures (such as dams, bridges, and culverts). Such analysis is also useful in understanding the statistical uncertainty of projecting extreme events into future.
SEE-GRID eInfrastructure for Regional eScience
NASA Astrophysics Data System (ADS)
Prnjat, Ognjen; Balaz, Antun; Vudragovic, Dusan; Liabotis, Ioannis; Sener, Cevat; Marovic, Branko; Kozlovszky, Miklos; Neagu, Gabriel
In the past 6 years, a number of targeted initiatives, funded by the European Commission via its information society and RTD programmes and Greek infrastructure development actions, have articulated a successful regional development actions in South East Europe that can be used as a role model for other international developments. The SEEREN (South-East European Research and Education Networking initiative) project, through its two phases, established the SEE segment of the pan-European G ´EANT network and successfully connected the research and scientific communities in the region. Currently, the SEE-LIGHT project is working towards establishing a dark-fiber backbone that will interconnect most national Research and Education networks in the region. On the distributed computing and storage provisioning i.e. Grid plane, the SEE-GRID (South-East European GRID e-Infrastructure Development) project, similarly through its two phases, has established a strong human network in the area of scientific computing and has set up a powerful regional Grid infrastructure, and attracted a number of applications from different fields from countries throughout the South-East Europe. The current SEEGRID-SCI project, ending in April 2010, empowers the regional user communities from fields of meteorology, seismology and environmental protection in common use and sharing of the regional e-Infrastructure. Current technical initiatives in formulation are focusing on a set of coordinated actions in the area of HPC and application fields making use of HPC initiatives. Finally, the current SEERA-EI project brings together policy makers - programme managers from 10 countries in the region. The project aims to establish a communication platform between programme managers, pave the way towards common e-Infrastructure strategy and vision, and implement concrete actions for common funding of electronic infrastructures on the regional level. The regional vision on establishing an e-Infrastructure compatible with European developments, and empowering the scientists in the region in equal participation in the use of pan- European infrastructures, is materializing through the above initiatives. This model has a number of concrete operational and organizational guidelines which can be adapted to help e-Infrastructure developments in other world regions. In this paper we review the most important developments and contributions by the SEEGRID- SCI project.
Integrating sea floor observatory data: the EMSO data infrastructure
NASA Astrophysics Data System (ADS)
Huber, Robert; Azzarone, Adriano; Carval, Thierry; Doumaz, Fawzi; Giovanetti, Gabriele; Marinaro, Giuditta; Rolin, Jean-Francois; Beranzoli, Laura; Waldmann, Christoph
2013-04-01
The European research infrastructure EMSO is a European network of fixed-point, deep-seafloor and water column observatories deployed in key sites of the European Continental margin and Arctic. It aims to provide the technological and scientific framework for the investigation of the environmental processes related to the interaction between the geosphere, biosphere, and hydrosphere and for a sustainable management by long-term monitoring also with real-time data transmission. Since 2006, EMSO is on the ESFRI (European Strategy Forum on Research Infrastructures) roadmap and has entered its construction phase in 2012. Within this framework, EMSO is contributing to large infrastructure integration projects such as ENVRI and COOPEUS. The EMSO infrastructure is geographically distributed in key sites of European waters, spanning from the Arctic, through the Atlantic and Mediterranean Sea to the Black Sea. It is presently consisting of thirteen sites which have been identified by the scientific community according to their importance respect to Marine Ecosystems, Climate Changes and Marine GeoHazards. The data infrastructure for EMSO is being designed as a distributed system. Presently, EMSO data collected during experiments at each EMSO site are locally stored and organized in catalogues or relational databases run by the responsible regional EMSO nodes. Three major institutions and their data centers are currently offering access to EMSO data: PANGAEA, INGV and IFREMER. In continuation of the IT activities which have been performed during EMSOs twin project ESONET, EMSO is now implementing the ESONET data architecture within an operational EMSO data infrastructure. EMSO aims to be compliant with relevant marine initiatives such as MyOceans, EUROSITES, EuroARGO, SEADATANET and EMODNET as well as to meet the requirements of international and interdisciplinary projects such as COOPEUS and ENVRI, EUDAT and iCORDI. A major focus is therefore set on standardization and interoperability of the EMSO data infrastructure. Beneath common standards for metadata exchange such as OpenSearch or OAI-PMH, EMSO has chosen to implement core standards of the Open Geospatial Consortium (OGC) Sensor Web Enablement (SWE) suite of standards, such as Catalogue Service for Web (CS-W), Sensor Observation Service (SOS) and Observations and Measurements (O&M). Further, strong integration efforts are currently undertaken to harmonize data formats e.g NetCDF as well as the used ontologies and terminologies. The presentation will also give information to users about the discovery and visualization procedure for the EMSO data presently available.
Open | SpeedShop: An Open Source Infrastructure for Parallel Performance Analysis
Schulz, Martin; Galarowicz, Jim; Maghrak, Don; ...
2008-01-01
Over the last decades a large number of performance tools has been developed to analyze and optimize high performance applications. Their acceptance by end users, however, has been slow: each tool alone is often limited in scope and comes with widely varying interfaces and workflow constraints, requiring different changes in the often complex build and execution infrastructure of the target application. We started the Open | SpeedShop project about 3 years ago to overcome these limitations and provide efficient, easy to apply, and integrated performance analysis for parallel systems. Open | SpeedShop has two different faces: it provides an interoperable tool set covering themore » most common analysis steps as well as a comprehensive plugin infrastructure for building new tools. In both cases, the tools can be deployed to large scale parallel applications using DPCL/Dyninst for distributed binary instrumentation. Further, all tools developed within or on top of Open | SpeedShop are accessible through multiple fully equivalent interfaces including an easy-to-use GUI as well as an interactive command line interface reducing the usage threshold for those tools.« less
VLTI: First Light for the Second Generation
NASA Astrophysics Data System (ADS)
Woillez, J.; Gonté, F.; Abad, J. A.; Abadie, S.; Abuter, R.; Accardo, M.; Acuña, M.; Alonso, J.; Andolfato, L.; Avila, G.; Barriga, P. J.; Beltran, J.; Berger, J.-P.; Bollados, C.; Bourget, P.; Brast, R.; Bristow, P.; Caniguante, L.; Castillo, R.; Conzelmann, R.; Cortes, A.; Delplancke, F.; Dell Valle, D.; Derie, F.; Diaz, A.; Donoso, R.; Duhoux, Ph.; Dupuy, C.; Elao, C.; Egner, S.; Fuenteseca, E.; Fernandez, R.; Gaytan, D.; Glindemann, A.; Gonzales, J.; Guisard, S.; Hagenauer, P.; Haimerl, A.; Heinz, V.; Henriquez, J. P.; van der Heyden, P.; Hubin, N.; Huerta, R.; Jochum, L.; Kirchbauer, J.-P.; Leiva, A.; Lévêque, S.; Lizon, J.-P.; Luco, F.; Mardones, P.; Mellado, A.; Mérand, A.; Osorio, J.; Ott, J.; Pallanca, L.; Pavez, M.; Pasquini, L.; Percheron, I.; Pirard, J.-F.; Phan, D. T.; Pineda, J. C.; Pino, A.; Poupar, S.; Ramírez, A.; Reinero, C.; Riquelme, M.; Romero, J.; Rivinius, Th.; Rojas, C.; Rozas, F.; Salgado, F.; Schöller, M.; Schuhler, N.; Siclari, W.; Stephan, C.; Tamblay, R.; Tapia, M.; Tristram, K.; Valdes, G.; de Wit, W.-J.; Wright, A.; Zins, G.
2015-12-01
The Very Large Telescope Interferometer (VLTI) stopped operation on 4 March 2015 with the objective of upgrading its infrastructure in preparation for the second generation VLTI instruments GRAVITY and MATISSE. A brief account of the eight bustling months it took our interferometer to metamorphose into its second generation, under the supervision of the VLTI Facility Project, is presented.
JTEC panel on display technologies in Japan
NASA Technical Reports Server (NTRS)
Tannas, Lawrence E., Jr.; Glenn, William E.; Credelle, Thomas; Doane, J. William; Firester, Arthur H.; Thompson, Malcolm
1992-01-01
This report is one in a series of reports that describes research and development efforts in Japan in the area of display technologies. The following are included in this report: flat panel displays (technical findings, liquid crystal display development and production, large flat panel displays (FPD's), electroluminescent displays and plasma panels, infrastructure in Japan's FPD industry, market and projected sales, and new a-Si active matrix liquid crystal display (AMLCD) factory); materials for flat panel displays (liquid crystal materials, and light-emissive display materials); manufacturing and infrastructure of active matrix liquid crystal displays (manufacturing logistics and equipment); passive matrix liquid crystal displays (LCD basics, twisted nematics LCD's, supertwisted nematic LCD's, ferroelectric LCD's, and a comparison of passive matrix LCD technology); active matrix technology (basic active matrix technology, investment environment, amorphous silicon, polysilicon, and commercial products and prototypes); and projection displays (comparison of Japanese and U.S. display research, and technical evaluation of work).
The role of ethics in data governance of large neuro-ICT projects.
Stahl, Bernd Carsten; Rainey, Stephen; Harris, Emma; Fothergill, B Tyr
2018-05-14
We describe current practices of ethics-related data governance in large neuro-ICT projects, identify gaps in current practice, and put forward recommendations on how to collaborate ethically in complex regulatory and normative contexts. We undertake a survey of published principles of data governance of large neuro-ICT projects. This grounds an approach to a normative analysis of current data governance approaches. Several ethical issues are well covered in the data governance policies of neuro-ICT projects, notably data protection and attribution of work. Projects use a set of similar policies to ensure users behave appropriately. However, many ethical issues are not covered at all. Implementation and enforcement of policies remain vague. The data governance policies we investigated indicate that the neuro-ICT research community is currently close-knit and that shared assumptions are reflected in infrastructural aspects. This explains why many ethical issues are not explicitly included in data governance policies at present. With neuro-ICT research growing in scale, scope, and international involvement, these shared assumptions should be made explicit and reflected in data governance.
NASA Astrophysics Data System (ADS)
Costa, Luís; Monteiro, José Paulo; Leitão, Teresa; Lobo-Ferreira, João Paulo; Oliveira, Manuel; Martins de Carvalho, José; Martins de Carvalho, Tiago; Agostinho, Rui
2015-04-01
The Campina de Faro (CF) aquifer system, located on the south coast of Portugal, is an important source of groundwater, mostly used for agriculture purposes. In some areas, this multi-layered aquifer is contaminated with high concentration of nitrates, possibly arising from excessive usage of fertilizers, reaching to values as high as 300 mg/L. In order to tackle this problem, Managed Aquifer Recharge (MAR) techniques are being applied at demonstration scale to improve groundwater quality through aquifer recharge, in both infiltration basins at the river bed of ephemeral river Rio Seco and existing traditional large diameter wells located in this aquifer. In order to assess the infiltration capacity of the existing infrastructures, in particular infiltration basins and large diameter wells at CF aquifer, infiltration tests were performed, indicating a high infiltration capacity of the existing infrastructures. Concerning the sources of water for recharge, harvested rainwater at greenhouses was identified in CF aquifer area as one of the main potential sources for aquifer recharge, once there is a large surface area occupied by these infrastructures at the demo site. This potential source of water could, in some cases, be redirected to the large diameter wells or to the infiltration basins at the riverbed of Rio Seco. Estimates of rainwater harvested at greenhouses were calculated based on a 32 year average rainfall model and on the location of the greenhouses and their surface areas, the latter based on aerial photograph. Potential estimated annual rainwater intercepted by greenhouses at CF aquifer accounts an average of 1.63 hm3/year. Nonetheless it is unlikely that the totality of this amount can be harvested, collected and redirected to aquifer recharge infrastructures, for several reasons, such as the lack of appropriate greenhouse infrastructures, conduits or a close location between greenhouses and large diameter wells and infiltration basins. Anyway, this value is a good indication of the total amount of the harvested rainfall that could be considered for future MAR solutions. Given the estimates on the greenhouse harvested rainwater and the infiltration capacity of the infiltration basins and large diameter wells, it is intended to develop groundwater flow models in order to assess the nitrate washing rate in the CF aquifer. This work is being developed under the scope of MARSOL Project (MARSOL-GA-2013-619120), in which Campina de Faro aquifer system is one of the several case studies. This project aims to demonstrate that MAR is a sound, safe and sustainable strategy that can be applied with great confidence in finding solutions to water scarcity in Southern Europe.
NASA Astrophysics Data System (ADS)
Pellegrin, F.; Jeram, B.; Haucke, J.; Feyrin, S.
2016-07-01
The paper describes the introduction of a new automatized build and test infrastructure, based on the open-source software Jenkins1, into the ESO Very Large Telescope control software to replace the preexisting in-house solution. A brief introduction to software quality practices is given, a description of the previous solution, the limitations of it and new upcoming requirements. Modifications required to adapt the new system are described, how these were implemented to current software and the results obtained. An overview on how the new system may be used in future projects is also presented.
NASA Astrophysics Data System (ADS)
Schaap, Dick M. A.; Fichaut, Michele
2013-04-01
The second phase of the project SeaDataNet started on October 2011 for another 4 years with the aim to upgrade the SeaDataNet infrastructure built during previous years. The numbers of the project are quite impressive: 59 institutions from 35 different countries are involved. In particular, 45 data centers are sharing human and financial resources in a common efforts to sustain an operationally robust and state-of-the-art Pan-European infrastructure for providing up-to-date and high quality access to ocean and marine metadata, data and data products. The main objective of SeaDataNet II is to improve operations and to progress towards an efficient data management infrastructure able to handle the diversity and large volume of data collected via the Pan-European oceanographic fleet and the new observation systems, both in real-time and delayed mode. The infrastructure is based on a semi-distributed system that incorporates and enhance the existing NODCs network. SeaDataNet aims at serving users from science, environmental management, policy making, and economical sectors. Better integrated data systems are vital for these users to achieve improved scientific research and results, to support marine environmental and integrated coastal zone management, to establish indicators of Good Environmental Status for sea basins, and to support offshore industry developments, shipping, fisheries, and other economic activities. The recent EU communication "MARINE KNOWLEDGE 2020 - marine data and observation for smart and sustainable growth" states that the creation of marine knowledge begins with observation of the seas and oceans. In addition, directives, policies, science programmes require reporting of the state of the seas and oceans in an integrated pan-European manner: of particular note are INSPIRE, MSFD, WISE-Marine and GMES Marine Core Service. These underpin the importance of a well functioning marine and ocean data management infrastructure. SeaDataNet is now one of the major players in informatics in oceanography and collaborative relationships have been created with other EU and non EU projects. In particular SeaDataNet has recognised roles in the continuous serving of common vocabularies, the provision of tools for data management, as well as giving access to metadata, data sets and data products of importance for society. The SeaDataNet infrastructure comprises a network of interconnected data centres and a central SeaDataNet portal. The portal provides users not only background information about SeaDataNet and the various SeaDataNet standards and tools, but also a unified and transparent overview of the metadata and controlled access to the large collections of data sets, managed by the interconnected data centres. The presentation will give information on present services of the SeaDataNet infrastructure and services, and highlight a number of key achievements in SeaDataNet II so far.
Evaluation of Future Internet Technologies for Processing and Distribution of Satellite Imagery
NASA Astrophysics Data System (ADS)
Becedas, J.; Perez, R.; Gonzalez, G.; Alvarez, J.; Garcia, F.; Maldonado, F.; Sucari, A.; Garcia, J.
2015-04-01
Satellite imagery data centres are designed to operate a defined number of satellites. For instance, difficulties when new satellites have to be incorporated in the system appear. This occurs because traditional infrastructures are neither flexible nor scalable. With the appearance of Future Internet technologies new solutions can be provided to manage large and variable amounts of data on demand. These technologies optimize resources and facilitate the appearance of new applications and services in the traditional Earth Observation (EO) market. The use of Future Internet technologies for the EO sector were validated with the GEO-Cloud experiment, part of the Fed4FIRE FP7 European project. This work presents the final results of the project, in which a constellation of satellites records the whole Earth surface on a daily basis. The satellite imagery is downloaded into a distributed network of ground stations and ingested in a cloud infrastructure, where the data is processed, stored, archived and distributed to the end users. The processing and transfer times inside the cloud, workload of the processors, automatic cataloguing and accessibility through the Internet are evaluated to validate if Future Internet technologies present advantages over traditional methods. Applicability of these technologies is evaluated to provide high added value services. Finally, the advantages of using federated testbeds to carry out large scale, industry driven experiments are analysed evaluating the feasibility of an experiment developed in the European infrastructure Fed4FIRE and its migration to a commercial cloud: SoftLayer, an IBM Company.
NASA Astrophysics Data System (ADS)
Vatcha, Rashna; Lee, Seok-Won; Murty, Ajeet; Tolone, William; Wang, Xiaoyu; Dou, Wenwen; Chang, Remco; Ribarsky, William; Liu, Wanqiu; Chen, Shen-en; Hauser, Edd
2009-05-01
Infrastructure management (and its associated processes) is complex to understand, perform and thus, hard to make efficient and effective informed decisions. The management involves a multi-faceted operation that requires the most robust data fusion, visualization and decision making. In order to protect and build sustainable critical assets, we present our on-going multi-disciplinary large-scale project that establishes the Integrated Remote Sensing and Visualization (IRSV) system with a focus on supporting bridge structure inspection and management. This project involves specific expertise from civil engineers, computer scientists, geographers, and real-world practitioners from industry, local and federal government agencies. IRSV is being designed to accommodate the essential needs from the following aspects: 1) Better understanding and enforcement of complex inspection process that can bridge the gap between evidence gathering and decision making through the implementation of ontological knowledge engineering system; 2) Aggregation, representation and fusion of complex multi-layered heterogeneous data (i.e. infrared imaging, aerial photos and ground-mounted LIDAR etc.) with domain application knowledge to support machine understandable recommendation system; 3) Robust visualization techniques with large-scale analytical and interactive visualizations that support users' decision making; and 4) Integration of these needs through the flexible Service-oriented Architecture (SOA) framework to compose and provide services on-demand. IRSV is expected to serve as a management and data visualization tool for construction deliverable assurance and infrastructure monitoring both periodically (annually, monthly, even daily if needed) as well as after extreme events.
The Effect of Pixel Size on the Accuracy of Orthophoto Production
NASA Astrophysics Data System (ADS)
Kulur, S.; Yildiz, F.; Selcuk, O.; Yildiz, M. A.
2016-06-01
In our country, orthophoto products are used by the public and private sectors for engineering services and infrastructure projects, Orthophotos are particularly preferred due to faster and are more economical production according to vector digital photogrammetric production. Today, digital orthophotos provide an expected accuracy for engineering and infrastructure projects. In this study, the accuracy of orthophotos using pixel sizes with different sampling intervals are tested for the expectations of engineering and infrastructure projects.
Mapping (un)certainties in the sign of hydrological projections
NASA Astrophysics Data System (ADS)
Melsen, Lieke; Addor, Nans; Mizukami, Naoki; Newman, Andrew; Torfs, Paul; Clark, Martyn; Uijlenhoet, Remko; Teuling, Ryan
2017-04-01
While hydrological projections are of vital importance, particularly for water infrastructure design and food production, they are also prone to different sources of uncertainty. Using a multi-model set-up we investigated the uncertainty in hydrological projections for the period 2070-2100 associated with the parameterization of hydrological models, hydrological model structure, and General Circulation Models (GCMs) needed to force the hydrological model, for 605 basins throughout the contiguous United States. The use of such a large sample of basins gave us the opportunity to recognize spatial patterns in the results, and to attribute the uncertainty to particular hydrological processes. We investigated the sign of the projected change in mean annual runoff. The parameterization influenced the sign of change in 5 to 34% of the basins, depending on the hydrological model and GCM forcing. The hydrological model structure led to uncertainty in the sign of the change in 13 to 26% of the basins, depending on GCM forcing. This uncertainty could largely be attributed to the conceptualization of snow processes in the hydrological models. In 14% of the basins, none of the hydrological models was behavioural, which could be related to catchments with high aridity and intermittent flow behaviour. In 41 to 69% of the basins, the sign of the change was uncertain due to GCM forcing, which could be attributed to disagreement among the climate models regarding the projected change in precipitation. The results demonstrate that even the sign of change in mean annual runoff is highly uncertain in the majority of the investigated basins. If we want to use hydrological projections for water management purposes, including the design of water infrastructure, we clearly need to increase our understanding of climate and hydrological processes and their feedbacks.
Strengthening Software Authentication with the ROSE Software Suite
DOE Office of Scientific and Technical Information (OSTI.GOV)
White, G
2006-06-15
Many recent nonproliferation and arms control software projects include a software authentication regime. These include U.S. Government-sponsored projects both in the United States and in the Russian Federation (RF). This trend toward requiring software authentication is only accelerating. Demonstrating assurance that software performs as expected without hidden ''backdoors'' is crucial to a project's success. In this context, ''authentication'' is defined as determining that a software package performs only its intended purpose and performs said purpose correctly and reliably over the planned duration of an agreement. In addition to visual inspections by knowledgeable computer scientists, automated tools are needed to highlightmore » suspicious code constructs, both to aid visual inspection and to guide program development. While many commercial tools are available for portions of the authentication task, they are proprietary and not extensible. An open-source, extensible tool can be customized to the unique needs of each project (projects can have both common and custom rules to detect flaws and security holes). Any such extensible tool has to be based on a complete language compiler. ROSE is precisely such a compiler infrastructure developed within the Department of Energy (DOE) and targeted at the optimization of scientific applications and user-defined libraries within large-scale applications (typically applications of a million lines of code). ROSE is a robust, source-to-source analysis and optimization infrastructure currently addressing large, million-line DOE applications in C and C++ (handling the full C, C99, C++ languages and with current collaborations to support Fortran90). We propose to extend ROSE to address a number of security-specific requirements, and apply it to software authentication for nonproliferation and arms control projects.« less
2005-06-01
Logistics, BA-5590, BB- 390, BB-2590, PVPC, Iraq, Power Grid, Infrastructure, Cost Estimate, Photovoltaic Power Conversion (PVPC), MPPT 16. PRICE...the cost and feasibility of using photovoltaic (PV) solar power to assist in the rebuilding of the Iraqi electrical infrastructure. This project...cost and feasibility of using photovoltaic (PV) solar power to assist in the rebuilding of the Iraqi infrastructure. The project examines available
Use of NARCCAP results for extremes: British Columbia case studies
NASA Astrophysics Data System (ADS)
Murdock, T. Q.; Eckstrand, H.; Buerger, G.; Hiebert, J.
2011-12-01
Demand for projections of extremes has arisen out of local infrastructure vulnerability assessments and adaptation planning. Four preliminary analyses of extremes have been undertaken in British Columbia in the past two years in collaboration with users: BC Ministry of Transportation and Infrastructure, Engineers Canada, City of Castelgar, and Columbia Basin Trust. Projects have included analysis of extremes for stormwater management, highways, and community adaptation in different areas of the province. This need for projections of extremes has been met using an ensemble of Regional Climate Model (RCM) results from NARCCAP, in some cases supplemented by and compared to statistical downscaling. Before assessing indices of extremes, each RCM simulation in the NARCCAP ensemble driven by reanalysis (NCEP) was compared to historical observations to assess RCM skill. Next, the anomalies according to each RCM future projection were compared to those of their driving GCM to determine the "value added" by the RCMs. Selected results will be shown for several indices of extremes, including the Climdex set of indices that has been widely used elsewhere (e.g., Stardex) and specific parameters of interest defined by users. Finally, the need for threshold scaling of some indices and use of as large an ensemble as possible will be illustrated.
Navajo-Hopi Land Commission Renewable Energy Development Project (NREP)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thomas Benally, Deputy Director,
2012-05-15
The Navajo Hopi Land Commission Office (NHLCO), a Navajo Nation executive branch agency has conducted activities to determine capacity-building, institution-building, outreach and management activities to initiate the development of large-scale renewable energy - 100 megawatt (MW) or larger - generating projects on land in Northwestern New Mexico in the first year of a multi-year program. The Navajo Hopi Land Commission Renewable Energy Development Project (NREP) is a one year program that will develop and market a strategic business plan; form multi-agency and public-private project partnerships; compile site-specific solar, wind and infrastructure data; and develop and use project communication and marketingmore » tools to support outreach efforts targeting the public, vendors, investors and government audiences.« less
NASA Astrophysics Data System (ADS)
Bulega, T.; Kyeyune, A.; Onek, P.; Sseguya, R.; Mbabazi, D.; Katwiremu, E.
2011-10-01
Several publications have identified technical challenges facing Uganda's National Transmission Backbone Infrastructure project. This research addresses the technical limitations of the National Transmission Backbone Infrastructure project, evaluates the goals of the project, and compares the results against the technical capability of the backbone. The findings of the study indicate a bandwidth deficit, which will be addressed by using dense wave division multiplexing repeaters, leasing bandwidth from private companies. Microwave links for redundancy, a Network Operation Center for operation and maintenance, and deployment of wireless interoperability for microwave access as a last-mile solution are also suggested.
Towards a single seismological service infrastructure in Europe
NASA Astrophysics Data System (ADS)
Spinuso, A.; Trani, L.; Frobert, L.; Van Eck, T.
2012-04-01
In the last five year services and data providers, within the seismological community in Europe, focused their efforts in migrating the way of opening their archives towards a Service Oriented Architecture (SOA). This process tries to follow pragmatically the technological trends and available solutions aiming at effectively improving all the data stewardship activities. These advancements are possible thanks to the cooperation and the follow-ups of several EC infrastructural projects that, by looking at general purpose techniques, combine their developments envisioning a multidisciplinary platform for the earth observation as the final common objective (EPOS, Earth Plate Observation System) One of the first results of this effort is the Earthquake Data Portal (http://www.seismicportal.eu), which provides a collection of tools to discover, visualize and access a variety of seismological data sets like seismic waveform, accelerometric data, earthquake catalogs and parameters. The Portal offers a cohesive distributed search environment, linking data search and access across multiple data providers through interactive web-services, map-based tools and diverse command-line clients. Our work continues under other EU FP7 projects. Here we will address initiatives in two of those projects. The NERA, (Network of European Research Infrastructures for Earthquake Risk Assessment and Mitigation) project will implement a Common Services Architecture based on OGC services APIs, in order to provide Resource-Oriented common interfaces across the data access and processing services. This will improve interoperability between tools and across projects, enabling the development of higher-level applications that can uniformly access the data and processing services of all participants. This effort will be conducted jointly with the VERCE project (Virtual Earthquake and Seismology Research Community for Europe). VERCE aims to enable seismologists to exploit the wealth of seismic data within a data-intensive computation framework, which will be tailored to the specific needs of the community. It will provide a new interoperable infrastructure, as the computational backbone laying behind the publicly available interfaces. VERCE will have to face the challenges of implementing a service oriented architecture providing an efficient layer between the Data and the Grid infrastructures, coupling HPC data analysis and HPC data modeling applications through the execution of workflows and data sharing mechanism. Online registries of interoperable worklflow components, storage of intermediate results and data provenance are those aspects that are currently under investigations to make the VERCE facilities usable from a large scale of users, data and service providers. For such purposes the adoption of a Digital Object Architecture, to create online catalogs referencing and describing semantically all these distributed resources, such as datasets, computational processes and derivative products, is seen as one of the viable solution to monitor and steer the usage of the infrastructure, increasing its efficiency and the cooperation among the community.
ERIC Educational Resources Information Center
Sands, Ashley Elizabeth
2017-01-01
Ground-based astronomy sky surveys are massive, decades-long investments in scientific data collection. Stakeholders expect these datasets to retain scientific value well beyond the lifetime of the sky survey. However, the necessary investments in knowledge infrastructures for managing sky survey data are not yet in place to ensure the long-term…
Stereoscopic applications for design visualization
NASA Astrophysics Data System (ADS)
Gilson, Kevin J.
2007-02-01
Advances in display technology and 3D design visualization applications have made real-time stereoscopic visualization of architectural and engineering projects a reality. Parsons Brinkerhoff (PB) is a transportation consulting firm that has used digital visualization tools from their inception and has helped pioneer the application of those tools to large scale infrastructure projects. PB is one of the first Architecture/Engineering/Construction (AEC) firms to implement a CAVE- an immersive presentation environment that includes stereoscopic rear-projection capability. The firm also employs a portable stereoscopic front-projection system, and shutter-glass systems for smaller groups. PB is using commercial real-time 3D applications in combination with traditional 3D modeling programs to visualize and present large AEC projects to planners, clients and decision makers in stereo. These presentations create more immersive and spatially realistic presentations of the proposed designs. This paper will present the basic display tools and applications, and the 3D modeling techniques PB is using to produce interactive stereoscopic content. The paper will discuss several architectural and engineering design visualizations we have produced.
Eco-logical successes : third edition, September 2012
DOT National Transportation Integrated Search
2012-09-01
Eco-Logical: An Ecosystem Approach to Developing Infrastructure Projects outlines an ecosystem-scale approach to prioritizing, developing, and delivering infrastructure projects. Eco-Logical emphasizes interagency collaboration in order to create inf...
More Bang for the Buck: Integrating Green Infrastructure into Existing Public Works Projects
shares lessons learned from municipal and county officials experienced in coordinating green infrastructure applications with scheduled street maintenance, park improvements, and projects on public sites.
The role of assessment infrastructures in crafting project-based science classrooms
NASA Astrophysics Data System (ADS)
D'Amico, Laura Marie
In project-based science teaching, teachers engage students in the practice of conducting meaningful investigations and explanations of natural phenomena, often in collaboration with fellow students or adults. Reformers suggest that this approach can provide students with more profitable learning experiences; but for many teachers, a shift to such instruction can be difficult to manage. As some reform-minded teachers have discovered, classroom assessment can serve as a vital tool for meeting the challenges associated with project science activity. In this research, classroom assessment was viewed as an infrastructure that both students and teachers rely upon as a mediational tool for classroom activity and communications. The study explored the classroom assessment infrastructures created by three teachers involved in the Learning through Collaborative Visualization (CoVis) Project from 1993--94 to 1995--96. Each of the three teachers under study either created a new course or radically reformulated an old one in an effort to incorporate project-based science pedagogy and supporting technologies. Data in the form of interviews, classroom observations, surveys, student work, and teacher records was collected. From these data, an interpretive case study was developed for each course and its accompanying assessment infrastructure. A set of cross-case analyses was also constructed, based upon common themes that emerged from all three cases. These themes included: the assessment challenges based on the nature of project activity, the role of technology in the teachers' assessment infrastructure designs, and the influence of the wider assessment infrastructure on their course and assessment designs. In combination, the case studies and cross-case analyses describe the synergistic relationship between the design of pedagogical reforms and classroom assessment infrastructures, as well as the effectiveness of all three assessment designs. This work contributes to research and practice associated with assessment and pedagogical reform in three ways. First, it provides a theoretical frame for the relationship between assessment and pedagogical reform. Second, it provides a set of taxonomies which outline both the challenges of project-based science activity and typical assessment strategies to meet them. Finally, it provides a set of cautions and recommendations for designing classroom assessment infrastructures in support of project-based science.
Challenges and opportunities of cloud computing for atmospheric sciences
NASA Astrophysics Data System (ADS)
Pérez Montes, Diego A.; Añel, Juan A.; Pena, Tomás F.; Wallom, David C. H.
2016-04-01
Cloud computing is an emerging technological solution widely used in many fields. Initially developed as a flexible way of managing peak demand it has began to make its way in scientific research. One of the greatest advantages of cloud computing for scientific research is independence of having access to a large cyberinfrastructure to fund or perform a research project. Cloud computing can avoid maintenance expenses for large supercomputers and has the potential to 'democratize' the access to high-performance computing, giving flexibility to funding bodies for allocating budgets for the computational costs associated with a project. Two of the most challenging problems in atmospheric sciences are computational cost and uncertainty in meteorological forecasting and climate projections. Both problems are closely related. Usually uncertainty can be reduced with the availability of computational resources to better reproduce a phenomenon or to perform a larger number of experiments. Here we expose results of the application of cloud computing resources for climate modeling using cloud computing infrastructures of three major vendors and two climate models. We show how the cloud infrastructure compares in performance to traditional supercomputers and how it provides the capability to complete experiments in shorter periods of time. The monetary cost associated is also analyzed. Finally we discuss the future potential of this technology for meteorological and climatological applications, both from the point of view of operational use and research.
The Australian Replacement Research Reactor
NASA Astrophysics Data System (ADS)
Kennedy, Shane; Robinson, Robert
2004-03-01
The 20-MW Australian Replacement Research Reactor represents possibly the greatest single research infrastructure investment in Australia's history. Construction of the facility has commenced, following award of the construction contract in July 2000, and the construction licence in April 2002. The project includes a large state-of-the-art liquid deuterium cold-neutron source and supermirror guides feeding a large modern guide hall, in which most of the instruments are placed. Alongside the guide hall, there is good provision of laboratory, office and space for support activities. While the facility has "space" for up to 18 instruments, the project has funding for an initial set of 8 instruments, which will be ready when the reactor is fully operational in July 2006. Instrument performance will be competitive with the best research-reactor facilities anywhere, and our goal is to be in the top 3 such facilities worldwide. Staff to lead the design effort and man these instruments have been hired on the international market from leading overseas facilities, and from within Australia, and 7 out of 8 instruments have been specified and costed. At present the instrumentation project carries 10contingency. An extensive dialogue has taken place with the domestic user community and our international peers, via various means including a series of workshops over the last 2 years covering all 8 instruments, emerging areas of application like biology and the earth sciences, and computing infrastructure for the instruments.
IT Infrastructure Components for Biobanking
Prokosch, H.U.; Beck, A.; Ganslandt, T.; Hummel, M.; Kiehntopf, M.; Sax, U.; Ückert, F.; Semler, S.
2010-01-01
Objective Within translational research projects in the recent years large biobanks have been established, mostly supported by homegrown, proprietary software solutions. No general requirements for biobanking IT infrastructures have been published yet. This paper presents an exemplary biobanking IT architecture, a requirements specification for a biorepository management tool and exemplary illustrations of three major types of requirements. Methods We have pursued a comprehensive literature review for biobanking IT solutions and established an interdisciplinary expert panel for creating the requirements specification. The exemplary illustrations were derived from a requirements analysis within two university hospitals. Results The requirements specification comprises a catalog with more than 130 detailed requirements grouped into 3 major categories and 20 subcategories. Special attention is given to multitenancy capabilities in order to support the project-specific definition of varying research and bio-banking contexts, the definition of workflows to track sample processing, sample transportation and sample storage and the automated integration of preanalytic handling and storage robots. Conclusion IT support for biobanking projects can be based on a federated architectural framework comprising primary data sources for clinical annotations, a pseudonymization service, a clinical data warehouse with a flexible and user-friendly query interface and a biorepository management system. Flexibility and scalability of all such components are vital since large medical facilities such as university hospitals will have to support biobanking for varying monocentric and multicentric research scenarios and multiple medical clients. PMID:23616851
IT Infrastructure Components for Biobanking.
Prokosch, H U; Beck, A; Ganslandt, T; Hummel, M; Kiehntopf, M; Sax, U; Uckert, F; Semler, S
2010-01-01
Within translational research projects in the recent years large biobanks have been established, mostly supported by homegrown, proprietary software solutions. No general requirements for biobanking IT infrastructures have been published yet. This paper presents an exemplary biobanking IT architecture, a requirements specification for a biorepository management tool and exemplary illustrations of three major types of requirements. We have pursued a comprehensive literature review for biobanking IT solutions and established an interdisciplinary expert panel for creating the requirements specification. The exemplary illustrations were derived from a requirements analysis within two university hospitals. The requirements specification comprises a catalog with more than 130 detailed requirements grouped into 3 major categories and 20 subcategories. Special attention is given to multitenancy capabilities in order to support the project-specific definition of varying research and bio-banking contexts, the definition of workflows to track sample processing, sample transportation and sample storage and the automated integration of preanalytic handling and storage robots. IT support for biobanking projects can be based on a federated architectural framework comprising primary data sources for clinical annotations, a pseudonymization service, a clinical data warehouse with a flexible and user-friendly query interface and a biorepository management system. Flexibility and scalability of all such components are vital since large medical facilities such as university hospitals will have to support biobanking for varying monocentric and multicentric research scenarios and multiple medical clients.
Lovis, Christian; Colaert, Dirk; Stroetmann, Veli N
2008-01-01
The concepts and architecture underlying a large-scale integrating project funded within the 7th EU Framework Programme (FP7) are discussed. The main objective of the project is to build a tool that will have a significant impact for the monitoring and the control of infectious diseases and antimicrobial resistances in Europe; This will be realized by building a technical and semantic infrastructure able to share heterogeneous clinical data sets from different hospitals in different countries, with different languages and legislations; to analyze large amounts of this clinical data with advanced multimedia data mining and finally apply the obtained knowledge for clinical decisions and outcome monitoring. There are numerous challenges in this project at all levels, technical, semantical, legal and ethical that will have to be addressed.
NASA Astrophysics Data System (ADS)
Maffioletti, Sergio; Dawes, Nicholas; Bavay, Mathias; Sarni, Sofiane; Lehning, Michael
2013-04-01
The Swiss Experiment platform (SwissEx: http://www.swiss-experiment.ch) provides a distributed storage and processing infrastructure for environmental research experiments. The aim of the second phase project (the Open Support Platform for Environmental Research, OSPER, 2012-2015) is to develop the existing infrastructure to provide scientists with an improved workflow. This improved workflow will include pre-defined, documented and connected processing routines. A large-scale computing and data facility is required to provide reliable and scalable access to data for analysis, and it is desirable that such an infrastructure should be free of traditional data handling methods. Such an infrastructure has been developed using the cloud-based part of the Swiss national infrastructure SMSCG (http://www.smscg.ch) and Academic Cloud. The infrastructure under construction supports two main usage models: 1) Ad-hoc data analysis scripts: These scripts are simple processing scripts, written by the environmental researchers themselves, which can be applied to large data sets via the high power infrastructure. Examples of this type of script are spatial statistical analysis scripts (R-based scripts), mostly computed on raw meteorological and/or soil moisture data. These provide processed output in the form of a grid, a plot, or a kml. 2) Complex models: A more intense data analysis pipeline centered (initially) around the physical process model, Alpine3D, and the MeteoIO plugin; depending on the data set, this may require a tightly coupled infrastructure. SMSCG already supports Alpine3D executions as both regular grid jobs and as virtual software appliances. A dedicated appliance with the Alpine3D specific libraries has been created and made available through the SMSCG infrastructure. The analysis pipelines are activated and supervised by simple control scripts that, depending on the data fetched from the meteorological stations, launch new instances of the Alpine3D appliance, execute location-based subroutines at each grid point and store the results back into the central repository for post-processing. An optional extension of this infrastructure will be to provide a 'ring buffer'-type database infrastructure, such that model results (e.g. test runs made to check parameter dependency or for development) can be visualised and downloaded after completion without submitting them to a permanent storage infrastructure. Data organization Data collected from sensors are archived and classified in distributed sites connected with an open-source software middleware, GSN. Publicly available data are available through common web services and via a cloud storage server (based on Swift). Collocation of the data and processing in the cloud would eventually eliminate data transfer requirements. Execution control logic Execution of the data analysis pipelines (for both the R-based analysis and the Alpine3D simulations) has been implemented using the GC3Pie framework developed by UZH. (https://code.google.com/p/gc3pie/). This allows large-scale, fault-tolerant execution of the pipelines to be described in terms of software appliances. GC3Pie also allows supervision of the execution of large campaigns of appliances as a single simulation. This poster will present the fundamental architectural components of the data analysis pipelines together with initial experimental results.
Why You Should Consider Green Stormwater Infrastructure for Your Community
This page provides an overview of the nation's infrastructure needs and cost and the benefits of integrating green infrastructure into projects that typically use grey infrastructure, such as roadways, sidewalks and parking lots.
PKI security in large-scale healthcare networks.
Mantas, Georgios; Lymberopoulos, Dimitrios; Komninos, Nikos
2012-06-01
During the past few years a lot of PKI (Public Key Infrastructures) infrastructures have been proposed for healthcare networks in order to ensure secure communication services and exchange of data among healthcare professionals. However, there is a plethora of challenges in these healthcare PKI infrastructures. Especially, there are a lot of challenges for PKI infrastructures deployed over large-scale healthcare networks. In this paper, we propose a PKI infrastructure to ensure security in a large-scale Internet-based healthcare network connecting a wide spectrum of healthcare units geographically distributed within a wide region. Furthermore, the proposed PKI infrastructure facilitates the trust issues that arise in a large-scale healthcare network including multi-domain PKI infrastructures.
European environmental research infrastructures are going for common 30 years strategy
NASA Astrophysics Data System (ADS)
Asmi, Ari; Konjin, Jacco; Pursula, Antti
2014-05-01
Environmental Research infrastructures are facilities, resources, systems and related services that are used by research communities to conduct top-level research. Environmental research is addressing processes at very different time scales, and supporting research infrastructures must be designed as long-term facilities in order to meet the requirements of continuous environmental observation, measurement and analysis. This longevity makes the environmental research infrastructures ideal structures to support the long-term development in environmental sciences. ENVRI project is a collaborative action of the major European (ESFRI) Environmental Research Infrastructures working towards increased co-operation and interoperability between the infrastructures. One of the key products of the ENVRI project is to combine the long-term plans of the individual infrastructures towards a common strategy, describing the vision and planned actions. The envisaged vision for environmental research infrastructures toward 2030 is to support the holistic understanding of our planet and it's behavior. The development of a 'Standard Model of the Planet' is a common ambition, a challenge to define an environmental standard model; a framework of all interactions within the Earth System, from solid earth to near space. Indeed scientists feel challenged to contribute to a 'Standard Model of the Planet' with data, models, algorithms and discoveries. Understanding the Earth System as an interlinked system requires a systems approach. The Environmental Sciences are rapidly moving to become a one system-level science. Mainly since modern science, engineering and society are increasingly facing complex problems that can only be understood in the context of the full overall system. The strategy of the supporting collaborating research infrastructures is based on developing three key factors for the Environmental Sciences: the technological, the cultural and the human capital. The technological capital development concentrates on improving the capacities to measure, observe, preserve and compute. This requires staff, technologies, sensors, satellites, floats, software to integrate and to do analysis and modeling, including data storage, computing platforms and networks. The cultural capital development addresses issues such as open access to data, rules, licenses, citation agreements, IPR agreements, technologies for machine-machine interaction, workflows, metadata, and RI community on the policy level. Human capital actions are based on anticipated need of specialists, including data scientists and 'generalists' that oversee more than just their own discipline. Developing these, as interrelated services, should help the scientific community to enter innovative and large projects contributing to a 'Standard Model of the Planet'. To achieve the overall goal, ENVRI will publish a set of action items that contains intermediate aims, bigger and smaller steps to work towards the development of the 'Standard Model of the Planet' approach. This timeline of actions can used as reference and 'common denominator' in defining new projects and research programs. Either within the various environmental scientific disciplines or when cooperating among these disciplines or even when outreaching towards other disciplines like social sciences, physics/chemistry, medical/life sciences etc.
Private participation in infrastructure: A risk analysis of long-term contracts in power sector
NASA Astrophysics Data System (ADS)
Ceran, Nisangul
The objective of this dissertation is to assess whether the private participation in energy sector through long term contracting, such as Build-Operate-Transfer (BOT) type investments, is an efficient way of promoting efficiency in the economy. To this end; the theoretical literature on the issue is discussed, the experience of several developing countries are examined, and a BOT project, which is undertaken by the Enron company in Turkey, has been studied in depth as a case study. Different risk analysis techniques, including sensitivity and probabilistic risk analysis with the Monte Carlo Simulation (MCS) method have been applied to assess the financial feasibility and risks of the case study project, and to shed light on the level of rent-seeking in the BOT agreements. Although data on rent seeking and corruption is difficult to obtain, the analysis of case study investment using the sensitivity and MCS method provided some information that can be used in assessing the level of rent-seeking in BOT projects. The risk analysis enabled to test the sustainability of the long-term BOT contracts through the analysis of projects financial feasibility with and without the government guarantees in the project. The approach of testing the sustainability of the project under different scenarios is helpful to understand the potential costs and contingent liabilities for the government and project's impact on a country's overall economy. The results of the risk analysis made by the MCS method for the BOT project used as the case study strongly suggest that, the BOT projects does not serve to the interest of the society and transfers substantial amount of public money to the private companies, implying severe governance problems. It is found that not only government but also private sector may be reluctant about full privatization of infrastructure due to several factors such as involvement of large sunk costs, very long time period for returns to be received, political and macroeconomic uncertainties and insufficient institutional and regulatory environment. It is concluded that the BOT type infrastructure projects are not an efficient way of promoting private sector participation in infrastructure. They tend to serve the interest of rent-seekers rather than the interest of the society. Since concession contracts and Treasury guarantees shift the commercial risk to government, the private sector has no incentive to be efficient. The concession agreements distort the market conditions by preventing free completion in the market.
GSDC: A Unique Data Center in Korea for HEP research
NASA Astrophysics Data System (ADS)
Ahn, Sang-Un
2017-04-01
Global Science experimental Data hub Center (GSDC) at Korea Institute of Science and Technology Information (KISTI) is a unique data center in South Korea established for promoting the fundamental research fields by supporting them with the expertise on Information and Communication Technology (ICT) and the infrastructure for High Performance Computing (HPC), High Throughput Computing (HTC) and Networking. GSDC has supported various research fields in South Korea dealing with the large scale of data, e.g. RENO experiment for neutrino research, LIGO experiment for gravitational wave detection, Genome sequencing project for bio-medical, and HEP experiments such as CDF at FNAL, Belle at KEK, and STAR at BNL. In particular, GSDC has run a Tier-1 center for ALICE experiment using the LHC at CERN since 2013. In this talk, we present the overview on computing infrastructure that GSDC runs for the research fields and we discuss on the data center infrastructure management system deployed at GSDC.
Achievable steps toward building a National Health Information infrastructure in the United States.
Stead, William W; Kelly, Brian J; Kolodner, Robert M
2005-01-01
Consensus is growing that a health care information and communication infrastructure is one key to fixing the crisis in the United States in health care quality, cost, and access. The National Health Information Infrastructure (NHII) is an initiative of the Department of Health and Human Services receiving bipartisan support. There are many possible courses toward its objective. Decision makers need to reflect carefully on which approaches are likely to work on a large enough scale to have the intended beneficial national impacts and which are better left to smaller projects within the boundaries of health care organizations. This report provides a primer for use by informatics professionals as they explain aspects of that dividing line to policy makers and to health care leaders and front-line providers. It then identifies short-term, intermediate, and long-term steps that might be taken by the NHII initiative.
Achievable Steps Toward Building a National Health Information Infrastructure in the United States
Stead, William W.; Kelly, Brian J.; Kolodner, Robert M.
2005-01-01
Consensus is growing that a health care information and communication infrastructure is one key to fixing the crisis in the United States in health care quality, cost, and access. The National Health Information Infrastructure (NHII) is an initiative of the Department of Health and Human Services receiving bipartisan support. There are many possible courses toward its objective. Decision makers need to reflect carefully on which approaches are likely to work on a large enough scale to have the intended beneficial national impacts and which are better left to smaller projects within the boundaries of health care organizations. This report provides a primer for use by informatics professionals as they explain aspects of that dividing line to policy makers and to health care leaders and front-line providers. It then identifies short-term, intermediate, and long-term steps that might be taken by the NHII initiative. PMID:15561783
Green Infrastructure Projects and State Activities: CWSRF Innovations
This report highlights several projects funded by ARRA that illustrate the importance of building partnerships among various stakeholders and how different green infrastructure technologies and practices can be applied in different settings.
NASA Astrophysics Data System (ADS)
Purwanggono, Bambang; Margarette, Anastasia
2017-12-01
Completion time of highway construction is very meaningful for smooth transportation, moreover expected number of ownership motor vehicle will increase each year. Therefore, this study was conducted with to analyze the constraints that contained in an infrastructure development project. This research was conducted on Jatingaleh Underpass Project, Semarang. This research was carried out while the project is running, on the implementation, this project is experiencing delays. This research is done to find out what are the constraints that occur in execution of a road infrastructure project, in particular that causes delays. The method that used to find the root cause is fishbone diagram to obtain a possible means of mitigation. Coupled with the RFMEA method used to determine the critical risks that must be addressed immediately on road infrastructure project. The result of data tabulation in this study indicates that the most possible mitigation tool to make a Standard Operating Procedure (SOP) recommendations to disrupt utilities that interfere project implementation. Process of risk assessment has been carried out systematically based on ISO 31000:2009 on risk management and for determination of delayed variables, the requirements of process groups according to ISO 21500:2013 on project management were used.
Rapid Processing of Radio Interferometer Data for Transient Surveys
NASA Astrophysics Data System (ADS)
Bourke, S.; Mooley, K.; Hallinan, G.
2014-05-01
We report on a software infrastructure and pipeline developed to process large radio interferometer datasets. The pipeline is implemented using a radical redesign of the AIPS processing model. An infrastructure we have named AIPSlite is used to spawn, at runtime, minimal AIPS environments across a cluster. The pipeline then distributes and processes its data in parallel. The system is entirely free of the traditional AIPS distribution and is self configuring at runtime. This software has so far been used to process a EVLA Stripe 82 transient survey, the data for the JVLA-COSMOS project, and has been used to process most of the EVLA L-Band data archive imaging each integration to search for short duration transients.
The Da Vinci European BioBank: A Metabolomics-Driven Infrastructure
Carotenuto, Dario; Luchinat, Claudio; Marcon, Giordana; Rosato, Antonio; Turano, Paola
2015-01-01
We present here the organization of the recently-constituted da Vinci European BioBank (daVEB, https://www.davincieuropeanbiobank.org/it). The biobank was created as an infrastructure to support the activities of the Fiorgen Foundation (http://www.fiorgen.net/), a nonprofit organization that promotes research in the field of pharmacogenomics and personalized medicine. The way operating procedures concerning samples and data have been developed at daVEB largely stems from the strong metabolomics connotation of Fiorgen and from the involvement of the scientific collaborators of the foundation in international/European projects aimed to tackle the standardization of pre-analytical procedures and the promotion of data standards in metabolomics. PMID:25913579
On the use of IT investment assessment methods in the area of spatial data infrastructure
NASA Astrophysics Data System (ADS)
Zwirowicz-Rutkowska, Agnieszka
2016-06-01
One of the important issues concerning development of spatial data infrastructures (SDIs) is the carrying out of economic and financial analysis. It is essential to determine expenses and also assess effects resulting from the development and use of infrastructures. Costs and benefits assessment could be associated with assessment of the infrastructure effectiveness and efficiency as well as the infrastructure value, understood as the infrastructure impact on economic aspects of an organisational performance, both of an organisation which realises an SDI project and all users of the infrastructure. The aim of this paper is an overview of various assessment methods of investment as well as an analysis of different types of costs and benefits used for information technology (IT) projects. Based on the literature, the analysis of the examples of the use of these methods in the area of spatial data infrastructures is also presented. Furthermore, the issues of SDI projects and investments are outlined. The results of the analysis indicate usefulness of the financial methods from different fields of management in the area of SDI building, development and use. The author proposes, in addition to the financial methods, the adaptation of the various techniques used for IT investments and their development, taking into consideration the SDI specificity for the purpose of assessment of different types of costs and benefits and integration of financial aspects with non-financial ones. Among the challenges are identification and quantification of costs and benefits, as well as establishing measures which would fit the characteristics of the SDI project and artefacts resulting from the project realisation. Moreover, aspects of subjectivity and variability in time should be taken into account as the consequences of definite goals and policies as well as business context of organisation undertaking the project or using its artefacts and also investors.
Possible illnesses: assessing the health impacts of the Chad Pipeline Project.
Leonard, Lori
2003-01-01
Health impact assessments associated with large-scale infrastructure projects, such as the Chad-Cameroon Petroleum Development and Pipeline Project, monitor pre-existing conditions and new diseases associated with particular industries or changes in social organization. This paper suggests that illness self-reports constitute a complementary set of benchmarks to measure the health impacts of these projects, and presents data gathered in ongoing household and health service surveys in Ngalaba, a village near a major oilfield in Chad. In an initial 16-week period of weekly data collection, 363 people reported few of the clinically chronic or asymptomatic conditions expected according to health transition theory, and the overall level of illness reporting was low. Illnesses often were described by symptoms or lay diagnoses. Health care practitioners were consulted rarely; when they were, resources for diagnosis and treatment were limited. Clinically acute, short-duration illnesses (e.g. parasitic infections, toothaches, or hernias) were experienced as chronic conditions and were reported week after week. The low levels of illness reporting and lack of clinically chronic conditions are not taken to mean that rural Chadians are healthy. Rather, the patterns of morbidity reflect a particular local ecology in which health services are organized and care dispensed in ways that limit the possibilities for illness in terms of types of illnesses that can be diagnosed and reported, forms illnesses take, and ways in which illnesses are experienced. Illness self-reports are useful adjuncts to "harder" biological measures in HIAs, particularly in the context of large-scale infrastructure projects with explicit development goals. Rather than providing data on the extent to which harm has been mitigated by corporate, state, and donor activities, self-reports show the possibilities of illness in local contexts. PMID:12894327
NASA Astrophysics Data System (ADS)
Asmi, A.; Sorvari, S.; Kutsch, W. L.; Laj, P.
2017-12-01
European long-term environmental research infrastructures (often referred as ESFRI RIs) are the core facilities for providing services for scientists in their quest for understanding and predicting the complex Earth system and its functioning that requires long-term efforts to identify environmental changes (trends, thresholds and resilience, interactions and feedbacks). Many of the research infrastructures originally have been developed to respond to the needs of their specific research communities, however, it is clear that strong collaboration among research infrastructures is needed to serve the trans-boundary research requires exploring scientific questions at the intersection of different scientific fields, conducting joint research projects and developing concepts, devices, and methods that can be used to integrate knowledge. European Environmental research infrastructures have already been successfully worked together for many years and have established a cluster - ENVRI cluster - for their collaborative work. ENVRI cluster act as a collaborative platform where the RIs can jointly agree on the common solutions for their operations, draft strategies and policies and share best practices and knowledge. Supporting project for the ENVRI cluster, ENVRIplus project, brings together 21 European research infrastructures and infrastructure networks to work on joint technical solutions, data interoperability, access management, training, strategies and dissemination efforts. ENVRI cluster act as one stop shop for multidisciplinary RI users, other collaborative initiatives, projects and programmes and coordinates and implement jointly agreed RI strategies.
Scalable collaborative risk management technology for complex critical systems
NASA Technical Reports Server (NTRS)
Campbell, Scott; Torgerson, Leigh; Burleigh, Scott; Feather, Martin S.; Kiper, James D.
2004-01-01
We describe here our project and plans to develop methods, software tools, and infrastructure tools to address challenges relating to geographically distributed software development. Specifically, this work is creating an infrastructure that supports applications working over distributed geographical and organizational domains and is using this infrastructure to develop a tool that supports project development using risk management and analysis techniques where the participants are not collocated.
Developing A Large-Scale, Collaborative, Productive Geoscience Education Network
NASA Astrophysics Data System (ADS)
Manduca, C. A.; Bralower, T. J.; Egger, A. E.; Fox, S.; Ledley, T. S.; Macdonald, H.; Mcconnell, D. A.; Mogk, D. W.; Tewksbury, B. J.
2012-12-01
Over the past 15 years, the geoscience education community has grown substantially and developed broad and deep capacity for collaboration and dissemination of ideas. While this community is best viewed as emergent from complex interactions among changing educational needs and opportunities, we highlight the role of several large projects in the development of a network within this community. In the 1990s, three NSF projects came together to build a robust web infrastructure to support the production and dissemination of on-line resources: On The Cutting Edge (OTCE), Earth Exploration Toolbook, and Starting Point: Teaching Introductory Geoscience. Along with the contemporaneous Digital Library for Earth System Education, these projects engaged geoscience educators nationwide in exploring professional development experiences that produced lasting on-line resources, collaborative authoring of resources, and models for web-based support for geoscience teaching. As a result, a culture developed in the 2000s in which geoscience educators anticipated that resources for geoscience teaching would be shared broadly and that collaborative authoring would be productive and engaging. By this time, a diverse set of examples demonstrated the power of the web infrastructure in supporting collaboration, dissemination and professional development . Building on this foundation, more recent work has expanded both the size of the network and the scope of its work. Many large research projects initiated collaborations to disseminate resources supporting educational use of their data. Research results from the rapidly expanding geoscience education research community were integrated into the Pedagogies in Action website and OTCE. Projects engaged faculty across the nation in large-scale data collection and educational research. The Climate Literacy and Energy Awareness Network and OTCE engaged community members in reviewing the expanding body of on-line resources. Building Strong Geoscience Departments sought to create the same type of shared information base that was supporting individual faculty for departments. The Teach the Earth portal and its underlying web development tools were used by NSF-funded projects in education to disseminate their results. Leveraging these funded efforts, the Climate Literacy Network has expanded this geoscience education community to include individuals broadly interested in fostering climate literacy. Most recently, the InTeGrate project is implementing inter-institutional collaborative authoring, testing and evaluation of curricular materials. While these projects represent only a fraction of the activity in geoscience education, they are important drivers in the development of a large, national, coherent geoscience education network with the ability to collaborate and disseminate information effectively. Importantly, the community is open and defined by active participation. Key mechanisms for engagement have included alignment of project activities with participants needs and goals; productive face-to-face and virtual workshops, events, and series; stipends for completion of large products; and strong supporting staff to keep projects moving and assist with product production. One measure of its success is the adoption and adaptation of resources and models by emerging projects, which results in the continued growth of the network.
Error begat error: design error analysis and prevention in social infrastructure projects.
Love, Peter E D; Lopez, Robert; Edwards, David J; Goh, Yang M
2012-09-01
Design errors contribute significantly to cost and schedule growth in social infrastructure projects and to engineering failures, which can result in accidents and loss of life. Despite considerable research that has addressed their error causation in construction projects they still remain prevalent. This paper identifies the underlying conditions that contribute to design errors in social infrastructure projects (e.g. hospitals, education, law and order type buildings). A systemic model of error causation is propagated and subsequently used to develop a learning framework for design error prevention. The research suggests that a multitude of strategies should be adopted in congruence to prevent design errors from occurring and so ensure that safety and project performance are ameliorated. Copyright © 2011. Published by Elsevier Ltd.
The City of Kansas City, Mo., Water Services Department is implementing a pilot project to measure and evaluate the performance of green infrastructure. Information obtained through this pilot project will be used to guide the design of green solutions throughout Kansas City und...
Webinar November 18: An Overview of the Hydrogen Fueling Infrastructure
Research and Station Technology (H2FIRST) Project | News | NREL Webinar November 18: An Overview of the Hydrogen Fueling Infrastructure Research and Station Technology (H2FIRST) Project Webinar ) Project November 12, 2014 The Energy Department will present a live webinar entitled "An Overview of
EVALUATING MACROINVERTEBRATE COMMUNITY ...
Since 2010, new construction in California is required to include stormwater detention and infiltration that is designed to capture rainfall from the 85th percentile of storm events in the region, preferably through green infrastructure. This study used recent macroinvertebrate community monitoring data to determine the ecological threshold for percent impervious cover prior to large scale adoption of green infrastructure using Threshold Indicator Taxa Analysis (TITAN). TITAN uses an environmental gradient and biological community data to determine individual taxa change points with respect to changes in taxa abundance and frequency across that gradient. Individual taxa change points are then aggregated to calculate the ecological threshold. This study used impervious cover data from National Land Cover Datasets and macroinvertebrate community data from California Environmental Data Exchange Network and Southern California Coastal Water Research Project. Preliminary TITAN runs for California’s Chaparral region indicated that both increasing and decreasing taxa had ecological thresholds of <1% watershed impervious cover. Next, TITAN will be used to determine shifts in the ecological threshold after the implementation of green infrastructure on a large scale. This presentation for the Society for Freshwater Scientists will discuss initial evaluation of community and taxa-specific thresholds of impairment for macroinvertebrates in California streams along
Large rainfall changes consistently projected over substantial areas of tropical land
NASA Astrophysics Data System (ADS)
Chadwick, Robin; Good, Peter; Martin, Gill; Rowell, David P.
2016-02-01
Many tropical countries are exceptionally vulnerable to changes in rainfall patterns, with floods or droughts often severely affecting human life and health, food and water supplies, ecosystems and infrastructure. There is widespread disagreement among climate model projections of how and where rainfall will change over tropical land at the regional scales relevant to impacts, with different models predicting the position of current tropical wet and dry regions to shift in different ways. Here we show that despite uncertainty in the location of future rainfall shifts, climate models consistently project that large rainfall changes will occur for a considerable proportion of tropical land over the twenty-first century. The area of semi-arid land affected by large changes under a higher emissions scenario is likely to be greater than during even the most extreme regional wet or dry periods of the twentieth century, such as the Sahel drought of the late 1960s to 1990s. Substantial changes are projected to occur by mid-century--earlier than previously expected--and to intensify in line with global temperature rise. Therefore, current climate projections contain quantitative, decision-relevant information on future regional rainfall changes, particularly with regard to climate change mitigation policy.
Lessons learned from first year cistern monitoring in Camden ...
Invited panelist for Webinar 08/16/2016 by Office of Water : Lessons Learned from Past Green Infrastructure Projects Invited panelist for Webinar 08/16/2016 by Office of Water : Lessons Learned from Past Green Infrastructure Projects
Eco-logical successes : second edition, January 2012
DOT National Transportation Integrated Search
2012-01-01
In 2006, leaders from eight Federal agencies signed the interagency document EcoLogical: An Ecosystem Approach to Developing Infrastructure Projects. Eco-Logical is a document that outlines a shared vision of how to develop infrastructure projects in...
Personal Devices in Public Settings: Lessons Learned from an iPod Touch/iPad Project
ERIC Educational Resources Information Center
Crichton, Susan; Pegler, Karen; White, Duncan
2012-01-01
Our paper reports findings from a two-phase deployment of iPod Touch and iPad devices in a large, urban Canadian school board. The purpose of the study was to gain an understanding of the infrastructure required to support handheld devices in classrooms; the opportunities and challenges teachers face as they begin to use handheld devices for…
Assessing Terrorist Motivations for Attacking Critical Infrastructure
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ackerman, G; Abhayaratne, P; Bale, J
Certain types of infrastructure--critical infrastructure (CI)--play vital roles in underpinning our economy, security and way of life. These complex and often interconnected systems have become so ubiquitous and essential to day-to-day life that they are easily taken for granted. Often it is only when the important services provided by such infrastructure are interrupted--when we lose easy access to electricity, health care, telecommunications, transportation or water, for example--that we are conscious of our great dependence on these networks and of the vulnerabilities that stem from such dependence. Unfortunately, it must be assumed that many terrorists are all too aware that CImore » facilities pose high-value targets that, if successfully attacked, have the potential to dramatically disrupt the normal rhythm of society, cause public fear and intimidation, and generate significant publicity. Indeed, revelations emerging at the time of this writing about Al Qaida's efforts to prepare for possible attacks on major financial facilities in New York, New Jersey, and the District of Columbia remind us just how real and immediate such threats to CI may be. Simply being aware that our nation's critical infrastructure presents terrorists with a plethora of targets, however, does little to mitigate the dangers of CI attacks. In order to prevent and preempt such terrorist acts, better understanding of the threats and vulnerabilities relating to critical infrastructure is required. The Center for Nonproliferation Studies (CNS) presents this document as both a contribution to the understanding of such threats and an initial effort at ''operationalizing'' its findings for use by analysts who work on issues of critical infrastructure protection. Specifically, this study focuses on a subsidiary aspect of CI threat assessment that has thus far remained largely unaddressed by contemporary terrorism research: the motivations and related factors that determine whether a terrorist organization will attack critical infrastructure. In other words, this research investigates: (1) why terrorists choose to attack critical infrastructure rather than other targets; (2) how groups make such decisions; (3) what, if any, types of groups are most inclined to attack critical infrastructure targets; and (4) which types of critical infrastructure terrorists prefer to attack and why. In an effort to address the above questions as comprehensively as possible, the project team employed four discrete investigative approaches in its research design. These include: (1) a review of existing terrorism and threat assessment literature to glean expert consensus regarding terrorist target selection, as well as to identify theoretical approaches that might be valuable to analysts and decision-makers who are seeking to understand such terrorist group decision-making processes; (2) the preparation of several concise case studies to help identify internal group factors and contextual influences that have played significant roles in leading some terrorist groups to attack critical infrastructure; (3) the creation of a new database--the Critical Infrastructure Terrorist Incident Catalog (CrITC)--to capture a large sample of empirical CI attack data that might be used to illuminate the nature of such attacks to date; and (4) the development of a new analytical framework--the Determinants Effecting Critical Infrastructure Decisions (DECIDe) Framework--designed to make the factors and dynamics identified by the study more ''usable'' in any future efforts to assess terrorist intentions to target critical infrastructure. Although each is addressed separately in the following chapters, none of the four aspects of this study were developed in isolation. Rather, all the constituent elements of the project informed--and were informed by--the others. For example, the review of the available literature on terrorist target selection made possible the identification of several target selection factors that were both important in the development of the analytical framework and subsequently validated by the case studies. Similarly, statistical analysis of the CrITIC data yielded measurable evidence that supported hypotheses derived from the framework, the case studies, and the writings of various experts. Besides providing an important mechanism of self-reinforcement and validation, the project's multifaceted nature made it possible to discern aspects of CI attack motivations that would likely have been missed if any single approach had been adopted.« less
Power monitoring and control for large scale projects: SKA, a case study
NASA Astrophysics Data System (ADS)
Barbosa, Domingos; Barraca, João. Paulo; Maia, Dalmiro; Carvalho, Bruno; Vieira, Jorge; Swart, Paul; Le Roux, Gerhard; Natarajan, Swaminathan; van Ardenne, Arnold; Seca, Luis
2016-07-01
Large sensor-based science infrastructures for radio astronomy like the SKA will be among the most intensive datadriven projects in the world, facing very high demanding computation, storage, management, and above all power demands. The geographically wide distribution of the SKA and its associated processing requirements in the form of tailored High Performance Computing (HPC) facilities, require a Greener approach towards the Information and Communications Technologies (ICT) adopted for the data processing to enable operational compliance to potentially strict power budgets. Addressing the reduction of electricity costs, improve system power monitoring and the generation and management of electricity at system level is paramount to avoid future inefficiencies and higher costs and enable fulfillments of Key Science Cases. Here we outline major characteristics and innovation approaches to address power efficiency and long-term power sustainability for radio astronomy projects, focusing on Green ICT for science and Smart power monitoring and control.
EPA Provides State of Vermont $14.7 Million for Water Infrastructure Projects
The U.S. Environmental Protection Agency has awarded $14.7 million to the State of Vermont to help finance improvements to water infrastructure projects that are essential to protecting public health and the environment.
Green Infrastructure Models and Tools
The objective of this project is to modify and refine existing models and develop new tools to support decision making for the complete green infrastructure (GI) project lifecycle, including the planning and implementation of stormwater control in urban and agricultural settings,...
Centre for Research Infrastructure of Polish GNSS Data - response and possible contribution to EPOS
NASA Astrophysics Data System (ADS)
Araszkiewicz, Andrzej; Rohm, Witold; Bosy, Jaroslaw; Szolucha, Marcin; Kaplon, Jan; Kroszczynski, Krzysztof
2017-04-01
In the frame of the first call under Action 4.2: Development of modern research infrastructure of the science sector in the Smart Growth Operational Programme 2014-2020 in the late of 2016 the "EPOS-PL" project has launched. Following institutes are responsible for the implementation of this project: Institute of Geophysics, Polish Academy of Sciences - Project Leader, Academic Computer Centre Cyfronet AGH University of Science and Technology, Central Mining Institute, the Institute of Geodesy and Cartography, Wrocław University of Environmental and Life Sciences, Military University of Technology. In addition, resources constituting entrepreneur's own contribution will come from the Polish Mining Group. Research Infrastructure EPOS-PL will integrate both existing and newly built National Research Infrastructures (Theme Centre for Research Infrastructures), which, under the premise of the program EPOS, are financed exclusively by the national founds. In addition, the e-science platform will be developed. The Centre for Research Infrastructure of GNSS Data (CIBDG - Task 5) will be built based on the experience and facilities of two institutions: Military University of Technology and Wrocław University of Environmental and Life Sciences. The project includes the construction of the National GNNS Repository with data QC procedures and adaptation of two Regional GNNS Analysis Centres for rapid and long-term geodynamical monitoring.
LANL: Weapons Infrastructure Briefing to Naval Reactors, July 18, 2017
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chadwick, Frances
Presentation slides address: The Laboratory infrastructure supports hundreds of high hazard, complex operations daily; LANL’s unique science and engineering infrastructure is critical to delivering on our mission; LANL FY17 Budget & Workforce; Direct-Funded Infrastructure Accounts; LANL Org Chart; Weapons Infrastructure Program Office; The Laboratory’s infrastructure relies on both Direct and Indirect funding; NA-50’s Operating, Maintenance & Recapitalization funding is critical to the execution of the mission; Los Alamos is currently executing several concurrent Line Item projects; Maintenance @ LANL; NA-50 is helping us to address D&D needs; We are executing a CHAMP Pilot Project at LANL; G2 = Main Toolmore » for Program Management; MDI: Future Investments are centered on facilities with a high Mission Dependency Index; Los Alamos hosted first “Deep Dive” in November 2016; Safety, Infrastructure & Operations is one of the most important programs at LANL, and is foundational for our mission success.« less
Implementation of an Intelligent Tutorial System for Socioenvironmental Management Projects
ERIC Educational Resources Information Center
Vera, Gil; Daniel, Víctor; Awad, Gabriel
2015-01-01
The agents responsible of execution of physical infrastructure projects of the Government of Antioquia must know the theories and concepts related to the socio-environmental management of physical infrastructure projects. In the absence of tools and the little information on the subject, it is necessary to build a m-learning tool to facilitate to…
Investigation into Cloud Computing for More Robust Automated Bulk Image Geoprocessing
NASA Technical Reports Server (NTRS)
Brown, Richard B.; Smoot, James C.; Underwood, Lauren; Armstrong, C. Duane
2012-01-01
Geospatial resource assessments frequently require timely geospatial data processing that involves large multivariate remote sensing data sets. In particular, for disasters, response requires rapid access to large data volumes, substantial storage space and high performance processing capability. The processing and distribution of this data into usable information products requires a processing pipeline that can efficiently manage the required storage, computing utilities, and data handling requirements. In recent years, with the availability of cloud computing technology, cloud processing platforms have made available a powerful new computing infrastructure resource that can meet this need. To assess the utility of this resource, this project investigates cloud computing platforms for bulk, automated geoprocessing capabilities with respect to data handling and application development requirements. This presentation is of work being conducted by Applied Sciences Program Office at NASA-Stennis Space Center. A prototypical set of image manipulation and transformation processes that incorporate sample Unmanned Airborne System data were developed to create value-added products and tested for implementation on the "cloud". This project outlines the steps involved in creating and testing of open source software developed process code on a local prototype platform, and then transitioning this code with associated environment requirements into an analogous, but memory and processor enhanced cloud platform. A data processing cloud was used to store both standard digital camera panchromatic and multi-band image data, which were subsequently subjected to standard image processing functions such as NDVI (Normalized Difference Vegetation Index), NDMI (Normalized Difference Moisture Index), band stacking, reprojection, and other similar type data processes. Cloud infrastructure service providers were evaluated by taking these locally tested processing functions, and then applying them to a given cloud-enabled infrastructure to assesses and compare environment setup options and enabled technologies. This project reviews findings that were observed when cloud platforms were evaluated for bulk geoprocessing capabilities based on data handling and application development requirements.
NASA Technical Reports Server (NTRS)
Moore, A. W.; Neilan, R. E.; Springer, T. A.; Reigber, Ch.
2000-01-01
A strong multipurpose aspect of the International GPS Service (IGS) is revealed by a glance at the titles of current projects and working groups within the IGS: IGS/BIPM Time Transfer Project; Ionosphere Working Group; Troposphere Working Group; International GLONASS Experiment; Working Group on Low-Earth Orbiter Missions; and Tide Gauges, CGPS, and the IGS. The IGS network infrastructure, in large part originally commissioned for geodynamical investigations, has proved to be a valuable asset in developing application-oriented subnetworks whose requirements overlap the characteristics of existing IGS stations and future station upgrades. Issues encountered thus far in the development of multipurpose or multitechnique IGS projects as well as future possibilities will be reviewed.
NASA Astrophysics Data System (ADS)
Karlsson, Caroline; Miliutenko, Sofiia; Björklund, Anna; Mörtberg, Ulla; Olofsson, Bo; Toller, Susanna
2017-04-01
Environmental impacts during the life cycle stages of transport infrastructure are substantial, including among other greenhouse gas (GHG) emissions, as well as resource and energy use. For transport infrastructure to be sustainable, such issues need to be integrated in the planning process. Environmental Impact Assessment (EIA) is required by the European Union (EU) in order to ensure that all environmental aspects are considered during planning of road infrastructure projects. As a part of this process, the European Commission has suggested the use of the tool life cycle assessment (LCA) for assessing life cycle energy use and GHG emissions. When analyzing life cycle impacts of the road infrastructure itself, it was shown that earthworks and materials used for the road construction have a big share in the total energy use and GHG emissions. Those aspects are largely determined by the geological conditions at the site of construction: parameters such as soil thickness, slope, bedrock quality and soil type. The geological parameters determine the amounts of earthworks (i.e. volumes of soil and rock that will be excavated and blasted), transportation need for excavated materials as well as the availability of building materials. The study presents a new geographic information system (GIS)-based approach for utilizing spatial geological data in three dimensions (i.e. length, width and depth) in order to improve estimates on earthworks during the early stages of road infrastructure planning. Three main methodological steps were undertaken: mass balance calculation, life cycle inventory analysis and spatial mapping of greenhouse gas (GHG) emissions and energy use. The proposed GIS-based approach was later evaluated by comparing with the actual values of extracted material of a real road construction project. The results showed that the estimate of filling material was the most accurate, while the estimate for excavated soil and blasted rock had a wide variation from the actual values. It was also found that the total volume of excavated and ripped soils did not change when accounting for geological stratigraphy. The proposed GIS-based approach shows promising results for usage in LCA at an early stage of road infrastructure planning, and by providing better data quality, GIS in combination with LCA can enable planning for a more sustainable transport infrastructure.
NASA Astrophysics Data System (ADS)
Glaves, Helen; Graham, Colin
2010-05-01
Geo-Seas - a pan-European infrastructure for the management of marine geological and geophysical data. Helen Glaves1 and Colin Graham2 on behalf of the Geo-Seas consortium The Geo-Seas project will create a network of twenty six European marine geoscience data centres from seventeen coastal countries including six from the Baltic Sea area. This will be achieved through the development of a pan-European infrastructure for the exchange of marine geoscientific data. Researchers will be able to locate and access harmonised and federated marine geological and geophysical datasets and data products held by the data centres through the Geo-Seas data portal, using a common data catalogue. The new infrastructure, an expansion of the exisiting SeaDataNet, will create an infrastructure covering oceanographic and marine geoscientific data. New data products and services will be developed following consultations with users on their current and future research requirements. Common data standards will be implemented across all of the data centres and other geological and geophysical organisations will be encouraged to adopt the protocols, standards and tools which are developed as part of the Geo-Seas project. Oceanographic and marine data include a wide range of variables, an important category of which are the geological and geophysical data sets. This data includes raw observational and analytical data as well as derived data products from seabed sediment samples, boreholes, geophysical surveys (seismic, gravity etc) and sidescan sonar surveys. All of which are essential in order to produce a complete interpretation of seabed geology. Despite there being a large volume of geological and geophysical data available for the marine environment it is currently very difficult to use these datasets in an integrated way between organisations due to different nomenclatures, formats, scales and coordinate systems being used within different organisations and also within different countries. This makes the direct use of primary data in an integrated way very difficult and also hampers use of the data sets in a harmonised way to produce multidisciplinary data products and services. To ensure interoperability with other marine environmental data types Geo-Seas ISO19115 metadata, OGC and GeoSciML standards will be used as the basis for the metadata profiles for the geological and geophysical data. This will be largely achieved by modifying the SeaDataNet metadata standard profile (Common Data Index or CDI), which is itself based upon the ISO19115 standard, to accommodate the requirements of the Geo-Seas project. The overall objective of Geo-Seas project is to build and deploy a unified marine geoscientific data infrastructure within Europe which will in effect provide a data grid for the sharing of marine geological and geophysical data. This will result in a major improvement in the locating, accessing and delivery of federated marine geological and geophysical data and data products from national geological surveys and research institutes across Europe. There is an emphasis on interoperability both with other disciplines as well as with other key framework projects including the European Marine Observation and Data Network (EMODNet) and One Geology - Europe. In addition, a key objective of the Geo-Seas project is to underpin European directives such as INSPIRE as well as recent framework programmes on both the global and European scale, for example Global Earth Observation System of Systems (GEOSS) and Global Monitoring for Environment and Security (GMES), all of which are intended to encourage the exchange of data and information. Geo-Seas consortium partners: NERC-BGS (United Kingdom), NERC-BODC (United Kingdom), NERC-NOCS (United Kingdom), MARIS (Netherlands), IFREMER (France), BRGM (France), TNO (Netherlands), BSH (Germany), IGME (Spain), INETI (Portugal), IGME (Greece), GSI (Ireland), BGR (Germany), OGS (Italy), GEUS (Denmark), NGU (Norway), PGI (Poland), EGK (Estonia), LIGG (Lithuania), IO-BAS (Bulgaria), NOA (Greece), CIRIA (United Kingdom), MUMM (Belgium), UB (Spain), UCC (Ireland), EU-Consult (Netherlands), CNRS (France), SHOM (France), CEFAS (United Kingdom), and LU (Latvia). The project is coordinated by British Geological Survey (BGS), while the technical coordination is performed by Marine Information Service (MARIS). The Geo-Seas project is an Integrated Infrastructure Initiative (I3) of the Research Infrastructures programme within EU FP7, contract number RI-238952. It has a duration of 42 months from 1st May 2009 till 31st October 2012. 1 British Geological Survey, Keyworth, Nottingham, NG12 5GG, UK. e-mail: hmg@bgs.ac.uk 2 British Geological Survey, Murchison House, West Mains Road, Edinburgh, EH9 3LA, UK. e-mail: ccg@bgs.ac.uk
An assessment of autonomous vehicles : traffic impacts and infrastructure needs : final report.
DOT National Transportation Integrated Search
2017-03-01
The project began by understanding the current state of practice and trends. NHTSAs four-level taxonomy for automated vehicles was used to classify smart driving technologies and infrastructure needs. The project used surveys to analyze and gain a...
15 CFR 292.4 - Information infrastructure projects.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 15 Commerce and Foreign Trade 1 2013-01-01 2013-01-01 false Information infrastructure projects. 292.4 Section 292.4 Commerce and Foreign Trade Regulations Relating to Commerce and Foreign Trade NATIONAL INSTITUTE OF STANDARDS AND TECHNOLOGY, DEPARTMENT OF COMMERCE NIST EXTRAMURAL PROGRAMS...
15 CFR 292.4 - Information infrastructure projects.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 15 Commerce and Foreign Trade 1 2012-01-01 2012-01-01 false Information infrastructure projects. 292.4 Section 292.4 Commerce and Foreign Trade Regulations Relating to Commerce and Foreign Trade NATIONAL INSTITUTE OF STANDARDS AND TECHNOLOGY, DEPARTMENT OF COMMERCE NIST EXTRAMURAL PROGRAMS...
15 CFR 292.4 - Information infrastructure projects.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 15 Commerce and Foreign Trade 1 2014-01-01 2014-01-01 false Information infrastructure projects. 292.4 Section 292.4 Commerce and Foreign Trade Regulations Relating to Commerce and Foreign Trade NATIONAL INSTITUTE OF STANDARDS AND TECHNOLOGY, DEPARTMENT OF COMMERCE NIST EXTRAMURAL PROGRAMS...
Measuring Systemic Impacts of Bike Infrastructure Projects
DOT National Transportation Integrated Search
2018-05-01
This paper qualitatively identifies the impacts of bicycle infrastructure on all roadway users, including safety, operations, and travel route choice. Bicycle infrastructure includes shared lanes, conventional bike lanes, and separated bike lanes. Th...
Building sustainable multi-functional prospective electronic clinical data systems.
Randhawa, Gurvaneet S; Slutsky, Jean R
2012-07-01
A better alignment in the goals of the biomedical research enterprise and the health care delivery system can help fill the large gaps in our knowledge of the impact of clinical interventions on patient outcomes in the real world. There are several initiatives underway to align the research priorities of patients, providers, researchers, and policy makers. These include Agency for Healthcare Research and Quality (AHRQ)-supported projects to build flexible prospective clinical electronic data infrastructure that meet the needs of these diverse users. AHRQ has previously supported the creation of 2 distributed research networks as a new approach to conduct comparative effectiveness research (CER) while protecting a patient's confidential information and the proprietary needs of a clinical organization. It has applied its experience in building these networks in directing the American Recovery and Reinvestment Act funds for CER to support new clinical electronic infrastructure projects that can be used for several purposes including CER, quality improvement, clinical decision support, and disease surveillance. In addition, AHRQ has funded a new Electronic Data Methods forum to advance the methods in clinical informatics, research analytics, and governance by actively engaging investigators from the American Recovery and Reinvestment Act-funded projects and external stakeholders.
A Framework for Debugging Geoscience Projects in a High Performance Computing Environment
NASA Astrophysics Data System (ADS)
Baxter, C.; Matott, L.
2012-12-01
High performance computing (HPC) infrastructure has become ubiquitous in today's world with the emergence of commercial cloud computing and academic supercomputing centers. Teams of geoscientists, hydrologists and engineers can take advantage of this infrastructure to undertake large research projects - for example, linking one or more site-specific environmental models with soft computing algorithms, such as heuristic global search procedures, to perform parameter estimation and predictive uncertainty analysis, and/or design least-cost remediation systems. However, the size, complexity and distributed nature of these projects can make identifying failures in the associated numerical experiments using conventional ad-hoc approaches both time- consuming and ineffective. To address these problems a multi-tiered debugging framework has been developed. The framework allows for quickly isolating and remedying a number of potential experimental failures, including: failures in the HPC scheduler; bugs in the soft computing code; bugs in the modeling code; and permissions and access control errors. The utility of the framework is demonstrated via application to a series of over 200,000 numerical experiments involving a suite of 5 heuristic global search algorithms and 15 mathematical test functions serving as cheap analogues for the simulation-based optimization of pump-and-treat subsurface remediation systems.
Code of Federal Regulations, 2010 CFR
2010-10-01
... facilities may be included in an eligible rural water supply project? 404.9 Section 404.9 Public Lands... RURAL WATER SUPPLY PROGRAM Overview § 404.9 What types of infrastructure and facilities may be included in an eligible rural water supply project? A rural water supply project may include, but is not...
Stereoscopic display of 3D models for design visualization
NASA Astrophysics Data System (ADS)
Gilson, Kevin J.
2006-02-01
Advances in display technology and 3D design visualization applications have made real-time stereoscopic visualization of architectural and engineering projects a reality. Parsons Brinkerhoff (PB) is a transportation consulting firm that has used digital visualization tools from their inception and has helped pioneer the application of those tools to large scale infrastructure projects. PB is one of the first Architecture/Engineering/Construction (AEC) firms to implement a CAVE- an immersive presentation environment that includes stereoscopic rear-projection capability. The firm also employs a portable stereoscopic front-projection system, and shutter-glass systems for smaller groups. PB is using commercial real-time 3D applications in combination with traditional 3D modeling programs to visualize and present large AEC projects to planners, clients and decision makers in stereo. These presentations create more immersive and spatially realistic presentations of the proposed designs. This paper will present the basic display tools and applications, and the 3D modeling techniques PB is using to produce interactive stereoscopic content. The paper will discuss several architectural and engineering design visualizations we have produced.
Developments in damage assessment by Marie Skłodowska-Curie TRUSS ITN project
NASA Astrophysics Data System (ADS)
González, A.
2017-05-01
The growth of cities, the impacts of climate change and the massive cost of providing new infrastructure provide the impetus for TRUSS (Training in Reducing Uncertainty in Structural Safety), a €3.7 million Marie Skłodowska-Curie Action Innovative Training Network project funded by EU’s Horizon 2020 programme, which aims to maximize the potential of infrastructure that already exists (http://trussitn.eu). For that purpose, TRUSS brings together an international, inter-sectoral and multidisciplinary collaboration between five academic and eleven industry institutions from five European countries. The project covers rail and road infrastructure, buildings and energy and marine infrastructure. This paper reports progress in fields such as advanced sensor-based structural health monitoring solutions - unmanned aerial vehicles, optical backscatter reflectometry, monitoring sensors mounted on vehicles, … - and innovative algorithms for structural designs and short- and long-term assessments of buildings, bridges, pavements, ships, ship unloaders, nuclear components and wind turbine towers that will support infrastructure operators and owners in managing their assets.
NASA Astrophysics Data System (ADS)
Martel, J. L.; Brissette, F.; Mailhot, A.; Wood, R. R.; Ludwig, R.; Frigon, A.; Leduc, M.; Turcotte, R.
2017-12-01
Recent studies indicate that the frequency and intensity of extreme precipitation will increase in future climate due to global warming. In this study, we compare annual maxima precipitation series from three large ensembles of climate simulations at various spatial and temporal resolutions. The first two are at the global scale: the Canadian Earth System Model (CanESM2) 50-member large ensemble (CanESM2-LE) at a 2.8° resolution and the Community Earth System Model (CESM1) 40-member large ensemble (CESM1-LE) at a 1° resolution. The third ensemble is at the regional scale over both Eastern North America and Europe: the Canadian Regional Climate Model (CRCM5) 50-member large ensemble (CRCM5-LE) at a 0.11° resolution, driven at its boundaries by the CanESM-LE. The CRCM5-LE is a new ensemble issued from the ClimEx project (http://www.climex-project.org), a Québec-Bavaria collaboration. Using these three large ensembles, change in extreme precipitations over the historical (1980-2010) and future (2070-2100) periods are investigated. This results in 1 500 (30 years x 50 members for CanESM2-LE and CRCM5-LE) and 1200 (30 years x 40 members for CESM1-LE) simulated years over both the historical and future periods. Using these large datasets, the empirical daily (and sub-daily for CRCM5-LE) extreme precipitation quantiles for large return periods ranging from 2 to 100 years are computed. Results indicate that daily extreme precipitations generally will increase over most land grid points of both domains according to the three large ensembles. Regarding the CRCM5-LE, the increase in sub-daily extreme precipitations will be even more important than the one observed for daily extreme precipitations. Considering that many public infrastructures have lifespans exceeding 75 years, the increase in extremes has important implications on service levels of water infrastructures and public safety.
IT Infrastructure Projects: A Framework for Analysis. ECAR Research Bulletin
ERIC Educational Resources Information Center
Grochow, Jerrold M.
2014-01-01
Just as maintaining a healthy infrastructure of water delivery and roads is essential to the functioning of cities and towns, maintaining a healthy infrastructure of information technology is essential to the functioning of universities. Deterioration in IT infrastructure can lead to deterioration in research, teaching, and administration. Given…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Perry, F.V.; Valentine, G.A.; Crowe, B.M.
This is the final report of a one-year, Laboratory Directed Research and Development (LDRD) project at Los Alamos National Laboratory (LANL). The objective of this project was to determine whether isotopic techniques can be used to assess the eruption potential and eruption volume of continental stratovolcanoes. Large-volume eruptions from stratovolcanoes pose significant hazards to population and infrastructure in many parts of the world. We are testing whether this technique will allow a short- to medium-term (decades to millennia) probabilistic hazard assessment of large-volume eruptions. If successful, the technique will be useful to countries or regions that must consider medium tomore » long-term volcanic (e.g., nuclear waste facilities). We have begun sample acquisition and isotopic measurements at two stratovolcanoes, Pico de Orizaba in eastern Mexico and Daisen in western Japan.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smart, John Galloway; Salisbury, Shawn Douglas
2015-07-01
This report summarizes key findings in two national plug-in electric vehicle charging infrastructure demonstrations: The EV Project and ChargePoint America. It will be published to the INL/AVTA website for the general public.
State Transmission Infrastructure Authorities: The Story So Far; December 2007 - December 2008
DOE Office of Scientific and Technical Information (OSTI.GOV)
Porter, K.; Fink. S.
2008-05-01
This report examines the status and future direction of state transmission infrastructure authorities. It summarizes common characteristics, discusses current transmission projects, and outlines common issues the state infrastructure authorities have faced.
Information Infrastructure Technology and Applications (IITA) Program: Annual K-12 Workshop
NASA Technical Reports Server (NTRS)
Hunter, Paul; Likens, William; Leon, Mark
1995-01-01
The purpose of the K-12 workshop is to stimulate a cross pollination of inter-center activity and introduce the regional centers to curing edge K-1 activities. The format of the workshop consists of project presentations, working groups, and working group reports, all contained in a three day period. The agenda is aggressive and demanding. The K-12 Education Project is a multi-center activity managed by the Information Infrastructure Technology and Applications (IITA)/K-12 Project Office at the NASA Ames Research Center (ARC). this workshop is conducted in support of executing the K-12 Education element of the IITA Project The IITA/K-12 Project funds activities that use the National Information Infrastructure (NII) (e.g., the Internet) to foster reform and restructuring in mathematics, science, computing, engineering, and technical education.
Wilcox, S.; Andreas, A.
2010-03-16
The U.S. Department of Energy's National Renewable Energy Laboratory collaborates with the solar industry to establish high quality solar and meteorological measurements. This Solar Resource and Meteorological Assessment Project (SOLRMAP) provides high quality measurements to support deployment of power projects in the United States. The no-funds-exchanged collaboration brings NREL solar resource assessment expertise together with industry needs for measurements. The end result is high quality data sets to support the financing, design, and monitoring of large scale solar power projects for industry in addition to research-quality data for NREL model development. NREL provides consultation for instrumentation and station deployment, along with instrument calibrations, data acquisition, quality assessment, data distribution, and summary reports. Industry participants provide equipment, infrastructure, and station maintenance.
Stoffel, T.; Andreas, A.
2010-04-26
The U.S. Department of Energy's National Renewable Energy Laboratory collaborates with the solar industry to establish high quality solar and meteorological measurements. This Solar Resource and Meteorological Assessment Project (SOLRMAP) provides high quality measurements to support deployment of power projects in the United States. The no-funds-exchanged collaboration brings NREL solar resource assessment expertise together with industry needs for measurements. The end result is high quality data sets to support the financing, design, and monitoring of large scale solar power projects for industry in addition to research-quality data for NREL model development. NREL provides consultation for instrumentation and station deployment, along with instrument calibrations, data acquisition, quality assessment, data distribution, and summary reports. Industry participants provide equipment, infrastructure, and station maintenance.
Wilcox, S.; Andreas, A.
2010-07-13
The U.S. Department of Energy's National Renewable Energy Laboratory collaborates with the solar industry to establish high quality solar and meteorological measurements. This Solar Resource and Meteorological Assessment Project (SOLRMAP) provides high quality measurements to support deployment of power projects in the United States. The no-funds-exchanged collaboration brings NREL solar resource assessment expertise together with industry needs for measurements. The end result is high quality data sets to support the financing, design, and monitoring of large scale solar power projects for industry in addition to research-quality data for NREL model development. NREL provides consultation for instrumentation and station deployment, along with instrument calibrations, data acquisition, quality assessment, data distribution, and summary reports. Industry participants provide equipment, infrastructure, and station maintenance.
Wilcox, S.; Andreas, A.
2012-11-03
The U.S. Department of Energy's National Renewable Energy Laboratory collaborates with the solar industry to establish high quality solar and meteorological measurements. This Solar Resource and Meteorological Assessment Project (SOLRMAP) provides high quality measurements to support deployment of power projects in the United States. The no-funds-exchanged collaboration brings NREL solar resource assessment expertise together with industry needs for measurements. The end result is high quality data sets to support the financing, design, and monitoring of large scale solar power projects for industry in addition to research-quality data for NREL model development. NREL provides consultation for instrumentation and station deployment, along with instrument calibrations, data acquisition, quality assessment, data distribution, and summary reports. Industry participants provide equipment, infrastructure, and station maintenance.
Solar Resource & Meteorological Assessment Project (SOLRMAP): Sun Spot Two; Swink, Colorado (Data)
Wilcox, S.; Andreas, A.
2010-11-10
The U.S. Department of Energy's National Renewable Energy Laboratory collaborates with the solar industry to establish high quality solar and meteorological measurements. This Solar Resource and Meteorological Assessment Project (SOLRMAP) provides high quality measurements to support deployment of power projects in the United States. The no-funds-exchanged collaboration brings NREL solar resource assessment expertise together with industry needs for measurements. The end result is high quality data sets to support the financing, design, and monitoring of large scale solar power projects for industry in addition to research-quality data for NREL model development. NREL provides consultation for instrumentation and station deployment, along with instrument calibrations, data acquisition, quality assessment, data distribution, and summary reports. Industry participants provide equipment, infrastructure, and station maintenance.
Wilcox, S.; Andreas, A.
2010-07-14
The U.S. Department of Energy's National Renewable Energy Laboratory collaborates with the solar industry to establish high quality solar and meteorological measurements. This Solar Resource and Meteorological Assessment Project (SOLRMAP) provides high quality measurements to support deployment of power projects in the United States. The no-funds-exchanged collaboration brings NREL solar resource assessment expertise together with industry needs for measurements. The end result is high quality data sets to support the financing, design, and monitoring of large scale solar power projects for industry in addition to research-quality data for NREL model development. NREL provides consultation for instrumentation and station deployment, along with instrument calibrations, data acquisition, quality assessment, data distribution, and summary reports. Industry participants provide equipment, infrastructure, and station maintenance.
Wilcox, S.; Andreas, A.
2009-07-22
The U.S. Department of Energy's National Renewable Energy Laboratory collaborates with the solar industry to establish high quality solar and meteorological measurements. This Solar Resource and Meteorological Assessment Project (SOLRMAP) provides high quality measurements to support deployment of power projects in the United States. The no-funds-exchanged collaboration brings NREL solar resource assessment expertise together with industry needs for measurements. The end result is high quality data sets to support the financing, design, and monitoring of large scale solar power projects for industry in addition to research-quality data for NREL model development. NREL provides consultation for instrumentation and station deployment, along with instrument calibrations, data acquisition, quality assessment, data distribution, and summary reports. Industry participants provide equipment, infrastructure, and station maintenance.
Wilcox, S.; Andreas, A.
2010-11-03
The U.S. Department of Energy's National Renewable Energy Laboratory collaborates with the solar industry to establish high quality solar and meteorological measurements. This Solar Resource and Meteorological Assessment Project (SOLRMAP) provides high quality measurements to support deployment of power projects in the United States. The no-funds-exchanged collaboration brings NREL solar resource assessment expertise together with industry needs for measurements. The end result is high quality data sets to support the financing, design, and monitoring of large scale solar power projects for industry in addition to research-quality data for NREL model development. NREL provides consultation for instrumentation and station deployment, along with instrument calibrations, data acquisition, quality assessment, data distribution, and summary reports. Industry participants provide equipment, infrastructure, and station maintenance.
Sovereign cat bonds and infrastructure project financing.
Croson, David; Richter, Andreas
2003-06-01
We examine the opportunities for using catastrophe-linked securities (or equivalent forms of nondebt contingent capital) to reduce the total costs of funding infrastructure projects in emerging economies. Our objective is to elaborate on methods to reduce the necessity for unanticipated (emergency) project funding immediately after a natural disaster. We also place the existing explanations of sovereign-level contingent capital into a catastrophic risk management framework. In doing so, we address the following questions. (1) Why might catastrophe-linked securities be useful to a sovereign nation, over and above their usefulness for insurers and reinsurers? (2) Why are such financial instruments ideally suited for protecting infrastructure projects in emerging economies, under third-party sponsorship, from low-probability, high-consequence events that occur as a result of natural disasters? (3) How can the willingness to pay of a sovereign government in an emerging economy (or its external project sponsor), who values timely completion of infrastructure projects, for such instruments be calculated? To supplement our treatment of these questions, we use a multilayer spreadsheet-based model (in Microsoft Excel format) to calculate the overall cost reductions possible through the judicious use of catastrophe-based financial tools. We also report on numerical comparative statics on the value of contingent-capital financing to avoid project disruption based on varying costs of capital, probability and consequences of disasters, the feasibility of strategies for mid-stage project abandonment, and the timing of capital commitments to the infrastructure investment. We use these results to identify high-priority applications of catastrophe-linked securities so that maximal protection can be realized if the total number of catastrophe instruments is initially limited. The article concludes with potential extensions to our model and opportunities for future research.
NASA Astrophysics Data System (ADS)
Luo, Y.; Huang, Y.; Jiang, J.; MA, S.; Saruta, V.; Liang, G.; Hanson, P. J.; Ricciuto, D. M.; Milcu, A.; Roy, J.
2017-12-01
The past two decades have witnessed rapid development in sensor technology. Built upon the sensor development, large research infrastructure facilities, such as National Ecological Observatory Network (NEON) and FLUXNET, have been established. Through networking different kinds of sensors and other data collections at many locations all over the world, those facilities generate large volumes of ecological data every day. The big data from those facilities offer an unprecedented opportunity for advancing our understanding of ecological processes, educating teachers and students, supporting decision-making, and testing ecological theory. The big data from the major research infrastructure facilities also provides foundation for developing predictive ecology. Indeed, the capability to predict future changes in our living environment and natural resources is critical to decision making in a world where the past is no longer a clear guide to the future. We are living in a period marked by rapid climate change, profound alteration of biogeochemical cycles, unsustainable depletion of natural resources, and deterioration of air and water quality. Projecting changes in future ecosystem services to the society becomes essential not only for science but also for policy making. We will use this panel format to outline major opportunities and challenges in integrating research infrastructure and ecosystem models toward developing predictive ecology. Meanwhile, we will also show results from an interactive model-experiment System - Ecological Platform for Assimilating Data into models (EcoPAD) - that have been implemented at the Spruce and Peatland Responses Under Climatic and Environmental change (SPRUCE) experiment in Northern Minnesota and Montpellier Ecotron, France. EcoPAD is developed by integrating web technology, eco-informatics, data assimilation techniques, and ecosystem modeling. EcoPAD is designed to streamline data transfer seamlessly from research infrastructure facilities to model simulation, data assimilation, and ecological forecasting.
Zanutto, Alberto
2017-06-01
One of the most significant changes in the healthcare field in the past 10 years has been the large-scale digitalization of patients' healthcare data, and an increasing emphasis on the importance of patients' roles in cooperating with healthcare professionals through digital infrastructures. A project carried out in the North of Italy with the aim of creating a personal health record has been evaluated over the course of 5 years by means of mixed method fieldwork. Two years after the infrastructure was put into regular service, the way in which patients are represented in the system and patient practices have been studied using surveys and qualitative interviews. The data show that, first, patients have become co-actors in describing their clinical histories; second, that they have become co-actors in the diagnosis process; and finally, they have become co-actors in the management of time and space as regards their specific state of health.
Winstein, Carolee; Pate, Patricia; Ge, Tingting; Ervin, Carolyn; Baurley, James; Sullivan, Katherine J; Underwood, Samantha J; Fowler, Eileen G; Mulroy, Sara; Brown, David A; Kulig, Kornelia; Gordon, James; Azen, Stanley P
2008-11-01
This article describes the vision, methods, and implementation strategies used in building the infrastructure for PTClinResNet, a clinical research network designed to assess outcomes for health-related mobility associated with evidence-based physical therapy interventions across and within four different disability groups. Specific aims were to (1) create the infrastructure necessary to develop and sustain clinical trials research in rehabilitation, (2) generate evidence to evaluate the efficacy of resistance exercise-based physical interventions designed to improve muscle performance and movement skills, and (3) provide education and training opportunities for present and future clinician-researchers and for the rehabilitation community at-large in its support of evidence-based practice. We present the network's infrastructure, development, and several examples that highlight the benefits of a clinical research network. We suggest that the network structure is ideal for building research capacity and fostering multisite, multiinvestigator clinical research projects designed to generate evidence for the efficacy of rehabilitation interventions.
SFB754 - data management in large interdisciplinary collaborative research projects: what matters?
NASA Astrophysics Data System (ADS)
Mehrtens, Hela; Springer, Pina; Schirnick, Carsten; Schelten, Christiane K.
2016-04-01
Data management for SFB 754 is an integral part of the joint data management team at GEOMAR Helmholtz Centre for Ocean Research Kiel, a cooperation of the Cluster of Excellence "Future Ocean", the SFB 754 and other current and former nationally and EU-funded projects. The coalition successfully established one common data management infrastructure for marine sciences in Kiel. It aims to help researchers to better document the data lifecycle from acquisition to publication and share their results already during the project phase. The infrastructure is continuously improved by integration of standard tools and developing extensions in close cooperation with scientists, data centres and other research institutions. Open and frequent discussion of data management topics during SFB 754 meetings and seminars and efficient cooperation with its coordination office allowed gradual establishment of better data management practices. Furthermore a data policy was agreed on to ensure proper usage of data sets, even unpublished ones, schedules data upload and dissemination and enforces long-term public availability of the research outcome. Acceptance of the infrastructure is also backed by easy usage of the web-based platform for data set documentation and exchange among all research disciplines of the SFB 754. Members of the data management team act as data curators and assist in data publication in World Data Centres (e.g. PANGAEA). Cooperation with world data centres makes the research data then globally searchable and accessible while links to the data producers ensure citability and provide points of contact for the scientific community. A complete record of SFB 754 publications is maintained within the institutional repository for full text print publications by the GEOMAR library. This repository is strongly linked with the data management information system providing dynamic and up-to-date overviews on the various ties between publications and available data sets, expeditions and projects. Such views are also frequently used for the website and reports by the SFB 754 scientific community. The concept of a joint approach initiated by large-scale projects and participating institutions in order to establish a single data management infrastructure has proven to be very successful. We have experienced a snowball-like propagation among marine researchers at GEOMAR and Kiel University, they continue to engage data management services well known from collaboration with SFB 754. But we also observe an ongoing demand for training of new junior (and senior) scientists and continuous need for adaption to new methods and techniques. Only a standardized and consistent data management warrants completeness and integrity of published research data related to their peer-reviewed journal publications in the long run. Based on our daily experience this is best achieved, if not only, by skilled and experienced staff in a dedicated data management team which persists beyond the funding period of research projects. It can effectively carry on and impact by continuous personal contact, consultation and training of researchers on-site. (This poster is linked to the presentation by Dr. Christiane K. Schelten)
NASA Astrophysics Data System (ADS)
Schaap, Dick M. A.; Fichaut, Michele
2014-05-01
The second phase of the project SeaDataNet is well underway since October 2011 and is making good progress. The main objective is to improve operations and to progress towards an efficient data management infrastructure able to handle the diversity and large volume of data collected via research cruises and monitoring activities in European marine waters and global oceans. The SeaDataNet infrastructure comprises a network of interconnected data centres and a central SeaDataNet portal. The portal provides users a unified and transparent overview of the metadata and controlled access to the large collections of data sets, managed by the interconnected data centres, and the various SeaDataNet standards and tools,. Recently the 1st Innovation Cycle has been completed, including upgrading of the CDI Data Discovery and Access service to ISO 19139 and making it fully INSPIRE compliant. The extensive SeaDataNet Vocabularies have been upgraded too and implemented for all SeaDataNet European metadata directories. SeaDataNet is setting and governing marine data standards, and exploring and establishing interoperability solutions to connect to other e-infrastructures on the basis of standards of ISO (19115, 19139), OGC (WMS, WFS, CS-W and SWE), and OpenSearch. The population of directories has also increased considerably in cooperation and involvement in associated EU projects and initiatives. SeaDataNet now gives overview and access to more than 1.4 million data sets for physical oceanography, chemistry, geology, geophysics, bathymetry and biology from more than 90 connected data centres from 30 countries riparian to European seas. Access to marine data is also a key issue for the implementation of the EU Marine Strategy Framework Directive (MSFD). The EU communication 'Marine Knowledge 2020' underpins the importance of data availability and harmonising access to marine data from different sources. SeaDataNet qualified itself for leading the data management component of the EMODNet (European Marine Observation and Data Network) that is promoted in the EU Communication. In the past 4 years EMODNet portals have been initiated for marine data themes: digital bathymetry, chemistry, physical oceanography, geology, biology, and seabed habitat mapping. These portals are now being expanded to all European seas in successor projects, which started mid 2013 from EU DG MARE. EMODNet encourages more data providers to come forward for data sharing and participating in the process of making complete overviews and homogeneous data products. The EMODNet Bathymetry project is very illustrative for the synergy with SeaDataNet and added value of generating public data products. The project develops and publishes Digital Terrain Models (DTM) for the European seas. These are produced from survey and aggregated data sets. The portal provides a versatile DTM viewing service with many relevant map layers and functions for retrieving. A further refinement is taking place in the new phase. The presentation will give information on present services of the SeaDataNet infrastructure and services, highlight key achievements in SeaDataNet II so far, and give further insights in the EMODNet Bathymetry progress.
NASA Astrophysics Data System (ADS)
Glaves, Helen
2015-04-01
Marine research is rapidly moving away from traditional discipline specific science to a wider ecosystem level approach. This more multidisciplinary approach to ocean science requires large amounts of good quality, interoperable data to be readily available for use in an increasing range of new and complex applications. Significant amounts of marine data and information are already available throughout the world as a result of e-infrastructures being established at a regional level to manage and deliver marine data to the end user. However, each of these initiatives has been developed to address specific regional requirements and independently of those in other regions. Establishing a common framework for marine data management on a global scale necessitates that there is interoperability across these existing data infrastructures and active collaboration between the organisations responsible for their management. The Ocean Data Interoperability Platform (ODIP) project is promoting co-ordination between a number of these existing regional e-infrastructures including SeaDataNet and Geo-Seas in Europe, the Integrated Marine Observing System (IMOS) in Australia, the Rolling Deck to Repository (R2R) in the USA and the international IODE initiative. To demonstrate this co-ordinated approach the ODIP project partners are currently working together to develop several prototypes to test and evaluate potential interoperability solutions for solving the incompatibilities between the individual regional marine data infrastructures. However, many of the issues being addressed by the Ocean Data Interoperability Platform are not specific to marine science. For this reason many of the outcomes of this international collaborative effort are equally relevant and transferable to other domains.
Code of Federal Regulations, 2010 CFR
2010-10-01
... and facilities that may not be included in a rural water supply project? 404.10 Section 404.10 Public... RECLAMATION RURAL WATER SUPPLY PROGRAM Overview § 404.10 Are there certain types of infrastructure and facilities that may not be included in a rural water supply project? Yes. A rural water supply project may...
2016-09-01
unneeded to support U.S. forces in the CENTCOM area of responsibility and in future contingencies worldwide. View GAO-16-406. For more information...DEFENSE INFRASTRUCTURE Actions Needed to Enhance Oversight of Construction Projects Supporting Military Contingency ...Actions Needed to Enhance Oversight of Construction Projects Supporting Military Contingency Operations Why GAO Did This Study For about 15 years, DOD
COMMUNITY-ORIENTED DESIGN AND EVALUATION PROCESS FOR SUSTAINABLE INFRASTRUCTURE
We met our first objective by completing the physical infrastructure of the La Fortuna-Tule water and sanitation project using the CODE-PSI method. This physical component of the project was important in providing a real, relevant, community-scale test case for the methods ...
Multi-Scale Infrastructure Assessment
The U.S. Environmental Protection Agency’s (EPA) multi-scale infrastructure assessment project supports both water resource adaptation to climate change and the rehabilitation of the nation’s aging water infrastructure by providing tools, scientific data and information to progra...
Volden, Gro Holst
2018-08-01
Infrastructure projects in developed countries are rarely evaluated ex-post. Despite their number and scope, our knowledge about their various impacts is surprisingly limited. The paper argues that such projects must be assessed in a broad perspective that includes both operational, tactical and strategic aspects, and unintended as well as intended effects. A generic six-criteria evaluation framework is suggested, inspired by a framework frequently used to evaluate development assistance projects. It is tested on 20 Norwegian projects from various sectors (transport, defence, ICT, buildings). The results indicate that the majority of projects were successful, especially in operational terms, possibly because they underwent external quality assurance up-front. It is argued that applying this type of standardized framework provides a good basis for comparison and learning across sectors. It is suggested that evaluations should be conducted with the aim of promoting accountability, building knowledge about infrastructure projects, and continuously improve the tools, methods and governance arrangements used in the front-end of project development. Copyright © 2018 The Author. Published by Elsevier Ltd.. All rights reserved.
Integration of a neuroimaging processing pipeline into a pan-canadian computing grid
NASA Astrophysics Data System (ADS)
Lavoie-Courchesne, S.; Rioux, P.; Chouinard-Decorte, F.; Sherif, T.; Rousseau, M.-E.; Das, S.; Adalat, R.; Doyon, J.; Craddock, C.; Margulies, D.; Chu, C.; Lyttelton, O.; Evans, A. C.; Bellec, P.
2012-02-01
The ethos of the neuroimaging field is quickly moving towards the open sharing of resources, including both imaging databases and processing tools. As a neuroimaging database represents a large volume of datasets and as neuroimaging processing pipelines are composed of heterogeneous, computationally intensive tools, such open sharing raises specific computational challenges. This motivates the design of novel dedicated computing infrastructures. This paper describes an interface between PSOM, a code-oriented pipeline development framework, and CBRAIN, a web-oriented platform for grid computing. This interface was used to integrate a PSOM-compliant pipeline for preprocessing of structural and functional magnetic resonance imaging into CBRAIN. We further tested the capacity of our infrastructure to handle a real large-scale project. A neuroimaging database including close to 1000 subjects was preprocessed using our interface and publicly released to help the participants of the ADHD-200 international competition. This successful experiment demonstrated that our integrated grid-computing platform is a powerful solution for high-throughput pipeline analysis in the field of neuroimaging.
Team building: electronic management-clinical translational research (eM-CTR) systems.
Cecchetti, Alfred A; Parmanto, Bambang; Vecchio, Marcella L; Ahmad, Sjarif; Buch, Shama; Zgheib, Nathalie K; Groark, Stephen J; Vemuganti, Anupama; Romkes, Marjorie; Sciurba, Frank; Donahoe, Michael P; Branch, Robert A
2009-12-01
Classical drug exposure: response studies in clinical pharmacology represent the quintessential prototype for Bench to Bedside-Clinical Translational Research. A fundamental premise of this approach is for a multidisciplinary team of researchers to design and execute complex, in-depth mechanistic studies conducted in relatively small groups of subjects. The infrastructure support for this genre of clinical research is not well-handled by scaling down of infrastructure used for large Phase III clinical trials. We describe a novel, integrated strategy, whose focus is to support and manage a study using an Information Hub, Communication Hub, and Data Hub design. This design is illustrated by an application to a series of varied projects sponsored by Special Clinical Centers of Research in chronic obstructive pulmonary disease at the University of Pittsburgh. In contrast to classical informatics support, it is readily scalable to large studies. Our experience suggests the culture consequences of research group self-empowerment is not only economically efficient but transformative to the research process.
Data discovery and data processing for environmental research infrastructures
NASA Astrophysics Data System (ADS)
Los, Wouter; Beranzoli, Laura; Corriero, Giuseppe; Cossu, Roberto; Fiore, Nicola; Hardisty, Alex; Legré, Yannick; Pagano, Pasquale; Puglisi, Giuseppe; Sorvari, Sanna; Turunen, Esa
2013-04-01
The European ENVRI project (Common operations of Environmental Research Infrastructures) is addressing common ICT solutions for the research infrastructures as selected in the ESFRI Roadmap. More specifically, the project is looking for solutions that will assist interdisciplinary users who want to benefit from the data and other services of more than a single research infrastructure. However, the infrastructure architectures, the data, data formats, scales and granularity are very different. Indeed, they deal with diverse scientific disciplines, from plate tectonics, the deep sea, sea and land surface up to atmosphere and troposphere, from the dead to the living environment, and with a variety of instruments producing increasingly larger amounts of data. One of the approaches in the ENVRI project is to design a common Reference Model that will serve to promote infrastructure interoperability at the data, technical and service levels. The analysis of the characteristics of the environmental research infrastructures assisted in developing the Reference Model, and which is also an example for comparable infrastructures worldwide. Still, it is for users already now important to have the facilities available for multi-disciplinary data discovery and data processing. The rise of systems research, addressing Earth as a single complex and coupled system is requiring such capabilities. So, another approach in the project is to adapt existing ICT solutions to short term applications. This is being tested for a few study cases. One of these is looking for possible coupled processes following a volcano eruption in the vertical column from deep sea to troposphere. Another one deals with volcano either human impacts on atmospheric and sea CO2 pressure and the implications for sea acidification and marine biodiversity and their ecosystems. And a third one deals with the variety of sensor and satellites data sensing the area around a volcano cone. Preliminary results on these studies will be reported. The common results will assist in shaping more generic solutions to be adopted by the appropriate research infrastructures.
NASA Astrophysics Data System (ADS)
The CHAIN-REDS Project is organising a workshop on "e-Infrastructures for e-Sciences" focusing on Cloud Computing and Data Repositories under the aegis of the European Commission and in co-location with the International Conference on e-Science 2013 (IEEE2013) that will be held in Beijing, P.R. of China on October 17-22, 2013. The core objective of the CHAIN-REDS project is to promote, coordinate and support the effort of a critical mass of non-European e-Infrastructures for Research and Education to collaborate with Europe addressing interoperability and interoperation of Grids and other Distributed Computing Infrastructures (DCI). From this perspective, CHAIN-REDS will optimise the interoperation of European infrastructures with those present in 6 other regions of the world, both from a development and use point of view, and catering to different communities. Overall, CHAIN-REDS will provide input for future strategies and decision-making regarding collaboration with other regions on e-Infrastructure deployment and availability of related data; it will raise the visibility of e-Infrastructures towards intercontinental audiences, covering most of the world and will provide support to establish globally connected and interoperable infrastructures, in particular between the EU and the developing regions. Organised by IHEP, INFN and Sigma Orionis with the support of all project partners, this workshop will aim at: - Presenting the state of the art of Cloud computing in Europe and in China and discussing the opportunities offered by having interoperable and federated e-Infrastructures; - Exploring the existing initiatives of Data Infrastructures in Europe and China, and highlighting the Data Repositories of interest for the Virtual Research Communities in several domains such as Health, Agriculture, Climate, etc.
Controlled Hydrogen Fleet and Infrastructure Demonstration and Validation Project
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stottler, Gary
General Motors, LLC and energy partner Shell Hydrogen, LLC, deployed a system of hydrogen fuel cell electric vehicles integrated with a hydrogen fueling station infrastructure to operate under real world conditions as part of the U.S. Department of Energy's Controlled Hydrogen Fleet and Infrastructure Validation and Demonstration Project. This technical report documents the performance and describes the learnings from progressive generations of vehicle fuel cell system technology and multiple approaches to hydrogen generation and delivery for vehicle fueling.
Prototype Software Assurance Framework (SAF): Introduction and Overview
2017-04-05
Introduction 1 1 Process Management (Category 1) 6 1.1 Process Definition (Area 1.1) 6 1.2 Infrastructure Standards (Area 1.2) 6 1.3 Resources (Area 1.3) 7...1.4 Training (Area 1.4) 8 2 Project Management (Category 2) 9 2.1 Project Plans (Area 2.1) 9 2.2 Project Infrastructure (Area 2.2) 10 2.3 Project...Monitoring (Area 2.3) 10 2.4 Project Risk Management (Area 2.4) 11 2.5 Supplier Management (Area 2.5) 11 3 Engineering (Category 3) 13 3.1 Product
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hutchinson, R.L.; Hamilton, V.A.; Istrail, G.G.
1997-11-01
This report describes the results of a Sandia-funded laboratory-directed research and development project titled {open_quotes}Integrated and Robust Security Infrastructure{close_quotes} (IRSI). IRSI was to provide a broad range of commercial-grade security services to any software application. IRSI has two primary goals: application transparency and manageable public key infrastructure. IRSI must provide its security services to any application without the need to modify the application to invoke the security services. Public key mechanisms are well suited for a network with many end users and systems. There are many issues that make it difficult to deploy and manage a public key infrastructure. IRSImore » addressed some of these issues to create a more manageable public key infrastructure.« less
The Impact of Airport Performance towards Construction and Infrastructure Expansion in Indonesia
NASA Astrophysics Data System (ADS)
Laksono, T. D.; Kurniasih, N.; Hasyim, C.; Setiawan, M. I.; Ahmar, A. S.
2018-01-01
Development that is generated from airport areas includes construction and infrastructure development. This research reviews about how the implementation of material management in certain construction project and the relationship between development especially construction and infrastructure development with Airport Performance. The method that is used in this research is mixed method. The population in this research is 297 airports that are existed in Indonesia. From those 297 airports then it is chosen airports that have the most completed data about construction project and it is obtained 148 airports. Based on the coefficient correlation (R) test it is known that construction and infrastructure development has relatively strong relation with airport performance variable, but there are still other factors that influence construction and infrastructure development become bigger effect.
Smart City Pilot Projects Using LoRa and IEEE802.15.4 Technologies.
Pasolini, Gianni; Buratti, Chiara; Feltrin, Luca; Zabini, Flavio; De Castro, Cristina; Verdone, Roberto; Andrisano, Oreste
2018-04-06
Information and Communication Technologies (ICTs), through wireless communications and the Internet of Things (IoT) paradigm, are the enabling keys for transforming traditional cities into smart cities, since they provide the core infrastructure behind public utilities and services. However, to be effective, IoT-based services could require different technologies and network topologies, even when addressing the same urban scenario. In this paper, we highlight this aspect and present two smart city testbeds developed in Italy. The first one concerns a smart infrastructure for public lighting and relies on a heterogeneous network using the IEEE 802.15.4 short-range communication technology, whereas the second one addresses smart-building applications and is based on the LoRa low-rate, long-range communication technology. The smart lighting scenario is discussed providing the technical details and the economic benefits of a large-scale (around 3000 light poles) flexible and modular implementation of a public lighting infrastructure, while the smart-building testbed is investigated, through measurement campaigns and simulations, assessing the coverage and the performance of the LoRa technology in a real urban scenario. Results show that a proper parameter setting is needed to cover large urban areas while maintaining the airtime sufficiently low to keep packet losses at satisfactory levels.
Smart City Pilot Projects Using LoRa and IEEE802.15.4 Technologies
Buratti, Chiara; Zabini, Flavio; De Castro, Cristina; Verdone, Roberto; Andrisano, Oreste
2018-01-01
Information and Communication Technologies (ICTs), through wireless communications and the Internet of Things (IoT) paradigm, are the enabling keys for transforming traditional cities into smart cities, since they provide the core infrastructure behind public utilities and services. However, to be effective, IoT-based services could require different technologies and network topologies, even when addressing the same urban scenario. In this paper, we highlight this aspect and present two smart city testbeds developed in Italy. The first one concerns a smart infrastructure for public lighting and relies on a heterogeneous network using the IEEE 802.15.4 short-range communication technology, whereas the second one addresses smart-building applications and is based on the LoRa low-rate, long-range communication technology. The smart lighting scenario is discussed providing the technical details and the economic benefits of a large-scale (around 3000 light poles) flexible and modular implementation of a public lighting infrastructure, while the smart-building testbed is investigated, through measurement campaigns and simulations, assessing the coverage and the performance of the LoRa technology in a real urban scenario. Results show that a proper parameter setting is needed to cover large urban areas while maintaining the airtime sufficiently low to keep packet losses at satisfactory levels. PMID:29642391
ATLAS EventIndex general dataflow and monitoring infrastructure
NASA Astrophysics Data System (ADS)
Fernández Casaní, Á.; Barberis, D.; Favareto, A.; García Montoro, C.; González de la Hoz, S.; Hřivnáč, J.; Prokoshin, F.; Salt, J.; Sánchez, J.; Többicke, R.; Yuan, R.; ATLAS Collaboration
2017-10-01
The ATLAS EventIndex has been running in production since mid-2015, reliably collecting information worldwide about all produced events and storing them in a central Hadoop infrastructure at CERN. A subset of this information is copied to an Oracle relational database for fast dataset discovery, event-picking, crosschecks with other ATLAS systems and checks for event duplication. The system design and its optimization is serving event picking from requests of a few events up to scales of tens of thousand of events, and in addition, data consistency checks are performed for large production campaigns. Detecting duplicate events with a scope of physics collections has recently arisen as an important use case. This paper describes the general architecture of the project and the data flow and operation issues, which are addressed by recent developments to improve the throughput of the overall system. In this direction, the data collection system is reducing the usage of the messaging infrastructure to overcome the performance shortcomings detected during production peaks; an object storage approach is instead used to convey the event index information, and messages to signal their location and status. Recent changes in the Producer/Consumer architecture are also presented in detail, as well as the monitoring infrastructure.
Duarte, Afonso M. S.; Psomopoulos, Fotis E.; Blanchet, Christophe; Bonvin, Alexandre M. J. J.; Corpas, Manuel; Franc, Alain; Jimenez, Rafael C.; de Lucas, Jesus M.; Nyrönen, Tommi; Sipos, Gergely; Suhr, Stephanie B.
2015-01-01
With the increasingly rapid growth of data in life sciences we are witnessing a major transition in the way research is conducted, from hypothesis-driven studies to data-driven simulations of whole systems. Such approaches necessitate the use of large-scale computational resources and e-infrastructures, such as the European Grid Infrastructure (EGI). EGI, one of key the enablers of the digital European Research Area, is a federation of resource providers set up to deliver sustainable, integrated and secure computing services to European researchers and their international partners. Here we aim to provide the state of the art of Grid/Cloud computing in EU research as viewed from within the field of life sciences, focusing on key infrastructures and projects within the life sciences community. Rather than focusing purely on the technical aspects underlying the currently provided solutions, we outline the design aspects and key characteristics that can be identified across major research approaches. Overall, we aim to provide significant insights into the road ahead by establishing ever-strengthening connections between EGI as a whole and the life sciences community. PMID:26157454
Duarte, Afonso M S; Psomopoulos, Fotis E; Blanchet, Christophe; Bonvin, Alexandre M J J; Corpas, Manuel; Franc, Alain; Jimenez, Rafael C; de Lucas, Jesus M; Nyrönen, Tommi; Sipos, Gergely; Suhr, Stephanie B
2015-01-01
With the increasingly rapid growth of data in life sciences we are witnessing a major transition in the way research is conducted, from hypothesis-driven studies to data-driven simulations of whole systems. Such approaches necessitate the use of large-scale computational resources and e-infrastructures, such as the European Grid Infrastructure (EGI). EGI, one of key the enablers of the digital European Research Area, is a federation of resource providers set up to deliver sustainable, integrated and secure computing services to European researchers and their international partners. Here we aim to provide the state of the art of Grid/Cloud computing in EU research as viewed from within the field of life sciences, focusing on key infrastructures and projects within the life sciences community. Rather than focusing purely on the technical aspects underlying the currently provided solutions, we outline the design aspects and key characteristics that can be identified across major research approaches. Overall, we aim to provide significant insights into the road ahead by establishing ever-strengthening connections between EGI as a whole and the life sciences community.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-27
... Loan Programs Office into the process. The FRN also identifies the principles Western will continue using to ensure (1) that the Program is separate and distinct from Western's power marketing functions... obtain project funding. Table of Contents I. Definitions II. Principles III. Project Evaluation Criteria...
Demonstration of Green/Gray Infrastructure for Combined Sewer Overflow Control
This project is a major national demonstration of the integration of green and gray infrastructure for combined sewer overflow (CSO) control in a cost-effective and environmentally friendly manner. It will use Kansas City, MO, as a case example. The project will have a major in...
Benefits and Challenges of Linking Green Infrastructure and Highway Planning in the United States
NASA Astrophysics Data System (ADS)
Marcucci, Daniel J.; Jordan, Lauren M.
2013-01-01
Landscape-level green infrastructure creates a network of natural and semi-natural areas that protects and enhances ecosystem services, regenerative capacities, and ecological dynamism over long timeframes. It can also enhance quality of life and certain economic activity. Highways create a network for moving goods and services efficiently, enabling commerce, and improving mobility. A fundamentally profound conflict exists between transportation planning and green infrastructure planning because they both seek to create connected, functioning networks across the same landscapes and regions, but transportation networks, especially in the form of highways, fragment and disconnect green infrastructure networks. A key opportunity has emerged in the United States during the last ten years with the promotion of measures to link transportation and environmental concerns. In this article we examined the potential benefits and challenges of linking landscape-level green infrastructure planning and implementation with integrated transportation planning and highway project development in the United States policy context. This was done by establishing a conceptual model that identified logical flow lines from planning to implementation as well as the potential interconnectors between green infrastructure and highway infrastructure. We analyzed the relationship of these activities through literature review, policy analysis, and a case study of a suburban Maryland, USA landscape. We found that regionally developed and adopted green infrastructure plans can be instrumental in creating more responsive regional transportation plans and streamlining the project environmental review process while enabling better outcomes by enabling more targeted mitigation. In order for benefits to occur, however, landscape-scale green infrastructure assessments and plans must be in place before integrated transportation planning and highway project development occurs. It is in the transportation community's interests to actively facilitate green infrastructure planning because it creates a more predictable environmental review context. On the other hand, for landscape-level green infrastructure, transportation planning and development is much more established and better funded and can provide a means of supporting green infrastructure planning and implementation, thereby enhancing conservation of ecological function.
Benefits and challenges of linking green infrastructure and highway planning in the United States.
Marcucci, Daniel J; Jordan, Lauren M
2013-01-01
Landscape-level green infrastructure creates a network of natural and semi-natural areas that protects and enhances ecosystem services, regenerative capacities, and ecological dynamism over long timeframes. It can also enhance quality of life and certain economic activity. Highways create a network for moving goods and services efficiently, enabling commerce, and improving mobility. A fundamentally profound conflict exists between transportation planning and green infrastructure planning because they both seek to create connected, functioning networks across the same landscapes and regions, but transportation networks, especially in the form of highways, fragment and disconnect green infrastructure networks. A key opportunity has emerged in the United States during the last ten years with the promotion of measures to link transportation and environmental concerns. In this article we examined the potential benefits and challenges of linking landscape-level green infrastructure planning and implementation with integrated transportation planning and highway project development in the United States policy context. This was done by establishing a conceptual model that identified logical flow lines from planning to implementation as well as the potential interconnectors between green infrastructure and highway infrastructure. We analyzed the relationship of these activities through literature review, policy analysis, and a case study of a suburban Maryland, USA landscape. We found that regionally developed and adopted green infrastructure plans can be instrumental in creating more responsive regional transportation plans and streamlining the project environmental review process while enabling better outcomes by enabling more targeted mitigation. In order for benefits to occur, however, landscape-scale green infrastructure assessments and plans must be in place before integrated transportation planning and highway project development occurs. It is in the transportation community's interests to actively facilitate green infrastructure planning because it creates a more predictable environmental review context. On the other hand, for landscape-level green infrastructure, transportation planning and development is much more established and better funded and can provide a means of supporting green infrastructure planning and implementation, thereby enhancing conservation of ecological function.
Support Process Development for Assessing Green Infrastructure in Omaha, NE
Evaluates Omaha’s current process for assessing green infrastructure projects and recommends improvements for comparing green and gray infrastructure. Compares Omaha’s design criteria to other cities. Reviews other US programs with rights-of-way criteria.
Alternative Fuel Infrastructure Grants The Maryland Energy Administration administers the Maryland Alternative Fuel Infrastructure Program (AFIP), which provides grants to develop public access alternative fueling and charging infrastructure. Only Maryland-based private businesses are eligible, and projects
ERIC Educational Resources Information Center
Maule, R. William
1994-01-01
Discusses prototype information infrastructure projects in northern California's Silicon Valley. The strategies of the public and private telecommunications carriers vying for backbone services and industries developing end-user infrastructure technologies via office networks, set-top box networks, Internet multimedia, and "smart homes"…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pardo-Bosch, Francesc, E-mail: francesc.pardo@upc.edu; Political Science Department, University of California - Berkeley; Aguado, Antonio, E-mail: antonio.aguado@upc.edu
Infrastructure construction, one of the biggest driving forces of the economy nowadays, requires a huge analysis and clear transparency to decide what projects have to be executed with the few resources available. With the aim to provide the public administrations a tool with which they can make their decisions easier, the Sustainability Index of Infrastructure Projects (SIIP) has been defined, with a multi-criteria decision system called MIVES, in order to classify non-uniform investments. This index evaluates, in two inseparable stages, the contribution to the sustainable development of each infrastructure project, analyzing its social, environmental and economic impact. The result ofmore » the SIIP allows to decide the order with which projects will be prioritized. The case of study developed proves the adaptability and utility of this tool for the ordinary budget management.« less
Sustainability considerations for health research and analytic data infrastructures.
Wilcox, Adam; Randhawa, Gurvaneet; Embi, Peter; Cao, Hui; Kuperman, Gilad J
2014-01-01
The United States has made recent large investments in creating data infrastructures to support the important goals of patient-centered outcomes research (PCOR) and comparative effectiveness research (CER), with still more investment planned. These initial investments, while critical to the creation of the infrastructures, are not expected to sustain them much beyond the initial development. To provide the maximum benefit, the infrastructures need to be sustained through innovative financing models while providing value to PCOR and CER researchers. Based on our experience with creating flexible sustainability strategies (i.e., strategies that are adaptive to the different characteristics and opportunities of a resource or infrastructure), we define specific factors that are important considerations in developing a sustainability strategy. These factors include assets, expansion, complexity, and stakeholders. Each factor is described, with examples of how it is applied. These factors are dimensions of variation in different resources, to which a sustainability strategy should adapt. We also identify specific important considerations for maintaining an infrastructure, so that the long-term intended benefits can be realized. These observations are presented as lessons learned, to be applied to other sustainability efforts. We define the lessons learned, relating them to the defined sustainability factors as interactions between factors. Using perspectives and experiences from a diverse group of experts, we define broad characteristics of sustainability strategies and important observations, which can vary for different projects. Other descriptions of adaptive, flexible, and successful models of collaboration between stakeholders and data infrastructures can expand this framework by identifying other factors for sustainability, and give more concrete directions on how sustainability can be best achieved.
Reducing construction waste: A study of urban infrastructure projects.
de Magalhães, Ruane Fernandes; Danilevicz, Ângela de Moura Ferreira; Saurin, Tarcisio Abreu
2017-09-01
The construction industry is well-known for producing waste detrimental to the environment, and its impacts have increased with the development process of cities. Although there are several studies focused on the environmental impact of residential and commercial buildings, less knowledge is available regarding decreasing construction waste (CW) generation in urban infrastructure projects. This study presents best practices to reduce waste in the said projects, stressing the role of decision-making in the design stage and the effective management of construction processes in public sector. The best practices were identified from literature review, document analysis in 14 projects of urban infrastructure, and both qualitative and quantitative survey with 18 experts (architects and engineers) playing different roles on those projects. The contributions of these research are: (i) the identification of the main building techniques related to the urban design typologies analyzed; (ii) the identification of cause-effect relationships between the design choices and the CW generation diagnosis; (iii) the proposal of a checklist to support the decision-making process, that can be used as a control and evaluation instrument when developing urban infrastructure designs, focused on the construction waste minimization (CWM). Copyright © 2017 Elsevier Ltd. All rights reserved.
NGNP Infrastructure Readiness Assessment: Consolidation Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brian K Castle
2011-02-01
The Next Generation Nuclear Plant (NGNP) project supports the development, demonstration, and deployment of high temperature gas-cooled reactors (HTGRs). The NGNP project is being reviewed by the Nuclear Energy Advisory Council (NEAC) to provide input to the DOE, who will make a recommendation to the Secretary of Energy, whether or not to continue with Phase 2 of the NGNP project. The NEAC review will be based on, in part, the infrastructure readiness assessment, which is an assessment of industry's current ability to provide specified components for the FOAK NGNP, meet quality assurance requirements, transport components, have the necessary workforce inmore » place, and have the necessary construction capabilities. AREVA and Westinghouse were contracted to perform independent assessments of industry's capabilities because of their experience with nuclear supply chains, which is a result of their experiences with the EPR and AP-1000 reactors. Both vendors produced infrastructure readiness assessment reports that identified key components and categorized these components into three groups based on their ability to be deployed in the FOAK plant. The NGNP project has several programs that are developing key components and capabilities. For these components, the NGNP project have provided input to properly assess the infrastructure readiness for these components.« less
NASA Astrophysics Data System (ADS)
Kazarov, A.; Lehmann Miotto, G.; Magnoni, L.
2012-06-01
The Trigger and Data Acquisition (TDAQ) system of the ATLAS experiment at CERN is the infrastructure responsible for collecting and transferring ATLAS experimental data from detectors to the mass storage system. It relies on a large, distributed computing environment, including thousands of computing nodes with thousands of application running concurrently. In such a complex environment, information analysis is fundamental for controlling applications behavior, error reporting and operational monitoring. During data taking runs, streams of messages sent by applications via the message reporting system together with data published from applications via information services are the main sources of knowledge about correctness of running operations. The flow of data produced (with an average rate of O(1-10KHz)) is constantly monitored by experts to detect problem or misbehavior. This requires strong competence and experience in understanding and discovering problems and root causes, and often the meaningful information is not in the single message or update, but in the aggregated behavior in a certain time-line. The AAL project is meant at reducing the man power needs and at assuring a constant high quality of problem detection by automating most of the monitoring tasks and providing real-time correlation of data-taking and system metrics. This project combines technologies coming from different disciplines, in particular it leverages on an Event Driven Architecture to unify the flow of data from the ATLAS infrastructure, on a Complex Event Processing (CEP) engine for correlation of events and on a message oriented architecture for components integration. The project is composed of 2 main components: a core processing engine, responsible for correlation of events through expert-defined queries and a web based front-end to present real-time information and interact with the system. All components works in a loose-coupled event based architecture, with a message broker to centralize all communication between modules. The result is an intelligent system able to extract and compute relevant information from the flow of operational data to provide real-time feedback to human experts who can promptly react when needed. The paper presents the design and implementation of the AAL project, together with the results of its usage as automated monitoring assistant for the ATLAS data taking infrastructure.
Advanced operations focused on connected vehicles/infrastructure (CVI-UTC).
DOT National Transportation Integrated Search
2015-12-01
The goal of the Infrastructure Safety Assessment in a Connected Vehicle (CV) Environment : project was to develop a method to identify infrastructure safety hot spots using CV data. : Using these basic safety messages to detect hot spots may al...
DOT National Transportation Integrated Search
2009-09-13
The development of infrastructure facilities can negatively impact critical habitat and essential ecosystems. There are a variety of techniques available to avoid, minimize, and mitigate negative impacts of existing infrastructure as well as future i...
Implementation status of the extreme light infrastructure - nuclear physics (ELI-NP) project
NASA Astrophysics Data System (ADS)
Gales, S.; Zamfir, N. V.
2015-02-01
The Project Extreme Light Infrastructure (ELI) is part of the European Strategic Forum for Research Infrastructures (ESFRI) Roadmap. ELI will be built as a network of three complementary pillars at the frontier of laser technologies. The ELI-NP pillar (NP for Nuclear Physics) is under construction near Bucharest (Romania) and will develop a scientific program using two 10 PW lasers and a Compton back-scattering high-brilliance and intense gamma beam, a marriage of laser and accelerator technology at the frontier of knowledge. In the present paper, the technical description of the facility, the present status of the project as well as the science, applications and future perspectives will be discussed.
Global patterns of current and future road infrastructure
NASA Astrophysics Data System (ADS)
Meijer, Johan R.; Huijbregts, Mark A. J.; Schotten, Kees C. G. J.; Schipper, Aafke M.
2018-06-01
Georeferenced information on road infrastructure is essential for spatial planning, socio-economic assessments and environmental impact analyses. Yet current global road maps are typically outdated or characterized by spatial bias in coverage. In the Global Roads Inventory Project we gathered, harmonized and integrated nearly 60 geospatial datasets on road infrastructure into a global roads dataset. The resulting dataset covers 222 countries and includes over 21 million km of roads, which is two to three times the total length in the currently best available country-based global roads datasets. We then related total road length per country to country area, population density, GDP and OECD membership, resulting in a regression model with adjusted R 2 of 0.90, and found that that the highest road densities are associated with densely populated and wealthier countries. Applying our regression model to future population densities and GDP estimates from the Shared Socioeconomic Pathway (SSP) scenarios, we obtained a tentative estimate of 3.0–4.7 million km additional road length for the year 2050. Large increases in road length were projected for developing nations in some of the world’s last remaining wilderness areas, such as the Amazon, the Congo basin and New Guinea. This highlights the need for accurate spatial road datasets to underpin strategic spatial planning in order to reduce the impacts of roads in remaining pristine ecosystems.
Fredette, Thomas J; Foran, Christy M; Brasfield, Sandra M; Suedel, Burton C
2012-01-01
Navigation infrastructure such as channels, jetties, river training structures, and lock-and-dam facilities are primary components of a safe and efficient water transportation system. Planning for such infrastructure has until recently involved efforts to minimize impacts on the environment through a standardized environmental assessment process. More recently, consistent with environmental sustainability concepts, planners have begun to consider how such projects can also be constructed with environmental enhancements. This study examined the existing institutional conditions within the US Army Corps of Engineers and cooperating federal agencies relative to incorporating environmental enhancements into navigation infrastructure projects. The study sought to (1) investigate institutional attitudes towards the environmental enhancement of navigation infrastructure (EENI) concept, (2) identify potential impediments to implementation and solutions to such impediments, (3) identify existing navigation projects designed with the express intent of enhancing environmental benefit in addition to the primary project purpose, (4) identify innovative ideas for increasing environmental benefits for navigation projects, (5) identify needs for additional technical information or research, and (6) identify laws, regulations, and policies that both support and hinder such design features. The principal investigation tool was an Internet-based survey with 53 questions. The survey captured a wide range of perspectives on the EENI concept including ideas, concerns, research needs, and relevant laws and policies. Study recommendations included further promotion of the concept of EENI to planners and designers, documentation of existing projects, initiation of pilot studies on some of the innovative ideas provided through the survey, and development of national goals and interagency agreements to facilitate implementation. Copyright © 2011 SETAC.
Tempest: Tools for Addressing the Needs of Next-Generation Climate Models
NASA Astrophysics Data System (ADS)
Ullrich, P. A.; Guerra, J. E.; Pinheiro, M. C.; Fong, J.
2015-12-01
Tempest is a comprehensive simulation-to-science infrastructure that tackles the needs of next-generation, high-resolution, data intensive climate modeling activities. This project incorporates three key components: TempestDynamics, a global modeling framework for experimental numerical methods and high-performance computing; TempestRemap, a toolset for arbitrary-order conservative and consistent remapping between unstructured grids; and TempestExtremes, a suite of detection and characterization tools for identifying weather extremes in large climate datasets. In this presentation, the latest advances with the implementation of this framework will be discussed, and a number of projects now utilizing these tools will be featured.
iCollections methodology: workflow, results and lessons learned
Penn, Malcolm; Sadka, Mike; Hine, Adrian; Brooks, Stephen; Siebert, Darrell J.; Sleep, Chris; Cafferty, Steve; Cane, Elisa; Martin, Geoff; Toloni, Flavia; Wing, Peter; Chainey, John; Duffell, Liz; Huxley, Rob; Ledger, Sophie; McLaughlin, Caitlin; Mazzetta, Gerardo; Perera, Jasmin; Crowther, Robyn; Douglas, Lyndsey; Durant, Joanna; Scialabba, Elisabetta; Honey, Martin; Huertas, Blanca; Howard, Theresa; Carter, Victoria; Albuquerque, Sara; Paterson, Gordon; Kitching, Ian J.
2017-01-01
Abstract The Natural History Museum, London (NHMUK) has embarked on an ambitious programme to digitise its collections. The first phase of this programme was to undertake a series of pilot projects to develop the workflows and infrastructure needed to support mass digitisation of very large scientific collections. This paper presents the results of one of the pilot projects – iCollections. This project digitised all the lepidopteran specimens usually considered as butterflies, 181,545 specimens representing 89 species from the British Isles and Ireland. The data digitised includes, species name, georeferenced location, collector and collection date - the what, where, who and when of specimen data. In addition, a digital image of each specimen was taken. A previous paper explained the way the data were obtained and the background to the collections that made up the project. The present paper describes the technical, logistical, and economic aspects of managing the project. PMID:29104442
iCollections methodology: workflow, results and lessons learned
Penn, Malcolm; Sadka, Mike; Hine, Adrian; Brooks, Stephen; Siebert, Darrell J.; Sleep, Chris; Cafferty, Steve; Cane, Elisa; Martin, Geoff; Toloni, Flavia; Wing, Peter; Chainey, John; Duffell, Liz; Huxley, Rob; Ledger, Sophie; McLaughlin, Caitlin; Mazzetta, Gerardo; Perera, Jasmin; Crowther, Robyn; Douglas, Lyndsey; Durant, Joanna; Honey, Martin; Huertas, Blanca; Howard, Theresa; Carter, Victoria; Albuquerque, Sara; Paterson, Gordon; Kitching, Ian J.
2017-01-01
Abstract The Natural History Museum, London (NHMUK) has embarked on an ambitious programme to digitise its collections. The first phase of this programme was to undertake a series of pilot projects to develop the workflows and infrastructure needed to support mass digitisation of very large scientific collections. This paper presents the results of one of the pilot projects – iCollections. This project digitised all the lepidopteran specimens usually considered as butterflies, 181,545 specimens representing 89 species from the British Isles and Ireland. The data digitised includes, species name, georeferenced location, collector and collection date - the what, where, who and when of specimen data. In addition, a digital image of each specimen was taken. A previous paper explained the way the data were obtained and the background to the collections that made up the project. The present paper describes the technical, logistical, and economic aspects of managing the project. PMID:29104435
iCollections methodology: workflow, results and lessons learned.
Blagoderov, Vladimir; Penn, Malcolm; Sadka, Mike; Hine, Adrian; Brooks, Stephen; Siebert, Darrell J; Sleep, Chris; Cafferty, Steve; Cane, Elisa; Martin, Geoff; Toloni, Flavia; Wing, Peter; Chainey, John; Duffell, Liz; Huxley, Rob; Ledger, Sophie; McLaughlin, Caitlin; Mazzetta, Gerardo; Perera, Jasmin; Crowther, Robyn; Douglas, Lyndsey; Durant, Joanna; Honey, Martin; Huertas, Blanca; Howard, Theresa; Carter, Victoria; Albuquerque, Sara; Paterson, Gordon; Kitching, Ian J
2017-01-01
The Natural History Museum, London (NHMUK) has embarked on an ambitious programme to digitise its collections. The first phase of this programme was to undertake a series of pilot projects to develop the workflows and infrastructure needed to support mass digitisation of very large scientific collections. This paper presents the results of one of the pilot projects - iCollections. This project digitised all the lepidopteran specimens usually considered as butterflies, 181,545 specimens representing 89 species from the British Isles and Ireland. The data digitised includes, species name, georeferenced location, collector and collection date - the what, where, who and when of specimen data. In addition, a digital image of each specimen was taken. A previous paper explained the way the data were obtained and the background to the collections that made up the project. The present paper describes the technical, logistical, and economic aspects of managing the project.
The Roland Maze Project — Cosmic Ray Registration at Schools
NASA Astrophysics Data System (ADS)
Feder, J.; JȨDRZEJCZAK, K.; Karczmarczyk, J.; Lewandowski, R.; Swarzyński, J.; Szabelska, B.; Szabelski, J.; Tokarski, P.; Wibig, T.
Experimental studies of cosmic rays at the highest energies (above 1018 eV) are the main scientific goal of the projected large area network of extensive air shower detectors. Placing the detectors on the roofs of high school buildings will lower the cost by using the existing urban infrastructure (INTERNET, power supply, etc.), and can be a very efficient way of science popularisation by engaging high school students in the research program. 30 high schools in Łódź are already involved in the project. The project has recently obtained some financial support from the City Council of Łódź. The donation enabled us to start experimental work on detector construction details. A cycle of lectures and seminars devoted to different aspects of project realization (detector construction, on-line data acquisition system, C++ programming) has been organized for students at our Institute and at schools.
The QUANTGRID Project (RO)—Quantum Security in GRID Computing Applications
NASA Astrophysics Data System (ADS)
Dima, M.; Dulea, M.; Petre, M.; Petre, C.; Mitrica, B.; Stoica, M.; Udrea, M.; Sterian, R.; Sterian, P.
2010-01-01
The QUANTGRID Project, financed through the National Center for Programme Management (CNMP-Romania), is the first attempt at using Quantum Crypted Communications (QCC) in large scale operations, such as GRID Computing, and conceivably in the years ahead in the banking sector and other security tight communications. In relation with the GRID activities of the Center for Computing & Communications (Nat.'l Inst. Nucl. Phys.—IFIN-HH), the Quantum Optics Lab. (Nat.'l Inst. Plasma and Lasers—INFLPR) and the Physics Dept. (University Polytechnica—UPB) the project will build a demonstrator infrastructure for this technology. The status of the project in its incipient phase is reported, featuring tests for communications in classical security mode: socket level communications under AES (Advanced Encryption Std.), both proprietary code in C++ technology. An outline of the planned undertaking of the project is communicated, highlighting its impact in quantum physics, coherent optics and information technology.
DOT National Transportation Integrated Search
2007-03-02
This report presents the case study and lessons learned for the national evaluation of the Great Lakes Intelligent Transportation Systems (GLITS) Airport ITS Integration and Road Infrastructure Management System (RIMS) projects. The Airport ITS Integ...
The United States Environmental Protection Agency evaluated the performance of a hybrid green-gray infrastructure pilot project installed into the Marlborough Neighborhood by the Kansas City Water Services Department. Kansas City installed 135 vegetated SCMs, 24,290 square feet o...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-31
...- design and to develop additional alternatives for analysis. These two water infrastructure projects are... carbon, and reverse osmosis. The facility would be designed in modular form for ease of expandability... lighting, asphalt pavement, and pavement marking and signs. The project includes ``100-year storm'' flood...
Investing in the Improvement of Education: Lessons to be Learned from the National Writing Project
ERIC Educational Resources Information Center
St. John, Mark; Stokes, Laura
2008-01-01
This paper defines the concept of "improvement infrastructure" and "educational capital" for education, and it uses the case of the National Writing Project to develop an extended, data-based illustration of the design and generativeness of an improvement infrastructure. Since 1983 there have been multiple "waves" of…
UAS Integration in the NAS Project: Integrated Test and LVC Infrastructure
NASA Technical Reports Server (NTRS)
Murphy, Jim; Hoang, Ty
2015-01-01
Overview presentation of the Integrated Test and Evaluation sub-project of the Unmanned Aircraft System (UAS) in the National Airspace System (NAS). The emphasis of the presentation is the Live, Virtual, and Constructive (LVC) system (a broadly used name for classifying modeling and simulation) infrastructure and use of external assets and connection.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-16
... DEPARTMENT OF TRANSPORTATION Office of the Secretary Transportation Infrastructure Financing and... 2013 and $1 billion in FY 2014 for the Transportation Infrastructure Financing and Innovation Act... eligible surface transportation projects. This information collection relates to the collection of...
The Grand Ethiopian Renaissance Dam: Source of cooperation or contention?
Teferi Taye, Meron; Tadesse, Tsegaye; Senay, Gabriel; Block, Paul
2016-01-01
This paper discusses the challenges and benefits of the Grand Ethiopian Renaissance Dam (GERD), which is under construction and expected to be operational on the Blue Nile River in Ethiopia in a few years. Like many large-scale projects on transboundary rivers, the GERD has been criticized for potentially jeopardizing downstream water security and livelihoods through upstream unilateral decision making. In spite of the contentious nature of the project, the authors argue that this project can provide substantial benefits for regional development. The GERD, like any major river infrastructure project, will undeniably bring about social, environmental, and economic change, and in this unique case has, on balance, the potential to achieve success on all fronts. It must be stressed, however, that strong partnerships between riparian countries are essential. National success is contingent on regional cooperation.
Wilcox, S.; Andreas, A.
2010-09-27
The U.S. Department of Energy's National Renewable Energy Laboratory collaborates with the solar industry to establish high quality solar and meteorological measurements. This Solar Resource and Meteorological Assessment Project (SOLRMAP) provides high quality measurements to support deployment of power projects in the United States. The no-funds-exchanged collaboration brings NREL solar resource assessment expertise together with industry needs for measurements. The end result is high quality data sets to support the financing, design, and monitoring of large scale solar power projects for industry in addition to research-quality data for NREL model development. NREL provides consultation for instrumentation and station deployment, along with instrument calibrations, data acquisition, quality assessment, data distribution, and summary reports. Industry participants provide equipment, infrastructure, and station maintenance.
The Roland Maze Project school-based extensive air shower network
NASA Astrophysics Data System (ADS)
Feder, J.; Jȩdrzejczak, K.; Karczmarczyk, J.; Lewandowski, R.; Swarzyński, J.; Szabelska, B.; Szabelski, J.; Wibig, T.
2006-01-01
We plan to construct the large area network of extensive air shower detectors placed on the roofs of high school buildings in the city of Łódź. Detection points will be connected by INTERNET to the central server and their work will be synchronized by GPS. The main scientific goal of the project are studies of ultra high energy cosmic rays. Using existing town infrastructure (INTERNET, power supply, etc.) will significantly reduce the cost of the experiment. Engaging high school students in the research program should significantly increase their knowledge of science and modern technologies, and can be a very efficient way of science popularisation. We performed simulations of the projected network capabilities of registering Extensive Air Showers and reconstructing energies of primary particles. Results of the simulations and the current status of project realisation will be presented.
NASA Astrophysics Data System (ADS)
Wilcox, Steve; Myers, Daryl
2009-08-01
The U.S. Department of Energy's National Renewable Energy Laboratory has embarked on a collaborative effort with the solar industry to establish high quality solar and meteorological measurements. This Solar Resource and Meteorological Assessment Project (SOLRMAP) provides high quality measurements to support deployment of concentrating solar thermal power projects in the United States. The no-funds-exchanged collaboration brings NREL solar resource assessment expertise together with industry needs for measurements. The end result will be high quality data sets to support the financing, design, and monitoring of large scale solar power projects for industry in addition to research-quality data for NREL model development. NREL provides consultation for instrumentation and station deployment, along with instrument calibrations, data acquisition, quality assessment, data distribution, and summary reports. Industry participants provide equipment, infrastructure, and station maintenance.
Integrating TRENCADIS components in gLite to share DICOM medical images and structured reports.
Blanquer, Ignacio; Hernández, Vicente; Salavert, José; Segrelles, Damià
2010-01-01
The problem of sharing medical information among different centres has been tackled by many projects. Several of them target the specific problem of sharing DICOM images and structured reports (DICOM-SR), such as the TRENCADIS project. In this paper we propose sharing and organizing DICOM data and DICOM-SR metadata benefiting from the existent deployed Grid infrastructures compliant with gLite such as EGEE or the Spanish NGI. These infrastructures contribute with a large amount of storage resources for creating knowledge databases and also provide metadata storage resources (such as AMGA) to semantically organize reports in a tree-structure. First, in this paper, we present the extension of TRENCADIS architecture to use gLite components (LFC, AMGA, SE) on the shake of increasing interoperability. Using the metadata from DICOM-SR, and maintaining its tree structure, enables federating different but compatible diagnostic structures and simplifies the definition of complex queries. This article describes how to do this in AMGA and it shows an approach to efficiently code radiology reports to enable the multi-centre federation of data resources.
INcreasing Security and Protection through Infrastructure REsilience: The INSPIRE Project
NASA Astrophysics Data System (ADS)
D'Antonio, Salvatore; Romano, Luigi; Khelil, Abdelmajid; Suri, Neeraj
The INSPIRE project aims at enhancing the European potential in the field of security by ensuring the protection of critical information infrastructures through (a) the identification of their vulnerabilities and (b) the development of innovative techniques for securing networked process control systems. To increase the resilience of such systems INSPIRE will develop traffic engineering algorithms, diagnostic processes and self-reconfigurable architectures along with recovery techniques. Hence, the core idea of the INSPIRE project is to protect critical information infrastructures by appropriately configuring, managing, and securing the communication network which interconnects the distributed control systems. A working prototype will be implemented as a final demonstrator of selected scenarios. Controls/Communication Experts will support project partners in the validation and demonstration activities. INSPIRE will also contribute to standardization process in order to foster multi-operator interoperability and coordinated strategies for securing lifeline systems.
Trans-Pacific Astronomy Experiment Project Status
NASA Technical Reports Server (NTRS)
Hsu, Eddie
2000-01-01
The Trans-Pacific Astronomy Experiment is Phase 2 of the Trans-Pacific High Data Rate Satcom Experiments following the Trans-Pacific High Definition Video Experiment. It is a part of the Global Information Infrastructure-Global Interoperability for Broadband Networks Project (GII-GIBN). Provides global information infrastructure involving broadband satellites and terrestrial networks and access to information by anyone, anywhere, at any time. Collaboration of government, industry, and academic organizations demonstrate the use of broadband satellite links in a global information infrastructure with emphasis on astronomical observations, collaborative discussions and distance learning.
Global assessment of water policy vulnerability under uncertainty in water scarcity projections
NASA Astrophysics Data System (ADS)
Greve, Peter; Kahil, Taher; Satoh, Yusuke; Burek, Peter; Fischer, Günther; Tramberend, Sylvia; Byers, Edward; Flörke, Martina; Eisner, Stephanie; Hanasaki, Naota; Langan, Simon; Wada, Yoshihide
2017-04-01
Water scarcity is a critical environmental issue worldwide, which has been driven by the significant increase in water extractions during the last century. In the coming decades, climate change is projected to further exacerbate water scarcity conditions in many regions around the world. At present, one important question for policy debate is the identification of water policy interventions that could address the mounting water scarcity problems. Main interventions include investing in water storage infrastructures, water transfer canals, efficient irrigation systems, and desalination plants, among many others. This type of interventions involve long-term planning, long-lived investments and some irreversibility in choices which can shape development of countries for decades. Making decisions on these water infrastructures requires anticipating the long term environmental conditions, needs and constraints under which they will function. This brings large uncertainty in the decision-making process, for instance from demographic or economic projections. But today, climate change is bringing another layer of uncertainty that make decisions even more complex. In this study, we assess in a probabilistic approach the uncertainty in global water scarcity projections following different socioeconomic pathways (SSPs) and climate scenarios (RCPs) within the first half of the 21st century. By utilizing an ensemble of 45 future water scarcity projections based on (i) three state-of-the-art global hydrological models (PCR-GLOBWB, H08, and WaterGAP), (ii) five climate models, and (iii) three water scenarios, we have assessed changes in water scarcity and the associated uncertainty distribution worldwide. The water scenarios used here are developed by IIASA's Water Futures and Solutions (WFaS) Initiative. The main objective of this study is to improve the contribution of hydro-climatic information to effective policymaking by identifying spatial and temporal policy vulnerabilities under large uncertainty about the future socio-economic and climatic changes and to guide policymakers in charting a more sustainable pathway and avoiding maladaptive development pathways. The results show that water scarcity is increasing in up to 83% of all land area under a high-emission scenario (RCP 6.0-SSP3). Importantly, the range of uncertainty in projected water scarcity is increasing; in some regions by several orders of magnitude (e.g. sub-Saharan Africa, eastern Europe, Central Asia). This is further illustrated by focusing on a set of large river basins that will be subject both to substantial changes in basin-wide water scarcity and to strong increases in the overall range of uncertainty (e.g. the Niger, Indus, Yangtze). These conditions pose a significant challenge for water management options in those vulnerable basins, complicating decisions on needed investments in water supply infrastructure and other system improvements, and leading to the degradation of valuable resources such as non-renewable groundwater resources and water-dependent ecosystems. The results of this study call for careful and deliberative design of water policy interventions under a wide range of socio-economic and climate conditions.
NASA Astrophysics Data System (ADS)
Glaves, H. M.
2015-12-01
In recent years marine research has become increasingly multidisciplinary in its approach with a corresponding rise in the demand for large quantities of high quality interoperable data as a result. This requirement for easily discoverable and readily available marine data is currently being addressed by a number of regional initiatives with projects such as SeaDataNet in Europe, Rolling Deck to Repository (R2R) in the USA and the Integrated Marine Observing System (IMOS) in Australia, having implemented local infrastructures to facilitate the exchange of standardised marine datasets. However, each of these systems has been developed to address local requirements and created in isolation from those in other regions.Multidisciplinary marine research on a global scale necessitates a common framework for marine data management which is based on existing data systems. The Ocean Data Interoperability Platform project is seeking to address this requirement by bringing together selected regional marine e-infrastructures for the purposes of developing interoperability across them. By identifying the areas of commonality and incompatibility between these data infrastructures, and leveraging the development activities and expertise of these individual systems, three prototype interoperability solutions are being created which demonstrate the effective sharing of marine data and associated metadata across the participating regional data infrastructures as well as with other target international systems such as GEO, COPERNICUS etc.These interoperability solutions combined with agreed best practice and approved standards, form the basis of a common global approach to marine data management which can be adopted by the wider marine research community. To encourage implementation of these interoperability solutions by other regional marine data infrastructures an impact assessment is being conducted to determine both the technical and financial implications of deploying them alongside existing services. The associated best practice and common standards are also being disseminated to the user community through relevant accreditation processes and related initiatives such as the Research Data Alliance and the Belmont Forum.
21st Century Water Asset Accounting: Implications Report (WERF Report INFR6R12b)
This project is an important first step towards developing water industry standards and accounting protocols for green infrastructure that could be adopted by the Governmental Accounting Standards Board (GASB) to promote green infrastructure investment. Green infrastructure, the ...
Real-Time Optimization and Control of Next-Generation Distribution
Infrastructure | Grid Modernization | NREL Real-Time Optimization and Control of Next -Generation Distribution Infrastructure Real-Time Optimization and Control of Next-Generation Distribution Infrastructure This project develops innovative, real-time optimization and control methods for next-generation
NASA Astrophysics Data System (ADS)
Huber, Robert; Beranzoli, Laura; Fiebig, Markus; Gilbert, Olivier; Laj, Paolo; Mazzola, Mauro; Paris, Jean-Daniel; Pedersen, Helle; Stocker, Markus; Vitale, Vito; Waldmann, Christoph
2017-04-01
European Environmental Research Infrastructures (RI) frequently comprise in situ observatories from large-scale networks of platforms or sites to local networks of various sensors. Network operation is usually a cumbersome aspect of these RIs facing specific technological problems related to operations in remote areas, maintenance of the network, transmission of observation values, etc.. Robust inter-connection within and across these networks is still at infancy level and the burden increases with remoteness of the station, harshness of environmental conditions, and unavailability of classic communication systems, which is a common feature here. Despite existing RIs having developed ad-hoc solutions to overcome specific problems and innovative technologies becoming available, no common approach yet exists. Within the European project ENVRIplus, a dedicated work package aims to stimulate common network operation technologies and approaches in terms of power supply and storage, robustness, and data transmission. Major objectives of this task are to review existing technologies and RI requirements, propose innovative solutions and evaluate the standardization potential prior to wider deployment across networks. Focus areas within these efforts are: improving energy production and storage units, testing robustness of RI equipment towards extreme conditions as well as methodologies for robust data transmission. We will introduce current project activities which are coordinated at various levels including the engineering as well as the data management perspective, and explain how environmental RIs can benefit from the developments.
Self-service for software development projects and HPC activities
NASA Astrophysics Data System (ADS)
Husejko, M.; Høimyr, N.; Gonzalez, A.; Koloventzos, G.; Asbury, D.; Trzcinska, A.; Agtzidis, I.; Botrel, G.; Otto, J.
2014-05-01
This contribution describes how CERN has implemented several essential tools for agile software development processes, ranging from version control (Git) to issue tracking (Jira) and documentation (Wikis). Running such services in a large organisation like CERN requires many administrative actions both by users and service providers, such as creating software projects, managing access rights, users and groups, and performing tool-specific customisation. Dealing with these requests manually would be a time-consuming task. Another area of our CERN computing services that has required dedicated manual support has been clusters for specific user communities with special needs. Our aim is to move all our services to a layered approach, with server infrastructure running on the internal cloud computing infrastructure at CERN. This contribution illustrates how we plan to optimise the management of our of services by means of an end-user facing platform acting as a portal into all the related services for software projects, inspired by popular portals for open-source developments such as Sourceforge, GitHub and others. Furthermore, the contribution will discuss recent activities with tests and evaluations of High Performance Computing (HPC) applications on different hardware and software stacks, and plans to offer a dynamically scalable HPC service at CERN, based on affordable hardware.
Future Visions of the Brahmaputra - Establishing Hydrologic Baseline and Water Resources Context
NASA Astrophysics Data System (ADS)
Ray, P. A.; Yang, Y. E.; Wi, S.; Brown, C. M.
2013-12-01
The Brahmaputra River Basin (China-India-Bhutan-Bangladesh) is on the verge of a transition from a largely free flowing and highly variable river to a basin of rapid investment and infrastructure development. This work demonstrates a knowledge platform for the basin that compiles available data, and develops hydrologic and water resources system models of the basin. A Variable Infiltration Capacity (VIC) model of the Brahmaputra basin supplies hydrologic information of major tributaries to a water resources system model, which routes runoff generated via the VIC model through water infrastructure, and accounts for water withdrawals for agriculture, hydropower generation, municipal demand, return flows and others human activities. The system model also simulates agricultural production and the economic value of water in its various uses, including municipal, agricultural, and hydropower. Furthermore, the modeling framework incorporates plausible climate change scenarios based on the latest projections of changes to contributing glaciers (upstream), as well as changes to monsoon behavior (downstream). Water resources projects proposed in the Brahmaputra basin are evaluated based on their distribution of benefits and costs in the absence of well-defined water entitlements, and relative to a complex regional water-energy-food nexus. Results of this project will provide a basis for water sharing negotiation among the four countries and inform trans-national water-energy policy making.
Recovery Act-SmartGrid regional demonstration transmission and distribution (T&D) Infrastructure
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hedges, Edward T.
This document represents the Final Technical Report for the Kansas City Power & Light Company (KCP&L) Green Impact Zone SmartGrid Demonstration Project (SGDP). The KCP&L project is partially funded by Department of Energy (DOE) Regional Smart Grid Demonstration Project cooperative agreement DE-OE0000221 in the Transmission and Distribution Infrastructure application area. This Final Technical Report summarizes the KCP&L SGDP as of April 30, 2015 and includes summaries of the project design, implementation, operations, and analysis performed as of that date.
Gales, Sydney; Tanaka, Kazuo A; Balabanski, D L; Negoita, Florin; Stutman, D; Ur, Calin Alexander; Tesileanu, Ovidiu; Ursescu, Daniel; Ghita, Dan Gabriel; Andrei, I; Ataman, Stefan; Cernaianu, M O; D'Alessi, L; Dancus, I; Diaconescu, B; Djourelov, N; Filipescu, D; Ghenuche, P; Matei, C; Seto Kei, K; Zeng, M; Zamfir, Victor Nicolae
2018-06-28
The European Strategic Forum for Research Infrastructures (ESFRI) has selected in 2006 a proposal based on ultra-intense laser elds with intensities reaching up to 10221023 W/cm2 called \\ELI" for Extreme Light Infrastructure. The construction of a large-scale laser-centred, distributed pan-European research infrastructure, involving beyond the state-of-the-art ultra-short and ultra-intense laser technologies, received the approval for funding in 2011 2012. The three pillars of the ELI facility are being built in Czech Republic, Hungary and Romania. The Romanian pillar is ELI-Nuclear Physics (ELI-NP). The new facility is intended to serve a broad national, European and International science community. Its mission covers scientic research at the frontier of knowledge involving two domains. The rst one is laser-driven experiments related to nuclear physics, strong-eld quantum electrodynamics and associated vacuum eects. The second is based on a Comptonbackscattering high-brilliance and intense low-energy gamma beam (< 20 MeV), a marriage of laser and accelerator technology which will allow us to investigate nuclear structure and reactions as well as nuclear astrophysics with unprecedented resolution and accuracy. In addition to fundamental themes, a large number of applications with signicant societal impact are being developed. The ELI-NP research centre will be located in Magurele near Bucharest, Romania. The project is implemented by \\Horia Hulubei" National Institute for Physics and Nuclear Engineering (IFIN-HH). The project started in January 2013 and the new facility will be fully operational by the end of 2019. After a short introduction to multi-PW lasers and Multi-MeV brilliant gamma beam scientic and technical description of the future ELI-NP facility as well as the present status of its implementation of ELI-NP, will be presented. The science and examples of societal applications at reach with these new probes will be discussed with a special focus on day-one experiments and associated novel instrumentation. © 2018 IOP Publishing Ltd.
Geo-Seas - building a unified e-infrastructure for marine geoscientific data management in Europe
NASA Astrophysics Data System (ADS)
Glaves, H.; Schaap, D.
2012-04-01
A significant barrier to marine geoscientific research in Europe is the lack of standardised marine geological and geophysical data and data products which could potentially facilitate multidisciplinary marine research extending across national and international boundaries. Although there are large volumes of geological and geophysical data available for the marine environment it is currently very difficult to use these datasets in an integrated way due to different nomenclatures, formats, scales and coordinate systems being used within different organisations as well as between countries. This makes the direct use of primary data very difficult and also hampers use of the data to produce integrated multidisciplinary data products and services. The Geo-Seas project, an EU Framework 7 funded initiative, is developing a unified e-infrastructure to facilitate the sharing of marine geoscientific data within Europe. This e-infrastructure is providing on-line access to both discovery metadata and the associated federated data sets from 26 European data centres via a dedicated portal. The implementation of the Geo-Seas portal is allowing a range of end users to locate, assess and access standardised geoscientific data from multiple sources which is interoperable with other marine data types. Geo-Seas is building on the work already done by the existing SeaDataNet project which currently provides a data management e-infrastructure for oceanographic data which allows users to locate and access federated oceanographic data sets. By adopting and adapting the SeaDataNet methodologies and technologies the Geo-Seas project has not only avoid unnecessary duplication of effort by reusing existing and proven technologies but also contributed to the development of a multidisciplinary approach to ocean science across Europe through the creation of a joint infrastructure for both marine geoscientific and oceanographic data. This approach is also leading to the development of collaborative links with other European projects including EMODNET, Eurofleets. Genesi-DEC and iMarine as well as extending to the wider marine geoscientific and oceanographic community including projects in the USA such as the Rolling Deck Repository (R2R) initiative and also organisations in both the USA and Australia. On behalf of the Geo-Seas consortium partners: NERC-BGS (United Kingdom), NERC-BODC (United Kingdom), NERC-NOCS (United Kingdom), MARIS (Netherlands), IFREMER (France), BRGM (France), TNO (Netherlands), BSH (Germany), IGME (Spain), LNEG (Portugal), GSI (Ireland), BGR (Germany), OGS (Italy), GEUS (Denmark), NGU (Norway), PGI (Poland), EGK (Estonia), NRC-IGG (Lithuania), IO-BAS (Bulgaria), NOA (Greece), CIRIA (United Kingdom), MUMM (Belgium), UB (Spain), UCC (Ireland), EU-Consult (Netherlands), CNRS (France), SHOM (France), CEFAS (United Kingdom), and LU (Latvia).
Dams and Intergovernmental Transfers
NASA Astrophysics Data System (ADS)
Bao, X.
2012-12-01
Gainers and Losers are always associated with large scale hydrological infrastructure construction, such as dams, canals and water treatment facilities. Since most of these projects are public services and public goods, Some of these uneven impacts cannot fully be solved by markets. This paper tried to explore whether the governments are paying any effort to balance the uneven distributional impacts caused by dam construction or not. It showed that dam construction brought an average 2% decrease in per capita tax revenue in the upstream counties, a 30% increase in the dam-location counties and an insignificant increase in downstream counties. Similar distributional impacts were observed for other outcome variables. like rural income and agricultural crop yields, though the impacts differ across different crops. The paper also found some balancing efforts from inter-governmental transfers to reduce the unevenly distributed impacts caused by dam construction. However, overall the inter-governmental fiscal transfer efforts were not large enough to fully correct those uneven distributions, reflected from a 2% decrease of per capita GDP in upstream counties and increase of per capita GDP in local and downstream counties. This paper may shed some lights on the governmental considerations in the decision making process for large hydrological infrastructures.
ERIC Educational Resources Information Center
Tickles, Virginia C.; Li, Yadong; Walters, Wilbur L.
2013-01-01
Much criticism exists concerning a lack of focus on real-world problem-solving in the science, technology, engineering and mathematics (STEM) infrastructures. Many of these critics say that current educational infrastructures are incapable in preparing future scientists and engineers to solve the complex and multidisciplinary problems this society…
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-04
... Water Infrastructure Project at Marine Corps Base Camp Pendleton, California AGENCY: Department of the... Environmental Policy Act (NEPA) of 1969, 42 United States Code (U.S.C.) Section 4332(2)(c), the regulations of the Council on Environmental Quality for Implementing the Procedural Provisions of NEPA (40 Code of...
Environmentally Responsible Aviation Project: Infrastructure Enhancements and New Capabilities
NASA Technical Reports Server (NTRS)
Bezos-OConnor, Gaudy M.
2015-01-01
This oral presentation highlights the technical investments the NASA Environmentally Responsible Aviation Project under the Integrated Systems Research Program within ARMD made during FY10-FY14 to upgrade/enhance the NASA infrastructure/testing assets and new capabilities required to mature the ERA N=2 Portfolio of airframe and propulsion technologies to TRL 5/6.
76 FR 72449 - Notice of Buy American Waiver Under the American Recovery and Reinvestment Act of 2009
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-23
... manufactured goods used in and incorporated into a project funded through the Academic Research Infrastructure... iron, steel, and manufactured goods used in and incorporated into a project funded through the Academic... to be less than 5% of the total Recovery Act funds awarded under the Academic Research Infrastructure...
Implementation status of the extreme light infrastructure - nuclear physics (ELI-NP) project
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gales, S., E-mail: sydney.gales@eli-np.ro; Zamfir, N. V., E-mail: sydney.gales@eli-np.ro
2015-02-24
The Project Extreme Light Infrastructure (ELI) is part of the European Strategic Forum for Research Infrastructures (ESFRI) Roadmap. ELI will be built as a network of three complementary pillars at the frontier of laser technologies. The ELI-NP pillar (NP for Nuclear Physics) is under construction near Bucharest (Romania) and will develop a scientific program using two 10 PW lasers and a Compton back-scattering high-brilliance and intense gamma beam, a marriage of laser and accelerator technology at the frontier of knowledge. In the present paper, the technical description of the facility, the present status of the project as well as themore » science, applications and future perspectives will be discussed.« less
Transforming Our Cities: High-Performance Green Infrastructure (WERF Report INFR1R11)
The objective of this project is to demonstrate that the highly distributed real-time control (DRTC) technologies for green infrastructure being developed by the research team can play a critical role in transforming our nation’s urban infrastructure. These technologies include a...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-22
... Heads of Executive Departments and Agencies Reliable, safe, and resilient infrastructure is the backbone... and agencies (agencies) have achieved better outcomes for communities and the environment and realized... major infrastructure projects by half, while also improving outcomes for communities and the environment...
Integral stormwater management master plan and design in an ecological community.
Che, Wu; Zhao, Yang; Yang, Zheng; Li, Junqi; Shi, Man
2014-09-01
Urban stormwater runoff nearly discharges directly into bodies of water through gray infrastructure in China, such as sewers, impermeable ditches, and pump stations. As urban flooding, water shortage, and other environment problems become serious, integrated water environment management is becoming increasingly complex and challenging. At more than 200ha, the Oriental Sun City community is a large retirement community located in the eastern side of Beijing. During the beginning of its construction, the project faced a series of serious water environment crises such as eutrophication, flood risk, water shortage, and high maintenance costs. To address these issues, an integral stormwater management master plan was developed based on the concept of low impact development (LID). A large number of LID and green stormwater infrastructure (GSI) approaches were designed and applied in the community to replace traditional stormwater drainage systems completely. These approaches mainly included bioretention (which captured nearly 85th percentile volume of the annual runoff in the site, nearly 5.4×10(5)m(3) annually), swales (which functioned as a substitute for traditional stormwater pipes), waterscapes, and stormwater wetlands. Finally, a stormwater system plan was proposed by integrating with the gray water system, landscape planning, an architectural master plan, and related consultations that supported the entire construction period. After more than 10 years of planning, designing, construction, and operation, Oriental Sun City has become one of the earliest modern large-scale LID communities in China. Moreover, the project not only addressed the crisis efficiently and effectively, but also yielded economic and ecological benefits. Copyright © 2014. Published by Elsevier B.V.
Atlas2 Cloud: a framework for personal genome analysis in the cloud
2012-01-01
Background Until recently, sequencing has primarily been carried out in large genome centers which have invested heavily in developing the computational infrastructure that enables genomic sequence analysis. The recent advancements in next generation sequencing (NGS) have led to a wide dissemination of sequencing technologies and data, to highly diverse research groups. It is expected that clinical sequencing will become part of diagnostic routines shortly. However, limited accessibility to computational infrastructure and high quality bioinformatic tools, and the demand for personnel skilled in data analysis and interpretation remains a serious bottleneck. To this end, the cloud computing and Software-as-a-Service (SaaS) technologies can help address these issues. Results We successfully enabled the Atlas2 Cloud pipeline for personal genome analysis on two different cloud service platforms: a community cloud via the Genboree Workbench, and a commercial cloud via the Amazon Web Services using Software-as-a-Service model. We report a case study of personal genome analysis using our Atlas2 Genboree pipeline. We also outline a detailed cost structure for running Atlas2 Amazon on whole exome capture data, providing cost projections in terms of storage, compute and I/O when running Atlas2 Amazon on a large data set. Conclusions We find that providing a web interface and an optimized pipeline clearly facilitates usage of cloud computing for personal genome analysis, but for it to be routinely used for large scale projects there needs to be a paradigm shift in the way we develop tools, in standard operating procedures, and in funding mechanisms. PMID:23134663
Atlas2 Cloud: a framework for personal genome analysis in the cloud.
Evani, Uday S; Challis, Danny; Yu, Jin; Jackson, Andrew R; Paithankar, Sameer; Bainbridge, Matthew N; Jakkamsetti, Adinarayana; Pham, Peter; Coarfa, Cristian; Milosavljevic, Aleksandar; Yu, Fuli
2012-01-01
Until recently, sequencing has primarily been carried out in large genome centers which have invested heavily in developing the computational infrastructure that enables genomic sequence analysis. The recent advancements in next generation sequencing (NGS) have led to a wide dissemination of sequencing technologies and data, to highly diverse research groups. It is expected that clinical sequencing will become part of diagnostic routines shortly. However, limited accessibility to computational infrastructure and high quality bioinformatic tools, and the demand for personnel skilled in data analysis and interpretation remains a serious bottleneck. To this end, the cloud computing and Software-as-a-Service (SaaS) technologies can help address these issues. We successfully enabled the Atlas2 Cloud pipeline for personal genome analysis on two different cloud service platforms: a community cloud via the Genboree Workbench, and a commercial cloud via the Amazon Web Services using Software-as-a-Service model. We report a case study of personal genome analysis using our Atlas2 Genboree pipeline. We also outline a detailed cost structure for running Atlas2 Amazon on whole exome capture data, providing cost projections in terms of storage, compute and I/O when running Atlas2 Amazon on a large data set. We find that providing a web interface and an optimized pipeline clearly facilitates usage of cloud computing for personal genome analysis, but for it to be routinely used for large scale projects there needs to be a paradigm shift in the way we develop tools, in standard operating procedures, and in funding mechanisms.
Damping characterization in large structures
NASA Technical Reports Server (NTRS)
Eke, Fidelis O.; Eke, Estelle M.
1991-01-01
This research project has as its main goal the development of methods for selecting the damping characteristics of components of a large structure or multibody system, in such a way as to produce some desired system damping characteristics. The main need for such an analytical device is in the simulation of the dynamics of multibody systems consisting, at least partially, of flexible components. The reason for this need is that all existing simulation codes for multibody systems require component-by-component characterization of complex systems, whereas requirements (including damping) often appear at the overall system level. The main goal was met in large part by the development of a method that will in fact synthesize component damping matrices from a given system damping matrix. The restrictions to the method are that the desired system damping matrix must be diagonal (which is almost always the case) and that interbody connections must be by simple hinges. In addition to the technical outcome, this project contributed positively to the educational and research infrastructure of Tuskegee University - a Historically Black Institution.
Scaling the PuNDIT project for wide area deployments
NASA Astrophysics Data System (ADS)
McKee, Shawn; Batista, Jorge; Carcassi, Gabriele; Dovrolis, Constantine; Lee, Danny
2017-10-01
In today’s world of distributed scientific collaborations, there are many challenges to providing reliable inter-domain network infrastructure. Network operators use a combination of active monitoring and trouble tickets to detect problems, but these are often ineffective at identifying issues that impact wide-area network users. Additionally, these approaches do not scale to wide area inter-domain networks due to unavailability of data from all the domains along typical network paths. The Pythia Network Diagnostic InfrasTructure (PuNDIT) project aims to create a scalable infrastructure for automating the detection and localization of problems across these networks. The project goal is to gather and analyze metrics from existing perfSONAR monitoring infrastructures to identify the signatures of possible problems, locate affected network links, and report them to the user in an intuitive fashion. Simply put, PuNDIT seeks to convert complex network metrics into easily understood diagnoses in an automated manner. We present our progress in creating the PuNDIT system and our status in developing, testing and deploying PuNDIT. We report on the project progress to-date, describe the current implementation architecture and demonstrate some of the various user interfaces it will support. We close by discussing the remaining challenges and next steps and where we see the project going in the future.
Economic vulnerability to sea-level rise along the northern U.S. Gulf Coast
Thatcher, Cindy A.; Brock, John C.; Pendleton, Elizabeth A.
2013-01-01
The northern Gulf of Mexico coast of the United States has been identified as highly vulnerable to sea-level rise, based on a combination of physical and societal factors. Vulnerability of human populations and infrastructure to projected increases in sea level is a critical area of uncertainty for communities in the extremely low-lying and flat northern gulf coastal zone. A rapidly growing population along some parts of the northern Gulf of Mexico coastline is further increasing the potential societal and economic impacts of projected sea-level rise in the region, where observed relative rise rates range from 0.75 to 9.95 mm per year on the Gulf coasts of Texas, Louisiana, Mississippi, Alabama, and Florida. A 1-m elevation threshold was chosen as an inclusive designation of the coastal zone vulnerable to relative sea-level rise, because of uncertainty associated with sea-level rise projections. This study applies a Coastal Economic Vulnerability Index (CEVI) to the northern Gulf of Mexico region, which includes both physical and economic factors that contribute to societal risk of impacts from rising sea level. The economic variables incorporated in the CEVI include human population, urban land cover, economic value of key types of infrastructure, and residential and commercial building values. The variables are standardized and combined to produce a quantitative index value for each 1-km coastal segment, highlighting areas where human populations and the built environment are most at risk. This information can be used by coastal managers as they allocate limited resources for ecosystem restoration, beach nourishment, and coastal-protection infrastructure. The study indicates a large amount of variability in index values along the northern Gulf of Mexico coastline, and highlights areas where long-term planning to enhance resiliency is particularly needed.
Mixing Metaphors: Building Infrastructure for Large Scale School Turnaround
ERIC Educational Resources Information Center
Peurach, Donald J.; Neumerski, Christine M.
2015-01-01
The purpose of this analysis is to increase understanding of the possibilities and challenges of building educational infrastructure--the basic, foundational structures, systems, and resources--to support large-scale school turnaround. Building educational infrastructure often exceeds the capacity of schools, districts, and state education…
Sustainability Considerations for Health Research and Analytic Data Infrastructures
Wilcox, Adam; Randhawa, Gurvaneet; Embi, Peter; Cao, Hui; Kuperman, Gilad J.
2014-01-01
Introduction: The United States has made recent large investments in creating data infrastructures to support the important goals of patient-centered outcomes research (PCOR) and comparative effectiveness research (CER), with still more investment planned. These initial investments, while critical to the creation of the infrastructures, are not expected to sustain them much beyond the initial development. To provide the maximum benefit, the infrastructures need to be sustained through innovative financing models while providing value to PCOR and CER researchers. Sustainability Factors: Based on our experience with creating flexible sustainability strategies (i.e., strategies that are adaptive to the different characteristics and opportunities of a resource or infrastructure), we define specific factors that are important considerations in developing a sustainability strategy. These factors include assets, expansion, complexity, and stakeholders. Each factor is described, with examples of how it is applied. These factors are dimensions of variation in different resources, to which a sustainability strategy should adapt. Summary Observations: We also identify specific important considerations for maintaining an infrastructure, so that the long-term intended benefits can be realized. These observations are presented as lessons learned, to be applied to other sustainability efforts. We define the lessons learned, relating them to the defined sustainability factors as interactions between factors. Conclusion and Next Steps: Using perspectives and experiences from a diverse group of experts, we define broad characteristics of sustainability strategies and important observations, which can vary for different projects. Other descriptions of adaptive, flexible, and successful models of collaboration between stakeholders and data infrastructures can expand this framework by identifying other factors for sustainability, and give more concrete directions on how sustainability can be best achieved. PMID:25848610
van der Straeten, Jonas; Hasenöhrl, Ute
2016-12-01
In the academic debate on infrastructures in the Global South, there is a broad consensus that (post)colonial legacies present a major challenge for a transition towards more inclusive, sustainable and adapted modes of providing services. Yet, relatively little is known about the emergence and evolution of infrastructures in former colonies. Until a decade ago, most historical studies followed Daniel Headrick's (1981) "tools of empire" thesis, painting-with broad brush strokes-a picture of infrastructures as instruments for advancing the colonial project of exploitation and subordination of non-European peoples and environments. This paper explores new research perspectives beyond this straightforward, 'diffusionist' perspective on technology transfer. In order to do so, it presents and discusses more recent studies which focus on interactive transfer processes as well as mechanisms of appropriation, and which increasingly combine approaches from imperial history, environmental history, and history of technology.There is much to gain from unpacking the changing motives and ideologies behind technology transfer; tracing the often contested and negotiated flows of ideas, technologies and knowledge within multilayered global networks; investigating the manifold ways in which infrastructures reflected and (re)produced colonial spaces and identities; critically reflecting on the utility of large (socio)technical systems (LTS) for the Global South; and approaching infrastructures in the (post)colonial world through entangled histories of technology and the environment. Following David Arnold's (2005) plea for a "more interactive, culturally-nuanced, multi-sited debate" on technology in the non-Western world, the paper offers fresh insights for a broader debate about how infrastructures work within specific parameters of time, place and culture.
ERIC Educational Resources Information Center
Chipley, Michael; Lyon, Wesley; Smilowitz, Robert; Williams, Pax; Arnold, Christopher; Blewett, William; Hazen, Lee; Krimgold, Fred
2012-01-01
This publication, part of the new Building and Infrastructure Protection Series (BIPS) published by the Department of Homeland Security (DHS) Science and Technology Directorate (S&T) Infrastructure Protection and Disaster Management Division (IDD), serves to advance high performance and integrated design for buildings and infrastructure. This…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mitchell, G.
This presentation discusses the differences between the original Vehicle and Infrastructure Cash-Flow Evaluation (VICE) Model and the revamped version, VICE 2.0. The enhanced tool can now help assess projects to acquire vehicles and infrastructure, or to acquire vehicles only.
South Africa's School Infrastructure Performance Indicator System
ERIC Educational Resources Information Center
Gibberd, Jeremy
2007-01-01
While some South African schools have excellent infrastructure, others lack basic services such as water and sanitation. This article describes the school infrastructure performance indicator system (SIPIS) in South Africa. The project offers an approach that can address both the urgent provision of basic services as well as support the…
Evolution of precipitation extremes in two large ensembles of climate simulations
NASA Astrophysics Data System (ADS)
Martel, Jean-Luc; Mailhot, Alain; Talbot, Guillaume; Brissette, François; Ludwig, Ralf; Frigon, Anne; Leduc, Martin; Turcotte, Richard
2017-04-01
Recent studies project significant changes in the future distribution of precipitation extremes due to global warming. It is likely that extreme precipitation intensity will increase in a future climate and that extreme events will be more frequent. In this work, annual maxima daily precipitation series from the Canadian Earth System Model (CanESM2) 50-member large ensemble (spatial resolution of 2.8°x2.8°) and the Community Earth System Model (CESM1) 40-member large ensemble (spatial resolution of 1°x1°) are used to investigate extreme precipitation over the historical (1980-2010) and future (2070-2100) periods. The use of these ensembles results in respectively 1 500 (30 years x 50 members) and 1200 (30 years x 40 members) simulated years over both the historical and future periods. These large datasets allow the computation of empirical daily extreme precipitation quantiles for large return periods. Using the CanESM2 and CESM1 large ensembles, extreme daily precipitation with return periods ranging from 2 to 100 years are computed in historical and future periods to assess the impact of climate change. Results indicate that daily precipitation extremes generally increase in the future over most land grid points and that these increases will also impact the 100-year extreme daily precipitation. Considering that many public infrastructures have lifespans exceeding 75 years, the increase in extremes has important implications on service levels of water infrastructures and public safety. Estimated increases in precipitation associated to very extreme precipitation events (e.g. 100 years) will drastically change the likelihood of flooding and their extent in future climate. These results, although interesting, need to be extended to sub-daily durations, relevant for urban flooding protection and urban infrastructure design (e.g. sewer networks, culverts). Models and simulations at finer spatial and temporal resolution are therefore needed.
WDM-PON Architecture for FTTx Networks
NASA Astrophysics Data System (ADS)
Iannone, E.; Franco, P.; Santoni, S.
Broadband services for residential users in European countries have until now largely relied on xDSL technologies, while FTTx technologies have been mainly exploited in Asia and North America. The increasing bandwidth demand and the growing penetration of new services are pushing the deployment of optical access networks, and major European operators are now announcing FTTx projects. While FTTH is recognized as the target solution to bring broadband services to residential users, the identification of an FTTx evolutionary path able to seamlessly migrate to FTTH is key to enabling a massive deployment, easing the huge investments needed. WDM-PON architecture is an interesting solution that is able to accommodate the strategic need of building a new fiber-based access infrastructure with the possibility of adapting investments to actual demands and evolving to FTTH without requiring further interventions on fiber infrastructures.
2013-01-01
Analyzing and storing data and results from next-generation sequencing (NGS) experiments is a challenging task, hampered by ever-increasing data volumes and frequent updates of analysis methods and tools. Storage and computation have grown beyond the capacity of personal computers and there is a need for suitable e-infrastructures for processing. Here we describe UPPNEX, an implementation of such an infrastructure, tailored to the needs of data storage and analysis of NGS data in Sweden serving various labs and multiple instruments from the major sequencing technology platforms. UPPNEX comprises resources for high-performance computing, large-scale and high-availability storage, an extensive bioinformatics software suite, up-to-date reference genomes and annotations, a support function with system and application experts as well as a web portal and support ticket system. UPPNEX applications are numerous and diverse, and include whole genome-, de novo- and exome sequencing, targeted resequencing, SNP discovery, RNASeq, and methylation analysis. There are over 300 projects that utilize UPPNEX and include large undertakings such as the sequencing of the flycatcher and Norwegian spruce. We describe the strategic decisions made when investing in hardware, setting up maintenance and support, allocating resources, and illustrate major challenges such as managing data growth. We conclude with summarizing our experiences and observations with UPPNEX to date, providing insights into the successful and less successful decisions made. PMID:23800020
Automatic publishing ISO 19115 metadata with PanMetaDocs using SensorML information
NASA Astrophysics Data System (ADS)
Stender, Vivien; Ulbricht, Damian; Schroeder, Matthias; Klump, Jens
2014-05-01
Terrestrial Environmental Observatories (TERENO) is an interdisciplinary and long-term research project spanning an Earth observation network across Germany. It includes four test sites within Germany from the North German lowlands to the Bavarian Alps and is operated by six research centers of the Helmholtz Association. The contribution by the participating research centers is organized as regional observatories. A challenge for TERENO and its observatories is to integrate all aspects of data management, data workflows, data modeling and visualizations into the design of a monitoring infrastructure. TERENO Northeast is one of the sub-observatories of TERENO and is operated by the German Research Centre for Geosciences (GFZ) in Potsdam. This observatory investigates geoecological processes in the northeastern lowland of Germany by collecting large amounts of environmentally relevant data. The success of long-term projects like TERENO depends on well-organized data management, data exchange between the partners involved and on the availability of the captured data. Data discovery and dissemination are facilitated not only through data portals of the regional TERENO observatories but also through a common spatial data infrastructure TEODOOR (TEreno Online Data repOsitORry). TEODOOR bundles the data, provided by the different web services of the single observatories, and provides tools for data discovery, visualization and data access. The TERENO Northeast data infrastructure integrates data from more than 200 instruments and makes data available through standard web services. Geographic sensor information and services are described using the ISO 19115 metadata schema. TEODOOR accesses the OGC Sensor Web Enablement (SWE) interfaces offered by the regional observatories. In addition to the SWE interface, TERENO Northeast also published data through DataCite. The necessary metadata are created in an automated process by extracting information from the SWE SensorML to create ISO 19115 compliant metadata. The resulting metadata file is stored in the GFZ Potsdam data infrastructure. The publishing workflow for file based research datasets at GFZ Potsdam is based on the eSciDoc infrastructure, using PanMetaDocs (PMD) as the graphical user interface. PMD is a collaborative, metadata based data and information exchange platform [1]. Besides SWE, metadata are also syndicated by PMD through an OAI-PMH interface. In addition, metadata from other observatories, projects or sensors in TERENO can be accessed through the TERENO Northeast data portal. [1] http://meetingorganizer.copernicus.org/EGU2012/EGU2012-7058-2.pdf
Neaimeh, Myriam; Salisbury, Shawn D.; Hill, Graeme A.; ...
2017-06-27
An appropriate charging infrastructure is one of the key aspects needed to support the mass adoption of battery electric vehicles (BEVs), and it is suggested that publically available fast chargers could play a key role in this infrastructure. As fast charging is a relatively new technology, very little research is conducted on the topic using real world datasets, and it is of utmost importance to measure actual usage of this technology and provide evidence on its importance to properly inform infrastructure planning. 90,000 fast charge events collected from the first large-scale roll-outs and evaluation projects of fast charging infrastructure inmore » the UK and the US and 12,700 driving days collected from 35 BEVs in the UK were analysed. Using multiple regression analysis, we examined the relationship between daily driving distance and standard and fast charging and demonstrated that fast chargers are more influential. Fast chargers enabled using BEVs on journeys above their single-charge range that would have been impractical using standard chargers. Fast chargers could help overcome perceived and actual range barriers, making BEVs more attractive to future users. At current BEV market share, there is a vital need for policy support to accelerate the development of fast charge networks.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Neaimeh, Myriam; Salisbury, Shawn D.; Hill, Graeme A.
An appropriate charging infrastructure is one of the key aspects needed to support the mass adoption of battery electric vehicles (BEVs), and it is suggested that publically available fast chargers could play a key role in this infrastructure. As fast charging is a relatively new technology, very little research is conducted on the topic using real world datasets, and it is of utmost importance to measure actual usage of this technology and provide evidence on its importance to properly inform infrastructure planning. 90,000 fast charge events collected from the first large-scale roll-outs and evaluation projects of fast charging infrastructure inmore » the UK and the US and 12,700 driving days collected from 35 BEVs in the UK were analysed. Using multiple regression analysis, we examined the relationship between daily driving distance and standard and fast charging and demonstrated that fast chargers are more influential. Fast chargers enabled using BEVs on journeys above their single-charge range that would have been impractical using standard chargers. Fast chargers could help overcome perceived and actual range barriers, making BEVs more attractive to future users. At current BEV market share, there is a vital need for policy support to accelerate the development of fast charge networks.« less
Robustness and Recovery of Lifeline Infrastructure and Ecosystem Networks
NASA Astrophysics Data System (ADS)
Bhatia, U.; Ganguly, A. R.
2015-12-01
Disruptive events, both natural and man-made, can have widespread impacts on both natural systems and lifeline infrastructure networks leading to the loss of biodiversity and essential functionality, respectively. Projected sea-level rise and climate change can further increase the frequency and severity of large-scale floods on urban-coastal megacities. Nevertheless, Failure in infrastructure systems can trigger cascading impacts on dependent ecosystems, and vice-versa. An important consideration in the behavior of the isolated networks and inter-connected networks following disruptive events is their resilience, or the ability of the network to "bounce back" to a pre-disaster state. Conventional risk analysis and subsequent risk management frameworks have focused on identifying the components' vulnerability and strengthening of the isolated components to withstand these disruptions. But high interconnectedness of these systems, and evolving nature of hazards, particularly in the context of climate extremes, make the component level analysis unrealistic. In this study, we discuss the complex network-based resilience framework to understand fragility and recovery strategies for infrastructure systems impacted by climate-related hazards. We extend the proposed framework to assess the response of ecological networks to multiple species loss and design the restoration management framework to identify the most efficient restoration sequence of species, which can potentially lead to disproportionate gains in biodiversity.
Embracing Diversity: The Exploration of User Motivations in Citizen Science Astronomy Projects
NASA Astrophysics Data System (ADS)
Lee, Lo
2018-06-01
Online citizen science projects ask members of the public to donate spare time on their personal computers to process large datasets. A critical challenge for these projects is volunteer recruitment and retention. Many of these projects use Berkeley Open Infrastructure for Network Computing (BOINC), a piece of middleware, to support their operations. This poster analyzes volunteer motivations in two large, BOINC-based astronomy projects, Einstein@Home and Milkyway@Home. Volunteer opinions are addressed to assess whether and how competitive elements, such as credit and ranking systems, motivate volunteers. Findings from a study of project volunteers, comprising surveys (n=2,031) and follow-up interviews (n=21), show that altruism is the main incentive for participation because volunteers consider scientific research to be critical for humans. Multiple interviewees also revealed a passion for extrinsic motivations, i.e. those that involve recognition from other people, such as opportunities to become co-authors of publications or to earn financial benefits. Credit and ranking systems motivate nearly half of interviewees. By analyzing user motivations in astronomical BOINC projects, this research provides scientists with deeper understandings about volunteer communities and various types of volunteers. Building on these findings, scientists can develop different strategies, for example, awarding volunteers badges, to recruit and retain diverse volunteers, and thus enhance long-term user participation in astronomical BOINC projects.
Code of Federal Regulations, 2011 CFR
2011-10-01
... related facilities required for the rural water supply project; (f) Equipment and management tools for... facilities may be included in an eligible rural water supply project? 404.9 Section 404.9 Public Lands... RURAL WATER SUPPLY PROGRAM Overview § 404.9 What types of infrastructure and facilities may be included...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Finnell, Joshua Eugene; Klein, Martin; Cain, Brian J.
2017-05-09
The proposal is to provide institutional infrastructure that facilitates management of research projects, research collaboration, and management, preservation, and discovery of data. Deploying such infrastructure will amplify the effectiveness, efficiency, and impact of research, as well as assist researchers in regards to compliance with both data management mandates and LANL security policy. This will facilitate discoverability of LANL research both within the lab and external to LANL.
Increasing precipitation volatility in twenty-first-century California
NASA Astrophysics Data System (ADS)
Swain, Daniel L.; Langenbrunner, Baird; Neelin, J. David; Hall, Alex
2018-05-01
Mediterranean climate regimes are particularly susceptible to rapid shifts between drought and flood—of which, California's rapid transition from record multi-year dryness between 2012 and 2016 to extreme wetness during the 2016-2017 winter provides a dramatic example. Projected future changes in such dry-to-wet events, however, remain inadequately quantified, which we investigate here using the Community Earth System Model Large Ensemble of climate model simulations. Anthropogenic forcing is found to yield large twenty-first-century increases in the frequency of wet extremes, including a more than threefold increase in sub-seasonal events comparable to California's `Great Flood of 1862'. Smaller but statistically robust increases in dry extremes are also apparent. As a consequence, a 25% to 100% increase in extreme dry-to-wet precipitation events is projected, despite only modest changes in mean precipitation. Such hydrological cycle intensification would seriously challenge California's existing water storage, conveyance and flood control infrastructure.
The management of large cabling campaigns during the Long Shutdown 1 of LHC
NASA Astrophysics Data System (ADS)
Meroli, S.; Machado, S.; Formenti, F.; Frans, M.; Guillaume, J. C.; Ricci, D.
2014-03-01
The Large Hadron Collider at CERN entered into its first 18 month-long shutdown period in February 2013. During this period the entire CERN accelerator complex will undergo major consolidation and upgrade works, preparing the machines for LHC operation at nominal energy (7 TeV/beam). One of the most challenging activities concerns the cabling infrastructure (copper and optical fibre cables) serving the CERN data acquisition, networking and control systems. About 1000 kilometres of cables, distributed in different machine areas, will be installed, representing an investment of about 15 MCHF. This implies an extraordinary challenge in terms of project management, including resource and activity planning, work execution and quality control. The preparation phase of this project started well before its implementation, by defining technical solutions and setting financial plans for staff recruitment and material supply. Enhanced task coordination was further implemented by deploying selected competences to form a central support team.
Forbes, Barbara; Kepe, Thembela
2015-02-01
Agriculture's large share of Tanzanian GDP and the large percentage of rural poor engaged in the sector make it a focus for many development projects that see it as an area of attention for reducing rural poverty. This paper uses a case of the Kamachumu community, where a dairy cow loan project was implemented using the heifer-in-trust (HIT) model. This study finds that productivity is limited by how the cows are being managed, particularly with many animals not having ad lib access to drinking water. The paper explores reasons why farmers do or do not provide their cows with unlimited access to drinking water. The study concludes that there are many barriers farmers face, including water accessibility, education and training, infrastructure, simple negligence, and security. These results suggest an increase in extension services and national and local livestock policies that consider the specific realities of small-scale dairy farmers.
From ATLASGAL to SEDIGISM: Towards a Complete 3D View of the Dense Galactic Interstellar Medium
NASA Astrophysics Data System (ADS)
Schuller, F.; Urquhart, J.; Bronfman, L.; Csengeri, T.; Bontemps, S.; Duarte-Cabral, A.; Giannetti, A.; Ginsburg, A.; Henning, T.; Immer, K.; Leurini, S.; Mattern, M.; Menten, K.; Molinari, S.; Muller, E.; Sánchez-Monge, A.; Schisano, E.; Suri, S.; Testi, L.; Wang, K.; Wyrowski, F.; Zavagno, A.
2016-09-01
The ATLASGAL survey has provided the first unbiased view of the inner Galactic Plane at sub-millimetre wavelengths. This is the largest ground-based survey of its kind to date, covering 420 square degrees at a wavelength of 870 µm. The reduced data, consisting of images and a catalogue of > 104 compact sources, are available from the ESO Science Archive Facility through the Phase 3 infrastructure. The extremely rich statistics of this survey initiated several follow-up projects, including spectroscopic observations to explore molecular complexity and high angular resolution imaging with the Atacama Large Millimeter/submillimeter Array (ALMA), aimed at resolving individual protostars. The most extensive follow-up project is SEDIGISM, a 3D mapping of the dense interstellar medium over a large fraction of the inner Galaxy. Some notable results of these surveys are highlighted.
Spaceport Command and Control System Automation Testing
NASA Technical Reports Server (NTRS)
Plano, Tom
2017-01-01
The goal of automated testing is to create and maintain a cohesive infrastructure of robust tests that could be run independently on a software package in its entirety. To that end, the Spaceport Command and Control System (SCCS) project at the National Aeronautics and Space Administration's (NASA) Kennedy Space Center (KSC) has brought in a large group of interns to work side-by-side with full time employees to do just this work. Thus, our job is to implement the tests that will put SCCS through its paces.
The Current State of Data Transmission Channels from Pushchino to Moscow and Perspectives
NASA Astrophysics Data System (ADS)
Dumsky, D. V.; Isaev, E. A.; Samodurov, V. A.; Shatskaya, M. V.
Since the work of a unique space radio telescope in the international VLBI project "Radioastron" extended to 2017 the transmission and storage of large volumes of scientific and telemetry data obtained during the experiments is still remains actual. This project is carried out by the Astro Space Center of Lebedev Physical Institute in Moscow, Russia. It requires us to maintain in operating state the high-speed link to merge into a single LAN buffer data center in Puschino and scientific information center in Moscow. Still relevant the chanal equipment monitoring system, and storage systems, as well as the timely replacement of hardware and software upgrades, backups, and documentation of the network infrastructure.
NASA Astrophysics Data System (ADS)
D'Addezio, Giuliana; Marsili, Antonella; Beranzoli, Laura
2017-04-01
ENVRIplus is a Horizon 2020 project bringing together Environmental and Earth System Research Infrastructures, projects and networks together with technical specialist partners to create a more coherent, interdisciplinary and interoperable cluster of Environmental Research. One of the aims of this project is to disseminate knowledge on environmental topics, focusing attention on European secondary schools. We elaborated actions to design an e-Training Platform for multimedia education of secondary school level teachers and students. The purpose is to favor teacher training and consequently students training on selected scientific themes faced within the ENVRIPLUS Research Infrastructures. In particular we address major thematic research areas and challenges on Biodiversity and Ecosystem Services, Greenhouse effect and Earth Warming, Ocean acidifications and Environmental sustainability. To realize the training platform we start detailed study and analysis of teaching and multimedia information materials already available. We plan the realization of an appealing and usable portal/digital repository, to stimulate learning of STEM topics and which also includes opportunities to develop original content. To better project the actions and to catch teacher needs, we prepare a questionnaire that will be administered to a large sample of international school audience to collect input directly from the potential users. The first part focused on objective information about the formal, quantitative and qualitative position of science class in schools and the content and methods of teaching in different countries. The second part investigates subjective teacher experiences, views and proposals on what can improve training offer for environmental science lessons and courses.
Johannessen, Liv Karen; Obstfelder, Aud; Lotherington, Ann Therese
2013-05-01
The purpose of this paper is to explore the making and scaling of information infrastructures, as well as how the conditions for scaling a component may change for the vendor. The first research question is how the making and scaling of a healthcare information infrastructure can be done and by whom. The second question is what scope for manoeuvre there might be for vendors aiming to expand their market. This case study is based on an interpretive approach, whereby data is gathered through participant observation and semi-structured interviews. A case study of the making and scaling of an electronic system for general practitioners ordering laboratory services from hospitals is described as comprising two distinct phases. The first may be characterized as an evolving phase, when development, integration and implementation were achieved in small steps, and the vendor, together with end users, had considerable freedom to create the solution according to the users' needs. The second phase was characterized by a large-scale procurement process over which regional healthcare authorities exercised much more control and the needs of groups other than the end users influenced the design. The making and scaling of healthcare information infrastructures is not simply a process of evolution, in which the end users use and change the technology. It also consists of large steps, during which different actors, including vendors and healthcare authorities, may make substantial contributions. This process requires work, negotiation and strategies. The conditions for the vendor may change dramatically, from considerable freedom and close relationships with users and customers in the small-scale development, to losing control of the product and being required to engage in more formal relations with customers in the wider public healthcare market. Onerous procurement processes may be one of the reasons why large-scale implementation of information projects in healthcare is difficult and slow. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
Global sand trade is paving the way for a tragedy of the sand commons
NASA Astrophysics Data System (ADS)
Torres, A.; Brandt, J.; Lear, K.; Liu, J.
2016-12-01
In the first 40 years of the 21st century, planet Earth is highly likely to experience more urban land expansion than in all of history, an increase in transportation infrastructure by more than a third, and a great variety of land reclamation projects. While scientists are beginning to quantify the deep imprint of human infrastructure on biodiversity at large scales, its off-site impacts and linkages to sand mining and trade have been largely ignored. Sand is the most widely used building material in the world. With an ever-increasing demand for this resource, sand is being extracted at rates that far exceed its replenishment, and is becoming increasingly scarce. This has already led to conflicts around the world and will likely lead to a "tragedy of the sand commons" if sustainable sand mining and trade cannot be achieved. We investigate the environmental and socioeconomic interactions over large distances (telecouplings) of infrastructure development and sand mining and trade across diverse systems through transdisciplinary research and the recently proposed telecoupling framework. Our research is generating a thorough understanding of the telecouplings driven by an increasing demand for sand. In particular, we address three main research questions: 1) Where are the conflicts related to sand mining occurring?; 2) What are the major "sending" and "receiving" systems of sand?; and 3) What are the main components (e.g. causes, effects, agents, etc.) of telecoupled systems involving sand mining and trade? Our results highlight the role of global sand trade as a driver of environmental degradation that threatens the integrity of natural systems and their capacity to deliver key ecosystem services. In addition, infrastructure development and sand mining and trade have important implications for other sustainability challenges such as over-fishing and global warming. This knowledge will help to identify opportunities and tools to better promote a more sustainable use of sand, ultimately helping avoid a "tragedy of the sand commons".
Segal, Courtney; Holve, Erin
2014-11-01
The Recovery Act provided a substantial, one-time investment in data infrastructure for comparative effectiveness research (CER). A review of the publications, data, and tools developed as a result of this support has informed understanding of the level of effort undertaken by these projects. Structured search queries, as well as outreach efforts, were conducted to identify and review resources from American Recovery and Reinvestment Act of 2009 CER projects building electronic clinical data infrastructure. The findings from this study provide a spectrum of productivity across a range of topics and settings. A total of 451 manuscripts published in 192 journals, and 141 data resources and tools were identified and address gaps in evidence on priority populations, conditions, and the infrastructure needed to support CER.
A service-based BLAST command tool supported by cloud infrastructures.
Carrión, Abel; Blanquer, Ignacio; Hernández, Vicente
2012-01-01
Notwithstanding the benefits of distributed-computing infrastructures for empowering bioinformatics analysis tools with the needed computing and storage capability, the actual use of these infrastructures is still low. Learning curves and deployment difficulties have reduced the impact on the wide research community. This article presents a porting strategy of BLAST based on a multiplatform client and a service that provides the same interface as sequential BLAST, thus reducing learning curve and with minimal impact on their integration on existing workflows. The porting has been done using the execution and data access components from the EC project Venus-C and the Windows Azure infrastructure provided in this project. The results obtained demonstrate a low overhead on the global execution framework and reasonable speed-up and cost-efficiency with respect to a sequential version.
NASA Astrophysics Data System (ADS)
Carsughi, Flavio; Fonseca, Luis
2017-06-01
NFFA-EUROPE is an European open access resource for experimental and theoretical nanoscience and sets out a platform to carry out comprehensive projects for multidisciplinary research at the nanoscale extending from synthesis to nanocharacterization to theory and numerical simulation. Advanced infrastructures specialized on growth, nano-lithography, nano-characterization, theory and simulation and fine-analysis with Synchrotron, FEL and Neutron radiation sources are integrated in a multi-site combination to develop frontier research on methods for reproducible nanoscience research and to enable European and international researchers from diverse disciplines to carry out advanced proposals impacting science and innovation. NFFA-EUROPE will enable coordinated access to infrastructures on different aspects of nanoscience research that is not currently available at single specialized ones and without duplicating their specific scopes. Approved user projects will have access to the best suited instruments and support competences for performing the research, including access to analytical large scale facilities, theory and simulation and high-performance computing facilities. Access is offered free of charge to European users and users will receive a financial contribution for their travel, accommodation and subsistence costs. The users access will include several "installations" and will be coordinated through a single entry point portal that will activate an advanced user-infrastructure dialogue to build up a personalized access programme with an increasing return on science and innovation production. The own research activity of NFFA-EUROPE will address key bottlenecks of nanoscience research: nanostructure traceability, protocol reproducibility, in-operando nano-manipulation and analysis, open data.
Climate simulations and services on HPC, Cloud and Grid infrastructures
NASA Astrophysics Data System (ADS)
Cofino, Antonio S.; Blanco, Carlos; Minondo Tshuma, Antonio
2017-04-01
Cloud, Grid and High Performance Computing have changed the accessibility and availability of computing resources for Earth Science research communities, specially for Climate community. These paradigms are modifying the way how climate applications are being executed. By using these technologies the number, variety and complexity of experiments and resources are increasing substantially. But, although computational capacity is increasing, traditional applications and tools used by the community are not good enough to manage this large volume and variety of experiments and computing resources. In this contribution, we evaluate the challenges to run climate simulations and services on Grid, Cloud and HPC infrestructures and how to tackle them. The Grid and Cloud infrastructures provided by EGI's VOs ( esr , earth.vo.ibergrid and fedcloud.egi.eu) will be evaluated, as well as HPC resources from PRACE infrastructure and institutional clusters. To solve those challenges, solutions using DRM4G framework will be shown. DRM4G provides a good framework to manage big volume and variety of computing resources for climate experiments. This work has been supported by the Spanish National R&D Plan under projects WRF4G (CGL2011-28864), INSIGNIA (CGL2016-79210-R) and MULTI-SDM (CGL2015-66583-R) ; the IS-ENES2 project from the 7FP of the European Commission (grant agreement no. 312979); the European Regional Development Fund—ERDF and the Programa de Personal Investigador en Formación Predoctoral from Universidad de Cantabria and Government of Cantabria.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baumgartner, Lee J.; Daniel Deng, Z.; Thorncraft, Garry
2014-01-01
Tropical rivers have high annual discharges optimal for hydropower and irrigation development. The Mekong River is one of the largest tropical river systems, supporting a unique mega-diverse fish community. Fish are an important commodity in the Mekong, contributing a large proportion of calcium, protein, and essential nutrients to the diet of the local people and providing a critical source of income for rural households. Many of these fish migrate not only upstream and downstream within main-channel habitats but also laterally into highly productive floodplain habitat to both feed and spawn. Most work to date has focused on providing for upstreammore » fish passage, but downstream movement is an equally important process to protect. Expansion of hydropower and irrigation weirs can disrupt downstream migrations and it is important to ensure that passage through regulators or mini hydro systems is not harmful or fatal. Many new infrastructure projects (<6 m head) are proposed for the thousands of tributary streams throughout the Lower Mekong Basin and it is important that designs incorporate the best available science to protect downstream migrants. Recent advances in technology have provided new techniques which could be applied to Mekong fish species to obtain design criteria that can facilitate safe downstream passage. Obtaining and applying this knowledge to new infrastructure projects is essential in order to produce outcomes that are more favorable to local ecosystems and fisheries.« less
Mind the Gap: furthering the development of EU-US collaboration in marine geoscience.
NASA Astrophysics Data System (ADS)
Glaves, H.; Miller, S.; Schaap, D.
2012-04-01
There is a large and ever increasing amount of marine geological and geophysical data available throughout Europe, the USA and beyond. The challenges associated with the acquisition of this data mean that the cost of collecting it is very high and there is therefore a need to maximise the potential re-use of this data wherever possible. Facilitating this is becoming an increasingly important aspect of marine geosciences data management as the need for marine data increases at a time when the financial resources for data acquisition are being dramatically reduced. A significant barrier to the re-use of marine geoscience data is the variety of different formats, standards, vocabularies etc which have been used by the various organisations engaged with the collection and management of marine geosciences data at a regional, national and international scale. This is also proving to be a barrier to the development of interoperability with other data types at a time when there is a need to develop a more holistic approach to marine research. These challenges are currently being addressed within Europe by a number of EU funded initiatives, the objectives of which are an improvement in the discovery and access to marine data. The Geo-Seas project is just one of these initiatives, the focus of which is the development of an e-infrastructure for the delivery of standardised marine geological and geophysical data across Europe. The project is developing this e-infrastructure by adopting and adapting the methodologies of the SeaDataNet project which currently provides an e-infrastructure for the management of oceanographic data. This re-use of the existing technologies has lead to the development a joint multidisciplinary e-infrastructure for the delivery or both geoscientific and oceanographic data. In order to expand these initiatives further and bridge the gap between these European projects and those being undertaken by colleagues in both the US and elsewhere a number of collaborative relationships have been developed. To further these growing collaborative relationships a new EU initiative has recently been proposed in parallel with the relevant funding agencies in the USA and Australia with the objective of developing common standards and methodologies which will allow the development of a common multidisciplinary approach to marine science on an international scale.
Uncertainties in discharge projections in consequence of climate change
NASA Astrophysics Data System (ADS)
Liebert, J.; Düthmann, D.; Berg, P.; Feldmann, H.; Ihringer, J.; Kunstmann, H.; Merz, B.; Ott, I.; Schädler, G.; Wagner, S.
2012-04-01
The fourth assessment report of the IPCC summarizes possible effects of the global climate change. For Europe an increasing variability of temperature and precipitation is expected. While the increasing temperature is projected almost uniformly for Europe, for precipitation the models indicate partly heterogeneous tendencies. In order to maintain current safety-standards in the infrastructure of our various water management systems, the possible future floods discharges are very often a central question. In the planning and operation of water infrastructure systems uncertainties considerations have an important function. In times of climate change the analyses of measured historical gauge data (normally 30 - 80 years) are not sufficient enough, because even significant trends are only valid in the analyzed time period and extrapolations are exceedingly difficult. Therefore combined climate and hydrological modeling for scenario based projections become more and more popular. Regarding that adaptation measures in water infrastructure are in general very time-consuming and cost intensive qualified questions to the variability and uncertainty of model based results are important as well. The CEDIM-Project "Flood hazards in a changing climate" is focusing on both: future changes in flood discharge and assess the uncertainties that are involved in such model based future predictions. In detail the study bases on an ensemble of hydrological model (HM) simulations in 3 representative small to medium sized German river catchments (Ammer, Mulde and Ruhr). The meteorological Input bases on 2 high resolution (7 km) regional climate models (RCM) driven by 2 global climate models (GCM) for the near future (2021 - 2050) following the A1B emission scenario (SRES). Two of the catchments (Ruhr and Mulde) have sub-mountainous and one (Ammer) has alpine character. Besides analyzing the future changes in discharge in the catchments, the describing and potential quantification of the variability of the results, based on the different driving data, regionalization methods, spatial resolutions and model types, is one main goal of the study and should stay in the focus of the poster. The general result is a large variability in the discharge projection. The identified variabilities are in the annual regime mainly attributable to different causes in the used model chain (GCM-RCM-HM). In winter the global climate models (GCM) bring the main uncertainties in the future projection. In summer the main variability refers to the meteorological downscaling to the regional scale (RCM) in combination with the hydrological modeling (HM). But with an appropriate ensemble statistic are despite the large variabilities mean future tendencies detectable. The Ruhr catchment shows tendencies to future higher flood discharges and in the Ammer and Mulde catchments are no significant changes expected.
Semantic Support for Complex Ecosystem Research Environments
NASA Astrophysics Data System (ADS)
Klawonn, M.; McGuinness, D. L.; Pinheiro, P.; Santos, H. O.; Chastain, K.
2015-12-01
As ecosystems come under increasing stresses from diverse sources, there is growing interest in research efforts aimed at monitoring, modeling, and improving understanding of ecosystems and protection options. We aimed to provide a semantic infrastructure capable of representing data initially related to one large aquatic ecosystem research effort - the Jefferson project at Lake George. This effort includes significant historical observational data, extensive sensor-based monitoring data, experimental data, as well as model and simulation data covering topics including lake circulation, watershed runoff, lake biome food webs, etc. The initial measurement representation has been centered on monitoring data and related provenance. We developed a human-aware sensor network ontology (HASNetO) that leverages existing ontologies (PROV-O, OBOE, VSTO*) in support of measurement annotations. We explicitly support the human-aware aspects of human sensor deployment and collection activity to help capture key provenance that often is lacking. Our foundational ontology has since been generalized into a family of ontologies and used to create our human-aware data collection infrastructure that now supports the integration of measurement data along with simulation data. Interestingly, we have also utilized the same infrastructure to work with partners who have some more specific needs for specifying the environmental conditions where measurements occur, for example, knowing that an air temperature is not an external air temperature, but of the air temperature when windows are shut and curtains are open. We have also leveraged the same infrastructure to work with partners more interested in modeling smart cities with data feeds more related to people, mobility, environment, and living. We will introduce our human-aware data collection infrastructure, and demonstrate how it uses HASNetO and its supporting SOLR-based search platform to support data integration and semantic browsing. Further we will present learnings from its use in three relatively diverse large ecosystem research efforts and highlight some benefits and challenges related to our semantically-enhanced foundation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Francfort, Jim; Bennett, Brion; Carlson, Richard
2015-09-01
Battelle Energy Alliance, LLC, managing and operating contractor for the U.S. Department of Energy’s (DOE) Idaho National Laboratory (INL), is the lead laboratory for U.S. Department of Energy’s Advanced Vehicle Testing Activity (AVTA). INL’s conduct of the AVTA resulted in a significant base of knowledge and experience in the area of testing light-duty vehicles that reduced transportation-related petroleum consumption. Due to this experience, INL was tasked by DOE to develop agreements with companies that were the recipients of The American Recovery and Reinvestment Act of 2009 (ARRA) grants, that would allow INL to collect raw data from light-duty vehicles andmore » charging infrastructure. INL developed non-disclosure agreements (NDAs) with several companies and their partners that resulted in INL being able to receive raw data via server-to-server connections from the partner companies. This raw data allowed INL to independently conduct data quality checks, perform analysis, and report publicly to DOE, partners, and stakeholders, how drivers used both new vehicle technologies and the deployed charging infrastructure. The ultimate goal was not the deployment of vehicles and charging infrastructure, cut rather to create real-world laboratories of vehicles, charging infrastructure and drivers that would aid in the design of future electric drive transportation systems. The five projects that INL collected data from and their partners are: • ChargePoint America - Plug-in Electric Vehicle Charging Infrastructure Demonstration • Chrysler Ram PHEV Pickup - Vehicle Demonstration • General Motors Chevrolet Volt - Vehicle Demonstration • The EV Project - Plug-in Electric Vehicle Charging Infrastructure Demonstration • EPRI / Via Motors PHEVs – Vehicle Demonstration The document serves to benchmark the performance science involved the execution, analysis and reporting for the five above projects that provided lessons learned based on driver’s use of the vehicles and recharging decisions made. Data is reported for the use of more than 25,000 vehicles and charging units.« less
Urban underground infrastructure mapping and assessment
NASA Astrophysics Data System (ADS)
Huston, Dryver; Xia, Tian; Zhang, Yu; Fan, Taian; Orfeo, Dan; Razinger, Jonathan
2017-04-01
This paper outlines and discusses a few associated details of a smart cities approach to the mapping and condition assessment of urban underground infrastructure. Underground utilities are critical infrastructure for all modern cities. They carry drinking water, storm water, sewage, natural gas, electric power, telecommunications, steam, etc. In most cities, the underground infrastructure reflects the growth and history of the city. Many components are aging, in unknown locations with congested configurations, and in unknown condition. The technique uses sensing and information technology to determine the state of infrastructure and provide it in an appropriate, timely and secure format for managers, planners and users. The sensors include ground penetrating radar and buried sensors for persistent sensing of localized conditions. Signal processing and pattern recognition techniques convert the data in information-laden databases for use in analytics, graphical presentations, metering and planning. The presented data are from construction of the St. Paul St. CCTA Bus Station Project in Burlington, VT; utility replacement sites in Winooski, VT; and laboratory tests of smart phone position registration and magnetic signaling. The soil conditions encountered are favorable for GPR sensing and make it possible to locate buried pipes and soil layers. The present state of the art is that the data collection and processing procedures are manual and somewhat tedious, but that solutions for automating these procedures appear to be viable. Magnetic signaling with moving permanent magnets has the potential for sending lowfrequency telemetry signals through soils that are largely impenetrable by other electromagnetic waves.
Efficient On-Demand Operations in Large-Scale Infrastructures
ERIC Educational Resources Information Center
Ko, Steven Y.
2009-01-01
In large-scale distributed infrastructures such as clouds, Grids, peer-to-peer systems, and wide-area testbeds, users and administrators typically desire to perform "on-demand operations" that deal with the most up-to-date state of the infrastructure. However, the scale and dynamism present in the operating environment make it challenging to…
A National contribution to the GEO Science and Technology roadmap: GIIDA Project
NASA Astrophysics Data System (ADS)
Nativi, Stefano; Mazzetti, Paolo; Guzzetti, Fausto; Oggioni, Alessandro; Pirrone, Nicola; Santolieri, Rosalia; Viola, Angelo; Tartari, Gianni; Santoro, Mattia
2010-05-01
The GIIDA (Gestione Integrata e Interoperativa dei Dati Ambientali) project is an initiative of the Italian National Research Council (CNR) launched in 2008 as an inter-departmental project, aiming to design and develop a multidisciplinary e-infrastructure (cyber-infrastructure) for the management, processing, and evaluation of Earth and Environmental resources -i.e. data, services, models, sensors, best practices. GIIDA has been contributing to the implementation of the GEO (Group of Earth Observation) Science and Technology (S&T) roadmap by: (a) linking relevant S&T communities to GEOSS (GEO System of Systems); (b) ensuring that GEOSS is built based on state-of-the-art science and technology. GIIDA co-ordinates the CNR's digital infrastructure development for Earth Observation resources sharing and cooperates with other national agencies and existing projects pursuing the same objective. For the CNR, GIIDA provides an interface to European and international interoperability programmes (e.g. INSPIRE, and GMES). It builds a national network for dialogue and resolution of issues at varying scientific and technical levels. To achieve such goals, GIIDA introduced a set of guidance principles: • To shift from a "traditional" data centric approach to a more advanced service-based solution for Earth System Science and Environmental information. • To shift the focus from Data to Information Spatial Infrastructures in order to support decision-making. • To be interoperable with analogous National (e.g. SINAnet, and the INSPIRE National Infrastructure) and international initiatives (e.g. INSPIRE, GMES, SEIS, and GEOSS). • To reinforce the Italian presence in the European and international programmes concerning digital infrastructures, geospatial information, and the Mega-Science approach. • To apply the National and International Information Technology (IT) standards for achieving multi-disciplinary interoperability in the Earth and Space Sciences (e.g. ISO, OGC, CEN, CNIPA) In keeping with GEOSS, GIIDA infrastructure adopts a System of Systems architectural approach in order to federate the existing systems managed by a set of recognized Thematic Areas (i.e. Risks, Biodiversity, Climate Change, Air Quality, Land and Water Quality, Ocean and Marine resources, Joint Research and Public Administration infrastructures). GIIDA system of systems will contribute to develop multidisciplinary teams studying the global Earth systems in order to address the needs coming from the GEO Societal Benefit Areas (SBAs). GIIDA issued a Call For Pilots receiving more than 20 high-level projects which are contributing to the GIIDA system development. A national-wide research environmental infrastructure must be interconnected with analogous digital infrastructures operated by other important stakeholders, such as public users and private companies. In fact, the long-term sustainability of a "System of Systems" requires synergies between all the involved stakeholders' domains: Users, Governance, Capacity provision, and Research. Therefore, in order to increase the effectiveness of the GIIDA contribution process to a national environmental e-infrastructure, collaborations were activated with relevant actors of the other stakeholders' domains at the national level (e.g. ISPRA SINAnet).
Fuzzy net present valuation based on risk assessment of Malaysian infrastructure
NASA Astrophysics Data System (ADS)
Shaffie, Siti Salihah; Jaaman, Saiful Hafizah; Mohamad, Daud
2017-04-01
In recent years, built-operate-transfer (BOT) projects have profoundly been accepted under project financing for infrastructure developments in many countries. It requires high financing and involves complex mutual risk. The assessment of the risks is vital to avert huge financial loss. Net present value is widely applied to BOT project where the uncertainties in cash flows are deemed to be deterministic values. This study proposed a fuzzy net present value model taking consideration the assessment of risks from the BOT project. The proposed model is adopted to provide more flexible net present valuation of the project. It is shown and proven that the improved fuzzy cash flow model will provide a valuation that is closed to the real value of the project.
Integration of structural health monitoring and asset management.
DOT National Transportation Integrated Search
2012-08-01
This project investigated the feasibility and potential benefits of the integration of infrastructure monitoring systems into enterprise-scale transportation management systems. An infrastructure monitoring system designed for bridges was implemented...
Green Infrastructure Research at EPA's Edison Environmental Center
The presentation outline includes: (1) Green infrastructure research objectives (2) Introduction to ongoing research projects - Aspects of design, construction, and maintenence that affect function - Real-world applications of GI research
Green Infrastructure Barriers and Opportunities in the Macatawa Watershed, Michigan
The project supports MACC outreach and implementation efforts of the watershed management plan by facilitating communication with local municipal staff and educating local decision makers about green infrastructure.
Enhancing the Environmental Legacy of the International Polar Year 2007- 2008
NASA Astrophysics Data System (ADS)
Tin, T.; Roura, R.; Perrault, M.
2006-12-01
The International Geophysical Year (IGY) left a legacy of peace and international cooperation in the form of the 1959 Antarctic Treaty. Since the IGY, the 1991 Protocol of Environmental Protection to the Antarctic Treaty was signed and entered into force. The Protocol establishes that the protection of the environment and the wilderness values of Antarctica "shall be fundamental considerations in the planning and conduct of all activities in the Antarctic Treaty area". Fifty years on, the IPY 2007-08 can, in turn, leave behind a positive environmental legacy - where the sharing of facilities and logistics are encouraged, the human footprint in Antarctica is minimized and a future generation of environmentally aware scientists, logisticians and visitors is fostered. Based on an analysis of all Expressions of Interest submitted to the IPY, we found that about three-quarters of IPY's Antarctic projects plan to have fieldwork components. About one-third of these field projects expect to leave physical infrastructure in Antarctica. A number of projects plan to develop large-scale infrastructure, such as stations and observatories, in hitherto pristine areas. Fewer than one percent of Antarctic field projects address the issue of their environmental legacy: four projects indicated that the site will be cleaned up or the equipment will be removed at the end of the project; two projects indicated that their results may be useful for the management of the Antarctic environment, e.g., in the control of invasive species or setting up of marine protected areas. With the goal of increasing the environmental awareness of Antarctic field scientists, our contribution will review current research on the impacts of human activities science, tourism, exploitation of marine resources and global climate change - on the Antarctic environment. A preliminary analysis of the cumulative impacts of IPY activities will be presented. Case studies of scientific projects in Antarctica with a potentially positive environmental legacy will be highlighted, and suggestions of actions that could be taken to increase the environmental friendliness of scientific projects will be discussed.
2015-01-01
Recently, two quantitative tools have emerged for predicting the health impacts of projects that change population physical activity: the Health Economic Assessment Tool (HEAT) and Dynamic Modeling for Health Impact Assessment (DYNAMO-HIA). HEAT has been used to support health impact assessments of transportation infrastructure projects, but DYNAMO-HIA has not been previously employed for this purpose nor have the two tools been compared. To demonstrate the use of DYNAMO-HIA for supporting health impact assessments of transportation infrastructure projects, we employed the model in three communities (urban, suburban, and rural) in North Carolina. We also compared DYNAMO-HIA and HEAT predictions in the urban community. Using DYNAMO-HIA, we estimated benefit-cost ratios of 20.2 (95% C.I.: 8.7–30.6), 0.6 (0.3–0.9), and 4.7 (2.1–7.1) for the urban, suburban, and rural projects, respectively. For a 40-year time period, the HEAT predictions of deaths avoided by the urban infrastructure project were three times as high as DYNAMO-HIA's predictions due to HEAT's inability to account for changing population health characteristics over time. Quantitative health impact assessment coupled with economic valuation is a powerful tool for integrating health considerations into transportation decision-making. However, to avoid overestimating benefits, such quantitative HIAs should use dynamic, rather than static, approaches. PMID:26504832
Mansfield, Theodore J; MacDonald Gibson, Jacqueline
2015-01-01
Recently, two quantitative tools have emerged for predicting the health impacts of projects that change population physical activity: the Health Economic Assessment Tool (HEAT) and Dynamic Modeling for Health Impact Assessment (DYNAMO-HIA). HEAT has been used to support health impact assessments of transportation infrastructure projects, but DYNAMO-HIA has not been previously employed for this purpose nor have the two tools been compared. To demonstrate the use of DYNAMO-HIA for supporting health impact assessments of transportation infrastructure projects, we employed the model in three communities (urban, suburban, and rural) in North Carolina. We also compared DYNAMO-HIA and HEAT predictions in the urban community. Using DYNAMO-HIA, we estimated benefit-cost ratios of 20.2 (95% C.I.: 8.7-30.6), 0.6 (0.3-0.9), and 4.7 (2.1-7.1) for the urban, suburban, and rural projects, respectively. For a 40-year time period, the HEAT predictions of deaths avoided by the urban infrastructure project were three times as high as DYNAMO-HIA's predictions due to HEAT's inability to account for changing population health characteristics over time. Quantitative health impact assessment coupled with economic valuation is a powerful tool for integrating health considerations into transportation decision-making. However, to avoid overestimating benefits, such quantitative HIAs should use dynamic, rather than static, approaches.
NASA Astrophysics Data System (ADS)
Linsen, Max; Mostert, Erik; van der Zaag, Pieter
2015-04-01
Consequences of climate change include an increase in extreme weather events in North-West Europe. The Netherlands is directly affected by these extreme events, in particular in water management practices. Large investments in infrastructure were made ever since the floods of 1953, leading to a higher level of protection against flooding from the sea and to a managed eco-hydrological Delta. Adaptive water management is presented as an approach to deal with challenges in water allocation and flood protection. One challenge to adaptive water management relates to infrastructure. Large works are often inevitable and essential in flood protection. Hydraulic infrastructure however tends to be inflexible by nature and requires a level of robustness to deliver the desired performance over time. In this study, we focus on the relation between desired performance of infrastructure and adaptation to environmental change and evolving social demands. The objective of this study is to gain an understanding of the evolution of the desired performance of water management infrastructure. This serves two purposes: an increased understanding of design and construction of existing infrastructure, and potential lessons learned for future hydraulic infrastructure in the context of adaptive management. A qualitative approach was used to evaluate over 130 reports on all stages of the design, planning and construction of the Haringvliet sluices as part of the realization of the Delta Works. The time frame is set between 1950 and 1970. The main source of information is a set of quarterly reports to the Dutch parliament, published between 1956 and 1988, and which provided detailed information on design, construction, maintenance, system behavior, policy needs, social demands and stakeholders. The original objectives of the infrastructure were reflected in its design: protection against flooding, protection against salt intrusion and discharge of water and ice - all with a desired ease of operations of the gates of the infrastructure. The dimensions of the Haringvliet sluices thank their uniqueness to the requirements to discharge both ice and water. Upon completion of the Haringvliet sluices in 1970, two main observations can be made. First, environmental issues were hardly considered, while the focus was on protection against flooding and salt from the sea. Second, during the construction phase, experimentation, learning and adaptation were reported. Changes were made during the construction and based on extreme weather events and lessons learned from construction activities elsewhere. These observations prompt the question whether an experimental approach as applied during the construction of the Haringvliet sluices would be allowed for in modern infrastructure projects of comparable impact, size and costs. A second question to be studied should be what happened after the completion of the Haringvliet sluices, when this infrastructure had to be operated and in a context in which environmental issues gradually became more prominent and eventually integrated into the water management practice.
Street Level Hydrology: An Urban Application of the WRF-Hydro Framework in Denver, Colorado
NASA Astrophysics Data System (ADS)
Read, L.; Hogue, T. S.; Salas, F. R.; Gochis, D.
2015-12-01
Urban flood modeling at the watershed scale carries unique challenges in routing complexity, data resolution, social and political issues, and land surface - infrastructure interactions. The ability to accurately trace and predict the flow of water through the urban landscape enables better emergency response management, floodplain mapping, and data for future urban infrastructure planning and development. These services are of growing importance as urban population is expected to continue increasing by 1.84% per year for the next 25 years, increasing the vulnerability of urban regions to damages and loss of life from floods. Although a range of watershed-scale models have been applied in specific urban areas to examine these issues, there is a trend towards national scale hydrologic modeling enabled by supercomputing resources to understand larger system-wide hydrologic impacts and feedbacks. As such it is important to address how urban landscapes can be represented in large scale modeling processes. The current project investigates how coupling terrain and infrastructure routing can improve flow prediction and flooding events over the urban landscape. We utilize the WRF-Hydro modeling framework and a high-resolution terrain routing grid with the goal of compiling standard data needs necessary for fine scale urban modeling and dynamic flood forecasting in the urban setting. The city of Denver is selected as a case study, as it has experienced several large flooding events in the last five years and has an urban annual population growth rate of 1.5%, one of the highest in the U.S. Our work highlights the hydro-informatic challenges associated with linking channel networks and drainage infrastructure in an urban area using the WRF-Hydro modeling framework and high resolution urban models for short-term flood prediction.
Big Wind Turbines Require Infrastructure Upgrades - Continuum Magazine |
rapidly. To that end, NREL has been completing electrical infrastructure upgrades to accommodate utility in the fall of 2009 necessitated infrastructure upgrades. Now the NWTC's electrical infrastructure eastern-most row on site. Interconnecting these large turbines required major electrical infrastructure
NASA Astrophysics Data System (ADS)
Glaves, H. M.; Schaap, D.
2014-12-01
As marine research becomes increasingly multidisciplinary in its approach there has been a corresponding rise in the demand for large quantities of high quality interoperable data. A number of regional initiatives are already addressing this requirement through the establishment of e-infrastructures to improve the discovery and access of marine data. Projects such as Geo-Seas and SeaDataNet in Europe, Rolling Deck to Repository (R2R) in the USA and IMOS in Australia have implemented local infrastructures to facilitate the exchange of standardised marine datasets. However, each of these regional initiatives has been developed to address their own requirements and independently of other regions. To establish a common framework for marine data management on a global scale these is a need to develop interoperability solutions that can be implemented across these initiatives.Through a series of workshops attended by the relevant domain specialists, the Ocean Data Interoperability Platform (ODIP) project has identified areas of commonality between the regional infrastructures and used these as the foundation for the development of three prototype interoperability solutions addressing: the use of brokering services for the purposes of providing access to the data available in the regional data discovery and access services including via the GEOSS portal the development of interoperability between cruise summary reporting systems in Europe, the USA and Australia for routine harvesting of cruise data for delivery via the Partnership for Observation of Global Oceans (POGO) portal the establishment of a Sensor Observation Service (SOS) for selected sensors installed on vessels and in real-time monitoring systems using sensor web enablement (SWE) These prototypes will be used to underpin the development of a common global approach to the management of marine data which can be promoted to the wider marine research community. ODIP is a community lead project that is currently focussed on regional initiatives in Europe, the USA and Australia but which is seeking to expand this framework to include other regional marine data infrastructures.
The Cloud Area Padovana: from pilot to production
NASA Astrophysics Data System (ADS)
Andreetto, P.; Costa, F.; Crescente, A.; Dorigo, A.; Fantinel, S.; Fanzago, F.; Sgaravatto, M.; Traldi, S.; Verlato, M.; Zangrando, L.
2017-10-01
The Cloud Area Padovana has been running for almost two years. This is an OpenStack-based scientific cloud, spread across two different sites: the INFN Padova Unit and the INFN Legnaro National Labs. The hardware resources have been scaled horizontally and vertically, by upgrading some hypervisors and by adding new ones: currently it provides about 1100 cores. Some in-house developments were also integrated in the OpenStack dashboard, such as a tool for user and project registrations with direct support for the INFN-AAI Identity Provider as a new option for the user authentication. In collaboration with the EU-funded Indigo DataCloud project, the integration with Docker-based containers has been experimented with and will be available in production soon. This computing facility now satisfies the computational and storage demands of more than 70 users affiliated with about 20 research projects. We present here the architecture of this Cloud infrastructure, the tools and procedures used to operate it. We also focus on the lessons learnt in these two years, describing the problems that were found and the corrective actions that had to be applied. We also discuss about the chosen strategy for upgrades, which combines the need to promptly integrate the OpenStack new developments, the demand to reduce the downtimes of the infrastructure, and the need to limit the effort requested for such updates. We also discuss how this Cloud infrastructure is being used. In particular we focus on two big physics experiments which are intensively exploiting this computing facility: CMS and SPES. CMS deployed on the cloud a complex computational infrastructure, composed of several user interfaces for job submission in the Grid environment/local batch queues or for interactive processes; this is fully integrated with the local Tier-2 facility. To avoid a static allocation of the resources, an elastic cluster, based on cernVM, has been configured: it allows to automatically create and delete virtual machines according to the user needs. SPES, using a client-server system called TraceWin, exploits INFN’s virtual resources performing a very large number of simulations on about a thousand nodes elastically managed.
Joint Knowledge Generation Between Climate Science and Infrastructure Engineering
NASA Astrophysics Data System (ADS)
Stoner, A. M. K.; Hayhoe, K.; Jacobs, J. M.
2015-12-01
Over the past decade the engineering community has become increasingly aware of the need to incorporate climate projections into the planning and design of sensitive infrastructure. However, this is a task that is easier said than done. This presentation will discuss some of the successes and hurdles experiences through the past year, from a climate scientist's perspective, working with engineers in infrastructure research and applied engineering through the Infrastructure & Climate Network (ICNet). Engineers rely on strict building codes and ordinances, and can be the subject of lawsuits if those codes are not followed. Matters are further complicated by the uncertainty inherent to climate projections, which include short-term natural variability, as well as the influence of scientific uncertainty and even human behavior on the rate and magnitude of change. Climate scientists typically address uncertainty by creating projections based on multiple models following different future scenarios. This uncertainty is difficult to incorporate into engineering projects, however, due to the fact that they cannot build two different bridges, one allowing for a lower amount of change, and another for a higher. More often than not there is a considerable difference between the costs of building two such bridges, which means that available funds often are the deciding factor. Discussions of climate science are often well received with engineers who work in the research area of infrastructure; going a step further, however, and implementing it in applied engineering projects can be challenging. This presentation will discuss some of the challenges and opportunities inherent to collaborations between climate scientists and transportation engineers, drawing from a range of studies including truck weight restrictions on roads during the spring thaw, and bridge deck performance due to environmental forcings.
FOSS Tools for Research Data Management
NASA Astrophysics Data System (ADS)
Stender, Vivien; Jankowski, Cedric; Hammitzsch, Martin; Wächter, Joachim
2017-04-01
Established initiatives and organizations, e.g. the Initiative for Scientific Cyberinfrastructures (NSF, 2007) or the European Strategy Forum on Research Infrastructures (ESFRI, 2008), promote and foster the development of sustainable research infrastructures. These infrastructures aim the provision of services supporting scientists to search, visualize and access data, to collaborate and exchange information, as well as to publish data and other results. In this regard, Research Data Management (RDM) gains importance and thus requires the support by appropriate tools integrated in these infrastructures. Different projects provide arbitrary solutions to manage research data. However, within two projects - SUMARIO for land and water management and TERENO for environmental monitoring - solutions to manage research data have been developed based on Free and Open Source Software (FOSS) components. The resulting framework provides essential components for harvesting, storing and documenting research data, as well as for discovering, visualizing and downloading these data on the basis of standardized services stimulated considerably by enhanced data management approaches of Spatial Data Infrastructures (SDI). In order to fully exploit the potentials of these developments for enhancing data management in Geosciences the publication of software components, e.g. via GitHub, is not sufficient. We will use our experience to move these solutions into the cloud e.g. as PaaS or SaaS offerings. Our contribution will present data management solutions for the Geosciences developed in two projects. A sort of construction kit with FOSS components build the backbone for the assembly and implementation of projects specific platforms. Furthermore, an approach is presented to stimulate the reuse of FOSS RDM solutions with cloud concepts. In further projects specific RDM platforms can be set-up much faster, customized to the individual needs and tools can be added during the run-time.
Pennsylvania Reaches Infrastructure Milestone
With a series of “aye” votes, the Pennsylvania agency that turns EPA funding and state financing into water infrastructure projects crossed a key threshold recently – $8 billion in investment over nearly three decades
Code of Federal Regulations, 2011 CFR
2011-01-01
... includes land that is improved by the construction of Project infrastructure such as, but not limited to, roads, sewers and water lines that are not situated on or under the land, where the infrastructure...
Greening the Iron Arts District in Scranton PA
The concept designs described in this report are examples of how green infrastructure can be used to reduce the impact of stormwater runoff and catalyze additional green infrastructure projects throughout Scranton.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wescott, K. L.; May, J. E.; Moore, H. R.
The U.S. Forest Service (USFS) Special Uses-Lands Program is in jeopardy. Although this program, authorized in Title 36, Part 251, of the U.S. Code of Federal Regulations (36 CFR Part 251), ranks among the top four revenue-generating programs for use of National Forest System (NFS) lands, along with the Timber, Minerals, and Special Uses-Recreation Programs, the Special Uses-Lands Program is in a state of neglect. Repeated cuts in funding (a decrease of 26% from fiscal years 2010 to 2014) are adversely affecting staffing and training, which in turn is affecting timely permit processing and ultimately the public’s ability to usemore » and benefit from NFS lands. In addition, highly experienced staff with valuable institutional knowledge of the program have begun to retire. The ability of the program to function under these dire circumstances can be attributed to the dedication of Special Uses staff to the program and their commitment to the public. The initial focus of this report was to identify opportunities for improving performance of permitting and review for large energy infrastructure-related projects. However, it became clear during this analysis that these projects are generally adequately staffed and managed. This is due in large part to the availability of cost-recovery dollars and the high-profile nature of these projects. However, it also became apparent that larger issues affecting the bulk of the work of the Special Uses-Lands Program need to be addressed immediately. This report is a preliminary examination of the state of the Special Uses-Lands Program and focuses on a few key items requiring immediate attention. Further investigation through case studies is recommended to dig deeper into the Special Uses-Lands Program business process to determine the most costeffective strategies for streamlining the overall process and the metrics by which performance can be evaluated, including for the permitting and tracking of energy infrastructure projects.« less
Characterizing uncertain sea-level rise projections to support investment decisions.
Sriver, Ryan L; Lempert, Robert J; Wikman-Svahn, Per; Keller, Klaus
2018-01-01
Many institutions worldwide are considering how to include uncertainty about future changes in sea-levels and storm surges into their investment decisions regarding large capital infrastructures. Here we examine how to characterize deeply uncertain climate change projections to support such decisions using Robust Decision Making analysis. We address questions regarding how to confront the potential for future changes in low probability but large impact flooding events due to changes in sea-levels and storm surges. Such extreme events can affect investments in infrastructure but have proved difficult to consider in such decisions because of the deep uncertainty surrounding them. This study utilizes Robust Decision Making methods to address two questions applied to investment decisions at the Port of Los Angeles: (1) Under what future conditions would a Port of Los Angeles decision to harden its facilities against extreme flood scenarios at the next upgrade pass a cost-benefit test, and (2) Do sea-level rise projections and other information suggest such conditions are sufficiently likely to justify such an investment? We also compare and contrast the Robust Decision Making methods with a full probabilistic analysis. These two analysis frameworks result in similar investment recommendations for different idealized future sea-level projections, but provide different information to decision makers and envision different types of engagement with stakeholders. In particular, the full probabilistic analysis begins by aggregating the best scientific information into a single set of joint probability distributions, while the Robust Decision Making analysis identifies scenarios where a decision to invest in near-term response to extreme sea-level rise passes a cost-benefit test, and then assembles scientific information of differing levels of confidence to help decision makers judge whether or not these scenarios are sufficiently likely to justify making such investments. Results highlight the highly-localized and context dependent nature of applying Robust Decision Making methods to inform investment decisions.
Characterizing uncertain sea-level rise projections to support investment decisions
Lempert, Robert J.; Wikman-Svahn, Per; Keller, Klaus
2018-01-01
Many institutions worldwide are considering how to include uncertainty about future changes in sea-levels and storm surges into their investment decisions regarding large capital infrastructures. Here we examine how to characterize deeply uncertain climate change projections to support such decisions using Robust Decision Making analysis. We address questions regarding how to confront the potential for future changes in low probability but large impact flooding events due to changes in sea-levels and storm surges. Such extreme events can affect investments in infrastructure but have proved difficult to consider in such decisions because of the deep uncertainty surrounding them. This study utilizes Robust Decision Making methods to address two questions applied to investment decisions at the Port of Los Angeles: (1) Under what future conditions would a Port of Los Angeles decision to harden its facilities against extreme flood scenarios at the next upgrade pass a cost-benefit test, and (2) Do sea-level rise projections and other information suggest such conditions are sufficiently likely to justify such an investment? We also compare and contrast the Robust Decision Making methods with a full probabilistic analysis. These two analysis frameworks result in similar investment recommendations for different idealized future sea-level projections, but provide different information to decision makers and envision different types of engagement with stakeholders. In particular, the full probabilistic analysis begins by aggregating the best scientific information into a single set of joint probability distributions, while the Robust Decision Making analysis identifies scenarios where a decision to invest in near-term response to extreme sea-level rise passes a cost-benefit test, and then assembles scientific information of differing levels of confidence to help decision makers judge whether or not these scenarios are sufficiently likely to justify making such investments. Results highlight the highly-localized and context dependent nature of applying Robust Decision Making methods to inform investment decisions. PMID:29414978
NASA Astrophysics Data System (ADS)
Gates, Andrew R.; Benfield, Mark C.; Booth, David J.; Fowler, Ashley M.; Skropeta, Danielle; Jones, Daniel O. B.
2017-03-01
The SERPENT Project has been running for over ten years. In this time scientists from universities and research institutions have made more than 120 visits to oil rigs, drill ships and survey vessels operated by 16 oil companies, in order to work with the industry's Remotely Operated Vehicles (ROV). Visits have taken place in Europe, North and South America, Africa and Australasia at water depths from 100 m to nearly 3000 m. The project has directly produced >40 peer reviewed publications and data from the project's >2600 entry online image and video archive have been used in many others. The aim of this paper is to highlight examples of how valuable data can be obtained through collaboration with hydrocarbon exploration and production companies to use existing industry infrastructure to increase scientific discovery in unexplored areas and augment environmental monitoring of industrial activity. The large number of industry ROVs operating globally increases chance encounters with large, enigmatic marine organisms. SERPENT video observations include the deepest known records of species previously considered epipelagic such as scalloped hammerhead (Sphyrna lewini) and southern sunfish (Mola ramsayi) and the first in situ observations of pelagic species such as oarfish (Regalecus glesne). Such observations enable improvements to distribution records and description of behaviour of poorly understood species. Specimen collection has been used for taxonomic descriptions, functional studies and natural products chemistry research. Anthropogenic effects been assessed at the local scale using in situ observations and sample collection at the time of drilling operations and subsequent visits have enabled study of recovery from drilling. Future challenges to be addressed using the SERPENT approach include ensuring unique faunal observations by industry ROV operators are reported, further study of recovery from deep-water drilling activity and to carry out in situ studies to improve the understanding of potential future decommissioning of obsolete hydrocarbon infrastructure.
Taylor, David I
2010-04-01
Boston Harbor, a bay-estuary in the north-east USA, has recently been the site of one of the largest wastewater infrastructure projects conducted in the USA, the Boston Harbor Project (BHP). The BHP, which was conducted from 1991 to 2000, ended over a century of direct wastewater treatment facility discharges to the harbor. The BHP caused the loadings of total nitrogen (TN), total phosphorus (TP), total suspended solids (TSS) and particulate organic carbon (POC) to the harbor, to decrease by between 80% and 90%. Approximately one-third of the decreases in TSS and POC loadings occurred between 1991 and 1992; the remaining two-thirds, between 1995 and 2000. For TN and TP, the bulk of the decreases occurred between 1997 or 1998, and 2000. (c) 2009 Elsevier Ltd. All rights reserved.
Integrated Diseases Surveillance Project (IDSP) through a consultant's lens.
Suresh, K
2008-01-01
India has long experienced one of the highest burdens of infectious diseases in the world, fueled by factors including a large population, high poverty levels, poor sanitation, and problems with access to health care and preventive services. It has traditionally been difficult to monitor disease burden and trends in India, even more difficult to detect, diagnose, and control outbreaks until they had become quite large. In an effort to improve the surveillance and response infrastructure in the country, in November 2004 the Integrated Disease Surveillance Project (IDSP) was initiated with funding from the World Bank. Given the surveillance challenges in India, the project seeks to accomplish its goals through, having a small list of priority conditions, many of which are syndrome-based at community and sub center level and easily recognizable at the out patients and inpatients care of facilities at lowest levels of the health care system, a simplified battery of laboratory tests and rapid test kits, and reporting of largely aggregate data rather than individual case reporting. The project also includes activities that are relatively high technology, such as computerization, electronic data transmission, and video conferencing links for communication and training. The project is planned to be implemented all over the country in a phased manner with a stress on 14 focus states for intensive follow-up to demonstrate successful implementation of IDSP. The National Institute of Communicable Diseases chosen to provide national leadership may have to immediately address five issues. First, promote surveillance through major hospitals (both in public and private sector) and active surveillance through health system staff and community, second, build capacity for data collation, analysis, interpretation to recognize warning signal of outbreak, and institute public health action, third, develop a system which allows availability of quality test kits at district and state laboratories and/or culture facilities at identified laboratories and a national training program to build capacities for performing testing and obtaining high quality results, fourth, there must be a process established by which an appropriate quality assurance program can be implemented and fifth, encourage use of IT infrastructure for data transmission, analysis, routine communication (E-mail etc) and videoconferencing for troubleshooting, consultations and epidemiological investigations. These five activities must be addressed at the national level and cannot be left up to individual states/districts.
iCollections - Digitising the British and Irish Butterflies in the Natural History Museum, London.
Paterson, Gordon; Albuquerque, Sara; Blagoderov, Vladimir; Brooks, Stephen; Cafferty, Steve; Cane, Elisa; Carter, Victoria; Chainey, John; Crowther, Robyn; Douglas, Lyndsey; Durant, Joanna; Duffell, Liz; Hine, Adrian; Honey, Martin; Huertas, Blanca; Howard, Theresa; Huxley, Rob; Kitching, Ian; Ledger, Sophie; McLaughlin, Caitlin; Martin, Geoff; Mazzetta, Gerardo; Penn, Malcolm; Perera, Jasmin; Sadka, Mike; Scialabba, Elisabetta; Self, Angela; Siebert, Darrell J; Sleep, Chris; Toloni, Flavia; Wing, Peter
2016-01-01
The Natural History Museum, London (NHMUK) has embarked on an ambitious programme to digitise its collections . The first phase of this programme has been to undertake a series of pilot projects that will develop the necessary workflows and infrastructure development needed to support mass digitisation of very large scientific collections. This paper presents the results of one of the pilot projects - iCollections. This project digitised all the lepidopteran specimens usually considered as butterflies, 181,545 specimens representing 89 species from the British Isles and Ireland. The data digitised includes, species name, georeferenced location, collector and collection date - the what, where, who and when of specimen data. In addition, a digital image of each specimen was taken. This paper explains the way the data were obtained and the background to the collections which made up the project. Specimen-level data associated with British and Irish butterfly specimens have not been available before and the iCollections project has released this valuable resource through the NHM data portal.
Project Final Report: The Institute for Sustained Performance, Energy, and Resilience (SUPER)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hollingsworth, Jeffrey K.
This project concentrated on various aspects of creating and applying tool infrastructure to make it easier to effectively use large-scale parallel computers. This project was collaborative with Argonne National Laboratory, Lawrence Berkeley National Laboratory, Lawrence Livermore National Laboratory, Oak Ridge National Laboratory, U.C. San Diego, University of Maryland, University of North Carolina, University of Oregon, University Southern California, University of Tennessee, and University of Utah. The research conducted during this project at the University of Maryland is summarized in this report. The complete details of the work are available in the publications listed at the end of the report. Manymore » of the concepts created during this project have been incorporated into tools and made available as freely downloadable software (www.dyninst.org/harmony). It also supported the studies of six graduate students, one undergraduate student, and two post-docs. The funding also provided summer support for the PI and part of the salary of a research staff member.« less
NASA Astrophysics Data System (ADS)
Proto, Monica; Massimo, Bavusi; Francesco, Soldovieri
2010-05-01
The research project "Integrated System for Transport Infrastructure surveillance and Monitoring by Electromagnetic Sensing" (ISTIMES), was approved in the 7th Framework Programme, in the Joint Call ICT and Security and started on 1st July 2009. The purpose of ISTIMES project is to design, assess and promote an ICT-based system, exploiting distributed and local sensors, for non-destructive electromagnetic monitoring in order to achieve the critical transport infrastructures more reliable and safe. The transportation sector's components are susceptible to the consequences of natural disasters and can also be attractive as terrorist targets. The sector's size, its physically dispersed and decentralized nature, the many public and private entities involved in its operations, the critical importance of cost considerations, and the inherent requirement of convenient accessibility to its services by all users - make the transportation particularly vulnerable to security and safety threats. As well known, the surface transportation system consists of interconnected infrastructures including highways, transit systems, railroads, airports, waterways, pipelines and ports, and the vehicles, aircraft, and vessels that operate along these networks. Thus, interdependencies exist between transportation and nearly every other sector of the economy and the effective operation of this system is essential to the European economic productivity; therefore, transportation sector protection is of paramount importance since threats to it may impact other industries that rely on it. The system exploits an open network architecture that can accommodate a wide range of sensors, static and mobile, and can be easily scaled up to allow the integration of additional sensors and interfacing with other networks. It relies on heterogeneous state-of-the-art electromagnetic sensors, enabling a self-organizing, self-healing, ad-hoc networking of terrestrial sensors, supported by specific satellite measurements. The integration of electromagnetic technologies with new ICT information and telecommunications systems enables remotely controlled monitoring and surveillance and real time data imaging of the critical transport infrastructures. Thus, the proposal will concern also with the development of tools for handling, analysing and processing large data volume (Information Fusion) and then providing information and performing behaviour prediction in a quick, easy and intuitive way (Situation Awareness). The proposal is based on several independent non-invasive imaging technologies based on electromagnetic sensing. Sensor cross validation, synergy and new data fusion and correlation schemes will permit a multi-method, multi-resolution and multi-scale electromagnetic detection and monitoring of surface and subsurface changes of the infrastructure. According to GMES and European Spatial Data Infrastructure (ESDI) initiatives, the system will adopt open architectures and will make efforts to achieve full interoperability. The system will be tested on two very challenging test beds such as: a highway-bridge and a railway tunnel. The system will be based on clear end-user requirements, coming from representative end-users and technological choices will be based on a long term cost-benefit analysis. Then, a dissemination plan was included into the project to encourage a wide range of public institutions and private companies to evaluate and adopt our approach for real-time control and distributed monitoring also in the more general framework of critical and civil infrastructure management and protection. Finally, an exploitation plan will develop for the commercialization of any derived technology, software, or monitoring concepts. ISTIMES project is carried out by an international partnership formed by nine partners coming from seven countries: Tecnologie per le Osservazioni della Terra (TeRN), Elsag Datamat (ED) and Dipartimento della Protezione Civile (DPC) from Italy, Eidgenoessische Materialpruefungs-und Forschungsanstalt (EMPA) from Switzerland, Laboratoire Central des Ponts et Chaussées (LCPC) from France, Lund University (ULUND) from Sweden, Tel Aviv University (TAU) from Israel, Territorial Data Elaboration (TDE) from Romania and Norsk Elektro Optikk (NEO) from Norway.
Proto, Monica; Bavusi, Massimo; Bernini, Romeo; Bigagli, Lorenzo; Bost, Marie; Bourquin, Frédrèric; Cottineau, Louis-Marie; Cuomo, Vincenzo; Della Vecchia, Pietro; Dolce, Mauro; Dumoulin, Jean; Eppelbaum, Lev; Fornaro, Gianfranco; Gustafsson, Mats; Hugenschmidt, Johannes; Kaspersen, Peter; Kim, Hyunwook; Lapenna, Vincenzo; Leggio, Mario; Loperte, Antonio; Mazzetti, Paolo; Moroni, Claudio; Nativi, Stefano; Nordebo, Sven; Pacini, Fabrizio; Palombo, Angelo; Pascucci, Simone; Perrone, Angela; Pignatti, Stefano; Ponzo, Felice Carlo; Rizzo, Enzo; Soldovieri, Francesco; Taillade, Fédrèric
2010-01-01
The ISTIMES project, funded by the European Commission in the frame of a joint Call "ICT and Security" of the Seventh Framework Programme, is presented and preliminary research results are discussed. The main objective of the ISTIMES project is to design, assess and promote an Information and Communication Technologies (ICT)-based system, exploiting distributed and local sensors, for non-destructive electromagnetic monitoring of critical transport infrastructures. The integration of electromagnetic technologies with new ICT information and telecommunications systems enables remotely controlled monitoring and surveillance and real time data imaging of the critical transport infrastructures. The project exploits different non-invasive imaging technologies based on electromagnetic sensing (optic fiber sensors, Synthetic Aperture Radar satellite platform based, hyperspectral spectroscopy, Infrared thermography, Ground Penetrating Radar-, low-frequency geophysical techniques, Ground based systems for displacement monitoring). In this paper, we show the preliminary results arising from the GPR and infrared thermographic measurements carried out on the Musmeci bridge in Potenza, located in a highly seismic area of the Apennine chain (Southern Italy) and representing one of the test beds of the project.
Transport Infrastructure Surveillance and Monitoring by Electromagnetic Sensing: The ISTIMES Project
Proto, Monica; Bavusi, Massimo; Bernini, Romeo; Bigagli, Lorenzo; Bost, Marie; Bourquin, Frédrèric.; Cottineau, Louis-Marie; Cuomo, Vincenzo; Vecchia, Pietro Della; Dolce, Mauro; Dumoulin, Jean; Eppelbaum, Lev; Fornaro, Gianfranco; Gustafsson, Mats; Hugenschmidt, Johannes; Kaspersen, Peter; Kim, Hyunwook; Lapenna, Vincenzo; Leggio, Mario; Loperte, Antonio; Mazzetti, Paolo; Moroni, Claudio; Nativi, Stefano; Nordebo, Sven; Pacini, Fabrizio; Palombo, Angelo; Pascucci, Simone; Perrone, Angela; Pignatti, Stefano; Ponzo, Felice Carlo; Rizzo, Enzo; Soldovieri, Francesco; Taillade, Fédrèric
2010-01-01
The ISTIMES project, funded by the European Commission in the frame of a joint Call “ICT and Security” of the Seventh Framework Programme, is presented and preliminary research results are discussed. The main objective of the ISTIMES project is to design, assess and promote an Information and Communication Technologies (ICT)-based system, exploiting distributed and local sensors, for non-destructive electromagnetic monitoring of critical transport infrastructures. The integration of electromagnetic technologies with new ICT information and telecommunications systems enables remotely controlled monitoring and surveillance and real time data imaging of the critical transport infrastructures. The project exploits different non-invasive imaging technologies based on electromagnetic sensing (optic fiber sensors, Synthetic Aperture Radar satellite platform based, hyperspectral spectroscopy, Infrared thermography, Ground Penetrating Radar-, low-frequency geophysical techniques, Ground based systems for displacement monitoring). In this paper, we show the preliminary results arising from the GPR and infrared thermographic measurements carried out on the Musmeci bridge in Potenza, located in a highly seismic area of the Apennine chain (Southern Italy) and representing one of the test beds of the project. PMID:22163489
Building a multidisciplinary e-infrastructure for the NextData Community
NASA Astrophysics Data System (ADS)
Nativi, Stefano; Rorro, Marco; Mazzetti, Paolo; Fiameni, Giuseppe; Papeschi, Fabrizio; Carpenè, Michele
2014-05-01
In 2012, Italy decided to launch a national initiative called NextData (http://www.nextdataproject.it/): a national system for the retrieval, storage, access and diffusion of environmental and climate data from mountain and marine areas. NextData is funded by the Research and University Ministry, as a "Project of Interest". In 2013, NextData funded a "special project", the NextData System of Systems Infrastructure project (ND-SoS-Ina). The main objective is to design, build and operate in production the NextData multidisciplinary and multi-organizational e-infrastructure for the publication and sharing of its resources (e.g. data, services, vocabularies, models). SoS-Ina realizes the NextData general portal implementing the interoperability among the data archives carried out by NextData. The Florentine Division of the Institute of Atmospheric Pollution Research of CNR (CNR-IIA) and CINECA run the project. SoS-Ina (http://essi-lab.eu/nextdata/sosina/) decided to adopt a "System of Systems" (SoS) approach based on a brokering architecture. This has been pursued by applying the brokering technology first developed by the EC-FP7 EuroGEOSS project (http://www.eurogeoss.eu/broker/Pages/AbouttheEuroGEOSSBroker.aspx) and more recently consolidated by the international programme GEOSS (Global Earth Observation System of Systems) of GEO (Group oh Earth Observation) -see http://www.earthobservations.org/documents/geo_ix/20111122_geoss_implementation_highlights.pdf. The NextData general Portal architecture definition will proceed accordingly with the requirements elicited by user communities. The portal will rely on services and interfaces being offered by the brokering middleware and will be based on Liferay (http://www.liferay.com/). Liferay is free and open source, it provides many built-in applications for social collaboration, content and document management. Liferay is also configurable for high availability. The project considers three distinct phases and related milestones: (a) the first prototype of the NextData SoS infrastructure, implementing the core functionalities; (b) the consolidated version of the NextData SoS infrastructure, implementing advanced functionalities; (c) the final and operative NextData SoS infrastructure for data and information sharing and publication. An important outcome of the project will be the performances and scalability advancement of the current brokering and portal technologies, exploiting resources and middleware services provided by CINECA.
Role of Computational Fluid Dynamics and Wind Tunnels in Aeronautics R and D
NASA Technical Reports Server (NTRS)
Malik, Murjeeb R.; Bushnell, Dennis M.
2012-01-01
The purpose of this report is to investigate the status and future projections for the question of supplantation of wind tunnels by computation in design and to intuit the potential impact of computation approaches on wind-tunnel utilization all with an eye toward reducing the infrastructure cost at aeronautics R&D centers. Wind tunnels have been closing for myriad reasons, and such closings have reduced infrastructure costs. Further cost reductions are desired, and the work herein attempts to project which wind-tunnel capabilities can be replaced in the future and, if possible, the timing of such. If the possibility exists to project when a facility could be closed, then maintenance and other associated costs could be rescheduled accordingly (i.e., before the fact) to obtain an even greater infrastructure cost reduction.
DOE Office of Scientific and Technical Information (OSTI.GOV)
O'Leary, Patrick
The primary challenge motivating this project is the widening gap between the ability to compute information and to store it for subsequent analysis. This gap adversely impacts science code teams, who can perform analysis only on a small fraction of the data they calculate, resulting in the substantial likelihood of lost or missed science, when results are computed but not analyzed. Our approach is to perform as much analysis or visualization processing on data while it is still resident in memory, which is known as in situ processing. The idea in situ processing was not new at the time ofmore » the start of this effort in 2014, but efforts in that space were largely ad hoc, and there was no concerted effort within the research community that aimed to foster production-quality software tools suitable for use by Department of Energy (DOE) science projects. Our objective was to produce and enable the use of production-quality in situ methods and infrastructure, at scale, on DOE high-performance computing (HPC) facilities, though we expected to have an impact beyond DOE due to the widespread nature of the challenges, which affect virtually all large-scale computational science efforts. To achieve this objective, we engaged in software technology research and development (R&D), in close partnerships with DOE science code teams, to produce software technologies that were shown to run efficiently at scale on DOE HPC platforms.« less
Grid-enabled mammographic auditing and training system
NASA Astrophysics Data System (ADS)
Yap, M. H.; Gale, A. G.
2008-03-01
Effective use of new technologies to support healthcare initiatives is important and current research is moving towards implementing secure grid-enabled healthcare provision. In the UK, a large-scale collaborative research project (GIMI: Generic Infrastructures for Medical Informatics), which is concerned with the development of a secure IT infrastructure to support very widespread medical research across the country, is underway. In the UK, there are some 109 breast screening centers and a growing number of individuals (circa 650) nationally performing approximately 1.5 million screening examinations per year. At the same, there is a serious, and ongoing, national workforce issue in screening which has seen a loss of consultant mammographers and a growth in specially trained technologists and other non-radiologists. Thus there is a need to offer effective and efficient mammographic training so as to maintain high levels of screening skills. Consequently, a grid based system has been proposed which has the benefit of offering very large volumes of training cases that the mammographers can access anytime and anywhere. A database, spread geographically across three university systems, of screening cases is used as a test set of known cases. The GIMI mammography training system first audits these cases to ensure that they are appropriately described and annotated. Subsequently, the cases are utilized for training in a grid-based system which has been developed. This paper briefly reviews the background to the project and then details the ongoing research. In conclusion, we discuss the contributions, limitations, and future plans of such a grid based approach.
Scaling the CERN OpenStack cloud
NASA Astrophysics Data System (ADS)
Bell, T.; Bompastor, B.; Bukowiec, S.; Castro Leon, J.; Denis, M. K.; van Eldik, J.; Fermin Lobo, M.; Fernandez Alvarez, L.; Fernandez Rodriguez, D.; Marino, A.; Moreira, B.; Noel, B.; Oulevey, T.; Takase, W.; Wiebalck, A.; Zilli, S.
2015-12-01
CERN has been running a production OpenStack cloud since July 2013 to support physics computing and infrastructure services for the site. In the past year, CERN Cloud Infrastructure has seen a constant increase in nodes, virtual machines, users and projects. This paper will present what has been done in order to make the CERN cloud infrastructure scale out.
ERIC Educational Resources Information Center
Cole, Bruce Kevin
2012-01-01
Evaluations of Public-Private Partnership arrangements as alternatives to traditional government procurement methods for the delivery of public infrastructure projects have been anecdotal at best. This paper proposes a framework to evaluate a public university's infrastructure asset management performance and a specific measure based on a new…
Federal resource guide for infrastructure planning and design.
DOT National Transportation Integrated Search
2015-05-01
This Guide describes the important role of planning and design typically known as predevelopment in the life of an : infrastructure project and provides: : Guiding principles for predevelopment; : Case studies highlighting how t...
Oklahoma's transportation infrastructure : inventory and impacts.
DOT National Transportation Integrated Search
2009-10-01
This project comprehensively analyzed Oklahomas transportation infrastructure and its impact on the states economy via network analysis techniques that are widely used in and outside geography. The focus was on the context, connectivity, and co...
Outlook for grid service technologies within the @neurIST eHealth environment.
Arbona, A; Benkner, S; Fingberg, J; Frangi, A F; Hofmann, M; Hose, D R; Lonsdale, G; Ruefenacht, D; Viceconti, M
2006-01-01
The aim of the @neurIST project is to create an IT infrastructure for the management of all processes linked to research, diagnosis and treatment development for complex and multi-factorial diseases. The IT infrastructure will be developed for one such disease, cerebral aneurysm and subarachnoid haemorrhage, but its core technologies will be transferable to meet the needs of other medical areas. Since the IT infrastructure for @neurIST will need to encompass data repositories, computational analysis services and information systems handling multi-scale, multi-modal information at distributed sites, the natural basis for the IT infrastructure is a Grid Service middleware. The project will adopt a service-oriented architecture because it aims to provide a system addressing the needs of medical researchers, clinicians and health care specialists (and their IT providers/systems) and medical supplier/consulting industries.
Contour Crafting Simulation Plan for Lunar Settlement Infrastructure Build-Up
NASA Technical Reports Server (NTRS)
Khoshnevis, B.; Carlson, A.; Leach N.; Thangavelu, M.
2016-01-01
Economically viable and reliable building systems and tool sets are being sought, examined and tested for extraterrestrial infrastructure buildup. This project focused on a unique architecture weaving the robotic building construction technology with designs for assisting rapid buildup of initial operational capability Lunar and Martian bases. The project aimed to study new methodologies to construct certain crucial infrastructure elements in order to evaluate the merits, limitations and feasibility of adapting and using such technologies for extraterrestrial application. Current extraterrestrial settlement buildup philosophy holds that in order to minimize the materials needed to be flown in, at great transportation costs, strategies that maximize the use of locally available resources must be adopted. Tools and equipment flown as cargo from Earth are proposed to build required infrastructure to support future missions and settlements on the Moon and Mars.
NASA Astrophysics Data System (ADS)
Hautière, Nicolas; Bourquin, Frédéric
2017-04-01
Through the centuries, the roads - which today constitute in France a huge transport network of 1 millions kilometers length - have always been able to cope with society needs and challenges. As a consequence, the next generation road infrastructure will have to take into account at least three societal transitions: ecological, energetic and digital. The goal of the 5th generation road project (R5G©) [1], led by Ifsttar in France, aligned with the Forever Open program [2], is to design and build demonstrators of such future road infrastructures. The goal of this presentation is to present different results related to the greening of road materials [3], the design of energy-positive roads [4, 5], the test of roads that self-diagnose [6], the design of roads adapted for connected [7], autonomous [8] and electrified vehicles [9], etc. In terms of perspectives, we will demonstrate that the road infrastructures will soon become a complex system: On one side road users will benefit from new services, on the other side such massively connected and instrumented infrastructures will potentially become an opportune sensor for knowledge development in geoscience, such as air quality, visibility and fog monitoring. References: [1] R5G project. r5g.ifsttar.fr [2] Forever Open Road project. www.foreveropenroad.eu [3] Biorepavation project. www.infravation.net/projects/BIOREPAVATION [4] N. Le Touz, J. Dumoulin. Numerical study of the thermal behavior of a new deicing road structure design with energy harvesting capabilities. EGU General Assembly 2015, Apr 2015, Vienne, Austria. [5] S. Asfour, F. Bernardin, E. Toussaint, J.-M. Piau. Hydrothermal modeling of porous pavement for its surface de-freezing. Applied Thermal Engineering. Volume 107, 25 August 2016, Pages 493-500 [6] LGV BPL Instrumentation. http://railenium.eu/wp-content/uploads/2016/08/INSTRUMENTATION-BPL-FR.pdf [7] SCOOP@F project. https://ec.europa.eu/inea/en/connecting-europe-facility/cef-transport/projects-by-country/multi-country/2014-eu-ta-0669-s [8] J. Ehrlich, D. Gruyer, O. Orfila, N. Hautière. Autonomous vehicle : The concept of high quality of service highway. FISITA World congress, Busan, Korea, 2016. [9] FABRIC project. www.fabric-project.eu
NASA Astrophysics Data System (ADS)
Shibayeva, Marina; Serebryakova, Yelena; Shalnev, Oleg
2017-10-01
Growing demand to increase the investment volume in modernization and development projects for transport infrastructure define the urgency of the current study. The amount of private sector investments in the field is insufficient to implement the projects for road construction due to their significant capital intensity and long payoff period. The implementation of social significant infrastructure projects on the principles of public-private partnership is one of the key strategic directions of growth for transport facilities. The authors come up with a concept and methodology for modeling the investment and innovation activity in the transport facility construction. Furthermore, there is developed a model to find the balance between public and private sector investments in implementing construction projects for transport infrastructure with involvement of PPP (further - public-private partnership). The suggested concepts aim to improve the efficiency rate of the investment and innovation activity in the field of transport facility construction on the basis of public and private sectors collaboration.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sandstrom, Matthew M.
2012-03-30
This is the final report for a grant-funded project to financially assist and otherwise provide support to projects that increase E85 infrastructure in Michigan at retail fueling locations. Over the two-year project timeframe, nine E85 and/or flex-fuel pumps were installed around the State of Michigan at locations currently lacking E85 infrastructure. A total of five stations installed the nine pumps, all providing cost share toward the project. By using cost sharing by station partners, the $200,000 provided by the Department of Energy facilitated a total project worth $746,332.85. This project was completed over a two-year timetable (eight quarters). The firstmore » quarter of the project focused on project outreach to station owners about the incentive on the installation and/or conversion of E85 compatible fueling equipment including fueling pumps, tanks, and all necessary electrical and plumbing connections. Utilizing Clean Energy Coalition (CEC) extensive knowledge of gasoline/ethanol infrastructure throughout Michigan, CEC strategically placed these pumps in locations to strengthen the broad availability of E85 in Michigan. During the first and second quarters, CEC staff approved projects for funding and secured contracts with station owners; the second through eighth quarters were spent working with fueling station owners to complete projects; the third through eighth quarters included time spent promoting projects; and beginning in the second quarter and running for the duration of the project was spent performing project reporting and evaluation to the US DOE. A total of 9 pumps were installed (four in Elkton, two in Sebewaing, one in East Lansing, one in Howell, and one in Whitmore Lake). At these combined station locations, a total of 192,445 gallons of E85, 10,786 gallons of E50, and 19,159 gallons of E30 were sold in all reporting quarters for 2011. Overall, the project has successfully displaced 162,611 gallons (2,663 barrels) of petroleum, and reduced regional GHG emissions by 375 tons in the first year of station deployment.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wood, Eric W; Rames, Clement L; Bedir, Abdulkadir
This report analyzes plug-in electric vehicle (PEV) infrastructure needs in California from 2017 to 2025 in a scenario where the State's zero-emission vehicle (ZEV) deployment goals are achieved by household vehicles. The statewide infrastructure needs are evaluated by using the Electric Vehicle Infrastructure Projection tool, which incorporates representative statewide travel data from the 2012 California Household Travel Survey. The infrastructure solution presented in this assessment addresses two primary objectives: (1) enabling travel for battery electric vehicles and (2) maximizing the electric vehicle-miles traveled for plug-in hybrid electric vehicles. The analysis is performed at the county-level for each year between 2017more » and 2025 while considering potential technology improvements. The results from this study present an infrastructure solution that can facilitate market growth for PEVs to reach the State's ZEV goals by 2025. The overall results show a need for 99k-130k destination chargers, including workplaces and public locations, and 9k-25k fast chargers. The results also show a need for dedicated or shared residential charging solutions at multi-family dwellings, which are expected to host about 120k PEVs by 2025. An improvement to the scientific literature, this analysis presents the significance of infrastructure reliability and accessibility on the quantification of charger demand.« less
Optimizing the Prioritization of Natural Disaster Recovery Projects
2007-03-01
collection, and basic utility and infrastructure restoration. The restoration of utilities can include temporary bridges, temporary water and sewage lines...interrupted such as in the case of the 9/11 disaster. Perhaps next time our enemies may target our power grid or water systems. It is the duty of...Transportation The amount and type of transportation infrastructure damage a repair project addresses Water The amount and type of water
clearScience: Infrastructure for Communicating Data-Intensive Science.
Bot, Brian M; Burdick, David; Kellen, Michael; Huang, Erich S
2013-01-01
Progress in biomedical research requires effective scientific communication to one's peers and to the public. Current research routinely encompasses large datasets and complex analytic processes, and the constraints of traditional journal formats limit useful transmission of these elements. We are constructing a framework through which authors can not only provide the narrative of what was done, but the primary and derivative data, the source code, the compute environment, and web-accessible virtual machines. This infrastructure allows authors to "hand their machine"- prepopulated with libraries, data, and code-to those interested in reviewing or building off of their work. This project, "clearScience," seeks to provide an integrated system that accommodates the ad hoc nature of discovery in the data-intensive sciences and seamless transitions from working to reporting. We demonstrate that rather than merely describing the science being reported, one can deliver the science itself.
NASA Technical Reports Server (NTRS)
Whitehead, A. H., Jr.
1978-01-01
The considered study has been conducted to evaluate the future potential for an advanced air cargo transport. A current operations analysis is discussed, taking into account the traffic structure, modal cost comparisons, terminal operations, containerization, and institutional factors. Attention is also given to case studies, a demand forecast, and an advanced air cargo systems analysis. The effects of potential improvements on reducing costs are shown. Improvement to the current infrastructure can occur from 1978 to 1985 with off-the-shelf technology, which when combined with higher load factors for aircraft and containers, can provide up to a 16 percent reduction in total operating costs and a 15 percent rate reduction. The results of the analysis indicate that the proposed changes in the infrastructure and improved cargo loading efficiencies are as important to improving the airlines' financial posture as is the anticipated large dedicated cargo aircraft.
Evaluating betterment projects.
Fleming, Christopher M; Manning, Matthew; Smith, Christine
2016-04-01
In the past decade Australia has experienced a series of large-scale, severe natural disasters including catastrophic bushfires, widespread and repeated flooding, and intense storms and cyclones. There appears to be a prima facie case for rebuilding damaged infrastructure to a more disaster resilient (that is, to 'betterment') standard. The purpose of this paper is to develop and illustrate a consistent and readily applied method for advancing proposals for the betterment of essential public assets, which can be used by governments at all levels to determine the net benefits of such proposals. Case study results demonstrate that betterment investments have the potential to deliver a positive economic return across a range of asset types and regions. Results, however, are highly sensitive to underlying assumptions; in particular the probability of the natural disaster affecting the infrastructure in the absence of betterment. © 2016 The Author(s). Disasters © Overseas Development Institute, 2016.
OGC and Grid Interoperability in enviroGRIDS Project
NASA Astrophysics Data System (ADS)
Gorgan, Dorian; Rodila, Denisa; Bacu, Victor; Giuliani, Gregory; Ray, Nicolas
2010-05-01
EnviroGRIDS (Black Sea Catchment Observation and Assessment System supporting Sustainable Development) [1] is a 4-years FP7 Project aiming to address the subjects of ecologically unsustainable development and inadequate resource management. The project develops a Spatial Data Infrastructure of the Black Sea Catchment region. The geospatial technologies offer very specialized functionality for Earth Science oriented applications as well as the Grid oriented technology that is able to support distributed and parallel processing. One challenge of the enviroGRIDS project is the interoperability between geospatial and Grid infrastructures by providing the basic and the extended features of the both technologies. The geospatial interoperability technology has been promoted as a way of dealing with large volumes of geospatial data in distributed environments through the development of interoperable Web service specifications proposed by the Open Geospatial Consortium (OGC), with applications spread across multiple fields but especially in Earth observation research. Due to the huge volumes of data available in the geospatial domain and the additional introduced issues (data management, secure data transfer, data distribution and data computation), the need for an infrastructure capable to manage all those problems becomes an important aspect. The Grid promotes and facilitates the secure interoperations of geospatial heterogeneous distributed data within a distributed environment, the creation and management of large distributed computational jobs and assures a security level for communication and transfer of messages based on certificates. This presentation analysis and discusses the most significant use cases for enabling the OGC Web services interoperability with the Grid environment and focuses on the description and implementation of the most promising one. In these use cases we give a special attention to issues such as: the relations between computational grid and the OGC Web service protocols, the advantages offered by the Grid technology - such as providing a secure interoperability between the distributed geospatial resource -and the issues introduced by the integration of distributed geospatial data in a secure environment: data and service discovery, management, access and computation. enviroGRIDS project proposes a new architecture which allows a flexible and scalable approach for integrating the geospatial domain represented by the OGC Web services with the Grid domain represented by the gLite middleware. The parallelism offered by the Grid technology is discussed and explored at the data level, management level and computation level. The analysis is carried out for OGC Web service interoperability in general but specific details are emphasized for Web Map Service (WMS), Web Feature Service (WFS), Web Coverage Service (WCS), Web Processing Service (WPS) and Catalog Service for Web (CSW). Issues regarding the mapping and the interoperability between the OGC and the Grid standards and protocols are analyzed as they are the base in solving the communication problems between the two environments: grid and geospatial. The presetation mainly highlights how the Grid environment and Grid applications capabilities can be extended and utilized in geospatial interoperability. Interoperability between geospatial and Grid infrastructures provides features such as the specific geospatial complex functionality and the high power computation and security of the Grid, high spatial model resolution and geographical area covering, flexible combination and interoperability of the geographical models. According with the Service Oriented Architecture concepts and requirements of interoperability between geospatial and Grid infrastructures each of the main functionality is visible from enviroGRIDS Portal and consequently, by the end user applications such as Decision Maker/Citizen oriented Applications. The enviroGRIDS portal is the single way of the user to get into the system and the portal faces a unique style of the graphical user interface. Main reference for further information: [1] enviroGRIDS Project, http://www.envirogrids.net/
Code of Federal Regulations, 2010 CFR
2010-04-01
... HIGHWAY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION TRANSPORTATION INFRASTRUCTURE MANAGEMENT PROJECTS OF... efficient management of the project in accordance with applicable Federal statutes, regulations, and policy... of State Transportation Departments, with one State acting as the project lead. Eligible project...
Develop Improved Materials to Support the Hydrogen Economy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dr. Michael C. Martin
The Edison Materials Technology Center (EMTEC) solicited and funded hydrogen infrastructure related projects that have a near term potential for commercialization. The subject technology of each project is related to the US Department of Energy hydrogen economy goals as outlined in the multi-year plan titled, 'Hydrogen, Fuel Cells and Infrastructure Technologies Program Multi-Year Research, Development and Demonstration Plan.' Preference was given to cross cutting materials development projects that might lead to the establishment of manufacturing capability and job creation. The Edison Materials Technology Center (EMTEC) used the US Department of Energy hydrogen economy goals to find and fund projects withmore » near term commercialization potential. An RFP process aligned with this plan required performance based objectives with go/no-go technology based milestones. Protocols established for this program consisted of a RFP solicitation process, white papers and proposals with peer technology and commercialization review (including DoE), EMTEC project negotiation and definition and DoE cost share approval. Our RFP approach specified proposals/projects for hydrogen production, hydrogen storage or hydrogen infrastructure processing which may include sensor, separator, compression, maintenance, or delivery technologies. EMTEC was especially alert for projects in the appropriate subject area that have cross cutting materials technology with near term manufacturing and commercialization opportunities.« less
NASA Astrophysics Data System (ADS)
Parish, E. S.; Omitaomu, O.; Sylvester, L.; Nugent, P.
2015-12-01
Many U.S. cities are exploring the potential of using green infrastructure (e.g., porous pavements, green roofs, street planters) to reduce urban storm water runoff, which can be both be a nuisance and costly to treat. While tools exist to measure local runoff changes resulting from individual green infrastructure (GI) projects, most municipalities currently have no method of analyzing the collective impact of GI projects on urban stormwater systems under future rainfall scenarios and impervious surface distribution patterns. Using the mid-sized city of Knoxville, Tennessee as a case study, we propose a set of indicators that can be used to monitor and analyze the collective effects of GI emplacement on urban storm water runoff volumes as well as to quantify potential co-benefits of GI projects (e.g., urban heat island reduction, reduced stream scouring) under different climate projection ensembles and population growth scenarios. These indicators are intended to help the city prioritize GI projects as opportunities arise, as well as to track the effectiveness of GI implementation over time. We explore the aggregation of these indicators across different spatial scales (e.g., plot, neighborhood, watershed, city) in order to assess potential changes in climate change resilience resulting from the collective implementation of GI projects across an urban landscape.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-09-29
... construction, operation, and maintenance of utility infrastructure upgrades, expansions, and improvements... and wastewater facilities and road improvements to Range 130. All practical means to avoid or minimize...
Preliminary human factors guidelines for traffic management centers
DOT National Transportation Integrated Search
2006-04-01
Sometimes the development of infrastructure can negatively impact habitat and ecosystems. Ways to better avoid, minimize, and mitigate these impacts, as well as the impacts of past infrastructure projects, have been developed. Nevertheless, these avo...
Recycled carpet materials for infrastructure applications.
DOT National Transportation Integrated Search
2013-06-01
The objective of this project was to develop novel composite materials for infrastructure applications by recycling nylon based waste carpets. These novel composites have been proven to possess improved mechanical and sound barrier properties to meet...
Impacts and benefits of implementing BIM on bridge infrastructure projects.
DOT National Transportation Integrated Search
2014-10-01
To date, BIM (Building Information Modeling) is not widely utilized in infrastructure asset management. : Benefits achieved through implementation in vertical construction, however, suggest that BIM represents : significant opportunity for gains in p...
Eligibility of Indoor Plumbing Under Alaska Sanitation Infrastructure Grant Program
Memorandum response to questions that relate to whether indoor plumbing of homes, as part of a wastewater construction project, is an eligible cost item under the EPA Alaska Sanitation Infrastructure Grant Program.
Towards sustainable transport infrastructure : a sectoral approach in practice
DOT National Transportation Integrated Search
1996-07-01
These guidelines provide a comprehensive overview of the issues in moving towards more sustainable transport infrastructure in developing countries. They provide a sectoral framework in which project proposals and requests for European Union assistan...
NASA Astrophysics Data System (ADS)
Schaap, Dick M. A.; Fichaut, Michele
2017-04-01
SeaDataCloud marks the third phase of developing the pan-European SeaDataNet infrastructure for marine and ocean data management. The SeaDataCloud project is funded by EU and runs for 4 years from 1st November 2016. It succeeds the successful SeaDataNet II (2011 - 2015) and SeaDataNet (2006 - 2011) projects. SeaDataNet has set up and operates a pan-European infrastructure for managing marine and ocean data and is undertaken by National Oceanographic Data Centres (NODC's) and oceanographic data focal points from 34 coastal states in Europe. The infrastructure comprises a network of interconnected data centres and central SeaDataNet portal. The portal provides users a harmonised set of metadata directories and controlled access to the large collections of datasets, managed by the interconnected data centres. The population of directories has increased considerably in cooperation with and involvement in many associated EU projects and initiatives such as EMODnet. SeaDataNet at present gives overview and access to more than 1.9 million data sets for physical oceanography, chemistry, geology, geophysics, bathymetry and biology from more than 100 connected data centres from 34 countries riparian to European seas. SeaDataNet is also active in setting and governing marine data standards, and exploring and establishing interoperability solutions to connect to other e-infrastructures on the basis of standards of ISO (19115, 19139), and OGC (WMS, WFS, CS-W and SWE). Standards and associated SeaDataNet tools are made available at the SeaDataNet portal for wide uptake by data handling and managing organisations. SeaDataCloud aims at further developing standards, innovating services & products, adopting new technologies, and giving more attention to users. Moreover, it is about implementing a cooperation between the SeaDataNet consortium of marine data centres and the EUDAT consortium of e-infrastructure service providers. SeaDataCloud aims at considerably advancing services and increasing their usage by adopting cloud and High Performance Computing technology. SeaDataCloud will empower researchers with a packaged collection of services and tools, tailored to their specific needs, supporting research and enabling generation of added-value products from marine and ocean data. Substantial activities will be focused on developing added-value services, such as data subsetting, analysis, visualisation, and publishing workflows for users, both regular and advanced users, as part of a Virtual Research Environment (VRE). SeaDataCloud aims at a number of leading user communities that have new challenges for upgrading and expanding the SeaDataNet standards and services: Science, EMODnet, Copernicus Marine Environmental Monitoring Service (CMEMS) and EuroGOOS, and International scientific programmes. The presentation will give information on present services of the SeaDataNet infrastructure and services, and the new challenges in SeaDataCloud, and will highlight a number of key achievements in SeaDataCloud so far.
NASA Technical Reports Server (NTRS)
Falke, Stefan; Husar, Rudolf
2011-01-01
The goal of this REASoN applications and technology project is to deliver and use Earth Science Enterprise (ESE) data and tools in support of air quality management. Its scope falls within the domain of air quality management and aims to develop a federated air quality information sharing network that includes data from NASA, EPA, US States and others. Project goals were achieved through a access of satellite and ground observation data, web services information technology, interoperability standards, and air quality community collaboration. In contributing to a network of NASA ESE data in support of particulate air quality management, the project will develop access to distributed data, build Web infrastructure, and create tools for data processing and analysis. The key technologies used in the project include emerging web services for developing self describing and modular data access and processing tools, and service oriented architecture for chaining web services together to assemble customized air quality management applications. The technology and tools required for this project were developed within DataFed.net, a shared infrastructure that supports collaborative atmospheric data sharing and processing web services. Much of the collaboration was facilitated through community interactions through the Federation of Earth Science Information Partners (ESIP) Air Quality Workgroup. The main activities during the project that successfully advanced DataFed, enabled air quality applications and established community-oriented infrastructures were: develop access to distributed data (surface and satellite), build Web infrastructure to support data access, processing and analysis create tools for data processing and analysis foster air quality community collaboration and interoperability.
NASA Astrophysics Data System (ADS)
Schaap, Dick M. A.; Fichaut, Michele
2015-04-01
The second phase of the project SeaDataNet is well underway since October 2011. The main objective is to improve operations and to progress towards an efficient data management infrastructure able to handle the diversity and large volume of data collected via research cruises and monitoring activities in European marine waters and global oceans. The SeaDataNet infrastructure comprises a network of interconnected data centres and a central SeaDataNet portal. The portal provides users a unified and transparent overview of the metadata and controlled access to the large collections of data sets, managed by the interconnected data centres, and the various SeaDataNet standards and tools,. SeaDataNet is also setting and governing marine data standards, and exploring and establishing interoperability solutions to connect to other e-infrastructures on the basis of standards of ISO (19115, 19139), OGC (WMS, WFS, CS-W and SWE), and OpenSearch. The population of directories has increased considerably in cooperation and involvement in associated EU projects and initiatives. SeaDataNet now gives overview and access to more than 1.6 million data sets for physical oceanography, chemistry, geology, geophysics, bathymetry and biology from more than 100 connected data centres from 34 countries riparian to European seas. Access to marine data is also a key issue for the implementation of the EU Marine Strategy Framework Directive (MSFD). The EU communication 'Marine Knowledge 2020' underpins the importance of data availability and harmonising access to marine data from different sources. SeaDataNet qualified itself for an active role in the data management component of the EMODnet (European Marine Observation and Data network) that is promoted in the EU Communication. Starting 2009 EMODnet pilot portals have been initiated for marine data themes: digital bathymetry, chemistry, physical oceanography, geology, biology, and seabed habitat mapping. These portals are being expanded to all European sea regions as part of EMODnet Phase 2, which started mid 2013. EMODnet encourages more data providers to come forward for data sharing and participating in the process of making complete overviews and homogeneous data products. The EMODnet Bathymetry project is very illustrative for the synergy between SeaDataNet and EMODnet and added value of generating public data products. The project develops and publishes Digital Terrain Models (DTM) for the European seas. These are produced from survey and aggregated data sets. The portal provides a versatile DTM viewing service with many relevant map layers and functions for retrieving. A further refinement is taking place as part of phase 2. The presentation will highlight key achievements in SeaDataNet II and give further details and views on the new EMODNet Digital Bathymetry for European seas as to be released early 2015.
Characterizing Crowd Participation and Productivity of Foldit Through Web Scraping
2016-03-01
Berkeley Open Infrastructure for Network Computing CDF Cumulative Distribution Function CPU Central Processing Unit CSSG Crowdsourced Serious Game...computers at once can create a similar capacity. According to Anderson [6], principal investigator for the Berkeley Open Infrastructure for Network...extraterrestrial life. From this project, a software-based distributed computing platform called the Berkeley Open Infrastructure for Network Computing
A tale of two rain gardens: Barriers and bridges to adaptive ...
Green infrastructure installations such as rain gardens and bioswales are increasingly regarded as viable tools to mitigate stormwater runoff at the parcel level. The use of adaptive management to implement and monitor green infrastructure projects as experimental attempts to manage stormwater has not been adequately explored as a way to optimize green infrastructure performance or increase social and political acceptance. Efforts to improve stormwater management through green infrastructure suffer from the complexity of overlapping jurisdictional boundaries, as well as interacting social and political forces that dictate the flow, consumption, conservation and disposal of urban wastewater flows. Within this urban milieu, adaptive management—rigorous experimentation applied as policy—can inform new wastewater management techniques such as the implementation of green infrastructure projects. In this article, we present a narrative of scientists and practitioners working together to apply an adaptive management approach to green infrastructure implementation for stormwater management in Cleveland, Ohio. In Cleveland, contextual legal requirements and environmental factors created an opportunity for government researchers, stormwater managers and community organizers to engage in the development of two distinct sets of rain gardens, each borne of unique social, economic and environmental processes. In this article we analyze social and political barriers to app
The TENCompetence Infrastructure: A Learning Network Implementation
NASA Astrophysics Data System (ADS)
Vogten, Hubert; Martens, Harrie; Lemmers, Ruud
The TENCompetence project developed a first release of a Learning Network infrastructure to support individuals, groups and organisations in professional competence development. This infrastructure Learning Network infrastructure was released as open source to the community thereby allowing users and organisations to use and contribute to this development as they see fit. The infrastructure consists of client applications providing the user experience and server components that provide the services to these clients. These services implement the domain model (Koper 2006) by provisioning the entities of the domain model (see also Sect. 18.4) and henceforth will be referenced as domain entity services.
The Electrolyte Genome project: A big data approach in battery materials discovery
DOE Office of Scientific and Technical Information (OSTI.GOV)
Qu, Xiaohui; Jain, Anubhav; Rajput, Nav Nidhi
2015-06-01
We present a high-throughput infrastructure for the automated calculation of molecular properties with a focus on battery electrolytes. The infrastructure is largely open-source and handles both practical aspects (input file generation, output file parsing, and information management) as well as more complex problems (structure matching, salt complex generation, and failure recovery). Using this infrastructure, we have computed the ionization potential (IP) and electron affinities (EA) of 4830 molecules relevant to battery electrolytes (encompassing almost 55,000 quantum mechanics calculations) at the B3LYP/6-31+G(*) level. We describe automated workflows for computing redox potential, dissociation constant, and salt-molecule binding complex structure generation. We presentmore » routines for automatic recovery from calculation errors, which brings the failure rate from 9.2% to 0.8% for the QChem DFT code. Automated algorithms to check duplication between two arbitrary molecules and structures are described. We present benchmark data on basis sets and functionals on the G2-97 test set; one finding is that a IP/EA calculation method that combines PBE geometry optimization and B3LYP energy evaluation requires less computational cost and yields nearly identical results as compared to a full B3LYP calculation, and could be suitable for the calculation of large molecules. Our data indicates that among the 8 functionals tested, XYGJ-OS and B3LYP are the two best functionals to predict IP/EA with an RMSE of 0.12 and 0.27 eV, respectively. Application of our automated workflow to a large set of quinoxaline derivative molecules shows that functional group effect and substitution position effect can be separated for IP/EA of quinoxaline derivatives, and the most sensitive position is different for IP and EA. Published by Elsevier B.V« less
Sustainable infrastructure: A review and a research agenda.
Thomé, Antônio Márcio Tavares; Ceryno, Paula Santos; Scavarda, Annibal; Remmen, Arne
2016-12-15
This paper proposes a taxonomy of themes and a research agenda on sustainable infrastructure, with a focus on sustainable buildings (SB) and green infrastructure (GI). The citation databases of Web of Science formed the basis for a novel strategic thematic analysis of co-citation and co-occurrence of keywords with a longitudinal identification of themes during the last two decades (from 1995 to 2015) of an emerging and ever growing research area. SI is a multidisciplinary endeavour, including a diversified array of disciplines as general engineering, environmental ecology, construction, architecture, urban planning, and geography. This paper traces that the number of publications in SI is growing exponentially since 2003. Over 80% of total citations are concentrated in less than 10% of papers spread over a large number of journals. Most publications originate from the United States, Europe, Australia, and Asia. The main research streams in SI are green infrastructure, sustainable buildings, and assessment methods. Emerging and prevailing research themes include methodological issues of cost-effectiveness, project management and assessment tools. Substantive issues complement the research agenda of emerging themes in the areas of integration of human, economic and corporate social responsibility values in environmental sustainability, urban landscape and sustainable drainage systems, interdisciplinary research in green material, integrated policy research in urbanization, agriculture and nature conservation, and extensions of Green Building (GB) and GI to cities of developing countries. Copyright © 2016 Elsevier Ltd. All rights reserved.
Development of Bioinformatics Infrastructure for Genomics Research.
Mulder, Nicola J; Adebiyi, Ezekiel; Adebiyi, Marion; Adeyemi, Seun; Ahmed, Azza; Ahmed, Rehab; Akanle, Bola; Alibi, Mohamed; Armstrong, Don L; Aron, Shaun; Ashano, Efejiro; Baichoo, Shakuntala; Benkahla, Alia; Brown, David K; Chimusa, Emile R; Fadlelmola, Faisal M; Falola, Dare; Fatumo, Segun; Ghedira, Kais; Ghouila, Amel; Hazelhurst, Scott; Isewon, Itunuoluwa; Jung, Segun; Kassim, Samar Kamal; Kayondo, Jonathan K; Mbiyavanga, Mamana; Meintjes, Ayton; Mohammed, Somia; Mosaku, Abayomi; Moussa, Ahmed; Muhammd, Mustafa; Mungloo-Dilmohamud, Zahra; Nashiru, Oyekanmi; Odia, Trust; Okafor, Adaobi; Oladipo, Olaleye; Osamor, Victor; Oyelade, Jellili; Sadki, Khalid; Salifu, Samson Pandam; Soyemi, Jumoke; Panji, Sumir; Radouani, Fouzia; Souiai, Oussama; Tastan Bishop, Özlem
2017-06-01
Although pockets of bioinformatics excellence have developed in Africa, generally, large-scale genomic data analysis has been limited by the availability of expertise and infrastructure. H3ABioNet, a pan-African bioinformatics network, was established to build capacity specifically to enable H3Africa (Human Heredity and Health in Africa) researchers to analyze their data in Africa. Since the inception of the H3Africa initiative, H3ABioNet's role has evolved in response to changing needs from the consortium and the African bioinformatics community. H3ABioNet set out to develop core bioinformatics infrastructure and capacity for genomics research in various aspects of data collection, transfer, storage, and analysis. Various resources have been developed to address genomic data management and analysis needs of H3Africa researchers and other scientific communities on the continent. NetMap was developed and used to build an accurate picture of network performance within Africa and between Africa and the rest of the world, and Globus Online has been rolled out to facilitate data transfer. A participant recruitment database was developed to monitor participant enrollment, and data is being harmonized through the use of ontologies and controlled vocabularies. The standardized metadata will be integrated to provide a search facility for H3Africa data and biospecimens. Because H3Africa projects are generating large-scale genomic data, facilities for analysis and interpretation are critical. H3ABioNet is implementing several data analysis platforms that provide a large range of bioinformatics tools or workflows, such as Galaxy, the Job Management System, and eBiokits. A set of reproducible, portable, and cloud-scalable pipelines to support the multiple H3Africa data types are also being developed and dockerized to enable execution on multiple computing infrastructures. In addition, new tools have been developed for analysis of the uniquely divergent African data and for downstream interpretation of prioritized variants. To provide support for these and other bioinformatics queries, an online bioinformatics helpdesk backed by broad consortium expertise has been established. Further support is provided by means of various modes of bioinformatics training. For the past 4 years, the development of infrastructure support and human capacity through H3ABioNet, have significantly contributed to the establishment of African scientific networks, data analysis facilities, and training programs. Here, we describe the infrastructure and how it has affected genomics and bioinformatics research in Africa. Copyright © 2017 World Heart Federation (Geneva). Published by Elsevier B.V. All rights reserved.
Ocean Data Interoperability Platform (ODIP): using regional data systems for global ocean research
NASA Astrophysics Data System (ADS)
Schaap, D.; Thijsse, P.; Glaves, H.
2017-12-01
Ocean acidification, loss of coral reefs, sustainable exploitation of the marine environment are just a few of the challenges researchers around the world are currently attempting to understand and address. However, studies of these ecosystem level challenges are impossible unless researchers can discover and re-use the large volumes of interoperable multidisciplinary data that are currently only accessible through regional and global data systems that serve discreet, and often discipline specific, user communities. The plethora of marine data systems currently in existence are also using different standards, technologies and best practices making re-use of the data problematic for those engaged in interdisciplinary marine research. The Ocean Data Interoperability Platform (ODIP) is responding to this growing demand for discoverable, accessible and reusable data by establishing the foundations for a common global framework for marine data management. But creation of such an infrastructure is a major undertaking, and one that needs to be achieved in part by establishing different levels of interoperability across existing regional and global marine e-infrastructures. Workshops organised by ODIP II facilitate dialogue between selected regional and global marine data systems in an effort to identify potential solutions that integrate these marine e-infrastructures. The outcomes of these discussions have formed the basis for a number of prototype development tasks that aim to demonstrate effective sharing of data across multiple data systems, and allow users to access data from more than one system through a single access point. The ODIP II project is currently developing four prototype solutions that are establishing interoperability between selected regional marine data management infrastructures in Europe, the USA, Canada and Australia, and with the global POGO, IODE Ocean Data Portal (ODP) and GEOSS systems. The potential impact of implementing these solutions for the individual marine data infrastructures is also being evaluated to determine both the technical and financial implications of their integration within existing systems. These impact assessments form part of the strategy to encourage wider adoption of the ODIP solutions and approach beyond the current scope of the project.
NASA Astrophysics Data System (ADS)
Fiore, Sandro; Płóciennik, Marcin; Doutriaux, Charles; Blanquer, Ignacio; Barbera, Roberto; Donvito, Giacinto; Williams, Dean N.; Anantharaj, Valentine; Salomoni, Davide D.; Aloisio, Giovanni
2017-04-01
In many scientific domains such as climate, data is often n-dimensional and requires tools that support specialized data types and primitives to be properly stored, accessed, analysed and visualized. Moreover, new challenges arise in large-scale scenarios and eco-systems where petabytes (PB) of data can be available and data can be distributed and/or replicated, such as the Earth System Grid Federation (ESGF) serving the Coupled Model Intercomparison Project, Phase 5 (CMIP5) experiment, providing access to 2.5PB of data for the Intergovernmental Panel on Climate Change (IPCC) Fifth Assessment Report (AR5). A case study on climate models intercomparison data analysis addressing several classes of multi-model experiments is being implemented in the context of the EU H2020 INDIGO-DataCloud project. Such experiments require the availability of large amount of data (multi-terabyte order) related to the output of several climate models simulations as well as the exploitation of scientific data management tools for large-scale data analytics. More specifically, the talk discusses in detail a use case on precipitation trend analysis in terms of requirements, architectural design solution, and infrastructural implementation. The experiment has been tested and validated on CMIP5 datasets, in the context of a large scale distributed testbed across EU and US involving three ESGF sites (LLNL, ORNL, and CMCC) and one central orchestrator site (PSNC). The general "environment" of the case study relates to: (i) multi-model data analysis inter-comparison challenges; (ii) addressed on CMIP5 data; and (iii) which are made available through the IS-ENES/ESGF infrastructure. The added value of the solution proposed in the INDIGO-DataCloud project are summarized in the following: (i) it implements a different paradigm (from client- to server-side); (ii) it intrinsically reduces data movement; (iii) it makes lightweight the end-user setup; (iv) it fosters re-usability (of data, final/intermediate products, workflows, sessions, etc.) since everything is managed on the server-side; (v) it complements, extends and interoperates with the ESGF stack; (vi) it provides a "tool" for scientists to run multi-model experiments, and finally; and (vii) it can drastically reduce the time-to-solution for these experiments from weeks to hours. At the time the contribution is being written, the proposed testbed represents the first concrete implementation of a distributed multi-model experiment in the ESGF/CMIP context joining server-side and parallel processing, end-to-end workflow management and cloud computing. As opposed to the current scenario based on search & discovery, data download, and client-based data analysis, the INDIGO-DataCloud architectural solution described in this contribution addresses the scientific computing & analytics requirements by providing a paradigm shift based on server-side and high performance big data frameworks jointly with two-level workflow management systems realized at the PaaS level via a cloud infrastructure.
Business Development Corporation, Inc.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jasek, S.
1995-12-31
Business Development Corporation, Inc., is a company specializing in opportunity seeking and business development activities in the {open_quotes}new{close_quotes} post communist Central and Eastern Europe, with particular emphasis on the Republics of Poland and Slovakia. The company currently focuses its expertise on strategic investing and business development between Central Europe and the United States of America. In Poland and Slovakia, the company specializes in developing large scale energy and environmental {open_quotes}infrastructure{close_quotes} development projects on the federal, state, and local level. In addition, the company assists large state owned industries in the transformation and privatization process. Business Development Corporation has assisted andmore » continues to assist in projects of national importance. The staff of experts advise numerous large Polish and Slovak companies, most owned or in the process of privatization, on matters of restructuring, finance, capital structure, strategic parternships or investors, mergers, acquisitions and joint ventures with U.S. based firms. The company also assists and advises on a variety of environmental and energy matters in the public and private sector.« less
NASA Astrophysics Data System (ADS)
Melvin, A. M.; Larsen, P.; Boehlert, B.; Martinich, J.; Neumann, J.; Chinowsky, P.; Schweikert, A.; Strzepek, K.
2015-12-01
Climate change poses many risks and challenges for the Arctic and sub-Arctic, including threats to infrastructure. The safety and stability of infrastructure in this region can be impacted by many factors including increased thawing of permafrost soils, reduced coastline protection due to declining arctic sea ice, and changes in inland flooding. The U.S. Environmental Protection Agency (EPA) is coordinating an effort to quantify physical and economic impacts of climate change on public infrastructure across the state of Alaska and estimate how global greenhouse gas (GHG) mitigation may avoid or reduce these impacts. This research builds on the Climate Change Impacts and Risk Analysis (CIRA) project developed for the contiguous U.S., which is described in an EPA report released in June 2015. We are using a multi-model analysis focused primarily on the impacts of changing permafrost, coastal erosion, and inland flooding on a range of infrastructure types, including transportation (e.g. roads, airports), buildings and harbors, energy sources and transmission, sewer and water systems, and others. This analysis considers multiple global GHG emission scenarios ranging from a business as usual future to significant global action. These scenarios drive climate projections through 2100 spanning a range of outcomes to capture variability amongst climate models. Projections are being combined with a recently developed public infrastructure database and integrated into a version of the Infrastructure Planning Support System (IPSS) we are modifying for use in the Arctic and sub-Arctic region. The IPSS tool allows for consideration of both adaptation and reactive responses to climate change. Results of this work will address a gap in our understanding of climate change impacts in Alaska, provide estimates of the physical and economic damages we may expect with and without global GHG mitigation, and produce important insights about infrastructure vulnerabilities in response to warming at northern latitudes.
Hazard Management with DOORS: Rail Infrastructure Projects
NASA Astrophysics Data System (ADS)
Hughes, Dave; Saeed, Amer
LOI is a major rail infrastructure project that will contribute to a modernised transport system in time for the 2012 Olympic Games. A review of the procedures and tool infrastructure was conducted in early 2006, coinciding with a planned move to main works. A hazard log support tool was needed to provide: an automatic audit trial, version control and support collaborative working. A DOORS based Hazard Log (DHL) was selected as the Tool Strategy. A systematic approach was followed for the development of DHL, after a series of tests and acceptance gateways, DHL was handed over to the project in autumn 2006. The first few months were used for operational trials and he Hazard Management rocedure was modified to be a hybrid approach that used the strengths of DHL and Excel. The user experience in the deployment of DHL is summarised and directions for future improvement identified.
Vulnerability-based evaluation of water supply design under climate change
NASA Astrophysics Data System (ADS)
Umit Taner, Mehmet; Ray, Patrick; Brown, Casey
2015-04-01
Long-lived water supply infrastructures are strategic investments in the developing world, serving the purpose of balancing water deficits compounded by both population growth and socio-economic development. Robust infrastructure design under climate change is compelling, and often addressed by focusing on the outcomes of climate model projections ('scenario-led' planning), or by identifying design options that are less vulnerable to a wide range of plausible futures ('vulnerability-based' planning). Decision-Scaling framework combines these two approaches by first applying a climate stress test on the system to explore vulnerabilities across many traces of the future, and then employing climate projections to inform the decision-making process. In this work, we develop decision scaling's nascent risk management concepts further, directing actions on vulnerabilities identified during the climate stress test. In the process, we present a new way to inform climate vulnerability space using climate projections, and demonstrate the use of multiple decision criteria to guide to a final design recommendation. The concepts are demonstrated for a water supply project in the Mombasa Province of Kenya, planned to provide domestic and irrigation supply. Six storage design capacities (from 40 to 140 million cubic meters) are explored through a stress test, under a large number climate traces representing both natural climate variability and plausible climate changes. Design outcomes are simulated over a 40-year planning period with a coupled hydrologic-water resources systems model and using standard reservoir operation rules. Resulting performance is expressed in terms of water supply reliability and economic efficiency. Ensemble climate projections are used for assigning conditional likelihoods to the climate traces using a statistical distance measure. The final design recommendations are presented and discussed for the decision criteria of expected regret, satisficing, and conditional value-at-risk (CVaR).
Dynamic interaction between vehicles and infrastructure experiment (DIVINE) : policy implications
DOT National Transportation Integrated Search
1998-01-01
The Dynamic Interaction between Vehicles and Infrastructure Experiment (DIVINE) Project provided scientific evidence of the dynamic effects of heavy vehicles and their suspension systems on pavements and bridges. These conclusions are detailed in the...
Dynamic interaction between vehicles and infrastructure experiment (DIVINE) : technical report
DOT National Transportation Integrated Search
1998-10-27
The Dynamic Interaction between Vehicles and Infrastructure Experiment (DIVINE) Project provides scientific evidence of the dynamic effects of heavy vehicles and their suspension systems on pavements and bridges in support of transport policy decisio...
Better state-of-good-repair indicators for the transportation performance index.
DOT National Transportation Integrated Search
2014-07-01
The Transportation Performance Index was developed for the US Chamber of Commerce to track the : performance of transportation infrastructure over time and explore the connection between economic : health and infrastructure performance. This project ...
Second annual Transportation Infrastructure Engineering Conference.
DOT National Transportation Integrated Search
2013-10-01
The conference will highlight a few of the current projects that have been sponsored by the Center for Transportation : Infrastructure and Safety (CTIS), a national University Transportation Center at S&T. In operation since 1998, the CTIS supports :...
An Overview of the Distributed Space Exploration Simulation (DSES) Project
NASA Technical Reports Server (NTRS)
Crues, Edwin Z.; Chung, Victoria I.; Blum, Michael G.; Bowman, James D.
2007-01-01
This paper describes the Distributed Space Exploration Simulation (DSES) Project, a research and development collaboration between NASA centers which investigates technologies, and processes related to integrated, distributed simulation of complex space systems in support of NASA's Exploration Initiative. In particular, it describes the three major components of DSES: network infrastructure, software infrastructure and simulation development. With regard to network infrastructure, DSES is developing a Distributed Simulation Network for use by all NASA centers. With regard to software, DSES is developing software models, tools and procedures that streamline distributed simulation development and provide an interoperable infrastructure for agency-wide integrated simulation. Finally, with regard to simulation development, DSES is developing an integrated end-to-end simulation capability to support NASA development of new exploration spacecraft and missions. This paper presents the current status and plans for these three areas, including examples of specific simulations.
System Engineering Infrastructure Evolution Galileo IOV and the Steps Beyond
NASA Astrophysics Data System (ADS)
Eickhoff, J.; Herpel, H.-J.; Steinle, T.; Birn, R.; Steiner, W.-D.; Eisenmann, H.; Ludwig, T.
2009-05-01
The trends to more and more constrained financial budgets in satellite engineering require a permanent optimization of the S/C system engineering processes and infrastructure. Astrium in the recent years already has built up a system simulation infrastructure - the "Model-based Development & Verification Environment" - which meanwhile is well known all over Europe and is established as Astrium's standard approach for ESA, DLR projects and now even the EU/ESA-Project Galileo IOV. The key feature of the MDVE / FVE approach is to provide entire S/C simulation (with full featured OBC simulation) already in early phases to start OBSW code tests on a simulated S/C and to later add hardware in the loop step by step up to an entire "Engineering Functional Model (EFM)" or "FlatSat". The subsequent enhancements to this simulator infrastructure w.r.t. spacecraft design data handling are reported in the following sections.
NASA Astrophysics Data System (ADS)
Sokolova, N.; Morrison, A.; Haakonsen, T. A.
2015-04-01
Recent advancement of land-based mobile mapping enables rapid and cost-effective collection of highquality road related spatial information. Mobile Mapping Systems (MMS) can provide spatial information with subdecimeter accuracy in nominal operation environments. However, performance in challenging environments such as tunnels is not well characterized. The Norwegian Public Roads Administration (NPRA) manages the country's public road network and its infrastructure, a large segment of which is represented by road tunnels (there are about 1 000 road tunnels in Norway with a combined length of 800 km). In order to adopt mobile mapping technology for streamlining road network and infrastructure management and maintenance tasks, it is important to ensure that the technology is mature enough to meet existing requirements for object positioning accuracy in all types of environments, and provide homogeneous accuracy over the mapping perimeter. This paper presents results of a testing campaign performed within a project funded by the NPRA as a part of SMarter road traffic with Intelligent Transport Systems (ITS) (SMITS) program. The testing campaign objective was performance evaluation of high end commercial MMSs for inventory of public areas, focusing on Global Navigation Satellite System (GNSS) signal degraded environments.
NASA Astrophysics Data System (ADS)
Ducrot, Raphaëlle
2017-12-01
This paper explores the contradiction between the need for large scale interventions in rural water supplies and the need for flexibility when providing support for community institutions, by investigating the implementation of the Mozambique - National Rural Water Supply and Sanitation Program in a semi-arid district of the Limpopo Basin. Our results showed that coordinated leadership by key committee members, and the level of village governance was more important for borehole sustainability than the normative functioning of the committee. In a context in which the centrality of leadership prevails over collective action the sustainability of rural water infrastructure derives from the ability of leaders to motivate the community to provide supplementary funding. This, in turn, depends on the added value to the community of the water points and on village politics. Any interventions that increased community conflicts, for example because of lack of transparency or unequitable access to the benefit of the intervention, weakened the coordination and the collective action capacity of the community and hence the sustainability of the infrastructures even if the intervention was not directly related to water access. These results stress the importance of the project/program implementation pathway.
The GMOS cyber(e)-infrastructure: advanced services for supporting science and policy.
Cinnirella, S; D'Amore, F; Bencardino, M; Sprovieri, F; Pirrone, N
2014-03-01
The need for coordinated, systematized and catalogued databases on mercury in the environment is of paramount importance as improved information can help the assessment of the effectiveness of measures established to phase out and ban mercury. Long-term monitoring sites have been established in a number of regions and countries for the measurement of mercury in ambient air and wet deposition. Long term measurements of mercury concentration in biota also produced a huge amount of information, but such initiatives are far from being within a global, systematic and interoperable approach. To address these weaknesses the on-going Global Mercury Observation System (GMOS) project ( www.gmos.eu ) established a coordinated global observation system for mercury as well it retrieved historical data ( www.gmos.eu/sdi ). To manage such large amount of information a technological infrastructure was planned. This high-performance back-end resource associated with sophisticated client applications enables data storage, computing services, telecommunications networks and all services necessary to support the activity. This paper reports the architecture definition of the GMOS Cyber(e)-Infrastructure and the services developed to support science and policy, including the United Nation Environmental Program. It finally describes new possibilities in data analysis and data management through client applications.
The Emergence of Dominant Design(s) in Large Scale Cyber-Infrastructure Systems
ERIC Educational Resources Information Center
Diamanti, Eirini Ilana
2012-01-01
Cyber-infrastructure systems are integrated large-scale IT systems designed with the goal of transforming scientific practice by enabling multi-disciplinary, cross-institutional collaboration. Their large scale and socio-technical complexity make design decisions for their underlying architecture practically irreversible. Drawing on three…
Testing Omega P’s 650 KW, 1.3 GHZ Low-Voltage Multi-Beam Klystron for the Project X Pulsed LINAC
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fermi Research Alliance; Omega-P Inc.
Omega-P Inc. had developed a multi beam 1.3 GHz klystron (MBK) for the Project X pulsed linac application. Testing of the klystron require a special hardware such as a modulator, RF components, control system, power supplies, etc, as well as associated infrastructure( electricity, water, safety). This is an expensive part of klystron development for which Omega-P does not have the required equipment. Fermilab will test the MBK at Fermilab site providing contribution to the project all the necessary facilities, infrastructure and manpower for MBK test performance and analysis.
California Hydrogen Infrastructure Project
DOE Office of Scientific and Technical Information (OSTI.GOV)
Heydorn, Edward C
2013-03-12
Air Products and Chemicals, Inc. has completed a comprehensive, multiyear project to demonstrate a hydrogen infrastructure in California. The specific primary objective of the project was to demonstrate a model of a real-world retail hydrogen infrastructure and acquire sufficient data within the project to assess the feasibility of achieving the nation's hydrogen infrastructure goals. The project helped to advance hydrogen station technology, including the vehicle-to-station fueling interface, through consumer experiences and feedback. By encompassing a variety of fuel cell vehicles, customer profiles and fueling experiences, this project was able to obtain a complete portrait of real market needs. The projectmore » also opened its stations to other qualified vehicle providers at the appropriate time to promote widespread use and gain even broader public understanding of a hydrogen infrastructure. The project engaged major energy companies to provide a fueling experience similar to traditional gasoline station sites to foster public acceptance of hydrogen. Work over the course of the project was focused in multiple areas. With respect to the equipment needed, technical design specifications (including both safety and operational considerations) were written, reviewed, and finalized. After finalizing individual equipment designs, complete station designs were started including process flow diagrams and systems safety reviews. Material quotes were obtained, and in some cases, depending on the project status and the lead time, equipment was placed on order and fabrication began. Consideration was given for expected vehicle usage and station capacity, standard features needed, and the ability to upgrade the station at a later date. In parallel with work on the equipment, discussions were started with various vehicle manufacturers to identify vehicle demand (short- and long-term needs). Discussions included identifying potential areas most suited for hydrogen fueling stations with a focus on safe, convenient, fast-fills. These potential areas were then compared to and overlaid with suitable sites from various energy companies and other potential station operators. Work continues to match vehicle needs with suitable fueling station locations. Once a specific site was identified, the necessary agreements could be completed with the station operator and expected station users. Detailed work could then begin on the site drawings, permits, safety procedures and training needs. Permanent stations were successfully installed in Irvine (delivered liquid hydrogen), Torrance (delivered pipeline hydrogen) and Fountain Valley (renewable hydrogen from anaerobic digester gas). Mobile fueling stations were also deployed to meet short-term fueling needs in Long Beach and Placerville. Once these stations were brought online, infrastructure data was collected and reported to DOE using Air Products Enterprise Remote Access Monitoring system. Feedback from station operators was incorporated to improve the station user's fueling experience.« less
Updates of Land Surface and Air Quality Products in NASA MAIRS and NEESPI Data Portals
NASA Technical Reports Server (NTRS)
Shen, Suhung; Leptoukh, Gregory G.; Gerasimov, Irina
2010-01-01
Following successful support of the Northern Eurasia Earth Sciences Partner Initiative (NEESPI) project with NASA satellite remote sensing data, from Spring 2009 the NASA GES DISC (Goddard Earth Sciences Data and Information Services Center) has been working on collecting more satellite and model data to support the Monsoon Asia Integrated Regional Study (MAIRS) project. The established data management and service infrastructure developed for NEESPI has been used and improved for MAIRS support.Data search, subsetting, and download functions are available through a single system. A customized Giovanni system has been created for MAIRS.The Web-based on line data analysis and visualization system, Giovanni (Goddard Interactive Online Visualization ANd aNalysis Infrastructure) allows scientists to explore, quickly analyze, and download data easily without learning the original data structure and format. Giovanni MAIRS includes satellite observations from multiple sensors and model output from the NASA Global Land Data Assimilation System (GLDAS), and from the NASA atmospheric reanalysis project, MERRA. Currently, we are working on processing and integrating higher resolution land data in to Giovanni, such as vegetation index, land surface temperature, and active fire at 5km or 1km from the standard MODIS products. For data that are not archived at the GESDISC,a product metadata portal is under development to serve as a gateway for providing product level information and data access links, which include both satellite, model products and ground-based measurements information collected from MAIRS scientists.Due to the large overlap of geographic coverage and many similar scientific interests of NEESPI and MAIRS, these data and tools will serve both projects.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Williams, Dean N.
2015-01-27
The climate and weather data science community met December 9–11, 2014, in Livermore, California, for the fourth annual Earth System Grid Federation (ESGF) and Ultrascale Visualization Climate Data Analysis Tools (UV-CDAT) Face-to-Face (F2F) Conference, hosted by the Department of Energy, National Aeronautics and Space Administration, National Oceanic and Atmospheric Administration, the European Infrastructure for the European Network of Earth System Modelling, and the Australian Department of Education. Both ESGF and UVCDATremain global collaborations committed to developing a new generation of open-source software infrastructure that provides distributed access and analysis to simulated and observed data from the climate and weather communities.more » The tools and infrastructure created under these international multi-agency collaborations are critical to understanding extreme weather conditions and long-term climate change. In addition, the F2F conference fosters a stronger climate and weather data science community and facilitates a stronger federated software infrastructure. The 2014 F2F conference detailed the progress of ESGF, UV-CDAT, and other community efforts over the year and sets new priorities and requirements for existing and impending national and international community projects, such as the Coupled Model Intercomparison Project Phase Six. Specifically discussed at the conference were project capabilities and enhancements needs for data distribution, analysis, visualization, hardware and network infrastructure, standards, and resources.« less
NASA Astrophysics Data System (ADS)
Shiklomanov, Nikolay; Streletskiy, Dmitry; Swales, Timothy
2014-05-01
Planned socio-economic development during the Soviet period promoted migration into the Arctic and work force consolidation in urbanized settlements to support mineral resources extraction and transportation industries. These policies have resulted in very high level of urbanization in the Soviet Arctic. Despite the mass migration from the northern regions during the 1990s following the collapse of the Soviet Union and the diminishing government support, the Russian Arctic population remains predominantly urban. In five Russian Administrative regions underlined by permafrost and bordering the Arctic Ocean 66 to 82% (depending on region) of the total population is living in Soviet-era urban communities. The political, economic and demographic changes in the Russian Arctic over the last 20 years are further complicated by climate change which is greatly amplified in the Arctic region. One of the most significant impacts of climate change on arctic urban landscapes is the warming and degradation of permafrost which negatively affects the structural integrity of infrastructure. The majority of structures in the Russian Arctic are built according to the passive principle, which promotes equilibrium between the permafrost thermal regime and infrastructure foundations. This presentation is focused on quantitative assessment of potential changes in stability of Russian urban infrastructure built on permafrost in response to ongoing and future climatic changes using permafrost - geotechnical model forced by GCM-projected climate. To address the uncertainties in GCM projections we have utilized results from 6 models participated in most recent IPCC model inter-comparison project. The analysis was conducted for entire extent of Russian permafrost-affected area and on several representative urban communities. Our results demonstrate that significant observed reduction in urban infrastructure stability throughout the Russian Arctic can be attributed to climatic changes and that projected future climatic changes will further negatively affect communities on permafrost. However, the uncertainties in magnitude and spatial and temporal patterns of projected climate change produced by individual GCMs translate to substantial variability of the future state of infrastructure built on permafrost.
Using offsets to mitigate environmental impacts of major projects: A stakeholder analysis.
Martin, Nigel; Evans, Megan; Rice, John; Lodhia, Sumit; Gibbons, Philip
2016-09-01
Global patterns of development suggest that as more projects are initiated, business will need to find acceptable measures to conserve biodiversity. The application of environmental offsets allows firms to combine their economic interests with the environment and society. This article presents the results of a multi-stakeholder analysis related to the design of offsets principles, policies, and regulatory processes, using a large infrastructure projects context. The results indicate that business was primarily interested in using direct offsets and other compensatory measures, known internationally as indirect offsets, to acquit their environmental management obligations. In contrast, the environmental sector argued that highly principled and scientifically robust offsets programs should be implemented and maintained for enduring environmental protection. Stakeholder consensus stressed the importance of offsets registers with commensurate monitoring and enforcement. Our findings provide instructive insights into the countervailing views of offsets policy stakeholders. Copyright © 2016 Elsevier Ltd. All rights reserved.
Job Creation and Petroleum Independence with E85 in Texas
DOE Office of Scientific and Technical Information (OSTI.GOV)
Walk, Steve
Protec Fuel Management project objectives are to help design, build, provide, promote and supply biofuels for the greater energy independence, national security and domestic economic growth through job creations, infrastructure projects and supply chain business stimulants. Protec Fuel has teamed up with station owners to convert 5 existing retail fueling stations to include E85 fuel to service existing large number of fleet FFVs and general public FFVs. The stations are located in high flex fuel vehicle locations in the state of TX. Under the project name, “Job Creation and Petroleum Independence with E85 in Texas,” Protec Fuel identified and successfullymore » opened stations strategically located to maximize e85 fueling success for fleets and public. Protec Fuel and industry affiliates and FFV manufacturers are excited about these stations and the opportunities as they will help reduce emissions, increase jobs, economic stimulus benefits, energy independence and petroleum displacement.« less
Transportation Infrastructure Design and Construction \\0x16 Virtual Training Tools
DOT National Transportation Integrated Search
2003-09-01
This project will develop 3D interactive computer-training environments for a major element of transportation infrastructure : hot mix asphalt paving. These tools will include elements of hot mix design (including laboratory equipment) and constructi...
Vehicle-to-infrastructure program cooperative adaptive cruise control.
DOT National Transportation Integrated Search
2015-03-01
This report documents the work completed by the Crash Avoidance Metrics Partners LLC (CAMP) Vehicle to Infrastructure (V2I) Consortium during the project titled Cooperative Adaptive Cruise Control (CACC). Participating companies in the V2I Cons...
Integrating grey and green infrastructure to improve the health and well-being of urban populations
Erika S. Svendsen; Mary E. Northridge; Sara S. Metcalf
2012-01-01
One of the enduring lessons of cities is the essential relationship between grey infrastructure (e.g., streets and buildings) and green infrastructure (e.g., parks and open spaces). The design and management of natural resources to enhance human health and well-being may be traced back thousands of years to the earliest urban civilizations. From the irrigation projects...
Crowdsourcing Physical Network Topology Mapping With Net.Tagger
2016-03-01
backend server infrastructure . This in- cludes a full security audit, better web services handling, and integration with the OSM stack and dataset to...a novel approach to network infrastructure mapping that combines smartphone apps with crowdsourced collection to gather data for offline aggregation...and analysis. The project aims to build a map of physical network infrastructure such as fiber-optic cables, facilities, and access points. The
Dinov, Ivo D; Rubin, Daniel; Lorensen, William; Dugan, Jonathan; Ma, Jeff; Murphy, Shawn; Kirschner, Beth; Bug, William; Sherman, Michael; Floratos, Aris; Kennedy, David; Jagadish, H V; Schmidt, Jeanette; Athey, Brian; Califano, Andrea; Musen, Mark; Altman, Russ; Kikinis, Ron; Kohane, Isaac; Delp, Scott; Parker, D Stott; Toga, Arthur W
2008-05-28
The advancement of the computational biology field hinges on progress in three fundamental directions--the development of new computational algorithms, the availability of informatics resource management infrastructures and the capability of tools to interoperate and synergize. There is an explosion in algorithms and tools for computational biology, which makes it difficult for biologists to find, compare and integrate such resources. We describe a new infrastructure, iTools, for managing the query, traversal and comparison of diverse computational biology resources. Specifically, iTools stores information about three types of resources--data, software tools and web-services. The iTools design, implementation and resource meta-data content reflect the broad research, computational, applied and scientific expertise available at the seven National Centers for Biomedical Computing. iTools provides a system for classification, categorization and integration of different computational biology resources across space-and-time scales, biomedical problems, computational infrastructures and mathematical foundations. A large number of resources are already iTools-accessible to the community and this infrastructure is rapidly growing. iTools includes human and machine interfaces to its resource meta-data repository. Investigators or computer programs may utilize these interfaces to search, compare, expand, revise and mine meta-data descriptions of existent computational biology resources. We propose two ways to browse and display the iTools dynamic collection of resources. The first one is based on an ontology of computational biology resources, and the second one is derived from hyperbolic projections of manifolds or complex structures onto planar discs. iTools is an open source project both in terms of the source code development as well as its meta-data content. iTools employs a decentralized, portable, scalable and lightweight framework for long-term resource management. We demonstrate several applications of iTools as a framework for integrated bioinformatics. iTools and the complete details about its specifications, usage and interfaces are available at the iTools web page http://iTools.ccb.ucla.edu.
Dinov, Ivo D.; Rubin, Daniel; Lorensen, William; Dugan, Jonathan; Ma, Jeff; Murphy, Shawn; Kirschner, Beth; Bug, William; Sherman, Michael; Floratos, Aris; Kennedy, David; Jagadish, H. V.; Schmidt, Jeanette; Athey, Brian; Califano, Andrea; Musen, Mark; Altman, Russ; Kikinis, Ron; Kohane, Isaac; Delp, Scott; Parker, D. Stott; Toga, Arthur W.
2008-01-01
The advancement of the computational biology field hinges on progress in three fundamental directions – the development of new computational algorithms, the availability of informatics resource management infrastructures and the capability of tools to interoperate and synergize. There is an explosion in algorithms and tools for computational biology, which makes it difficult for biologists to find, compare and integrate such resources. We describe a new infrastructure, iTools, for managing the query, traversal and comparison of diverse computational biology resources. Specifically, iTools stores information about three types of resources–data, software tools and web-services. The iTools design, implementation and resource meta - data content reflect the broad research, computational, applied and scientific expertise available at the seven National Centers for Biomedical Computing. iTools provides a system for classification, categorization and integration of different computational biology resources across space-and-time scales, biomedical problems, computational infrastructures and mathematical foundations. A large number of resources are already iTools-accessible to the community and this infrastructure is rapidly growing. iTools includes human and machine interfaces to its resource meta-data repository. Investigators or computer programs may utilize these interfaces to search, compare, expand, revise and mine meta-data descriptions of existent computational biology resources. We propose two ways to browse and display the iTools dynamic collection of resources. The first one is based on an ontology of computational biology resources, and the second one is derived from hyperbolic projections of manifolds or complex structures onto planar discs. iTools is an open source project both in terms of the source code development as well as its meta-data content. iTools employs a decentralized, portable, scalable and lightweight framework for long-term resource management. We demonstrate several applications of iTools as a framework for integrated bioinformatics. iTools and the complete details about its specifications, usage and interfaces are available at the iTools web page http://iTools.ccb.ucla.edu. PMID:18509477
The Electronic Data Methods (EDM) forum for comparative effectiveness research (CER).
Holve, Erin; Segal, Courtney; Lopez, Marianne Hamilton; Rein, Alison; Johnson, Beth H
2012-07-01
AcademyHealth convened the Electronic Data Methods (EDM) Forum to collect, synthesize, and share lessons from eleven projects that are building infrastructure and using electronic clinical data for comparative effectiveness research (CER) and patient-centered outcomes research (PCOR). This paper provides a brief review of participating projects and provides a framework of common challenges. EDM Forum staff conducted a text review of relevant grant programs' funding opportunity announcements; projects' research plans; and available information on projects' websites. Additional information was obtained from presentations provided by each project; phone calls with project principal investigators, affiliated partners, and staff from the Agency for Healthcare Research and Quality (AHRQ); and six site visits. Projects participating in the EDM Forum are building infrastructure and developing innovative strategies to address a set of methodological, and data and informatics challenges, here identified in a common framework. The eleven networks represent more than 20 states and include a range of partnership models. Projects vary substantially in size, from 11,000 to more than 7.5 million individuals. Nearly all of the AHRQ priority populations and conditions are addressed. In partnership with the projects, the EDM Forum is focused on identifying and sharing lessons learned to advance the national dialogue on the use of electronic clinical data to conduct CER and PCOR. These efforts have the shared goal of addressing challenges in traditional research studies and data sources, and aim to build infrastructure and generate evidence to support a learning health care system that can improve patient outcomes.
Roadmap for Developing of Brokering as a Component of EarthCube
NASA Astrophysics Data System (ADS)
Pearlman, J.; Khalsa, S. S.; Browdy, S.; Duerr, R. E.; Nativi, S.; Parsons, M. A.; Pearlman, F.; Robinson, E. M.
2012-12-01
The goal of NSF's EarthCube is to create a sustainable infrastructure that enables the sharing of all geosciences data, information, and knowledge in an open, transparent and inclusive manner. Key to achieving the EarthCube vision is establishing a process that will guide the evolution of the infrastructure through community engagement and appropriate investment so that the infrastructure is embraced and utilized by the entire geosciences community. In this presentation we describe a roadmap, developed through the EarthCube Brokering Concept Award, for an evolutionary process of infrastructure and interoperability development. All geoscience communities already have, to a greater or lesser degree, elements of an information infrastructure in place. These elements include resources such as data archives, catalogs, and portals as well as vocabularies, data models, protocols, best practices and other community conventions. What is necessary now is a process for consolidating these diverse infrastructure elements into an overall infrastructure that provides easy discovery, access and utilization of resources across disciplinary boundaries. This process of consolidation will be achieved by creating "interfaces," what we call "brokers," between systems. Brokers connect disparate systems without imposing new burdens upon those systems, and enable the infrastructure to adjust to new technical developments and scientific requirements as they emerge. Robust cyberinfrastructure will arise only when social, organizational, and cultural issues are resolved in tandem with the creation of technology-based services. This is best done through use-case-driven requirements and agile, iterative development methods. It is important to start by solving real (not hypothetical) information access and use problems via small pilot projects that develop capabilities targeted to specific communities. These pilots can then grow into larger prototypes addressing intercommunity problems working towards a full-scale socio-technical infrastructure vision. Brokering, as a critical capability for connecting systems, evolves over time through more connections and increased functionality. This adaptive process allows for continual evaluation as to how well science-driven use cases are being met. Several NSF infrastructure projects are underway and beginning to shape the next generation of information sharing. There is a near term, and possibly unique, opportunity to increase the impact and interconnectivity of these projects, and further improve science research collaboration through brokering. Brokering has been demonstrated to be an essential part of a robust, adaptive infrastructure, but critical questions of governance and detailed implementation remain. Our roadmap proposes the expansion of brokering pilots into fully operational prototypes that work with the broader science and informatics communities to answer these questions, connect existing and emerging systems, and evolve the EarthCube infrastructure.
Risk assessment for construction projects of transport infrastructure objects
NASA Astrophysics Data System (ADS)
Titarenko, Boris
2017-10-01
The paper analyzes and compares different methods of risk assessment for construction projects of transport objects. The management of such type of projects demands application of special probabilistic methods due to large level of uncertainty of their implementation. Risk management in the projects requires the use of probabilistic and statistical methods. The aim of the work is to develop a methodology for using traditional methods in combination with robust methods that allow obtaining reliable risk assessments in projects. The robust approach is based on the principle of maximum likelihood and in assessing the risk allows the researcher to obtain reliable results in situations of great uncertainty. The application of robust procedures allows to carry out a quantitative assessment of the main risk indicators of projects when solving the tasks of managing innovation-investment projects. Calculation of damage from the onset of a risky event is possible by any competent specialist. And an assessment of the probability of occurrence of a risky event requires the involvement of special probabilistic methods based on the proposed robust approaches. Practice shows the effectiveness and reliability of results. The methodology developed in the article can be used to create information technologies and their application in automated control systems for complex projects.
Pratas, Diogo; Pinho, Armando J; Rodrigues, João M O S
2014-01-16
The emerging next-generation sequencing (NGS) is bringing, besides the natural huge amounts of data, an avalanche of new specialized tools (for analysis, compression, alignment, among others) and large public and private network infrastructures. Therefore, a direct necessity of specific simulation tools for testing and benchmarking is rising, such as a flexible and portable FASTQ read simulator, without the need of a reference sequence, yet correctly prepared for producing approximately the same characteristics as real data. We present XS, a skilled FASTQ read simulation tool, flexible, portable (does not need a reference sequence) and tunable in terms of sequence complexity. It has several running modes, depending on the time and memory available, and is aimed at testing computing infrastructures, namely cloud computing of large-scale projects, and testing FASTQ compression algorithms. Moreover, XS offers the possibility of simulating the three main FASTQ components individually (headers, DNA sequences and quality-scores). XS provides an efficient and convenient method for fast simulation of FASTQ files, such as those from Ion Torrent (currently uncovered by other simulators), Roche-454, Illumina and ABI-SOLiD sequencing machines. This tool is publicly available at http://bioinformatics.ua.pt/software/xs/.
Code of Federal Regulations, 2010 CFR
2010-04-01
... HIGHWAY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION TRANSPORTATION INFRASTRUCTURE MANAGEMENT PROJECTS OF... under this program: (a) A project meeting the definition of an eligible project under 505.5 of this section located fully within one State shall have eligible project costs that are quantified in the...
Transportation Infrastructure: Central Artery/Tunnel Project Faces Continued Financial Uncertainties
DOT National Transportation Integrated Search
1996-05-01
At a cost of over $1 billion a mile, the Central Artery/Tunnel project - an Interstate Highway System project in Boston, Massachusetts - is one of the largest, most complex, and most expensive highway construction projects ever undertaken. In respons...
Space research - At a crossroads
NASA Technical Reports Server (NTRS)
Mcdonald, Frank B.
1987-01-01
Efforts which must be expended if U.S. space research is to regain vitality in the next few years are discussed. Small-scale programs are the cornerstone for big science projects, giving both researchers and students a chance to practice the development of space missions and hardware and identify promising goals for larger projects. Small projects can be carried aloft by balloons, sounding rockets, the Shuttle and ELVs. It is recommended that NASA continue the development of remote sensing systems, and join with other government agencies to fund space-based materials science, space biology and medical research. Increased international cooperation in space projects is necessary for affording moderate to large scale missions, for political reasons, and to maximize available space resources. Finally, the establishment and funding of long-range goals in space, particularly the development of the infrastructure and technologies for the exploration and colonization of the planets, must be viewed as the normal outgrowth of the capabilities being developed for LEO operations.
National Laboratory for Advanced Scientific Visualization at UNAM - Mexico
NASA Astrophysics Data System (ADS)
Manea, Marina; Constantin Manea, Vlad; Varela, Alfredo
2016-04-01
In 2015, the National Autonomous University of Mexico (UNAM) joined the family of Universities and Research Centers where advanced visualization and computing plays a key role to promote and advance missions in research, education, community outreach, as well as business-oriented consulting. This initiative provides access to a great variety of advanced hardware and software resources and offers a range of consulting services that spans a variety of areas related to scientific visualization, among which are: neuroanatomy, embryonic development, genome related studies, geosciences, geography, physics and mathematics related disciplines. The National Laboratory for Advanced Scientific Visualization delivers services through three main infrastructure environments: the 3D fully immersive display system Cave, the high resolution parallel visualization system Powerwall, the high resolution spherical displays Earth Simulator. The entire visualization infrastructure is interconnected to a high-performance-computing-cluster (HPCC) called ADA in honor to Ada Lovelace, considered to be the first computer programmer. The Cave is an extra large 3.6m wide room with projected images on the front, left and right, as well as floor walls. Specialized crystal eyes LCD-shutter glasses provide a strong stereo depth perception, and a variety of tracking devices allow software to track the position of a user's hand, head and wand. The Powerwall is designed to bring large amounts of complex data together through parallel computing for team interaction and collaboration. This system is composed by 24 (6x4) high-resolution ultra-thin (2 mm) bezel monitors connected to a high-performance GPU cluster. The Earth Simulator is a large (60") high-resolution spherical display used for global-scale data visualization like geophysical, meteorological, climate and ecology data. The HPCC-ADA, is a 1000+ computing core system, which offers parallel computing resources to applications that requires large quantity of memory as well as large and fast parallel storage systems. The entire system temperature is controlled by an energy and space efficient cooling solution, based on large rear door liquid cooled heat exchangers. This state-of-the-art infrastructure will boost research activities in the region, offer a powerful scientific tool for teaching at undergraduate and graduate levels, and enhance association and cooperation with business-oriented organizations.
Resource allocation in road infrastructure using ANP priorities with ZOGP formulation-A case study
NASA Astrophysics Data System (ADS)
Alias, Suriana; Adna, Norfarziah; Soid, Siti Khuzaimah; Kardri, Mahani
2013-09-01
Road Infrastructure (RI) project evaluation and selection is concern with the allocation of scarce organizational resources. In this paper, it is suggest an improved RI project selection methodology which reflects interdependencies among evaluation criteria and candidate projects. Fuzzy Delphi Method (FDM) is use to evoking expert group opinion and also to determine a degree of interdependences relationship between the alternative projects. In order to provide a systematic approach to set priorities among multi-criteria and trade-off among objectives, Analytic Network Process (ANP) is suggested to be applied prior to Zero-One Goal Programming (ZOGP) formulation. Specifically, this paper demonstrated how to combined FDM and ANP with ZOGP through a real-world RI empirical example on an ongoing decision-making project in Johor, Malaysia.
Invisible water, visible impact: groundwater use and Indian agriculture under climate change
Zaveri, Esha; Grogan, Danielle S.; Fisher-Vanden, Karen; ...
2016-08-03
India is one of the world's largest food producers, making the sustainability of its agricultural system of global significance. Groundwater irrigation underpins India's agriculture, currently boosting crop production by enough to feed 170 million people. Groundwater overexploitation has led to drastic declines in groundwater levels, threatening to push this vital resource out of reach for millions of small-scale farmers who are the backbone of India's food security. Historically, losing access to groundwater has decreased agricultural production and increased poverty. We take a multidisciplinary approach to assess climate change challenges facing India's agricultural system, and to assess the effectiveness of large-scalemore » water infrastructure projects designed to meet these challenges. We find that even in areas that experience climate change induced precipitation increases, expansion of irrigated agriculture will require increasing amounts of unsustainable groundwater. Finally, the large proposed national river linking project has limited capacity to alleviate groundwater stress. Thus, without intervention, poverty and food insecurity in rural India is likely to worsen.« less
Invisible water, visible impact: groundwater use and Indian agriculture under climate change
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zaveri, Esha; Grogan, Danielle S.; Fisher-Vanden, Karen
India is one of the world's largest food producers, making the sustainability of its agricultural system of global significance. Groundwater irrigation underpins India's agriculture, currently boosting crop production by enough to feed 170 million people. Groundwater overexploitation has led to drastic declines in groundwater levels, threatening to push this vital resource out of reach for millions of small-scale farmers who are the backbone of India's food security. Historically, losing access to groundwater has decreased agricultural production and increased poverty. We take a multidisciplinary approach to assess climate change challenges facing India's agricultural system, and to assess the effectiveness of large-scalemore » water infrastructure projects designed to meet these challenges. We find that even in areas that experience climate change induced precipitation increases, expansion of irrigated agriculture will require increasing amounts of unsustainable groundwater. Finally, the large proposed national river linking project has limited capacity to alleviate groundwater stress. Thus, without intervention, poverty and food insecurity in rural India is likely to worsen.« less
Invisible water, visible impact: groundwater use and Indian agriculture under climate change
NASA Astrophysics Data System (ADS)
Zaveri, Esha; Grogan, Danielle S.; Fisher-Vanden, Karen; Frolking, Steve; Lammers, Richard B.; Wrenn, Douglas H.; Prusevich, Alexander; Nicholas, Robert E.
2016-08-01
India is one of the world’s largest food producers, making the sustainability of its agricultural system of global significance. Groundwater irrigation underpins India’s agriculture, currently boosting crop production by enough to feed 170 million people. Groundwater overexploitation has led to drastic declines in groundwater levels, threatening to push this vital resource out of reach for millions of small-scale farmers who are the backbone of India’s food security. Historically, losing access to groundwater has decreased agricultural production and increased poverty. We take a multidisciplinary approach to assess climate change challenges facing India’s agricultural system, and to assess the effectiveness of large-scale water infrastructure projects designed to meet these challenges. We find that even in areas that experience climate change induced precipitation increases, expansion of irrigated agriculture will require increasing amounts of unsustainable groundwater. The large proposed national river linking project has limited capacity to alleviate groundwater stress. Thus, without intervention, poverty and food insecurity in rural India is likely to worsen.
The EV Project Price/Fee Models for Publicly Accessible Charging
DOE Office of Scientific and Technical Information (OSTI.GOV)
Francfort, James Edward
As plug-in electric vehicles (PEVs) are introduced to the market place and gain more consumer acceptance, it is important for a robust and self-sustaining non-residential infrastructure of electric vehicle supply equipment (EVSE) to be established to meet the needs of PEV drivers. While federal and state financial incentives for electric vehicles were in place and remain so today, future incentives are uncertain. In order for PEVs to achieve mainstream adoption, an adequate and sustainable commercial or publicly available charging infrastructure was pursued by The EV Project to encourage increased PEV purchases by alleviating range anxiety, and by removing adoption barriersmore » for consumers without a dedicated overnight parking location to provide a home-base charger. This included determining a business model for publicly accessible charge infrastructure. To establish this business model, The EV Project team created a fee for charge model along with various ancillary offerings related to charging that would generate revenue. And after placing chargers in the field the Project rolled out this fee structure.« less
ENVRIplus - European collaborative development of environmental infrastructures
NASA Astrophysics Data System (ADS)
Asmi, A.; Brus, M.; Kutsch, W. L.; Laj, P.
2016-12-01
European Research Infrastructures (RI) are built using ESFRI process, which dictates the steps towards a common European RIs. Building each RI separately creates unnessary barriers towards service users (e.g. on differing standards) and is not effiicient in e.g. e-science tool or data system development. To answer these inter-RI issues, the European Commission has funded several large scale cluster projectsto bring these RIs together already in planning and development phases to develop common tools, standards and methodologies, as well as learn from the exisiting systems. ENVRIplus is the cluster project for the environmental RIs in Europe, and provides platform for common development and sharing within the RI community. The project is organized around different themes, each having several workpackages with specific tasks. Major themesof the ENVRIplus are: Technical innovation, including tasks such as RI technology transfer, new observation techniques, autonomous operation, etc.; Data for science, with tasks such as RI reference model development, data discovery and citation, data publication, processing, etc.; Access to RIs, with specific tasks on interdicplinary and transnational access to RI services, and common access governance; Societal relevance and understanding, tackling on ethical issues on RI operations and understanding on human-environmental system and citizen science approaches, among others; Knowledge transfer, particularly between the RIs, and with developing RI organizations, organizing training and staff exchange; and Communication and dissemination, working towards a common environmental RI community (ENVRI community platform), and creating an own advisory RI discussion board (BEERi), and disseminating the ENVRIplus products globally. Importantly, all ENVRIplus results are open to any users from any country. Also, collaboration with international RIs and user communities are crucial to the success of the ENVRI initiatives. Overall goal is to do science globally, to answer global and regional critical challenges. The presentation will not only present the project, its state after nearly 2 years of operation, but will alsop present ideas towards building international and even more interdiciplinary collaboration on research infrastructures and their users.
Broad economic benefits of freight transportation infrastructure improvement.
DOT National Transportation Integrated Search
2012-06-01
This project strives to introduce a novel way to quantify the broad re-organization benefits associated with an : improvement in the freight infrastructure. Using the approach based on 1) the technique known as Field of Influence, and : 2) RAS adjust...
ERIC Educational Resources Information Center
Hassler, Vesna; Biely, Helmut
1999-01-01
Describes the Digital Signature Project that was developed in Austria to establish an infrastructure for applying smart card-based digital signatures in banking and electronic-commerce applications. Discusses the need to conform to international standards, an international certification infrastructure, and security features for a public directory…
DOT National Transportation Integrated Search
2016-01-11
The goal of the project was the implementation of interferometric synthetic aperture radar : (InSAR) monitoring techniques to allow for early detection of geohazard, potentially : affecting the transportation infrastructure, as well as the monitoring...
The research overview of the US EPA Aging Water Infrastructure Research Program includes: Research areas: condition assessment; rehabilitation; advanced design/treatment concepts and Research project focused on innovative rehabilitation technologies to reduce costs and increase...
DOT National Transportation Integrated Search
2013-10-01
Creating transportation infrastructure, which can clean up itself and contaminated air surrounding it, can be a : groundbreaking approach in addressing environmental challenges of our time. This project has explored a possibility of : depositing coat...
DOT National Transportation Integrated Search
2014-09-01
This project studied application of acoustic emission (AE) technology to perform structural : health monitoring of highway bridges. Highway bridges are a vital part of transportation : infrastructure and there is need for reliable non-destructive met...
Introduction of Cooperative Vehicle-to-Infrastructure Systems to Improve Speed Harmonization
DOT National Transportation Integrated Search
2016-03-01
This project executed a preliminary experiment of vehicle-to-infrastructure (V2I)-based speed harmonization in which speed guidance was communicated directly to vehicles. This experiment involved a set of micro-simulation experiments and a limited nu...
DOT National Transportation Integrated Search
1997-03-01
This is our Management Advisory Memorandum on the National Airspace : System (NAS) Infrastructure Management System (NIMS) prototype : project in the Federal Aviation Administration (FAA). Our review was : initiated in response to a hotline complaint...
Prototype Rail Crossing Violation Warning Application Project Report.
DOT National Transportation Integrated Search
2017-09-05
This report is the Project Report for the Rail Crossing Violation Warning (RCVW) safety application developed for the project on Rail Crossing Violation Warning Application and Infrastructure Connection, providing a means for equipped connected vehic...
Learn about the Clean Water State Revolving Fund (CWSRF)
The Clean Water State Revolving Fund provides financial assistance for a range of water infrastructure projects. Learn how it works, project eligibility, and types of financial assistance it can provide for water quality projects.
Venezuela's Bolivarian Schools Project.
ERIC Educational Resources Information Center
Diaz, Maria Magnolia Santamaria
2002-01-01
Discusses efforts by the Venezuelan government to improve the nation's school infrastructure through the Bolivarian Schools Project administered by the Ministry of Education, Culture and Sport. The project set educational principles which are guiding current school building efforts. (EV)
Abstracting application deployment on Cloud infrastructures
NASA Astrophysics Data System (ADS)
Aiftimiei, D. C.; Fattibene, E.; Gargana, R.; Panella, M.; Salomoni, D.
2017-10-01
Deploying a complex application on a Cloud-based infrastructure can be a challenging task. In this contribution we present an approach for Cloud-based deployment of applications and its present or future implementation in the framework of several projects, such as “!CHAOS: a cloud of controls” [1], a project funded by MIUR (Italian Ministry of Research and Education) to create a Cloud-based deployment of a control system and data acquisition framework, “INDIGO-DataCloud” [2], an EC H2020 project targeting among other things high-level deployment of applications on hybrid Clouds, and “Open City Platform”[3], an Italian project aiming to provide open Cloud solutions for Italian Public Administrations. We considered to use an orchestration service to hide the complex deployment of the application components, and to build an abstraction layer on top of the orchestration one. Through Heat [4] orchestration service, we prototyped a dynamic, on-demand, scalable platform of software components, based on OpenStack infrastructures. On top of the orchestration service we developed a prototype of a web interface exploiting the Heat APIs. The user can start an instance of the application without having knowledge about the underlying Cloud infrastructure and services. Moreover, the platform instance can be customized by choosing parameters related to the application such as the size of a File System or the number of instances of a NoSQL DB cluster. As soon as the desired platform is running, the web interface offers the possibility to scale some infrastructure components. In this contribution we describe the solution design and implementation, based on the application requirements, the details of the development of both the Heat templates and of the web interface, together with possible exploitation strategies of this work in Cloud data centers.
den Besten, Matthijs; Thomas, Arthur J; Schroeder, Ralph
2009-04-22
It is often said that the life sciences are transforming into an information science. As laboratory experiments are starting to yield ever increasing amounts of data and the capacity to deal with those data is catching up, an increasing share of scientific activity is seen to be taking place outside the laboratories, sifting through the data and modelling "in silico" the processes observed "in vitro." The transformation of the life sciences and similar developments in other disciplines have inspired a variety of initiatives around the world to create technical infrastructure to support the new scientific practices that are emerging. The e-Science programme in the United Kingdom and the NSF Office for Cyberinfrastructure are examples of these. In Switzerland there have been no such national initiatives. Yet, this has not prevented scientists from exploring the development of similar types of computing infrastructures. In 2004, a group of researchers in Switzerland established a project, SwissBioGrid, to explore whether Grid computing technologies could be successfully deployed within the life sciences. This paper presents their experiences as a case study of how the life sciences are currently operating as an information science and presents the lessons learned about how existing institutional and technical arrangements facilitate or impede this operation. SwissBioGrid gave rise to two pilot projects: one for proteomics data analysis and the other for high-throughput molecular docking ("virtual screening") to find new drugs for neglected diseases (specifically, for dengue fever). The proteomics project was an example of a data management problem, applying many different analysis algorithms to Terabyte-sized datasets from mass spectrometry, involving comparisons with many different reference databases; the virtual screening project was more a purely computational problem, modelling the interactions of millions of small molecules with a limited number of protein targets on the coat of the dengue virus. Both present interesting lessons about how scientific practices are changing when they tackle the problems of large-scale data analysis and data management by means of creating a novel technical infrastructure. In the experience of SwissBioGrid, data intensive discovery has a lot to gain from close collaboration with industry and harnessing distributed computing power. Yet the diversity in life science research implies only a limited role for generic infrastructure; and the transience of support means that researchers need to integrate their efforts with others if they want to sustain the benefits of their success, which are otherwise lost.
2017-01-05
AFRL-AFOSR-JP-TR-2017-0002 Advanced Computational Methods for Optimization of Non-Periodic Inspection Intervals for Aging Infrastructure Manabu...Computational Methods for Optimization of Non-Periodic Inspection Intervals for Aging Infrastructure 5a. CONTRACT NUMBER 5b. GRANT NUMBER FA2386...UNLIMITED: PB Public Release 13. SUPPLEMENTARY NOTES 14. ABSTRACT This report for the project titled ’Advanced Computational Methods for Optimization of
Critical Infrastructure: The National Asset Database
2007-07-16
Infrastructure: The National Asset Database 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e...upon which federal resources, including infrastructure protection grants , are allocated. According to DHS, both of those assumptions are wrong. DHS...assets that it has determined are critical to the nation. Also, while the National Asset Database has been used to support federal grant -making
Controlled Hydrogen Fleet and Infrastructure Demonstration Project
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dr. Scott Staley
2010-03-31
This program was undertaken in response to the US Department of Energy Solicitation DE-PS30-03GO93010, resulting in this Cooperative Agreement with the Ford Motor Company and BP to demonstrate and evaluate hydrogen fuel cell vehicles and required fueling infrastructure. Ford initially placed 18 hydrogen fuel cell vehicles (FCV) in three geographic regions of the US (Sacramento, CA; Orlando, FL; and southeast Michigan). Subsequently, 8 advanced technology vehicles were developed and evaluated by the Ford engineering team in Michigan. BP is Ford's principal partner and co-applicant on this project and provided the hydrogen infrastructure to support the fuel cell vehicles. BP ultimatelymore » provided three new fueling stations. The Ford-BP program consists of two overlapping phases. The deliverables of this project, combined with those of other industry consortia, are to be used to provide critical input to hydrogen economy commercialization decisions by 2015. The program's goal is to support industry efforts of the US President's Hydrogen Fuel Initiative in developing a path to a hydrogen economy. This program was designed to seek complete systems solutions to address hydrogen infrastructure and vehicle development, and possible synergies between hydrogen fuel electricity generation and transportation applications. This project, in support of that national goal, was designed to gain real world experience with Hydrogen powered Fuel Cell Vehicles (H2FCV) 'on the road' used in everyday activities, and further, to begin the development of the required supporting H2 infrastructure. Implementation of a new hydrogen vehicle technology is, as expected, complex because of the need for parallel introduction of a viable, available fuel delivery system and sufficient numbers of vehicles to buy fuel to justify expansion of the fueling infrastructure. Viability of the fuel structure means widespread, affordable hydrogen which can return a reasonable profit to the fuel provider, while viability of the vehicle requires an expected level of cost, comfort, safety and operation, especially driving range, that consumers require. This presents a classic 'chicken and egg' problem, which Ford believes can be solved with thoughtful implementation plans. The eighteen Ford Focus FCV vehicles that were operated for this demonstration project provided the desired real world experience. Some things worked better than expected. Most notable was the robustness and life of the fuel cell. This is thought to be the result of the full hybrid configuration of the drive system where the battery helps to overcome the performance reduction associated with time related fuel cell degradation. In addition, customer satisfaction surveys indicated that people like the cars and the concept and operated them with little hesitation. Although the demonstrated range of the cars was near 200 miles, operators felt constrained because of the lack of a number of conveniently located fueling stations. Overcoming this major concern requires overcoming a key roadblock, fuel storage, in a manner that permits sufficient quantity of fuel without sacrificing passenger or cargo capability. Fueling infrastructure, on the other hand, has been problematic. Only three of a planned seven stations were opened. The difficulty in obtaining public approval and local government support for hydrogen fuel, based largely on the fear of hydrogen that grew from past disasters and atomic weaponry, has inhibited progress and presents a major roadblock to implementation. In addition the cost of hydrogen production, in any of the methodologies used in this program, does not show a rapid reduction to commercially viable rates. On the positive side of this issue was the demonstrated safety of the fueling station, equipment and process. In the Ford program, there were no reported safety incidents.« less
NASA Astrophysics Data System (ADS)
Bogdanov, A. V.; Iuzhanin, N. V.; Zolotarev, V. I.; Ezhakova, T. R.
2017-12-01
In this article the problem of scientific projects support throughout their lifecycle in the computer center is considered in every aspect of support. Configuration Management system plays a connecting role in processes related to the provision and support of services of a computer center. In view of strong integration of IT infrastructure components with the use of virtualization, control of infrastructure becomes even more critical to the support of research projects, which means higher requirements for the Configuration Management system. For every aspect of research projects support, the influence of the Configuration Management system is being reviewed and development of the corresponding elements of the system is being described in the present paper.
A hybrid computational strategy to address WGS variant analysis in >5000 samples.
Huang, Zhuoyi; Rustagi, Navin; Veeraraghavan, Narayanan; Carroll, Andrew; Gibbs, Richard; Boerwinkle, Eric; Venkata, Manjunath Gorentla; Yu, Fuli
2016-09-10
The decreasing costs of sequencing are driving the need for cost effective and real time variant calling of whole genome sequencing data. The scale of these projects are far beyond the capacity of typical computing resources available with most research labs. Other infrastructures like the cloud AWS environment and supercomputers also have limitations due to which large scale joint variant calling becomes infeasible, and infrastructure specific variant calling strategies either fail to scale up to large datasets or abandon joint calling strategies. We present a high throughput framework including multiple variant callers for single nucleotide variant (SNV) calling, which leverages hybrid computing infrastructure consisting of cloud AWS, supercomputers and local high performance computing infrastructures. We present a novel binning approach for large scale joint variant calling and imputation which can scale up to over 10,000 samples while producing SNV callsets with high sensitivity and specificity. As a proof of principle, we present results of analysis on Cohorts for Heart And Aging Research in Genomic Epidemiology (CHARGE) WGS freeze 3 dataset in which joint calling, imputation and phasing of over 5300 whole genome samples was produced in under 6 weeks using four state-of-the-art callers. The callers used were SNPTools, GATK-HaplotypeCaller, GATK-UnifiedGenotyper and GotCloud. We used Amazon AWS, a 4000-core in-house cluster at Baylor College of Medicine, IBM power PC Blue BioU at Rice and Rhea at Oak Ridge National Laboratory (ORNL) for the computation. AWS was used for joint calling of 180 TB of BAM files, and ORNL and Rice supercomputers were used for the imputation and phasing step. All other steps were carried out on the local compute cluster. The entire operation used 5.2 million core hours and only transferred a total of 6 TB of data across the platforms. Even with increasing sizes of whole genome datasets, ensemble joint calling of SNVs for low coverage data can be accomplished in a scalable, cost effective and fast manner by using heterogeneous computing platforms without compromising on the quality of variants.
NASA Astrophysics Data System (ADS)
Wyborn, L. A.; Evans, B. J. K.; Pugh, T.; Lescinsky, D. T.; Foster, C.; Uhlherr, A.
2014-12-01
The National Computational Infrastructure (NCI) at the Australian National University (ANU) is a partnership between CSIRO, ANU, Bureau of Meteorology (BoM) and Geoscience Australia. Recent investments in a 1.2 PFlop Supercomputer (Raijin), ~ 20 PB data storage using Lustre filesystems and a 3000 core high performance cloud have created a hybrid platform for higher performance computing and data-intensive science to enable large scale earth and climate systems modelling and analysis. There are > 3000 users actively logging in and > 600 projects on the NCI system. Efficiently scaling and adapting data and software systems to petascale infrastructures requires the collaborative development of an architecture that is designed, programmed and operated to enable users to interactively invoke different forms of in-situ computation over complex and large scale data collections. NCI makes available major and long tail data collections from both the government and research sectors based on six themes: 1) weather, climate and earth system science model simulations, 2) marine and earth observations, 3) geosciences, 4) terrestrial ecosystems, 5) water and hydrology and 6) astronomy, bio and social. Collectively they span the lithosphere, crust, biosphere, hydrosphere, troposphere, and stratosphere. Collections are the operational form for data management and access. Similar data types from individual custodians are managed cohesively. Use of international standards for discovery and interoperability allow complex interactions within and between the collections. This design facilitates a transdisciplinary approach to research and enables a shift from small scale, 'stove-piped' science efforts to large scale, collaborative systems science. This new and complex infrastructure requires a move to shared, globally trusted software frameworks that can be maintained and updated. Workflow engines become essential and need to integrate provenance, versioning, traceability, repeatability and publication. There are also human resource challenges as highly skilled HPC/HPD specialists, specialist programmers, and data scientists are required whose skills can support scaling to the new paradigm of effective and efficient data-intensive earth science analytics on petascale, and soon to be exascale systems.
2011-03-01
AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Army War College,Strategic... projects build - ing Chinese infrastructure in both rural and urban ar- eas.53 The Chinese model thus instills a lifelong dedica- tion to and...Studies Institute,122 Forbes Avenue,Carlisle,PA,17013 8. PERFORMING ORGANIZATION REPORT NUMBER 9. SPONSORING/MONITORING AGENCY NAME(S) AND ADDRESS(ES
DOT National Transportation Integrated Search
2010-10-07
"This project examined the safety and operation of hydrogen (H2) fueling system infrastructure in : northern climates. A multidisciplinary team lead by the University of Vermont (UVM), : combined with investigators from Zhejiang and Tsinghua Universi...
78 FR 34661 - Submission for OMB Review; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2013-06-10
...: Infrastructure development; collaboration and coordination among partner agencies, organizations, and service...). Project LAUNCH promotes the healthy development and wellness of children ages birth to eight years. A..., build infrastructure, and improve methods for providing services. Grantees implement a range of public...
Using Low Impact Development and Green Infrastructure to Get Benefits From FEMA Programs
LID and Green Infrastructure is a cost-effective, resilient approach to stormwater management. Projects that reduce flood losses may be eligible for grant funding through FEMA and may allow communities to claim points through FEMA's rating system -CRS
NASA Astrophysics Data System (ADS)
Acedo, L.; Villanueva-Oller, J.; Moraño, J. A.; Villanueva, R.-J.
2013-01-01
The Berkeley Open Infrastructure for Network Computing (BOINC) has become the standard open source solution for grid computing in the Internet. Volunteers use their computers to complete an small part of the task assigned by a dedicated server. We have developed a BOINC project called Neurona@Home whose objective is to simulate a cellular automata random network with, at least, one million neurons. We consider a cellular automata version of the integrate-and-fire model in which excitatory and inhibitory nodes can activate or deactivate neighbor nodes according to a set of probabilistic rules. Our aim is to determine the phase diagram of the model and its behaviour and to compare it with the electroencephalographic signals measured in real brains.
The Next Level in Automated Solar Flare Forecasting: the EU FLARECAST Project
NASA Astrophysics Data System (ADS)
Georgoulis, M. K.; Bloomfield, D.; Piana, M.; Massone, A. M.; Gallagher, P.; Vilmer, N.; Pariat, E.; Buchlin, E.; Baudin, F.; Csillaghy, A.; Soldati, M.; Sathiapal, H.; Jackson, D.; Alingery, P.; Argoudelis, V.; Benvenuto, F.; Campi, C.; Florios, K.; Gontikakis, C.; Guennou, C.; Guerra, J. A.; Kontogiannis, I.; Latorre, V.; Murray, S.; Park, S. H.; Perasso, A.; Sciacchitano, F.; von Stachelski, S.; Torbica, A.; Vischi, D.
2017-12-01
We attempt an informative description of the Flare Likelihood And Region Eruption Forecasting (FLARECAST) project, European Commission's first large-scale investment to explore the limits of reliability and accuracy achieved for the forecasting of major solar flares. We outline the consortium, top-level objectives and first results of the project, highlighting the diversity and fusion of expertise needed to deliver what was promised. The project's final product, featuring an openly accessible, fully modular and free to download flare forecasting facility will be delivered in early 2018. The project's three objectives, namely, science, research-to-operations and dissemination / communication, are also discussed: in terms of science, we encapsulate our close-to-final assessment on how close (or far) are we from a practically exploitable solar flare forecasting. In terms of R2O, we briefly describe the architecture of the FLARECAST infrastructure that includes rigorous validation for each forecasting step. From the three different communication levers of the project we finally focus on lessons learned from the two-way interaction with the community of stakeholders and governmental organizations. The FLARECAST project has received funding from the European Union's Horizon 2020 research and innovation programme under grant agreement No. 640216.
Diaspora, faith, and science: building a Mouride hospital in Senegal.
Foley, Ellen E; Babou, Cheikh Anta
2011-01-01
This article examines a development initiative spearheaded by the members of a transnational diaspora – the creation of a medical hospital in the holy city of Touba in central Senegal. Although the construction of the hospital is decidedly a philanthropic project, Hôpital Matlaboul Fawzaini is better understood as part of the larger place-making project of the Muridiyya and the pursuit of symbolic capital by a particular Mouride "dahira". The "dahira's" project illuminates important processes of forging global connections and transnational localities, and underscores the importance of understanding the complex motivations behind diaspora development. The hospital's history reveals the delicate negotiations between state actors and diaspora organizations, and the complexities of public–private partnerships for development. In a reversal of state withdrawal in the neo-liberal era, a diaspora association was able to wrest new financial commitments from the state by completing a large infrastructure project. Despite this success, we argue that these kinds of projects, which are by nature uneven and sporadic, reflect particular historical conjunctures and do not offer a panacea for the failure of state-led development.
Large Payload Ground Transportation and Test Considerations
NASA Technical Reports Server (NTRS)
Rucker, Michelle A.
2016-01-01
During test and verification planning for the Altair lunar lander project, a National Aeronautics and Space Administration (NASA) study team identified several ground transportation and test issues related to the large payload diameter. Although the entire Constellation Program-including Altair-has since been canceled, issues identified by the Altair project serve as important lessons learned for payloads greater than 7 m diameter being considered for NASA's new Space Launch System (SLS). A transportation feasibility study found that Altair's 8.97 m diameter Descent Module would not fit inside available aircraft. Although the Ascent Module cabin was only 2.35 m diameter, the long reaction control system booms extended nearly to the Descent Module diameter, making it equally unsuitable for air transportation without removing the booms and invalidating assembly workmanship screens or acceptance testing that had already been performed. Ground transportation of very large payloads over extended distances is not generally permitted by most states, so overland transportation alone would not be an option. Limited ground transportation to the nearest waterway may be possible, but water transportation could take as long as 66 days per production unit, depending on point of origin and acceptance test facility; transportation from the western United States would require transit through the Panama Canal to access the Kennedy Space Center launch site. Large payloads also pose acceptance test and ground processing challenges. Although propulsion, mechanical vibration, and reverberant acoustic test facilities at NASA's Plum Brook Station have been designed to accommodate large spacecraft, special handling and test work-arounds may be necessary, which could increase cost, schedule, and technical risk. Once at the launch site, there are no facilities currently capable of accommodating the combination of large payload size and hazardous processing such as hypergolic fuels, pyrotechnic devices, and high pressure gasses. Ironically, the limiting factor to a national heavy lift strategy may not be the rocket technology needed to throw a heavy payload, but rather the terrestrial infrastructure-roads, bridges, airframes, and buildings-necessary to transport, acceptance test, and process large spacecraft. Failure to carefully consider where and how large spacecraft are manufactured, tested, and launched could result in unforeseen cost to modify existing (or develop new) infrastructure, or incur additional risk due to increased handling operations or eliminating key verifications.
Chaffin, Brian C; Shuster, William D; Garmestani, Ahjond S; Furio, Brooke; Albro, Sandra L; Gardiner, Mary; Spring, MaLisa; Green, Olivia Odom
2016-12-01
Green infrastructure installations such as rain gardens and bioswales are increasingly regarded as viable tools to mitigate stormwater runoff at the parcel level. The use of adaptive management to implement and monitor green infrastructure projects as experimental attempts to manage stormwater has not been adequately explored as a way to optimize green infrastructure performance or increase social and political acceptance. Efforts to improve stormwater management through green infrastructure suffer from the complexity of overlapping jurisdictional boundaries, as well as interacting social and political forces that dictate the flow, consumption, conservation and disposal of urban wastewater flows. Within this urban milieu, adaptive management-rigorous experimentation applied as policy-can inform new wastewater management techniques such as the implementation of green infrastructure projects. In this article, we present a narrative of scientists and practitioners working together to apply an adaptive management approach to green infrastructure implementation for stormwater management in Cleveland, Ohio. In Cleveland, contextual legal requirements and environmental factors created an opportunity for government researchers, stormwater managers and community organizers to engage in the development of two distinct sets of rain gardens, each borne of unique social, economic and environmental processes. In this article we analyze social and political barriers to applying adaptive management as a framework for implementing green infrastructure experiments as policy. We conclude with a series of lessons learned and a reflection on the prospects for adaptive management to facilitate green infrastructure implementation for improved stormwater management. Copyright © 2016 Elsevier Ltd. All rights reserved.
Brokering Capabilities for EarthCube - supporting Multi-disciplinary Earth Science Research
NASA Astrophysics Data System (ADS)
Jodha Khalsa, Siri; Pearlman, Jay; Nativi, Stefano; Browdy, Steve; Parsons, Mark; Duerr, Ruth; Pearlman, Francoise
2013-04-01
The goal of NSF's EarthCube is to create a sustainable infrastructure that enables the sharing of all geosciences data, information, and knowledge in an open, transparent and inclusive manner. Brokering of data and improvements in discovery and access are a key to data exchange and promotion of collaboration across the geosciences. In this presentation we describe an evolutionary process of infrastructure and interoperability development focused on participation of existing science research infrastructures and augmenting them for improved access. All geosciences communities already have, to a greater or lesser degree, elements of an information infrastructure in place. These elements include resources such as data archives, catalogs, and portals as well as vocabularies, data models, protocols, best practices and other community conventions. What is necessary now is a process for levering these diverse infrastructure elements into an overall infrastructure that provides easy discovery, access and utilization of resources across disciplinary boundaries. Brokers connect disparate systems with only minimal burdens upon those systems, and enable the infrastructure to adjust to new technical developments and scientific requirements as they emerge. Robust cyberinfrastructure will arise only when social, organizational, and cultural issues are resolved in tandem with the creation of technology-based services. This is a governance issue, but is facilitated by infrastructure capabilities that can impact the uptake of new interdisciplinary collaborations and exchange. Thus brokering must address both the cyberinfrastructure and computer technology requirements and also the social issues to allow improved cross-domain collaborations. This is best done through use-case-driven requirements and agile, iterative development methods. It is important to start by solving real (not hypothetical) information access and use problems via small pilot projects that develop capabilities targeted to specific communities. Brokering, as a critical capability for connecting systems, evolves over time through more connections and increased functionality. This adaptive process allows for continual evaluation as to how well science-driven use cases are being met. There is a near term, and possibly unique, opportunity through EarthCube and European e-Infrastructure projects to increase the impact and interconnectivity of projects. In the developments described in this presentation, brokering has been demonstrated to be an essential part of a robust, adaptive technical infrastructure and demonstration and user scenarios can address of both the governance and detailed implementation paths forward. The EarthCube Brokering roadmap proposes the expansion of brokering pilots into fully operational prototypes that work with the broader science and informatics communities to answer these questions, connect existing and emerging systems, and evolve the EarthCube infrastructure.
A Method for Mapping Future Urbanization in the United States
NASA Technical Reports Server (NTRS)
Bounoua, Lahouari; Nigro, Joseph; Thome, Kurtis; Zhang, Ping; Fathi, Najlaa; Lachir, Asia
2018-01-01
Cities are poised to absorb additional people. Their sustainability, or ability to accommodate a population increase without depleting resources or compromising future growth, depends on whether they harness the efficiency gains from urban land management. Population is often projected as a bulk national number without details about spatial distribution. We use Landsat and population data in a methodology to project and map U.S. urbanization for the year 2020 and document its spatial pattern. This methodology is important to spatially disaggregate projected population and assist land managers to monitor land use, assess infrastructure and distribute resources. We found the U.S. west coast urban areas to have the fastest population growth with relatively small land consumption resulting in future decrease in per capita land use. Except for Miami (FL), most other U.S. large urban areas, especially in the Midwest, are growing spatially faster than their population and inadvertently consuming land needed for ecosystem services. In large cities, such as New York, Chicago, Houston and Miami, land development is expected more in suburban zones than urban cores. In contrast, in Los Angeles land development within the city core is greater than in its suburbs.
Geohydrology of the shallow aquifers in the Boulder-Longmont area, Colorado
Robson, Stanley G.; Heiny, Janet S.; Arnold, L.R.
2000-01-01
Urban areas commonly rely on ground water for at least part of the municipal water supply, and as population increases, urban areas expand and require larger volumes of water. However, the expansion of an urban area can reduce ground-water availability. This may occur through processes of depletion (withdrawal of most of the available ground water), degradation (chemicals used in the urban area keep into the ground and contaminate the ground water), and preemption (cost or restrictions on pumping ground water from under extensively urbanized areas may he prohibitive). Thus, a vital natural resource needed to support the growth of an urban area and its infrastructure can become less available because of growth itself.The diminished availability of natural resources caused by expansion of urban areas is not unique to water resources. For example, large volumes of aggregate (sand and gravel) are used in concrete and asphalt to build and maintain the infrastructure (buildings, roads, airports, and so forth) of an urban area. Yet, mining of aggregate commonly is preempted by urban expansion; for example, it cannot he mined from under a subdivision. Energy resources such as coal, oil, and natural gas likewise are critical to the growth and existence of an urban area but may become less available as an urban area expands and preempts mining and drilling.In 1996, the U.S. Geological Survey began work on a national initiative designed to provide information on the availability of those natural resources (water, minerals, energy, and biota) that are critical to maintaining the Nation's infrastructure or that may become less available because of urban expansion. The initiative began with a 3-year demonstration project to develop procedures for assessing resources and methods for interpreting and publishing information in digital and traditional paper formats. The Front Range urban corridor of Colorado was chosen as the demonstration area (fig. 1), and the project was titled the Front Range Infrastructure Resources Project (FRIRP). This report and those of Robson (1996), Robson and others (1998), and Robson and others (2000a, 2000b, 2000c) (fig. 1) are the results of FRIRP water resources investigations; reports pertaining to geology, minerals, energy, biota, and cartography of the FRIRP are published separately. The water-resources studies of the FRIRP were undertaken in cooperation with the Colorado Department of Natural Resources, Division of Water Resources, and the Colorado Water Conservation Board.
Geohydrology of the shallow aquifers in the Greeley-Nunn area, Colorado
Robson, Stanley G.; Arnold, L.R.; Heiny, Janet S.
2000-01-01
Urban areas commonly rely on ground water for at least part of the municipal water supply, and as population increases, urban areas expand and require larger volumes of water. However, the expansion of an urban area can reduce ground-water availability. This may occur through processes of depletion (withdrawal of most of the available ground water), degradation (chemicals used in the urban area seep into the ground and contaminate the ground water), and preemption (cost or restrictions on pumping ground water from under extensively urbanized areas may be prohibitive). Thus, a vital natural resource needed to support the growth of an urban area and its infrastructure can become less available because of growth itself.The diminished availability of natural resources caused by expansion of urban areas is not unique to water resources. For example, large volumes of aggregate (sand and gravel) are used in concrete and asphalt to build and maintain the infrastructure (buildings, roads, airports, and so forth) of an urban area. Yet, mining of aggregate commonly is preempted by urban expansion; for example, it cannot be mined from under a subdivision. Energy resources such as coal, oil, and natural gas likewise are critical to the growth and vitality of an urban area but may become less available as an urban area expands and preempts mining and drilling.In 1996, the U.S. Geological Survey began work on a national initiative designed to provide information on the availability of those natural resources (water, minerals, energy, and biota) that are critical to maintaining the Nation's infrastructure or that may become less available because of urban expansion. The initiative began with a 3-year demonstration project to develop procedures for assessing resources and methods for interpreting and publishing information in digital and traditional paper formats. The Front Range urban corridor of Colorado was chosen as the demonstration area (fig. 1), and the project was titled the Front Range Infrastructure Resources Project (FRIRP). This report and those of Robson (1996), Robson and others (1998), and Robson and others (2000a, 2000b, 2000c) are the results of FRIRP water-resources investigations; reports pertaining to geology, minerals, energy, biota, and cartography of the FRIRP are published separately. The water resources studies of the FRIRP were undertaken in cooperation with the Colorado Department of Natural Resources, Division of Water Resources. and the Colorado Water Conservation Board.
Geohydrology of the shallow aquifers in the Fort Lupton-Gilchrest area, Colorado
Robson, Stanley G.; Heiny, Janet S.; Arnold, L.R.
2000-01-01
Urban areas commonly rely on ground water for at least part of the municipal water supply, and as population increases, urban areas expand and require larger volumes of water. However, the expansion of an urban area can reduce ground-water availability. This may occur through processes of depletion (withdrawal of most of the available ground water), degradation (chemicals used in the urban area seep into the ground and contaminate the ground water), and preemption (cost or restrictions on pumping ground water from under extensively urbanized areas may be prohibitive). Thus, a vital natural resource needed to support the growth of an urban area and its infrastructure can become less available because of growth itself.The diminished availability of natural resources caused by expansion of urban areas is not unique to water resources. For example, large volumes of aggregate (sand and gravel) are used in concrete and asphalt to build and maintain the infrastructure (buildings, roads, airports, and so forth) of an urban area. Yet, mining of aggregate commonly is preempted by urban expansion; for example, it cannot be mined from under a subdivision. Energy resources such as coal, oil, and natural gas likewise are critical to the growth and existence of an urban area but may become less available as an urban area expands and preempts mining and drilling.In 1996, the U.S. Geological Survey began work on a national initiative designed to provide information on the availability of those natural resources (water, minerals, energy, and biota) that are critical to maintaining the Nation's infrastructure or that may become less available because of urban expansion. The initiative began with a 3-year demonstration project to develop procedures for assessing resources and methods for interpreting and publishing information in digital and traditional paper formats. The Front Range urban corridor of Colorado was chosen as the demonstration area (fig. 1), and the project was titled the Front Range Infrastructure Resources Project (FRIRP). This report and those of Robson (1996), Robson and others (1998), and Robson and others (2000a, 2000b, 2000c) are the results of FRIRP water-resources investigations; reports pertaining to geology, minerals, energy, biota, and cartography of the FRIRP are published separately. The water resources studies of the FRIRP were undertaken in cooperation with the Colorado Department of Natural Resources, Division of Water Resources, and the Colorado Water Conservation Board.
Improving FHWA's Ability to Assess Highway Infrastructure Health : Pilot Study Report
DOT National Transportation Integrated Search
2012-07-01
This report documents the results of a pilot study conducted as part of a project on improving FHWAs ability to assess highway infrastructure health. As part of the pilot study, a section of Interstate 90 through South Dakota, Minnesota, and Wisco...
Lindberg, D A; Humphreys, B L
1995-01-01
The High-Performance Computing and Communications (HPCC) program is a multiagency federal effort to advance the state of computing and communications and to provide the technologic platform on which the National Information Infrastructure (NII) can be built. The HPCC program supports the development of high-speed computers, high-speed telecommunications, related software and algorithms, education and training, and information infrastructure technology and applications. The vision of the NII is to extend access to high-performance computing and communications to virtually every U.S. citizen so that the technology can be used to improve the civil infrastructure, lifelong learning, energy management, health care, etc. Development of the NII will require resolution of complex economic and social issues, including information privacy. Health-related applications supported under the HPCC program and NII initiatives include connection of health care institutions to the Internet; enhanced access to gene sequence data; the "Visible Human" Project; and test-bed projects in telemedicine, electronic patient records, shared informatics tool development, and image systems. PMID:7614116
Cloud Environment Automation: from infrastructure deployment to application monitoring
NASA Astrophysics Data System (ADS)
Aiftimiei, C.; Costantini, A.; Bucchi, R.; Italiano, A.; Michelotto, D.; Panella, M.; Pergolesi, M.; Saletta, M.; Traldi, S.; Vistoli, C.; Zizzi, G.; Salomoni, D.
2017-10-01
The potential offered by the cloud paradigm is often limited by technical issues, rules and regulations. In particular, the activities related to the design and deployment of the Infrastructure as a Service (IaaS) cloud layer can be difficult to apply and time-consuming for the infrastructure maintainers. In this paper the research activity, carried out during the Open City Platform (OCP) research project [1], aimed at designing and developing an automatic tool for cloud-based IaaS deployment is presented. Open City Platform is an industrial research project funded by the Italian Ministry of University and Research (MIUR), started in 2014. It intends to research, develop and test new technological solutions open, interoperable and usable on-demand in the field of Cloud Computing, along with new sustainable organizational models that can be deployed for and adopted by the Public Administrations (PA). The presented work and the related outcomes are aimed at simplifying the deployment and maintenance of a complete IaaS cloud-based infrastructure.
NASA Astrophysics Data System (ADS)
Clarke, Peter; Davenhall, Clive; Greenwood, Colin; Strong, Matthew
ESLEA, an EPSRC-funded project, aims to demonstrate the potential benefits of circuit-switched optical networks (lightpaths) to the UK e-Science community. This is being achieved by running a number of "proof of benefit" pilot applications over UKLight, the UK's first national optical research network. UKLight provides a new way for researchers to obtain dedicated "lightpaths" between remote sites and to deploy and test novel networking methods and technologies. It facilitates collaboration on global projects by providing a point of access to the fast growing international optical R&D infrastructure. A diverse range of data-intensive fields of academic endeavour are participating in the ESLEA project; all these groups require the integration of high-bandwidth switched lightpath circuits into their experimental and analysis infrastructure for international transport of high-volume applications data. In addition, network protocol research and development of circuit reservation mechanisms has been carried out to help the pilot applications to exploit the UKLight infrastructure effectively. Further information about ESLEA can be viewed at www.eslea.uklight.ac.uk. ESLEA activities are now coming to an end and work will finish from February to July 2007, depending upon the terms of funding of each pilot application. The first quarter of 2007 is considered the optimum time to hold a closing conference for the project. The objectives of the conference are to: 1. Provide a forum for the dissemination of research findings and learning experiences from the ESLEA project. 2. Enable colleagues from the UK and international e-Science communities to present, discuss and learn about the latest developments in networking technology. 3. Raise awareness about the deployment of the UKLight infrastructure and its relationship to SuperJANET 5. 4. Identify potential uses of UKLight by existing or future research projects
DOE Office of Scientific and Technical Information (OSTI.GOV)
Doug Cathro
The Lake Charles CCS Project is a large-scale industrial carbon capture and sequestration (CCS) project which will demonstrate advanced technologies that capture and sequester carbon dioxide (CO{sub 2}) emissions from industrial sources into underground formations. Specifically the Lake Charles CCS Project will accelerate commercialization of large-scale CO{sub 2} storage from industrial sources by leveraging synergy between a proposed petroleum coke to chemicals plant (the LCC Gasification Project) and the largest integrated anthropogenic CO{sub 2} capture, transport, and monitored sequestration program in the U.S. Gulf Coast Region. The Lake Charles CCS Project will promote the expansion of EOR in Texas andmore » Louisiana and supply greater energy security by expanding domestic energy supplies. The capture, compression, pipeline, injection, and monitoring infrastructure will continue to sequester CO{sub 2} for many years after the completion of the term of the DOE agreement. The objectives of this project are expected to be fulfilled by working through two distinct phases. The overall objective of Phase 1 was to develop a fully definitive project basis for a competitive Renewal Application process to proceed into Phase 2 - Design, Construction and Operations. Phase 1 includes the studies attached hereto that will establish: the engineering design basis for the capture, compression and transportation of CO{sub 2} from the LCC Gasification Project, and the criteria and specifications for a monitoring, verification and accounting (MVA) plan at the Hastings oil field in Texas. The overall objective of Phase 2, provided a successful competitive down-selection, is to execute design, construction and operations of three capital projects: (1) the CO{sub 2} capture and compression equipment, (2) a Connector Pipeline from the LLC Gasification Project to the Green Pipeline owned by Denbury and an affiliate of Denbury, and (3) a comprehensive MVA system at the Hastings oil field.« less
Hydrogen Infrastructure Testing and Research Facility
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
2017-04-10
Learn about the Hydrogen Infrastructure Testing and Research Facility (HITRF), where NREL researchers are working on vehicle and hydrogen infrastructure projects that aim to enable more rapid inclusion of fuel cell and hydrogen technologies in the market to meet consumer and national goals for emissions reduction, performance, and energy security. As part of NREL’s Energy Systems Integration Facility (ESIF), the HITRF is designed for collaboration with a wide range of hydrogen, fuel cell, and transportation stakeholders.
Methane Gas Emissions - is Older Infrastructure Leakier?
NASA Astrophysics Data System (ADS)
Wendt, L. P.; Caulton, D.; Zondlo, M. A.; Lane, H.; Lu, J.; Golston, L.; Pan, D.
2015-12-01
Large gains in natural gas production from hydraulic fracturing is reinvigorating the US energy economy. It is a clean burning fuel with lower emissions than that of coal or oil. Studies show that methane (CH4) leaks from natural gas infrastructure vary widely. A broader question is whether leak rates of methane might offset the benefits of combustion of natural gas. Excess methane (CH4) is a major greenhouse gas with a radiative forcing constant of 25 times that of CO2 when projected over a 100-year period. An extensive field study of 250 wells in the Marcellus Shale conducted in July 2015 examined the emission rates of this region and identifed super-emitters. Spud production data will provide information as to whether older infrastructure is responsible for more of the emissions. Quantifying the emission rate was determined by extrapolating methane releases at a distance from private well pads using an inverse Gaussian plume model. Wells studied were selected by prevailing winds, distance from public roads, and topographical information using commercial (ARCGIS and Google Earth), non-profit (drillinginfo), and government (State of PA) databases. Data were collected from the mobile sensing lab (CH4, CO2 and H2O sensors), as well as from a stationary tower. Emission rates from well pads will be compared to their original production (spud dates) to evaluate whether infrastructure age and total production correlates with the observed leak rates. Very preliminary results show no statistical correlation between well pad production rates and observed leak rates.
Multimodal network models for robust transportation systems.
DOT National Transportation Integrated Search
2009-10-01
Since transportation infrastructure projects have a lifetime of many decades, project developers must consider : not only the current demand for the project but also the future demand. Future demand is of course uncertain and should : be treated as s...
15 CFR 292.4 - Information infrastructure projects.
Code of Federal Regulations, 2011 CFR
2011-01-01
... authority of the governing or managing organization to conduct the proposed activities; qualifications of... each solicitation for unique projects. (b) Project objective. The purpose of these projects is to... funding in connection with that award. Renewal of an award to increase funding or extend the period of...
Enabling cross-disciplinary research by linking data to Open Access publications
NASA Astrophysics Data System (ADS)
Rettberg, N.
2012-04-01
OpenAIREplus focuses on the linking of research data to associated publications. The interlinking of research objects has implications for optimising the research process, allowing the sharing, enrichment and reuse of data, and ultimately serving to make open data an essential part of first class research. The growing call for more concrete data management and sharing plans, apparent at funder and national level, is complemented by the increasing support for a scientific infrastructure that supports the seamless access to a range of research materials. This paper will describe the recently launched OpenAIREplus and will detail how it plans to achieve its goals of developing an Open Access participatory infrastructure for scientific information. OpenAIREplus extends the current collaborative OpenAIRE project, which provides European researchers with a service network for the deposit of peer-reviewed FP7 grant-funded Open Access publications. This new project will focus on opening up the infrastructure to data sources from subject-specific communities to provide metadata about research data and publications, facilitating the linking between these objects. The ability to link within a publication out to a citable database, or other research data material, is fairly innovative and this project will enable users to search, browse, view, and create relationships between different information objects. In this regard, OpenAIREplus will build on prototypes of so-called "Enhanced Publications", originally conceived in the DRIVER-II project. OpenAIREplus recognizes the importance of representing the context of publications and datasets, thus linking to resources about the authors, their affiliation, location, project data and funding. The project will explore how links between text-based publications and research data are managed in different scientific fields. This complements a previous study in OpenAIRE on current disciplinary practices and future needs for infrastructural Open Access services, taking into account the variety within research approaches. Adopting Linked Data mechanisms on top of citation and content mining, it will approach the interchange of data between generic infrastructures such as OpenAIREplus and subject specific service providers. The paper will also touch on the other challenges envisaged in the project with regard to the culture of sharing data, as well as IPR, licensing and organisational issues.
NASA Astrophysics Data System (ADS)
Hurford, Anthony; Harou, Julien
2015-04-01
Climate change has challenged conventional methods of planning water resources infrastructure investment, relying on stationarity of time-series data. It is not clear how to best use projections of future climatic conditions. Many-objective simulation-optimisation and trade-off analysis using evolutionary algorithms has been proposed as an approach to addressing complex planning problems with multiple conflicting objectives. The search for promising assets and policies can be carried out across a range of climate projections, to identify the configurations of infrastructure investment shown by model simulation to be robust under diverse future conditions. Climate projections can be used in different ways within a simulation model to represent the range of possible future conditions and understand how optimal investments vary according to the different hydrological conditions. We compare two approaches, optimising over an ensemble of different 20-year flow and PET timeseries projections, and separately for individual future scenarios built synthetically from the original ensemble. Comparing trade-off curves and surfaces generated by the two approaches helps understand the limits and benefits of optimising under different sets of conditions. The comparison is made for the Tana Basin in Kenya, where climate change combined with multiple conflicting objectives of water management and infrastructure investment mean decision-making is particularly challenging.
The computing and data infrastructure to interconnect EEE stations
NASA Astrophysics Data System (ADS)
Noferini, F.; EEE Collaboration
2016-07-01
The Extreme Energy Event (EEE) experiment is devoted to the search of high energy cosmic rays through a network of telescopes installed in about 50 high schools distributed throughout the Italian territory. This project requires a peculiar data management infrastructure to collect data registered in stations very far from each other and to allow a coordinated analysis. Such an infrastructure is realized at INFN-CNAF, which operates a Cloud facility based on the OpenStack opensource Cloud framework and provides Infrastructure as a Service (IaaS) for its users. In 2014 EEE started to use it for collecting, monitoring and reconstructing the data acquired in all the EEE stations. For the synchronization between the stations and the INFN-CNAF infrastructure we used BitTorrent Sync, a free peer-to-peer software designed to optimize data syncronization between distributed nodes. All data folders are syncronized with the central repository in real time to allow an immediate reconstruction of the data and their publication in a monitoring webpage. We present the architecture and the functionalities of this data management system that provides a flexible environment for the specific needs of the EEE project.
Project evaluation process manual
DOT National Transportation Integrated Search
1997-07-01
Describes the process for evaluating airport environments, safety standards, airport infrastructure, licensing standards and multitransportational systems. The project rating system is intended to be used for determining state and federal funding.
Design, Results, Evolution and Status of the ATLAS Simulation at Point1 Project
NASA Astrophysics Data System (ADS)
Ballestrero, S.; Batraneanu, S. M.; Brasolin, F.; Contescu, C.; Fazio, D.; Di Girolamo, A.; Lee, C. J.; Pozo Astigarraga, M. E.; Scannicchio, D. A.; Sedov, A.; Twomey, M. S.; Wang, F.; Zaytsev, A.
2015-12-01
During the LHC Long Shutdown 1 (LSI) period, that started in 2013, the Simulation at Point1 (Sim@P1) project takes advantage, in an opportunistic way, of the TDAQ (Trigger and Data Acquisition) HLT (High-Level Trigger) farm of the ATLAS experiment. This farm provides more than 1300 compute nodes, which are particularly suited for running event generation and Monte Carlo production jobs that are mostly CPU and not I/O bound. It is capable of running up to 2700 Virtual Machines (VMs) each with 8 CPU cores, for a total of up to 22000 parallel jobs. This contribution gives a review of the design, the results, and the evolution of the Sim@P1 project, operating a large scale OpenStack based virtualized platform deployed on top of the ATLAS TDAQ HLT farm computing resources. During LS1, Sim@P1 was one of the most productive ATLAS sites: it delivered more than 33 million CPU-hours and it generated more than 1.1 billion Monte Carlo events. The design aspects are presented: the virtualization platform exploited by Sim@P1 avoids interferences with TDAQ operations and it guarantees the security and the usability of the ATLAS private network. The cloud mechanism allows the separation of the needed support on both infrastructural (hardware, virtualization layer) and logical (Grid site support) levels. This paper focuses on the operational aspects of such a large system during the upcoming LHC Run 2 period: simple, reliable, and efficient tools are needed to quickly switch from Sim@P1 to TDAQ mode and back, to exploit the resources when they are not used for the data acquisition, even for short periods. The evolution of the central OpenStack infrastructure is described, as it was upgraded from Folsom to the Icehouse release, including the scalability issues addressed.
WATER SUPPLY PIPE REPLACEMENT CONSIDERING SUSTAINABLE TRANSITION TO POPULATION DECREASED SOCIETY
NASA Astrophysics Data System (ADS)
Hosoi, Yoshihiko; Iwasaki, Yoji; Aklog, Dagnachew; Masuda, Takanori
Social infrastructures are aging and population is decreasing in Japan. The aged social infrastructures should be renewed. At the same time, they are required to be moved into new framework suitable for population decreased societies. Furthermore, they have to continue to supply sufficient services even during transition term that renewal projects are carried out. Authors propose sustainable soft landing management of infrastructures and it is tried to apply to water supply pipe replacement in this study. Methodology to replace aged pipes not only aiming for the new water supply network which suits for population decreased condition but also ensuring supply service and feasibility while the project is carried out was developed. It is applied for a model water supply network and discussions were carried out.
Ugolini, Donatella; Neri, Monica; Bennati, Luca; Canessa, Pier Aldo; Casanova, Georgia; Lando, Cecilia; Leoncini, Giacomo; Marroni, Paola; Parodi, Barbara; Simonassi, Claudio; Bonassi, Stefano
2012-03-01
Advances in molecular epidemiology and translational research have led to the need for biospecimen collection. The Cancer of the Respiratory Tract (CREST) biorepository is concerned with pleural malignant mesothelioma (MM) and lung cancer (LC). The biorepository staff has collected demographic and epidemiological data directly from consenting subjects using a structured questionnaire, in agreement with The Public Population Project in Genomics (P(3)G). Clinical and follow-up data were collected. Sample data were also recorded. The architecture is based on a database designed with Microsoft Access. Data standardization was carried out to conform with established conventions or procedures. As from January 31, 2011, the overall number of recruited subjects was 1,857 (454 LC, 245 MM, 130 other cancers and 1,028 controls). Due to its infrastructure, CREST was able to join international projects, sharing samples and/or data with other research groups in the field. The data management system allows CREST to be involved, through a minimum data set, in the national project for the construction of the Italian network of Oncologic BioBanks (RIBBO), and in the infrastructure of a pan-European biobank network (BBMRI). The CREST biorepository is a valuable tool for translational studies on respiratory tract diseases, because of its simple and efficient infrastructure.
The centrality of meta-programming in the ES-DOC eco-system
NASA Astrophysics Data System (ADS)
Greenslade, Mark
2017-04-01
The Earth System Documentation (ES-DOC) project is an international effort aiming to deliver a robust earth system model inter-comparison project documentation infrastructure. Such infrastructure both simplifies & standardizes the process of documenting (in detail) projects, experiments, models, forcings & simulations. In support of CMIP6, ES-DOC has upgraded its eco-system of tools, web-services & web-sites. The upgrade consolidates the existing infrastructure (built for CMIP5) and extends it with the introduction of new capabilities. The strategic focus of the upgrade is improvements in the documentation experience and broadening the range of scientific use-cases that the archived documentation may help deliver. Whether it is highlighting dataset errors, exploring experimental protocols, comparing forcings across ensemble runs, understanding MIP objectives, reviewing citations, exploring component properties of configured models, visualising inter-model relationships, scientists involved in CMIP6 will find the ES-DOC infrastructure helpful. This presentation underlines the centrality of meta-programming within the ES-DOC eco-system. We will demonstrate how agility is greatly enhanced by taking a meta-programming approach to representing data models and controlled vocabularies. Such an approach nicely decouples representations from encodings. Meta-models will be presented along with the associated tooling chain that forward engineers artefacts as diverse as: class hierarchies, IPython notebooks, mindmaps, configuration files, OWL & SKOS documents, spreadsheets …etc.
EuCARD 2010: European coordination of accelerator research and development
NASA Astrophysics Data System (ADS)
Romaniuk, Ryszard S.
2010-09-01
Accelerators are basic tools of the experimental physics of elementary particles, nuclear physics, light sources of the fourth generation. They are also used in myriad other applications in research, industry and medicine. For example, there are intensely developed transmutation techniques for nuclear waste from nuclear power and atomic industries. The European Union invests in the development of accelerator infrastructures inside the framework programs to build the European Research Area. The aim is to build new accelerator research infrastructures, develop the existing ones, and generally make the infrastructures more available to competent users. The paper summarizes the first year of activities of the EU FP7 Project Capacities EuCARD -European Coordination of Accelerator R&D. EuCARD is a common venture of 37 European Accelerator Laboratories, Institutes, Universities and Industrial Partners involved in accelerator sciences and technologies. The project, initiated by ESGARD, is an Integrating Activity co-funded by the European Commission under Framework Program 7 - Capacities for a duration of four years, starting April 1st, 2009. Several teams from this country participate actively in this project. The contribution from Polish research teams concerns: photonic and electronic measurement - control systems, RF-gun co-design, thin-film superconducting technology, superconducting transport infrastructures, photon and particle beam measurements and control.
OpenCMISS: a multi-physics & multi-scale computational infrastructure for the VPH/Physiome project.
Bradley, Chris; Bowery, Andy; Britten, Randall; Budelmann, Vincent; Camara, Oscar; Christie, Richard; Cookson, Andrew; Frangi, Alejandro F; Gamage, Thiranja Babarenda; Heidlauf, Thomas; Krittian, Sebastian; Ladd, David; Little, Caton; Mithraratne, Kumar; Nash, Martyn; Nickerson, David; Nielsen, Poul; Nordbø, Oyvind; Omholt, Stig; Pashaei, Ali; Paterson, David; Rajagopal, Vijayaraghavan; Reeve, Adam; Röhrle, Oliver; Safaei, Soroush; Sebastián, Rafael; Steghöfer, Martin; Wu, Tim; Yu, Ting; Zhang, Heye; Hunter, Peter
2011-10-01
The VPH/Physiome Project is developing the model encoding standards CellML (cellml.org) and FieldML (fieldml.org) as well as web-accessible model repositories based on these standards (models.physiome.org). Freely available open source computational modelling software is also being developed to solve the partial differential equations described by the models and to visualise results. The OpenCMISS code (opencmiss.org), described here, has been developed by the authors over the last six years to replace the CMISS code that has supported a number of organ system Physiome projects. OpenCMISS is designed to encompass multiple sets of physical equations and to link subcellular and tissue-level biophysical processes into organ-level processes. In the Heart Physiome project, for example, the large deformation mechanics of the myocardial wall need to be coupled to both ventricular flow and embedded coronary flow, and the reaction-diffusion equations that govern the propagation of electrical waves through myocardial tissue need to be coupled with equations that describe the ion channel currents that flow through the cardiac cell membranes. In this paper we discuss the design principles and distributed memory architecture behind the OpenCMISS code. We also discuss the design of the interfaces that link the sets of physical equations across common boundaries (such as fluid-structure coupling), or between spatial fields over the same domain (such as coupled electromechanics), and the concepts behind CellML and FieldML that are embodied in the OpenCMISS data structures. We show how all of these provide a flexible infrastructure for combining models developed across the VPH/Physiome community. Copyright © 2011 Elsevier Ltd. All rights reserved.
Terra Populus and DataNet Collaboration
NASA Astrophysics Data System (ADS)
Kugler, T.; Ruggles, S.; Fitch, C. A.; Clark, P. D.; Sobek, M.; Van Riper, D.
2012-12-01
Terra Populus, part of NSF's new DataNet initiative, is developing organizational and technical infrastructure to integrate, preserve, and disseminate data describing changes in the human population and environment over time. Terra Populus will incorporate large microdata and aggregate census datasets from the United States and around the world, as well as land use, land cover, climate and other environmental datasets. These data are widely dispersed, exist in a variety of data structures, have incompatible or inadequate metadata, and have incompatible geographic identifiers. Terra Populus is developing methods of integrating data from different domains and translating across data structures based on spatio-temporal linkages among data contents. The new infrastructure will enable researchers to identify and merge data from heterogeneous sources to study the relationships between human behavior and the natural world. Terra Populus will partner with data archives, data producers, and data users to create a sustainable international organization that will guarantee preservation and access over multiple decades. Terra Populus is also collaborating with the other projects in the DataNet initiative - DataONE, the DataNet Federation Consortium (DFC) and Sustainable Environment-Actionable Data (SEAD). Taken together, the four projects address aspects of the entire data lifecycle, including planning, collection, documentation, discovery, integration, curation, preservation, and collaboration; and encompass a wide range of disciplines including earth sciences, ecology, social sciences, hydrology, oceanography, and engineering. The four projects are pursuing activities to share data, tools, and expertise between pairs of projects as well as collaborating across the DataNet program on issues of cyberinfrastructure and community engagement. Topics to be addressed through program-wide collaboration include technical, organizational, and financial sustainability; semantic integration; data management training and education; and cross-disciplinary awareness of data resources.
New EVSE Analytical Tools/Models: Electric Vehicle Infrastructure Projection Tool (EVI-Pro)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wood, Eric W; Rames, Clement L; Muratori, Matteo
This presentation addresses the fundamental question of how much charging infrastructure is needed in the United States to support PEVs. It complements ongoing EVSE initiatives by providing a comprehensive analysis of national PEV charging infrastructure requirements. The result is a quantitative estimate for a U.S. network of non-residential (public and workplace) EVSE that would be needed to support broader PEV adoption. The analysis provides guidance to public and private stakeholders who are seeking to provide nationwide charging coverage, improve the EVSE business case by maximizing station utilization, and promote effective use of private/public infrastructure investments.
NASA Astrophysics Data System (ADS)
Beranzoli, L.; Best, M.; Embriaco, D.; Favali, P.; Juniper, K.; Lo Bue, N.; Lara-Lopez, A.; Materia, P.; Ó Conchubhair, D.; O'Rourke, E.; Proctor, R.; Weller, R. A.
2017-12-01
Understanding effects on marine ecosystems of multiple drivers at various scales; from regional such as climate and ocean circulation, to local, such as seafloor gas emissions and harmful underwater noise, requires long time-series of integrated and standardised datasets. Large-scale research infrastructures for ocean observation are able to provide such time-series for a variety of ocean process physical parameters (mass and energy exchanges among surface, water column and benthic boundary layer) that constitute important and necessary measures of environmental conditions and change/development over time. Information deduced from these data is essential for the study, modelling and prediction of marine ecosystems changes and can reveal and potentially confirm deterioration and threats. The COOPLUS European Commission project brings together research infrastructures with the aim of coordinating multilateral cooperation among RIs and identifying common priorities, actions, instruments, resources. COOPLUS will produce a Strategic Research and Innovation Agenda (SRIA) which will be a shared roadmap for mid to long-term collaboration. In particular, marine RIs collaborating in COOPLUS, namely the European Multidisciplinary Seafloor and water column Observatory: EMSO (Europe), the Ocean Observatories Initiative (OOI, USA), Ocean Networks Canada (ONC), and the Integrated Marine Observing System (IMOS, Australia), can represent a source of important data for researchers of marine ecosystems. The RIs can then, in turn, receive suggestions from researchers for implementing new measurements and stimulating cross-cutting collaborations and data integration and standardisation from their user community. This poster provides a description of EMSO, OOI, ONC and IMOS for the benefit of marine ecosystem studies and presents examples of where the analyses of time-series have revealed noteworthy environmental conditions, temporal trends and events.
Publication of sensor data in the long-term environmental monitoring infrastructure TERENO
NASA Astrophysics Data System (ADS)
Stender, V.; Schroeder, M.; Klump, J. F.
2014-12-01
Terrestrial Environmental Observatories (TERENO) is an interdisciplinary and long-term research project spanning an Earth observation network across Germany. It includes four test sites within Germany from the North German lowlands to the Bavarian Alps and is operated by six research centers of the Helmholtz Association. TERENO Northeast is one of the sub-observatories of TERENO and is operated by the German Research Centre for Geosciences GFZ in Potsdam. This observatory investigates geoecological processes in the northeastern lowland of Germany by collecting large amounts of environmentally relevant data. The success of long-term projects like TERENO depends on well-organized data management, data exchange between the partners involved and on the availability of the captured data. Data discovery and dissemination are facilitated not only through data portals of the regional TERENO observatories but also through a common spatial data infrastructure TEODOOR (TEreno Online Data repOsitORry). TEODOOR bundles the data, provided by the different web services of the single observatories, and provides tools for data discovery, visualization and data access. The TERENO Northeast data infrastructure integrates data from more than 200 instruments and makes data available through standard web services. TEODOOR accesses the OGC Sensor Web Enablement (SWE) interfaces offered by the regional observatories. In addition to the SWE interface, TERENO Northeast also publishes time series of environmental sensor data through the online research data publication platform DataCite. The metadata required by DataCite are created in an automated process by extracting information from the SWE SensorML to create ISO 19115 compliant metadata. The GFZ data management tool kit panMetaDocs is used to register Digital Object Identifiers (DOI) and preserve file based datasets. In addition to DOI, the International Geo Sample Numbers (IGSN) is used to uniquely identify research specimens.
Multi-objective optimization of GENIE Earth system models.
Price, Andrew R; Myerscough, Richard J; Voutchkov, Ivan I; Marsh, Robert; Cox, Simon J
2009-07-13
The tuning of parameters in climate models is essential to provide reliable long-term forecasts of Earth system behaviour. We apply a multi-objective optimization algorithm to the problem of parameter estimation in climate models. This optimization process involves the iterative evaluation of response surface models (RSMs), followed by the execution of multiple Earth system simulations. These computations require an infrastructure that provides high-performance computing for building and searching the RSMs and high-throughput computing for the concurrent evaluation of a large number of models. Grid computing technology is therefore essential to make this algorithm practical for members of the GENIE project.
Geospatial Applications on Different Parallel and Distributed Systems in enviroGRIDS Project
NASA Astrophysics Data System (ADS)
Rodila, D.; Bacu, V.; Gorgan, D.
2012-04-01
The execution of Earth Science applications and services on parallel and distributed systems has become a necessity especially due to the large amounts of Geospatial data these applications require and the large geographical areas they cover. The parallelization of these applications comes to solve important performance issues and can spread from task parallelism to data parallelism as well. Parallel and distributed architectures such as Grid, Cloud, Multicore, etc. seem to offer the necessary functionalities to solve important problems in the Earth Science domain: storing, distribution, management, processing and security of Geospatial data, execution of complex processing through task and data parallelism, etc. A main goal of the FP7-funded project enviroGRIDS (Black Sea Catchment Observation and Assessment System supporting Sustainable Development) [1] is the development of a Spatial Data Infrastructure targeting this catchment region but also the development of standardized and specialized tools for storing, analyzing, processing and visualizing the Geospatial data concerning this area. For achieving these objectives, the enviroGRIDS deals with the execution of different Earth Science applications, such as hydrological models, Geospatial Web services standardized by the Open Geospatial Consortium (OGC) and others, on parallel and distributed architecture to maximize the obtained performance. This presentation analysis the integration and execution of Geospatial applications on different parallel and distributed architectures and the possibility of choosing among these architectures based on application characteristics and user requirements through a specialized component. Versions of the proposed platform have been used in enviroGRIDS project on different use cases such as: the execution of Geospatial Web services both on Web and Grid infrastructures [2] and the execution of SWAT hydrological models both on Grid and Multicore architectures [3]. The current focus is to integrate in the proposed platform the Cloud infrastructure, which is still a paradigm with critical problems to be solved despite the great efforts and investments. Cloud computing comes as a new way of delivering resources while using a large set of old as well as new technologies and tools for providing the necessary functionalities. The main challenges in the Cloud computing, most of them identified also in the Open Cloud Manifesto 2009, address resource management and monitoring, data and application interoperability and portability, security, scalability, software licensing, etc. We propose a platform able to execute different Geospatial applications on different parallel and distributed architectures such as Grid, Cloud, Multicore, etc. with the possibility of choosing among these architectures based on application characteristics and complexity, user requirements, necessary performances, cost support, etc. The execution redirection on a selected architecture is realized through a specialized component and has the purpose of offering a flexible way in achieving the best performances considering the existing restrictions.
Chervenak, Ann L; van Erp, Theo G M; Kesselman, Carl; D'Arcy, Mike; Sobell, Janet; Keator, David; Dahm, Lisa; Murry, Jim; Law, Meng; Hasso, Anton; Ames, Joseph; Macciardi, Fabio; Potkin, Steven G
2012-01-01
Progress in our understanding of brain disorders increasingly relies on the costly collection of large standardized brain magnetic resonance imaging (MRI) data sets. Moreover, the clinical interpretation of brain scans benefits from compare and contrast analyses of scans from patients with similar, and sometimes rare, demographic, diagnostic, and treatment status. A solution to both needs is to acquire standardized, research-ready clinical brain scans and to build the information technology infrastructure to share such scans, along with other pertinent information, across hospitals. This paper describes the design, deployment, and operation of a federated imaging system that captures and shares standardized, de-identified clinical brain images in a federation across multiple institutions. In addition to describing innovative aspects of the system architecture and our initial testing of the deployed infrastructure, we also describe the Standardized Imaging Protocol (SIP) developed for the project and our interactions with the Institutional Review Board (IRB) regarding handling patient data in the federated environment.
Chervenak, Ann L.; van Erp, Theo G.M.; Kesselman, Carl; D’Arcy, Mike; Sobell, Janet; Keator, David; Dahm, Lisa; Murry, Jim; Law, Meng; Hasso, Anton; Ames, Joseph; Macciardi, Fabio; Potkin, Steven G.
2015-01-01
Progress in our understanding of brain disorders increasingly relies on the costly collection of large standardized brain magnetic resonance imaging (MRI) data sets. Moreover, the clinical interpretation of brain scans benefits from compare and contrast analyses of scans from patients with similar, and sometimes rare, demographic, diagnostic, and treatment status. A solution to both needs is to acquire standardized, research-ready clinical brain scans and to build the information technology infrastructure to share such scans, along with other pertinent information, across hospitals. This paper describes the design, deployment, and operation of a federated imaging system that captures and shares standardized, de-identified clinical brain images in a federation across multiple institutions. In addition to describing innovative aspects of the system architecture and our initial testing of the deployed infrastructure, we also describe the Standardized Imaging Protocol (SIP) developed for the project and our interactions with the Institutional Review Board (IRB) regarding handling patient data in the federated environment. PMID:22941984
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marco Carvalho; Richard Ford
2012-05-14
Supervisory Control and Data Acquisition (SCADA) Systems are a type of Industrial Control System characterized by the centralized (or hierarchical) monitoring and control of geographically dispersed assets. SCADA systems combine acquisition and network components to provide data gathering, transmission, and visualization for centralized monitoring and control. However these integrated capabilities, especially when built over legacy systems and protocols, generally result in vulnerabilities that can be exploited by attackers, with potentially disastrous consequences. Our research project proposal was to investigate new approaches for secure and survivable SCADA systems. In particular, we were interested in the resilience and adaptability of large-scale mission-criticalmore » monitoring and control infrastructures. Our research proposal was divided in two main tasks. The first task was centered on the design and investigation of algorithms for survivable SCADA systems and a prototype framework demonstration. The second task was centered on the characterization and demonstration of the proposed approach in illustrative scenarios (simulated or emulated).« less
On the Storm Surge and Sea Level Rise Projections for Infrastructure Risk Analysis and Adaptation
Storm surge can cause coastal hydrology changes, flooding, water quality changes, and even inundation of low-lying terrain. Strong wave actions and disruptive winds can damage water infrastructure and other environmental assets (hazardous and solid waste management facilities, w...
DOT National Transportation Integrated Search
2014-05-01
This project seeks to develop a rapidly deployable, low-cost, and wireless system for bridge : weigh-in-motion (BWIM) and nondestructive evaluation (NDE). The system is proposed to : assist in monitoring transportation infrastructure safety, for the ...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-11
... information about electricity infrastructure's current and projected communications requirements, as well as...'s electricity infrastructure need to employ adequate communications technologies that serve their... Smart Grid and the other technologies that will evolve and change how electricity is produced, consumed...
Information Infrastructure Sourcebook.
ERIC Educational Resources Information Center
Kahin, Brian, Ed.
This volume is designed to provide planners and policymakers with a single volume reference book on efforts to define and develop policy for the National Information Infrastructure. The sourcebook is divided into five sections: (1) official documents; (2) vision statements and position papers; (3) program and project descriptions (all sectors);…
DOT National Transportation Integrated Search
2015-09-23
This research project aimed to develop a remote sensing system capable of rapidly identifying fine-scale damage to critical transportation infrastructure following hazard events. Such a system must be pre-planned for rapid deployment, automate proces...