Sample records for infrastructure analysis presentation

  1. Infrastructure Analysis Tools: A Focus on Cash Flow Analysis (Presentation)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Melaina, M.; Penev, M.

    2012-09-01

    NREL has developed and maintains a variety of infrastructure analysis models for the U.S. Department of Energy. Business case analysis has recently been added to this tool set. This presentation focuses on cash flow analysis. Cash flows depend upon infrastructure costs, optimized spatially and temporally, and assumptions about financing and revenue. NREL has incorporated detailed metrics on financing and incentives into the models. Next steps in modeling include continuing to collect feedback on regional/local infrastructure development activities and 'roadmap' dynamics, and incorporating consumer preference assumptions on infrastructure to provide direct feedback between vehicles and station rollout.

  2. National Water Infrastructure Adaptation Assessment, Part I: Climate Change Adaptation Readiness Analysis

    EPA Science Inventory

    The report “National Water Infrastructure Adaptation Assessment” is comprised of four parts (Part I to IV), each in an independent volume. The Part I report presented herein describes a preliminary regulatory and technical analysis of water infrastructure and regulations in the ...

  3. Overview of Infrastructure Science and Analysis for Homeland Security

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Backhaus, Scott N.

    This presentation offers an analysis of infrastructure science with goals to provide third-party independent science based input into complex problems of national concern and to use scientific analysis to "turn down the noise" around complex problems.

  4. Connectivity and Resilience: A Multidimensional Analysis of Infrastructure Impacts in the Southwestern Amazon

    ERIC Educational Resources Information Center

    Perz, Stephen G.; Shenkin, Alexander; Barnes, Grenville; Cabrera, Liliana; Carvalho, Lucas A.; Castillo, Jorge

    2012-01-01

    Infrastructure is a worldwide policy priority for national development via regional integration into the global economy. However, economic, ecological and social research draws contrasting conclusions about the consequences of infrastructure. We present a synthetic approach to the study of infrastructure, focusing on a multidimensional treatment…

  5. Utilizing Semantic Big Data for realizing a National-scale Infrastructure Vulnerability Analysis System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chinthavali, Supriya; Shankar, Mallikarjun

    Critical Infrastructure systems(CIs) such as energy, water, transportation and communication are highly interconnected and mutually dependent in complex ways. Robust modeling of CIs interconnections is crucial to identify vulnerabilities in the CIs. We present here a national-scale Infrastructure Vulnerability Analysis System (IVAS) vision leveraging Se- mantic Big Data (SBD) tools, Big Data, and Geographical Information Systems (GIS) tools. We survey existing ap- proaches on vulnerability analysis of critical infrastructures and discuss relevant systems and tools aligned with our vi- sion. Next, we present a generic system architecture and discuss challenges including: (1) Constructing and manag- ing a CI network-of-networks graph,more » (2) Performing analytic operations at scale, and (3) Interactive visualization of ana- lytic output to generate meaningful insights. We argue that this architecture acts as a baseline to realize a national-scale network based vulnerability analysis system.« less

  6. Lessons for the Global Spatial Data Infrastructure : international case study analysis

    DOT National Transportation Integrated Search

    2002-01-01

    This report presents a RAND analysis of international collaboration for the Global Spatial Data Infrastructure (GSDI). Ten in-depth international and regional collaboration case studies were conducted to assess lessons learned for GSDI development an...

  7. California Plug-In Electric Vehicle Infrastructure Projections: 2017-2025 - Future Infrastructure Needs for Reaching the State's Zero Emission-Vehicle Deployment Goals

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wood, Eric W; Rames, Clement L; Bedir, Abdulkadir

    This report analyzes plug-in electric vehicle (PEV) infrastructure needs in California from 2017 to 2025 in a scenario where the State's zero-emission vehicle (ZEV) deployment goals are achieved by household vehicles. The statewide infrastructure needs are evaluated by using the Electric Vehicle Infrastructure Projection tool, which incorporates representative statewide travel data from the 2012 California Household Travel Survey. The infrastructure solution presented in this assessment addresses two primary objectives: (1) enabling travel for battery electric vehicles and (2) maximizing the electric vehicle-miles traveled for plug-in hybrid electric vehicles. The analysis is performed at the county-level for each year between 2017more » and 2025 while considering potential technology improvements. The results from this study present an infrastructure solution that can facilitate market growth for PEVs to reach the State's ZEV goals by 2025. The overall results show a need for 99k-130k destination chargers, including workplaces and public locations, and 9k-25k fast chargers. The results also show a need for dedicated or shared residential charging solutions at multi-family dwellings, which are expected to host about 120k PEVs by 2025. An improvement to the scientific literature, this analysis presents the significance of infrastructure reliability and accessibility on the quantification of charger demand.« less

  8. Assessing the risk posed by natural hazards to infrastructures

    NASA Astrophysics Data System (ADS)

    Eidsvig, Unni; Kristensen, Krister; Vidar Vangelsten, Bjørn

    2015-04-01

    The modern society is increasingly dependent on infrastructures to maintain its function, and disruption in one of the infrastructure systems may have severe consequences. The Norwegian municipalities have, according to legislation, a duty to carry out a risk and vulnerability analysis and plan and prepare for emergencies in a short- and long term perspective. Vulnerability analysis of the infrastructures and their interdependencies is an important part of this analysis. This paper proposes a model for assessing the risk posed by natural hazards to infrastructures. The model prescribes a three level analysis with increasing level of detail, moving from qualitative to quantitative analysis. This paper focuses on the second level, which consists of a semi-quantitative analysis. The purpose of this analysis is to perform a screening of the scenarios of natural hazards threatening the infrastructures identified in the level 1 analysis and investigate the need for further analyses, i.e. level 3 quantitative analyses. The proposed level 2 analysis considers the frequency of the natural hazard, different aspects of vulnerability including the physical vulnerability of the infrastructure itself and the societal dependency on the infrastructure. An indicator-based approach is applied, ranking the indicators on a relative scale. The proposed indicators characterize the robustness of the infrastructure, the importance of the infrastructure as well as interdependencies between society and infrastructure affecting the potential for cascading effects. Each indicator is ranked on a 1-5 scale based on pre-defined ranking criteria. The aggregated risk estimate is a combination of the semi-quantitative vulnerability indicators, as well as quantitative estimates of the frequency of the natural hazard and the number of users of the infrastructure. Case studies for two Norwegian municipalities are presented, where risk to primary road, water supply and power network threatened by storm and landslide is assessed. The application examples show that the proposed model provides a useful tool for screening of undesirable events, with the ultimate goal to reduce the societal vulnerability.

  9. Transportation Community Institutional Infrastructure Study : Volume 1. National Transportation Needs Mail Survey.

    DOT National Transportation Integrated Search

    1976-04-01

    The results of the Transportation Community Infrastructure Study are presented as a three volume series. This series presents a surveyed priority of topics for information exhange, a case study of a porposed training proram, and an analysis of the tr...

  10. New EVSE Analytical Tools/Models: Electric Vehicle Infrastructure Projection Tool (EVI-Pro)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wood, Eric W; Rames, Clement L; Muratori, Matteo

    This presentation addresses the fundamental question of how much charging infrastructure is needed in the United States to support PEVs. It complements ongoing EVSE initiatives by providing a comprehensive analysis of national PEV charging infrastructure requirements. The result is a quantitative estimate for a U.S. network of non-residential (public and workplace) EVSE that would be needed to support broader PEV adoption. The analysis provides guidance to public and private stakeholders who are seeking to provide nationwide charging coverage, improve the EVSE business case by maximizing station utilization, and promote effective use of private/public infrastructure investments.

  11. Application of GIS in exploring spatial dimensions of Efficiency in Competitiveness of Regions

    NASA Astrophysics Data System (ADS)

    Rahmat, Shahid; Sen, Joy

    2017-04-01

    Infrastructure is an important component in building competitiveness of a region. Present global scenario of economic slowdown that is led by slump in demand of goods and services and decreasing capacity of government institutions in investing public infrastructure. Strategy of augmenting competitiveness of a region can be built around improving efficient distribution of public infrastructure in the region. This efficiency in the distribution of infrastructure will reduce the burden of government institution and improve the relative output of the region in relative lesser investment. A rigorous literature study followed by an expert opinion survey (RIDIT scores) reveals that Railway, Road, ICTs and Electricity infrastructure is very crucial for better competitiveness of a region. Discussion with Experts in ICTs, Railways and Electricity sectors were conducted to find the issues, hurdles and possible solution for the development of these sectors. In an underdeveloped country like India, there is a large constrain of financial resources, for investment in infrastructure sector. Judicious planning for allocation of resources for infrastructure provisions becomes very important for efficient and sustainable development. Data Envelopment Analysis (DEA) is the mathematical programming optimization tool that measure technical efficiency of the multiple-input and/or multiple-output case by constructing a relative technical efficiency score. This paper tries to utilize DEA to identify the efficiency at which present level of selected components of Infrastructure (Railway, Road, ICTs and Electricity) is utilized in order to build competitiveness of the region. This paper tries to identify a spatial pattern of efficiency of Infrastructure with the help of spatial auto-correlation and Hot-spot analysis in Arc GIS. This analysis leads to policy implications for efficient allocation of financial resources for the provision of infrastructure in the region and building a prerequisite to boost an efficient Regional Competitiveness.

  12. Map Matching and Real World Integrated Sensor Data Warehousing (Presentation)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burton, E.

    2014-02-01

    The inclusion of interlinked temporal and spatial elements within integrated sensor data enables a tremendous degree of flexibility when analyzing multi-component datasets. The presentation illustrates how to warehouse, process, and analyze high-resolution integrated sensor datasets to support complex system analysis at the entity and system levels. The example cases presented utilizes in-vehicle sensor system data to assess vehicle performance, while integrating a map matching algorithm to link vehicle data to roads to demonstrate the enhanced analysis possible via interlinking data elements. Furthermore, in addition to the flexibility provided, the examples presented illustrate concepts of maintaining proprietary operational information (Fleet DNA)more » and privacy of study participants (Transportation Secure Data Center) while producing widely distributed data products. Should real-time operational data be logged at high resolution across multiple infrastructure types, map matched to their associated infrastructure, and distributed employing a similar approach; dependencies between urban environment infrastructures components could be better understood. This understanding is especially crucial for the cities of the future where transportation will rely more on grid infrastructure to support its energy demands.« less

  13. First results from a combined analysis of CERN computing infrastructure metrics

    NASA Astrophysics Data System (ADS)

    Duellmann, Dirk; Nieke, Christian

    2017-10-01

    The IT Analysis Working Group (AWG) has been formed at CERN across individual computing units and the experiments to attempt a cross cutting analysis of computing infrastructure and application metrics. In this presentation we will describe the first results obtained using medium/long term data (1 months — 1 year) correlating box level metrics, job level metrics from LSF and HTCondor, IO metrics from the physics analysis disk pools (EOS) and networking and application level metrics from the experiment dashboards. We will cover in particular the measurement of hardware performance and prediction of job duration, the latency sensitivity of different job types and a search for bottlenecks with the production job mix in the current infrastructure. The presentation will conclude with the proposal of a small set of metrics to simplify drawing conclusions also in the more constrained environment of public cloud deployments.

  14. Soak Up the Rain New England Webinar Series: National ...

    EPA Pesticide Factsheets

    Presenters will provide an introduction to the most recent EPA green infrastructure tools to R1 stakeholders; and their use in making decisions about implementing green infrastructure. We will discuss structuring your green infrastructure decision, finding appropriate information and tools, evaluating options and selecting the right Best Management Practices mix for your needs.WMOST (Watershed Management Optimization Support Tool)- for screening a wide range of practices for cost-effectiveness in achieving watershed or water utilities management goals.GIWiz (Green Infrastructure Wizard)- a web application connecting communities to EPA Green Infrastructure tools and resources.Opti-Tool-designed to assist in developing technically sound and optimized cost-effective Stormwater management plans. National Stormwater Calculator- a desktop application for estimating the impact of land cover change and green infrastructure controls on stormwater runoff. DASEES-GI (Decision Analysis for a Sustainable Environment, Economy, and Society) – a framework for linking objectives and measures with green infrastructure methods. Presenters will provide an introduction to the most recent EPA green infrastructure tools to R1 stakeholders; and their use in making decisions about implementing green infrastructure. We will discuss structuring your green infrastructure decision, finding appropriate information and tools, evaluating options and selecting the right Best Management Pr

  15. Human-Technology Centric In Cyber Security Maintenance For Digital Transformation Era

    NASA Astrophysics Data System (ADS)

    Ali, Firkhan Ali Bin Hamid; Zalisham Jali, Mohd, Dr

    2018-05-01

    The development of the digital transformation in the organizations has become more expanding in these present and future years. This is because of the active demand to use the ICT services among all the organizations whether in the government agencies or private sectors. While digital transformation has led manufacturers to incorporate sensors and software analytics into their offerings, the same innovation has also brought pressure to offer clients more accommodating appliance deployment options. So, their needs a well plan to implement the cyber infrastructures and equipment. The cyber security play important role to ensure that the ICT components or infrastructures execute well along the organization’s business successful. This paper will present a study of security management models to guideline the security maintenance on existing cyber infrastructures. In order to perform security model for the currently existing cyber infrastructures, combination of the some security workforces and security process of extracting the security maintenance in cyber infrastructures. In the assessment, the focused on the cyber security maintenance within security models in cyber infrastructures and presented a way for the theoretical and practical analysis based on the selected security management models. Then, the proposed model does evaluation for the analysis which can be used to obtain insights into the configuration and to specify desired and undesired configurations. The implemented cyber security maintenance within security management model in a prototype and evaluated it for practical and theoretical scenarios. Furthermore, a framework model is presented which allows the evaluation of configuration changes in the agile and dynamic cyber infrastructure environments with regard to properties like vulnerabilities or expected availability. In case of a security perspective, this evaluation can be used to monitor the security levels of the configuration over its lifetime and to indicate degradations.

  16. The NWRA Classification Infrastructure: description and extension to the Discriminant Analysis Flare Forecasting System (DAFFS)

    NASA Astrophysics Data System (ADS)

    Leka, K. D.; Barnes, Graham; Wagner, Eric

    2018-04-01

    A classification infrastructure built upon Discriminant Analysis (DA) has been developed at NorthWest Research Associates for examining the statistical differences between samples of two known populations. Originating to examine the physical differences between flare-quiet and flare-imminent solar active regions, we describe herein some details of the infrastructure including: parametrization of large datasets, schemes for handling "null" and "bad" data in multi-parameter analysis, application of non-parametric multi-dimensional DA, an extension through Bayes' theorem to probabilistic classification, and methods invoked for evaluating classifier success. The classifier infrastructure is applicable to a wide range of scientific questions in solar physics. We demonstrate its application to the question of distinguishing flare-imminent from flare-quiet solar active regions, updating results from the original publications that were based on different data and much smaller sample sizes. Finally, as a demonstration of "Research to Operations" efforts in the space-weather forecasting context, we present the Discriminant Analysis Flare Forecasting System (DAFFS), a near-real-time operationally-running solar flare forecasting tool that was developed from the research-directed infrastructure.

  17. A Systems Approach to Develop Sustainable Water Supply Infrastructure and Management

    EPA Science Inventory

    In a visit to Zhejiang University, China, Dr. Y. Jeffrey Yang will discuss in this presentation the system approach for urban water infrastructure sustainability. Through a system analysis, it becomes clear at an urban scale that the energy and water efficiencies of a water supp...

  18. Improving linear transport infrastructure efficiency by automated learning and optimised predictive maintenance techniques (INFRALERT)

    NASA Astrophysics Data System (ADS)

    Jiménez-Redondo, Noemi; Calle-Cordón, Alvaro; Kandler, Ute; Simroth, Axel; Morales, Francisco J.; Reyes, Antonio; Odelius, Johan; Thaduri, Aditya; Morgado, Joao; Duarte, Emmanuele

    2017-09-01

    The on-going H2020 project INFRALERT aims to increase rail and road infrastructure capacity in the current framework of increased transportation demand by developing and deploying solutions to optimise maintenance interventions planning. It includes two real pilots for road and railways infrastructure. INFRALERT develops an ICT platform (the expert-based Infrastructure Management System, eIMS) which follows a modular approach including several expert-based toolkits. This paper presents the methodologies and preliminary results of the toolkits for i) nowcasting and forecasting of asset condition, ii) alert generation, iii) RAMS & LCC analysis and iv) decision support. The results of these toolkits in a meshed road network in Portugal under the jurisdiction of Infraestruturas de Portugal (IP) are presented showing the capabilities of the approaches.

  19. Low carbon technology performance vs infrastructure vulnerability: analysis through the local and global properties space.

    PubMed

    Dawson, David A; Purnell, Phil; Roelich, Katy; Busch, Jonathan; Steinberger, Julia K

    2014-11-04

    Renewable energy technologies, necessary for low-carbon infrastructure networks, are being adopted to help reduce fossil fuel dependence and meet carbon mitigation targets. The evolution of these technologies has progressed based on the enhancement of technology-specific performance criteria, without explicitly considering the wider system (global) impacts. This paper presents a methodology for simultaneously assessing local (technology) and global (infrastructure) performance, allowing key technological interventions to be evaluated with respect to their effect on the vulnerability of wider infrastructure systems. We use exposure of low carbon infrastructure to critical material supply disruption (criticality) to demonstrate the methodology. A series of local performance changes are analyzed; and by extension of this approach, a method for assessing the combined criticality of multiple materials for one specific technology is proposed. Via a case study of wind turbines at both the material (magnets) and technology (turbine generators) levels, we demonstrate that analysis of a given intervention at different levels can lead to differing conclusions regarding the effect on vulnerability. Infrastructure design decisions should take a systemic approach; without these multilevel considerations, strategic goals aimed to help meet low-carbon targets, that is, through long-term infrastructure transitions, could be significantly jeopardized.

  20. Analysis of CERN computing infrastructure and monitoring data

    NASA Astrophysics Data System (ADS)

    Nieke, C.; Lassnig, M.; Menichetti, L.; Motesnitsalis, E.; Duellmann, D.

    2015-12-01

    Optimizing a computing infrastructure on the scale of LHC requires a quantitative understanding of a complex network of many different resources and services. For this purpose the CERN IT department and the LHC experiments are collecting a large multitude of logs and performance probes, which are already successfully used for short-term analysis (e.g. operational dashboards) within each group. The IT analytics working group has been created with the goal to bring data sources from different services and on different abstraction levels together and to implement a suitable infrastructure for mid- to long-term statistical analysis. It further provides a forum for joint optimization across single service boundaries and the exchange of analysis methods and tools. To simplify access to the collected data, we implemented an automated repository for cleaned and aggregated data sources based on the Hadoop ecosystem. This contribution describes some of the challenges encountered, such as dealing with heterogeneous data formats, selecting an efficient storage format for map reduce and external access, and will describe the repository user interface. Using this infrastructure we were able to quantitatively analyze the relationship between CPU/wall fraction, latency/throughput constraints of network and disk and the effective job throughput. In this contribution we will first describe the design of the shared analysis infrastructure and then present a summary of first analysis results from the combined data sources.

  1. Optical fibre multi-parameter sensing with secure cloud based signal capture and processing

    NASA Astrophysics Data System (ADS)

    Newe, Thomas; O'Connell, Eoin; Meere, Damien; Yuan, Hongwei; Leen, Gabriel; O'Keeffe, Sinead; Lewis, Elfed

    2016-05-01

    Recent advancements in cloud computing technologies in the context of optical and optical fibre based systems are reported. The proliferation of real time and multi-channel based sensor systems represents significant growth in data volume. This coupled with a growing need for security presents many challenges and presents a huge opportunity for an evolutionary step in the widespread application of these sensing technologies. A tiered infrastructural system approach is adopted that is designed to facilitate the delivery of Optical Fibre-based "SENsing as a Service- SENaaS". Within this infrastructure, novel optical sensing platforms, deployed within different environments, are interfaced with a Cloud-based backbone infrastructure which facilitates the secure collection, storage and analysis of real-time data. Feedback systems, which harness this data to affect a change within the monitored location/environment/condition, are also discussed. The cloud based system presented here can also be used with chemical and physical sensors that require real-time data analysis, processing and feedback.

  2. Risk assessment of the fatality due to explosion in land mass transport infrastructure by fast transient dynamic analysis.

    PubMed

    Giannopoulos, G; Larcher, M; Casadei, F; Solomos, G

    2010-01-15

    Terrorist attacks in New York have shocked the world community showing clearly the vulnerability of air transport in such events. However, the terrorist attacks in Madrid and London showed that land mass transport infrastructure is equally vulnerable in case of similar attacks. The fact that there has not been substantial investment in the domain of risk analysis and evaluation of the possible effects due to such events in land mass transportation infrastructure leaves large room for new developments that could eventually fill this gap. In the present work using the finite element code EUROPLEXUS there has been a large effort to perform a complete study of the land mass infrastructure in case of explosion events. This study includes a train station, a metro station and a metro carriage providing thus valuable simulation data for a variety of different situations. For the analysis of these structures it has been necessary to apply a laser scanning method for the acquisition of geometrical data, to improve the simulation capabilities of EUROPLEXUS by adding failure capabilities for specific finite elements, to implement new material models (e.g. glass), and to add new modules that achieve data post-processing for the calculation of fatal and non-fatal injuries risk. The aforementioned improvements are explained in the present work with emphasis in the newly developed risk analysis features of EUROPLEXUS.

  3. Study on transport infrastructure as mechanism of long-term urban planning strategies

    NASA Astrophysics Data System (ADS)

    Popova, Olga; Martynov, Kirill; Khusnutdinov, Rinat

    2017-10-01

    In this article, the authors carry out the research of the transport infrastructure. The authors have developed an algorithm for quality assessment of transport networks and connectivity of urban development areas. The results of the research are presented on the example of several central city quarters of Arkhangelsk city. The analysis was carried out by clustering objects (separate quarters of the Arkhangelsk city) using of SOM in comparable groups with a high level of similarity of characteristics inside each group. The result of clustering was 5 clusters with different levels of transport infrastructure. The novelty of the study is to justification for advantages of applying structural analysis for qualitative ranking of areas. The advantage of the proposed methodology is that it gives the opportunity both to compare the transport infrastructure quality of different city quarters and to determine the strategy for its development with a list of specific activities.

  4. Space-Based Information Infrastructure Architecture for Broadband Services

    NASA Technical Reports Server (NTRS)

    Price, Kent M.; Inukai, Tom; Razdan, Rajendev; Lazeav, Yvonne M.

    1996-01-01

    This study addressed four tasks: (1) identify satellite-addressable information infrastructure markets; (2) perform network analysis for space-based information infrastructure; (3) develop conceptual architectures; and (4) economic assessment of architectures. The report concludes that satellites will have a major role in the national and global information infrastructure, requiring seamless integration between terrestrial and satellite networks. The proposed LEO, MEO, and GEO satellite systems have satellite characteristics that vary widely. They include delay, delay variations, poorer link quality and beam/satellite handover. The barriers against seamless interoperability between satellite and terrestrial networks are discussed. These barriers are the lack of compatible parameters, standards and protocols, which are presently being evaluated and reduced.

  5. URBAN-NET: A Network-based Infrastructure Monitoring and Analysis System for Emergency Management and Public Safety

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Sangkeun; Chen, Liangzhe; Duan, Sisi

    Abstract Critical Infrastructures (CIs) such as energy, water, and transportation are complex networks that are crucial for sustaining day-to-day commodity flows vital to national security, economic stability, and public safety. The nature of these CIs is such that failures caused by an extreme weather event or a man-made incident can trigger widespread cascading failures, sending ripple effects at regional or even national scales. To minimize such effects, it is critical for emergency responders to identify existing or potential vulnerabilities within CIs during such stressor events in a systematic and quantifiable manner and take appropriate mitigating actions. We present here amore » novel critical infrastructure monitoring and analysis system named URBAN-NET. The system includes a software stack and tools for monitoring CIs, pre-processing data, interconnecting multiple CI datasets as a heterogeneous network, identifying vulnerabilities through graph-based topological analysis, and predicting consequences based on what-if simulations along with visualization. As a proof-of-concept, we present several case studies to show the capabilities of our system. We also discuss remaining challenges and future work.« less

  6. On the use of IT investment assessment methods in the area of spatial data infrastructure

    NASA Astrophysics Data System (ADS)

    Zwirowicz-Rutkowska, Agnieszka

    2016-06-01

    One of the important issues concerning development of spatial data infrastructures (SDIs) is the carrying out of economic and financial analysis. It is essential to determine expenses and also assess effects resulting from the development and use of infrastructures. Costs and benefits assessment could be associated with assessment of the infrastructure effectiveness and efficiency as well as the infrastructure value, understood as the infrastructure impact on economic aspects of an organisational performance, both of an organisation which realises an SDI project and all users of the infrastructure. The aim of this paper is an overview of various assessment methods of investment as well as an analysis of different types of costs and benefits used for information technology (IT) projects. Based on the literature, the analysis of the examples of the use of these methods in the area of spatial data infrastructures is also presented. Furthermore, the issues of SDI projects and investments are outlined. The results of the analysis indicate usefulness of the financial methods from different fields of management in the area of SDI building, development and use. The author proposes, in addition to the financial methods, the adaptation of the various techniques used for IT investments and their development, taking into consideration the SDI specificity for the purpose of assessment of different types of costs and benefits and integration of financial aspects with non-financial ones. Among the challenges are identification and quantification of costs and benefits, as well as establishing measures which would fit the characteristics of the SDI project and artefacts resulting from the project realisation. Moreover, aspects of subjectivity and variability in time should be taken into account as the consequences of definite goals and policies as well as business context of organisation undertaking the project or using its artefacts and also investors.

  7. Potential impacts of tephra fallout from a large-scale explosive eruption at Sakurajima volcano, Japan

    NASA Astrophysics Data System (ADS)

    Biass, S.; Todde, A.; Cioni, R.; Pistolesi, M.; Geshi, N.; Bonadonna, C.

    2017-10-01

    We present an exposure analysis of infrastructure and lifeline to tephra fallout for a future large-scale explosive eruption of Sakurajima volcano. An eruption scenario is identified based on the field characterization of the last subplinian eruption at Sakurajima and a review of reports of the eruptions that occurred in the past six centuries. A scenario-based probabilistic hazard assessment is performed using the Tephra2 model, considering various eruption durations to reflect complex eruptive sequences of all considered reference eruptions. A quantitative exposure analysis of infrastructures and lifelines is presented primarily using open-access data. The post-event impact assessment of Magill et al. (Earth Planets Space 65:677-698, 2013) after the 2011 VEI 2 eruption of Shinmoedake is used to discuss the vulnerability and the resilience of infrastructures during a future large eruption of Sakurajima. Results indicate a main eastward dispersal, with longer eruption durations increasing the probability of tephra accumulation in proximal areas and reducing it in distal areas. The exposure analysis reveals that 2300 km of road network, 18 km2 of urban area, and 306 km2 of agricultural land have a 50% probability of being affected by an accumulation of tephra of 1 kg/m2. A simple qualitative exposure analysis suggests that the municipalities of Kagoshima, Kanoya, and Tarumizu are the most likely to suffer impacts. Finally, the 2011 VEI 2 eruption of Shinmoedake demonstrated that the already implemented mitigation strategies have increased resilience and improved recovery of affected infrastructures. Nevertheless, the extent to which these mitigation actions will perform during the VEI 4 eruption presented here is unclear and our hazard assessment points to possible damages on the Sakurajima peninsula and the neighboring municipality of Tarumizu.

  8. Impact modeling and prediction of attacks on cyber targets

    NASA Astrophysics Data System (ADS)

    Khalili, Aram; Michalk, Brian; Alford, Lee; Henney, Chris; Gilbert, Logan

    2010-04-01

    In most organizations, IT (information technology) infrastructure exists to support the organization's mission. The threat of cyber attacks poses risks to this mission. Current network security research focuses on the threat of cyber attacks to the organization's IT infrastructure; however, the risks to the overall mission are rarely analyzed or formalized. This connection of IT infrastructure to the organization's mission is often neglected or carried out ad-hoc. Our work bridges this gap and introduces analyses and formalisms to help organizations understand the mission risks they face from cyber attacks. Modeling an organization's mission vulnerability to cyber attacks requires a description of the IT infrastructure (network model), the organization mission (business model), and how the mission relies on IT resources (correlation model). With this information, proper analysis can show which cyber resources are of tactical importance in a cyber attack, i.e., controlling them enables a large range of cyber attacks. Such analysis also reveals which IT resources contribute most to the organization's mission, i.e., lack of control over them gravely affects the mission. These results can then be used to formulate IT security strategies and explore their trade-offs, which leads to better incident response. This paper presents our methodology for encoding IT infrastructure, organization mission and correlations, our analysis framework, as well as initial experimental results and conclusions.

  9. Rapid Arctic Changes due to Infrastructure and Climate (RATIC) in the Russian North

    NASA Astrophysics Data System (ADS)

    Walker, D. A.; Kofinas, G.; Raynolds, M. K.; Kanevskiy, M. Z.; Shur, Y.; Ambrosius, K.; Matyshak, G. V.; Romanovsky, V. E.; Kumpula, T.; Forbes, B. C.; Khukmotov, A.; Leibman, M. O.; Khitun, O.; Lemay, M.; Allard, M.; Lamoureux, S. F.; Bell, T.; Forbes, D. L.; Vincent, W. F.; Kuznetsova, E.; Streletskiy, D. A.; Shiklomanov, N. I.; Fondahl, G.; Petrov, A.; Roy, L. P.; Schweitzer, P.; Buchhorn, M.

    2015-12-01

    The Rapid Arctic Transitions due to Infrastructure and Climate (RATIC) initiative is a forum developed by the International Arctic Science Committee (IASC) Terrestrial, Cryosphere, and Social & Human working groups for developing and sharing new ideas and methods to facilitate the best practices for assessing, responding to, and adaptively managing the cumulative effects of Arctic infrastructure and climate change. An IASC white paper summarizes the activities of two RATIC workshops at the Arctic Change 2014 Conference in Ottawa, Canada and the 2015 Third International Conference on Arctic Research Planning (ICARP III) meeting in Toyama, Japan (Walker & Pierce, ed. 2015). Here we present an overview of the recommendations from several key papers and posters presented at these conferences with a focus on oil and gas infrastructure in the Russian north and comparison with oil development infrastructure in Alaska. These analyses include: (1) the effects of gas- and oilfield activities on the landscapes and the Nenets indigenous reindeer herders of the Yamal Peninsula, Russia; (2) a study of urban infrastructure in the vicinity of Norilsk, Russia, (3) an analysis of the effects of pipeline-related soil warming on trace-gas fluxes in the vicinity of Nadym, Russia, (4) two Canadian initiatives that address multiple aspects of Arctic infrastructure called Arctic Development and Adaptation to Permafrost in Transition (ADAPT) and the ArcticNet Integrated Regional Impact Studies (IRIS), and (5) the effects of oilfield infrastructure on landscapes and permafrost in the Prudhoe Bay region, Alaska.

  10. Dynamic Collaboration Infrastructure for Hydrologic Science

    NASA Astrophysics Data System (ADS)

    Tarboton, D. G.; Idaszak, R.; Castillo, C.; Yi, H.; Jiang, F.; Jones, N.; Goodall, J. L.

    2016-12-01

    Data and modeling infrastructure is becoming increasingly accessible to water scientists. HydroShare is a collaborative environment that currently offers water scientists the ability to access modeling and data infrastructure in support of data intensive modeling and analysis. It supports the sharing of and collaboration around "resources" which are social objects defined to include both data and models in a structured standardized format. Users collaborate around these objects via comments, ratings, and groups. HydroShare also supports web services and cloud based computation for the execution of hydrologic models and analysis and visualization of hydrologic data. However, the quantity and variety of data and modeling infrastructure available that can be accessed from environments like HydroShare is increasing. Storage infrastructure can range from one's local PC to campus or organizational storage to storage in the cloud. Modeling or computing infrastructure can range from one's desktop to departmental clusters to national HPC resources to grid and cloud computing resources. How does one orchestrate this vast number of data and computing infrastructure without needing to correspondingly learn each new system? A common limitation across these systems is the lack of efficient integration between data transport mechanisms and the corresponding high-level services to support large distributed data and compute operations. A scientist running a hydrology model from their desktop may require processing a large collection of files across the aforementioned storage and compute resources and various national databases. To address these community challenges a proof-of-concept prototype was created integrating HydroShare with RADII (Resource Aware Data-centric collaboration Infrastructure) to provide software infrastructure to enable the comprehensive and rapid dynamic deployment of what we refer to as "collaborative infrastructure." In this presentation we discuss the results of this proof-of-concept prototype which enabled HydroShare users to readily instantiate virtual infrastructure marshaling arbitrary combinations, varieties, and quantities of distributed data and computing infrastructure in addressing big problems in hydrology.

  11. Simulation and Verification of Synchronous Set Relations in Rewriting Logic

    NASA Technical Reports Server (NTRS)

    Rocha, Camilo; Munoz, Cesar A.

    2011-01-01

    This paper presents a mathematical foundation and a rewriting logic infrastructure for the execution and property veri cation of synchronous set relations. The mathematical foundation is given in the language of abstract set relations. The infrastructure consists of an ordersorted rewrite theory in Maude, a rewriting logic system, that enables the synchronous execution of a set relation provided by the user. By using the infrastructure, existing algorithm veri cation techniques already available in Maude for traditional asynchronous rewriting, such as reachability analysis and model checking, are automatically available to synchronous set rewriting. The use of the infrastructure is illustrated with an executable operational semantics of a simple synchronous language and the veri cation of temporal properties of a synchronous system.

  12. CALS Infrastructure Analysis. Draft. Volume 21

    DOT National Transportation Integrated Search

    1990-03-01

    This executive overview to the DoD CALS Infrastructure Analysis Report summarizes the Components' current effort to modernize the DoD technical data infrastructure. This infrastructure includes all existing and planned capabilities to acquire, manage...

  13. Spatial aspects of the research on tourist infrastructure with the use of the cartographic method on the basis of Roztoczański National Park

    NASA Astrophysics Data System (ADS)

    Kałamucki, Krzysztof; Kamińska, Anna; Buk, Dorota

    2012-01-01

    The aim of the research was to demonstrate changes in tourist trails and in the distribution of tourist infrastructure spots in the area of Roztoczański National Park in its vicinity. Another, equally important aim, was to check the usefulness of tourist infrastructure in both cartographic method of infrastructure research and in cartography of presentation methods. The research covered the region of Roztoczański National Park. The following elements of tourist infrastructure were selected for the analysis: linear elements (walking trails, education paths) and spot elements (accommodation, eating places and the accompanied basis). In order to recreate the state of infrastructure during the last 50 years, it was necessary to analyse the following source material: tourist maps issued as independent publications, maps issued as supplements to tour guides and aerial photography. The information from text sources was used, e.g. from tourist guides, leaflets and monographs. The temporal framework was defined as 50 years from the 1960's until 2009. This time range was divided into five 10-year periods. In order to present the state of tourist infrastructure, its spatial and qualitative changes, 6 maps were produces (maps of states and types of changes). The conducted spatial analyses and the interpretations of maps of states and changes in tourist infrastructure allowed to capture both qualitative and quantitative changes. It was stated that the changes in the trails were not regular. There were parts of trails that did not change for 40 years. There were also some that were constructed during the last decade. Presently, the area is densely covered with tourist trails and education paths. The measurements of lengths of tourist trails and their parts with regard to land cover and category of roads allowed to determine the character of trails and the scope of changes. The conducted analyses proved the usefulness of cartographic methods in researching tourist infrastructure in spatial and quantitative aspects.

  14. The diverse use of clouds by CMS

    DOE PAGES

    Andronis, Anastasios; Bauer, Daniela; Chaze, Olivier; ...

    2015-12-23

    The resources CMS is using are increasingly being offered as clouds. In Run 2 of the LHC the majority of CMS CERN resources, both in Meyrin and at the Wigner Computing Centre, will be presented as cloud resources on which CMS will have to build its own infrastructure. This infrastructure will need to run all of the CMS workflows including: Tier 0, production and user analysis. In addition, the CMS High Level Trigger will provide a compute resource comparable in scale to the total offered by the CMS Tier 1 sites, when it is not running as part of themore » trigger system. During these periods a cloud infrastructure will be overlaid on this resource, making it accessible for general CMS use. Finally, CMS is starting to utilise cloud resources being offered by individual institutes and is gaining experience to facilitate the use of opportunistically available cloud resources. Lastly, we present a snap shot of this infrastructure and its operation at the time of the CHEP2015 conference.« less

  15. Application of large-scale computing infrastructure for diverse environmental research applications using GC3Pie

    NASA Astrophysics Data System (ADS)

    Maffioletti, Sergio; Dawes, Nicholas; Bavay, Mathias; Sarni, Sofiane; Lehning, Michael

    2013-04-01

    The Swiss Experiment platform (SwissEx: http://www.swiss-experiment.ch) provides a distributed storage and processing infrastructure for environmental research experiments. The aim of the second phase project (the Open Support Platform for Environmental Research, OSPER, 2012-2015) is to develop the existing infrastructure to provide scientists with an improved workflow. This improved workflow will include pre-defined, documented and connected processing routines. A large-scale computing and data facility is required to provide reliable and scalable access to data for analysis, and it is desirable that such an infrastructure should be free of traditional data handling methods. Such an infrastructure has been developed using the cloud-based part of the Swiss national infrastructure SMSCG (http://www.smscg.ch) and Academic Cloud. The infrastructure under construction supports two main usage models: 1) Ad-hoc data analysis scripts: These scripts are simple processing scripts, written by the environmental researchers themselves, which can be applied to large data sets via the high power infrastructure. Examples of this type of script are spatial statistical analysis scripts (R-based scripts), mostly computed on raw meteorological and/or soil moisture data. These provide processed output in the form of a grid, a plot, or a kml. 2) Complex models: A more intense data analysis pipeline centered (initially) around the physical process model, Alpine3D, and the MeteoIO plugin; depending on the data set, this may require a tightly coupled infrastructure. SMSCG already supports Alpine3D executions as both regular grid jobs and as virtual software appliances. A dedicated appliance with the Alpine3D specific libraries has been created and made available through the SMSCG infrastructure. The analysis pipelines are activated and supervised by simple control scripts that, depending on the data fetched from the meteorological stations, launch new instances of the Alpine3D appliance, execute location-based subroutines at each grid point and store the results back into the central repository for post-processing. An optional extension of this infrastructure will be to provide a 'ring buffer'-type database infrastructure, such that model results (e.g. test runs made to check parameter dependency or for development) can be visualised and downloaded after completion without submitting them to a permanent storage infrastructure. Data organization Data collected from sensors are archived and classified in distributed sites connected with an open-source software middleware, GSN. Publicly available data are available through common web services and via a cloud storage server (based on Swift). Collocation of the data and processing in the cloud would eventually eliminate data transfer requirements. Execution control logic Execution of the data analysis pipelines (for both the R-based analysis and the Alpine3D simulations) has been implemented using the GC3Pie framework developed by UZH. (https://code.google.com/p/gc3pie/). This allows large-scale, fault-tolerant execution of the pipelines to be described in terms of software appliances. GC3Pie also allows supervision of the execution of large campaigns of appliances as a single simulation. This poster will present the fundamental architectural components of the data analysis pipelines together with initial experimental results.

  16. Assessing the risk posed by natural hazards to infrastructures

    NASA Astrophysics Data System (ADS)

    Eidsvig, Unni Marie K.; Kristensen, Krister; Vidar Vangelsten, Bjørn

    2017-03-01

    This paper proposes a model for assessing the risk posed by natural hazards to infrastructures, with a focus on the indirect losses and loss of stability for the population relying on the infrastructure. The model prescribes a three-level analysis with increasing level of detail, moving from qualitative to quantitative analysis. The focus is on a methodology for semi-quantitative analyses to be performed at the second level. The purpose of this type of analysis is to perform a screening of the scenarios of natural hazards threatening the infrastructures, identifying the most critical scenarios and investigating the need for further analyses (third level). The proposed semi-quantitative methodology considers the frequency of the natural hazard, different aspects of vulnerability, including the physical vulnerability of the infrastructure itself, and the societal dependency on the infrastructure. An indicator-based approach is applied, ranking the indicators on a relative scale according to pre-defined ranking criteria. The proposed indicators, which characterise conditions that influence the probability of an infrastructure malfunctioning caused by a natural event, are defined as (1) robustness and buffer capacity, (2) level of protection, (3) quality/level of maintenance and renewal, (4) adaptability and quality of operational procedures and (5) transparency/complexity/degree of coupling. Further indicators describe conditions influencing the socio-economic consequences of the infrastructure malfunctioning, such as (1) redundancy and/or substitution, (2) cascading effects and dependencies, (3) preparedness and (4) early warning, emergency response and measures. The aggregated risk estimate is a combination of the semi-quantitative vulnerability indicators, as well as quantitative estimates of the frequency of the natural hazard, the potential duration of the infrastructure malfunctioning (e.g. depending on the required restoration effort) and the number of users of the infrastructure. Case studies for two Norwegian municipalities are presented for demonstration purposes, where risk posed by adverse weather and natural hazards to primary road, water supply and power networks is assessed. The application examples show that the proposed model provides a useful tool for screening of potential undesirable events, contributing to a targeted reduction of the risk.

  17. Regional Charging Infrastructure for Plug-In Electric Vehicles: A Case Study of Massachusetts

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wood, Eric; Raghavan, Sesha; Rames, Clement

    Given the complex issues associated with plug-in electric vehicle (PEV) charging and options in deploying charging infrastructure, there is interest in exploring scenarios of future charging infrastructure deployment to provide insight and guidance to national and regional stakeholders. The complexity and cost of PEV charging infrastructure pose challenges to decision makers, including individuals, communities, and companies considering infrastructure installations. The value of PEVs to consumers and fleet operators can be increased with well-planned and cost-effective deployment of charging infrastructure. This will increase the number of miles driven electrically and accelerate PEV market penetration, increasing the shared value of charging networksmore » to an expanding consumer base. Given these complexities and challenges, the objective of the present study is to provide additional insight into the role of charging infrastructure in accelerating PEV market growth. To that end, existing studies on PEV infrastructure are summarized in a literature review. Next, an analysis of current markets is conducted with a focus on correlations between PEV adoption and public charging availability. A forward-looking case study is then conducted focused on supporting 300,000 PEVs by 2025 in Massachusetts. The report concludes with a discussion of potential methodology for estimating economic impacts of PEV infrastructure growth.« less

  18. A Delphi-Based Framework for systems architecting of in-orbit exploration infrastructure for human exploration beyond Low Earth Orbit

    NASA Astrophysics Data System (ADS)

    Aliakbargolkar, Alessandro; Crawley, Edward F.

    2014-01-01

    The current debate in the U.S. Human Spaceflight Program focuses on the development of the next generation of man-rated heavy lift launch vehicles. While launch vehicle systems are of critical importance for future exploration, a comprehensive analysis of the entire exploration infrastructure is required to avoid costly pitfalls at early stages of the design process. This paper addresses this need by presenting a Delphi-Based Systems Architecting Framework for integrated architectural analysis of future in-orbit infrastructure for human space exploration beyond Low Earth Orbit. The paper is structured in two parts. The first part consists of an expert elicitation study to identify objectives for the in-space transportation infrastructure. The study was conducted between November 2011 and January 2012 with 15 senior experts involved in human spaceflight in the United States and Europe. The elicitation study included the formation of three expert panels representing exploration, science, and policy stakeholders engaged in a 3-round Delphi study. The rationale behind the Delphi approach, as imported from social science research, is discussed. Finally, a novel version of the Delphi method is presented and applied to technical decision-making and systems architecting in the context of human space exploration. The second part of the paper describes a tradespace exploration study of in-orbit infrastructure coupled with a requirements definition exercise informed by expert elicitation. The uncertainties associated with technical requirements and stakeholder goals are explicitly considered in the analysis. The outcome of the expert elicitation process portrays an integrated view of perceived stakeholder needs within the human spaceflight community. Needs are subsequently converted into requirements and coupled to the system architectures of interest to analyze the correlation between exploration, science, and policy goals. Pareto analysis is used to identify architectures of interest for further consideration by decision-makers. The paper closes with a summary of insights and develops a strategy for evolutionary development of the exploration infrastructure of the incoming decades. The most important result produced by this analysis is the identification of a critical irreducible ambiguity undermining value delivery for the in-space transportation infrastructure of the next three decades: destination choice. Consensus on destination is far from being reached by the community at large, with particular reference to exploration and policy stakeholders. The realization of this ambiguity is a call for NASA to promote an open forum on this topic, and to develop a strong case for policy makers to incentivize investments in the human spaceflight industry in the next decades.

  19. A service-based BLAST command tool supported by cloud infrastructures.

    PubMed

    Carrión, Abel; Blanquer, Ignacio; Hernández, Vicente

    2012-01-01

    Notwithstanding the benefits of distributed-computing infrastructures for empowering bioinformatics analysis tools with the needed computing and storage capability, the actual use of these infrastructures is still low. Learning curves and deployment difficulties have reduced the impact on the wide research community. This article presents a porting strategy of BLAST based on a multiplatform client and a service that provides the same interface as sequential BLAST, thus reducing learning curve and with minimal impact on their integration on existing workflows. The porting has been done using the execution and data access components from the EC project Venus-C and the Windows Azure infrastructure provided in this project. The results obtained demonstrate a low overhead on the global execution framework and reasonable speed-up and cost-efficiency with respect to a sequential version.

  20. Civil infrastructure monitoring for IVHS using optical fiber sensors

    NASA Astrophysics Data System (ADS)

    de Vries, Marten J.; Arya, Vivek; Grinder, C. R.; Murphy, Kent A.; Claus, Richard O.

    1995-01-01

    8Early deployment of Intelligent Vehicle Highway Systems would necessitate the internal instrumentation of infrastructure for emergency preparedness. Existing quantitative analysis and visual analysis techniques are time consuming, cost prohibitive, and are often unreliable. Fiber optic sensors are rapidly replacing conventional instrumentation because of their small size, light weight, immunity to electromagnetic interference, and extremely high information carrying capability. In this paper research on novel optical fiber sensing techniques for health monitoring of civil infrastructure such as highways and bridges is reported. Design, fabrication, and implementation of fiber optic sensor configurations used for measurements of strain are discussed. Results from field tests conducted to demonstrate the effectiveness of fiber sensors at determining quantitative strain vector components near crack locations in bridges are presented. Emerging applications of fiber sensors for vehicle flow, vehicle speed, and weigh-in-motion measurements are also discussed.

  1. 76 FR 34286 - ITS Joint Program Office; Webinar on Connected Vehicle Infrastructure Deployment Analysis Report...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-13

    ... Deployment Analysis Report Review; Notice of Public Meeting AGENCY: Research and Innovative Technology... discuss the Connected Vehicle Infrastructure Deployment Analysis Report. The webinar will provide an... and Transportation Officials (AASHTO) Connected Vehicle Infrastructure Deployment Analysis Report...

  2. Users' perception as a tool to improve urban beach planning and management.

    PubMed

    Cervantes, Omar; Espejel, Ileana; Arellano, Evarista; Delhumeau, Sheila

    2008-08-01

    Four beaches that share physiographic characteristics (sandy, wide, and long) but differ in socioeconomic and cultural terms (three are located in northwestern Mexico and one in California, USA) were evaluated by beach users. Surveys (565) composed of 36 questions were handed out to beach users on weekends and holidays in 2005. The 25 questions that revealed the most information were selected by factor analysis and classified by cluster analysis. Beach users' preferences were assigned a value by comparing the present survey results with the characteristics of an "ideal" recreational urban beach. Cluster analysis separated three groups of questions: (a) services and infrastructure, (b) recreational activities, and (c) beach conditions. Cluster linkage distance (r=0.82, r=0.78, r=0.67) was used as a weight and multiplied by the value of beach descriptive factors. Mazatlán and Oceanside obtained the highest values because there are enough infrastructure and services; on the contrary, Ensenada and Rosarito were rated medium and low because infrastructure and services are lacking. The presently proposed method can contribute to improving current beach evaluations because the final score represents the beach users' evaluation of the quality of the beach. The weight considered in the present study marks the beach users' preferences among the studied beaches. Adding this weight to beach evaluation will contribute to more specific beach planning in which users' perception is considered.

  3. Users' Perception as a Tool to Improve Urban Beach Planning and Management

    NASA Astrophysics Data System (ADS)

    Cervantes, Omar; Espejel, Ileana; Arellano, Evarista; Delhumeau, Sheila

    2008-08-01

    Four beaches that share physiographic characteristics (sandy, wide, and long) but differ in socioeconomic and cultural terms (three are located in northwestern Mexico and one in California, USA) were evaluated by beach users. Surveys (565) composed of 36 questions were handed out to beach users on weekends and holidays in 2005. The 25 questions that revealed the most information were selected by factor analysis and classified by cluster analysis. Beach users’ preferences were assigned a value by comparing the present survey results with the characteristics of an “ideal” recreational urban beach. Cluster analysis separated three groups of questions: (a) services and infrastructure, (b) recreational activities, and (c) beach conditions. Cluster linkage distance ( r = 0.82, r = 0.78, r = 0.67) was used as a weight and multiplied by the value of beach descriptive factors. Mazatlán and Oceanside obtained the highest values because there are enough infrastructure and services; on the contrary, Ensenada and Rosarito were rated medium and low because infrastructure and services are lacking. The presently proposed method can contribute to improving current beach evaluations because the final score represents the beach users’ evaluation of the quality of the beach. The weight considered in the present study marks the beach users’ preferences among the studied beaches. Adding this weight to beach evaluation will contribute to more specific beach planning in which users’ perception is considered.

  4. The co-integration analysis of relationship between urban infrastructure and urbanization - A case of Shanghai

    NASA Astrophysics Data System (ADS)

    Wang, Qianlu

    2017-10-01

    Urban infrastructure and urbanization influence each other, and quantitative analysis of the relationship between them will play a significant role in promoting the social development. The paper based on the data of infrastructure and the proportion of urban population in Shanghai from 1988 to 2013, use the econometric analysis of co-integration test, error correction model and Granger causality test method, and empirically analyze the relationship between Shanghai's infrastructure and urbanization. The results show that: 1) Shanghai Urban infrastructure has a positive effect for the development of urbanization and narrowing the population gap; 2) when the short-term fluctuations deviate from long-term equilibrium, the system will pull the non-equilibrium state back to equilibrium with an adjust intensity 0.342670. And hospital infrastructure is not only an important variable for urban development in short-term, but also a leading infrastructure in the process of urbanization in Shanghai; 3) there has Granger causality between road infrastructure and urbanization; and there is no Granger causality between water infrastructure and urbanization, hospital and school infrastructures of social infrastructure have unidirectional Granger causality with urbanization.

  5. Long-Term Impacts of Precolonial Institutions, Geography and Ecological Diversity on Access to Public Infrastructure Services in Nigeria

    NASA Astrophysics Data System (ADS)

    Archibong, B.

    2014-12-01

    Do precolonial institutions, geography and ecological diversity affect population access to public infrastructure services over a century later? Can local leaders from historically centralized or 'conqueror' groups still influence access to public goods today? Do precolonial states located in ecologically diverse environments have better access to water, power and sanitation resources today? A growing body of literature examining the sources of the current state of African economic development has cited the enduring impacts of precolonial institutions and geography on contemporary African economic development using large sample cross-sectional analysis. In this paper, I focus on within country effects of local ethnic and political state institutions on access to public infrastructure services in present day Nigeria. Specifically, I combine information on the spatial distribution of ethnic states and ecological diversity in Nigeria circa mid 19th century and political states in Nigeria circa 1785 and 1850 with information, from a novel geocoded survey dataset, on access to public infrastructure at the local government level in present day Nigeria to examine the impact of precolonial state centralization on the current unequal access to public infrastructure services in Nigeria, accounting for the effects of ecological diversity and other geographic covariates. Some preliminary results show evidence for the long-term impacts of institutions, geography and ecological diversity on access to public infrastructure in Nigeria.

  6. Infrastructural requirements for local implementation of safety policies: the discordance between top-down and bottom-up systems of action.

    PubMed

    Timpka, Toomas; Nordqvist, Cecilia; Lindqvist, Kent

    2009-03-09

    Safety promotion is planned and practised not only by public health organizations, but also by other welfare state agencies, private companies and non-governmental organizations. The term 'infrastructure' originally denoted the underlying resources needed for warfare, e.g. roads, industries, and an industrial workforce. Today, 'infrastructure' refers to the physical elements, organizations and people needed to run projects in different societal arenas. The aim of this study was to examine associations between infrastructure and local implementation of safety policies in injury prevention and safety promotion programs. Qualitative data on municipalities in Sweden designated as Safe Communities were collected from focus group interviews with municipal politicians and administrators, as well as from policy documents, and materials published on the Internet. Actor network theory was used to identify weaknesses in the present infrastructure and determine strategies that can be used to resolve these. The weakness identification analysis revealed that the factual infrastructure available for effectuating national strategies varied between safety areas and approaches, basically reflecting differences between bureaucratic and network-based organizational models. At the local level, a contradiction between safety promotion and the existence of quasi-markets for local public service providers was found to predispose for a poor local infrastructure diminishing the interest in integrated inter-agency activities. The weakness resolution analysis showed that development of an adequate infrastructure for safety promotion would require adjustment of the legal framework regulating injury data exchange, and would also require rational financial models for multi-party investments in local infrastructures. We found that the "silo" structure of government organization and assignment of resources was a barrier to collaborative action for safety at a community level. It may therefore be overly optimistic to take for granted that different approaches to injury control, such as injury prevention and safety promotion, can share infrastructure. Similarly, it may be unrealistic to presuppose that safety promotion can reach its potential in terms of injury rate reductions unless the critical infrastructure for this is in place. Such an alignment of the infrastructure to organizational processes requires more than financial investments.

  7. NISAC | National Infrastructure Simulation and Analysis Center | NISAC

    Science.gov Websites

    Logo National Infrastructure Simulation and Analysis Center Search Btn search this site... Overview Capabilities Fact Sheets Publications Contacts NISAC content top NISAC The National Infrastructure Simulation and Analysis Center (NISAC) is a modeling, simulation, and analysis program within the Department of

  8. Analysis of Stakeholder-Defined Needs in Northeast U.S. Coastal Communities to Determine Gaps in Research Informing Coastal Resilience Planning

    NASA Astrophysics Data System (ADS)

    Molino, G. D.; Kenney, M. A.; Sutton-Grier, A.; Penn, K.

    2017-12-01

    The impacts of climate change on our coastlines are increasing pressure on communities, ecosystems, infrastructure, and state-to-local economies in the northeastern United States (U.S.). As a result of current or imminent risk of acute and chronic hazards, local, state and regional entities have taken steps to identify and address vulnerabilities to climate change. Decisions to increase coastal infrastructure resilience and grey, green, and cultural infrastructure solutions requires physical, natural, and social science that is useful for decision-making and effective science translation mechanisms. Despite the desire to conduct or fund science that meets the needs of communities, there has been no comprehensive analysis to determine stakeholder-defined research needs. To address this gap, this study conducts a stakeholder needs analysis in northeast U.S. coastal communities to determine gaps in information and translation processes supporting coastal resilience planning. Documents were sourced from local, state, and regional organizations in both the public and private sectors, using the northeast region defined by the third National Climate Assessment. Modeled after Dilling et al. (2015), a deductive coding schema was developed that categorized documents using specific search terms such as "Location and condition of infrastructure" and "Proactive planning". A qualitative document analysis was then executed using NVivo to formally identify patterns and themes present in stakeholder surveys, workshop proceedings, and reports. Initial stakeholder priorities centered around incorporation of climate science into planning and decision making regarding vulnerabilities of infrastructure, enhanced emergency planning and response, and communication of key information.

  9. Onsite and Electric Backup Capabilities at Critical Infrastructure Facilities in the United States

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Phillips, Julia A.; Wallace, Kelly E.; Kudo, Terence Y.

    2016-04-01

    The following analysis, conducted by Argonne National Laboratory’s (Argonne’s) Risk and Infrastructure Science Center (RISC), details an analysis of electric power backup of national critical infrastructure as captured through the Department of Homeland Security’s (DHS’s) Enhanced Critical Infrastructure Program (ECIP) Initiative. Between January 1, 2011, and September 2014, 3,174 ECIP facility surveys have been conducted. This study focused first on backup capabilities by infrastructure type and then expanded to infrastructure type by census region.

  10. NiftyNet: a deep-learning platform for medical imaging.

    PubMed

    Gibson, Eli; Li, Wenqi; Sudre, Carole; Fidon, Lucas; Shakir, Dzhoshkun I; Wang, Guotai; Eaton-Rosen, Zach; Gray, Robert; Doel, Tom; Hu, Yipeng; Whyntie, Tom; Nachev, Parashkev; Modat, Marc; Barratt, Dean C; Ourselin, Sébastien; Cardoso, M Jorge; Vercauteren, Tom

    2018-05-01

    Medical image analysis and computer-assisted intervention problems are increasingly being addressed with deep-learning-based solutions. Established deep-learning platforms are flexible but do not provide specific functionality for medical image analysis and adapting them for this domain of application requires substantial implementation effort. Consequently, there has been substantial duplication of effort and incompatible infrastructure developed across many research groups. This work presents the open-source NiftyNet platform for deep learning in medical imaging. The ambition of NiftyNet is to accelerate and simplify the development of these solutions, and to provide a common mechanism for disseminating research outputs for the community to use, adapt and build upon. The NiftyNet infrastructure provides a modular deep-learning pipeline for a range of medical imaging applications including segmentation, regression, image generation and representation learning applications. Components of the NiftyNet pipeline including data loading, data augmentation, network architectures, loss functions and evaluation metrics are tailored to, and take advantage of, the idiosyncracies of medical image analysis and computer-assisted intervention. NiftyNet is built on the TensorFlow framework and supports features such as TensorBoard visualization of 2D and 3D images and computational graphs by default. We present three illustrative medical image analysis applications built using NiftyNet infrastructure: (1) segmentation of multiple abdominal organs from computed tomography; (2) image regression to predict computed tomography attenuation maps from brain magnetic resonance images; and (3) generation of simulated ultrasound images for specified anatomical poses. The NiftyNet infrastructure enables researchers to rapidly develop and distribute deep learning solutions for segmentation, regression, image generation and representation learning applications, or extend the platform to new applications. Copyright © 2018 The Authors. Published by Elsevier B.V. All rights reserved.

  11. A framework for quantifying and optimizing the value of seismic monitoring of infrastructure

    NASA Astrophysics Data System (ADS)

    Omenzetter, Piotr

    2017-04-01

    This paper outlines a framework for quantifying and optimizing the value of information from structural health monitoring (SHM) technology deployed on large infrastructure, which may sustain damage in a series of earthquakes (the main and the aftershocks). The evolution of the damage state of the infrastructure without or with SHM is presented as a time-dependent, stochastic, discrete-state, observable and controllable nonlinear dynamical system. The pre-posterior Bayesian analysis and the decision tree are used for quantifying and optimizing the value of SHM information. An optimality problem is then formulated how to decide on the adoption of SHM and how to manage optimally the usage and operations of the possibly damaged infrastructure and its repair schedule using the information from SHM. The objective function to minimize is the expected total cost or risk.

  12. Cheap Oil and the Impact on Rebuilding Syria

    DTIC Science & Technology

    2015-10-30

    Syrian energy infrastructure will not be cost effective in the current oil pricing environment. I will present a quick overview of oil’s historic role...in the Syrian economy, followed by a synopsis of the current state of Syria’s oil infrastructure . An analysis of the impact of low oil prices on...similarities between countries, a specific comparison will be made to Yemen in an effort to predict the challenges that Syria will face when entering the

  13. Conception of the system for traffic measurements based on piezoelectric foils

    NASA Astrophysics Data System (ADS)

    Płaczek, M.

    2016-08-01

    A concept of mechatronic system for traffic measurements based on the piezoelectric transducers used as sensors is presented. The aim of the work project is to theoretically and experimentally analyse the dynamic response of road infrastructure forced by vehicles motion. The subject of the project is therefore on the borderline of civil engineering and mechanical and covers a wide range of issues in both these areas. To measure the dynamic response of the tested pieces of road infrastructure application of piezoelectric, in particular piezoelectric transducers in the form of piezoelectric films (MFC - Macro Fiber Composite) is proposed. The purpose is to verify the possibility to use composite piezoelectric transducers as sensors used in traffic surveillance systems - innovative methods of controlling the road infrastructure and traffic. Presented paper reports works that were done in order to receive the basic information about analysed systems and their behaviour under excitation by passing vehicles. It is very important to verify if such kind of systems can be controlled by the analysis of the dynamic response of road infrastructure measured using piezoelectric transducers. Obtained results show that it could be possible.

  14. Developing a European grid infrastructure for cancer research: vision, architecture and services

    PubMed Central

    Tsiknakis, M; Rueping, S; Martin, L; Sfakianakis, S; Bucur, A; Sengstag, T; Brochhausen, M; Pucaski, J; Graf, N

    2007-01-01

    Life sciences are currently at the centre of an information revolution. The nature and amount of information now available opens up areas of research that were once in the realm of science fiction. During this information revolution, the data-gathering capabilities have greatly surpassed the data-analysis techniques. Data integration across heterogeneous data sources and data aggregation across different aspects of the biomedical spectrum, therefore, is at the centre of current biomedical and pharmaceutical R&D. This paper reports on original results from the ACGT integrated project, focusing on the design and development of a European Biomedical Grid infrastructure in support of multi-centric, post-genomic clinical trials (CTs) on cancer. Post-genomic CTs use multi-level clinical and genomic data and advanced computational analysis and visualization tools to test hypotheses in trying to identify the molecular reasons for a disease and the stratification of patients in terms of treatment. The paper provides a presentation of the needs of users involved in post-genomic CTs and presents indicative scenarios, which drive the requirements of the engineering phase of the project. Subsequently, the initial architecture specified by the project is presented, and its services are classified and discussed. A range of such key services, including the Master Ontology on sCancer, which lie at the heart of the integration architecture of the project, is presented. Special efforts have been taken to describe the methodological and technological framework of the project, enabling the creation of a legally compliant and trustworthy infrastructure. Finally, a short discussion of the forthcoming work is included, and the potential involvement of the cancer research community in further development or utilization of the infrastructure is described. PMID:22275955

  15. MPEG-7-based description infrastructure for an audiovisual content analysis and retrieval system

    NASA Astrophysics Data System (ADS)

    Bailer, Werner; Schallauer, Peter; Hausenblas, Michael; Thallinger, Georg

    2005-01-01

    We present a case study of establishing a description infrastructure for an audiovisual content-analysis and retrieval system. The description infrastructure consists of an internal metadata model and access tool for using it. Based on an analysis of requirements, we have selected, out of a set of candidates, MPEG-7 as the basis of our metadata model. The openness and generality of MPEG-7 allow using it in broad range of applications, but increase complexity and hinder interoperability. Profiling has been proposed as a solution, with the focus on selecting and constraining description tools. Semantic constraints are currently only described in textual form. Conformance in terms of semantics can thus not be evaluated automatically and mappings between different profiles can only be defined manually. As a solution, we propose an approach to formalize the semantic constraints of an MPEG-7 profile using a formal vocabulary expressed in OWL, which allows automated processing of semantic constraints. We have defined the Detailed Audiovisual Profile as the profile to be used in our metadata model and we show how some of the semantic constraints of this profile can be formulated using ontologies. To work practically with the metadata model, we have implemented a MPEG-7 library and a client/server document access infrastructure.

  16. Field data collection, analysis, and adaptive management of green infrastructure in the urban water cycle in Cleveland and Columbus, OH

    NASA Astrophysics Data System (ADS)

    Darner, R.; Shuster, W.

    2016-12-01

    Expansion of the urban environment can alter the landscape and creates challenges for how cities deal with energy and water. Large volumes of stormwater in areas that have combined septic and stormwater systems present on challenge. Managing the water as near to the source as possible by creates an environment that allows more infiltration and evapotranspiration. Stormwater control measures (SCM) associated with this type of development, often called green infrastructure, include rain gardens, pervious or porous pavements, bioswales, green or blue roofs, and others. In this presentation, we examine the hydrology of green infrastructure in urban sewersheds in Cleveland and Columbus, OH. We present the need for data throughout the water cycle and challenges to collecting field data at a small scale (single rain garden instrumented to measure inflows, outflow, weather, soil moisture, and groundwater levels) and at a macro scale (a project including low-cost rain gardens, highly engineered rain gardens, groundwater wells, weather stations, soil moisture, and combined sewer flow monitoring). Results will include quantifying the effectiveness of SCMs in intercepting stormwater for different precipitation event sizes. Small scale deployment analysis will demonstrate the role of active adaptive management in the ongoing optimization over multiple years of data collection.

  17. DNAseq Workflow in a Diagnostic Context and an Example of a User Friendly Implementation.

    PubMed

    Wolf, Beat; Kuonen, Pierre; Dandekar, Thomas; Atlan, David

    2015-01-01

    Over recent years next generation sequencing (NGS) technologies evolved from costly tools used by very few, to a much more accessible and economically viable technology. Through this recently gained popularity, its use-cases expanded from research environments into clinical settings. But the technical know-how and infrastructure required to analyze the data remain an obstacle for a wider adoption of this technology, especially in smaller laboratories. We present GensearchNGS, a commercial DNAseq software suite distributed by Phenosystems SA. The focus of GensearchNGS is the optimal usage of already existing infrastructure, while keeping its use simple. This is achieved through the integration of existing tools in a comprehensive software environment, as well as custom algorithms developed with the restrictions of limited infrastructures in mind. This includes the possibility to connect multiple computers to speed up computing intensive parts of the analysis such as sequence alignments. We present a typical DNAseq workflow for NGS data analysis and the approach GensearchNGS takes to implement it. The presented workflow goes from raw data quality control to the final variant report. This includes features such as gene panels and the integration of online databases, like Ensembl for annotations or Cafe Variome for variant sharing.

  18. Integration of RAMS in LCC analysis for linear transport infrastructures. A case study for railways.

    NASA Astrophysics Data System (ADS)

    Calle-Cordón, Álvaro; Jiménez-Redondo, Noemi; Morales-Gámiz, F. J.; García-Villena, F. A.; Garmabaki, Amir H. S.; Odelius, Johan

    2017-09-01

    Life-cycle cost (LCC) analysis is an economic technique used to assess the total costs associated with the lifetime of a system in order to support decision making in long term strategic planning. For complex systems, such as railway and road infrastructures, the cost of maintenance plays an important role in the LCC analysis. Costs associated with maintenance interventions can be more reliably estimated by integrating the probabilistic nature of the failures associated to these interventions in the LCC models. Reliability, Maintainability, Availability and Safety (RAMS) parameters describe the maintenance needs of an asset in a quantitative way by using probabilistic information extracted from registered maintenance activities. Therefore, the integration of RAMS in the LCC analysis allows obtaining reliable predictions of system maintenance costs and the dependencies of these costs with specific cost drivers through sensitivity analyses. This paper presents an innovative approach for a combined RAMS & LCC methodology for railway and road transport infrastructures being developed under the on-going H2020 project INFRALERT. Such RAMS & LCC analysis provides relevant probabilistic information to be used for condition and risk-based planning of maintenance activities as well as for decision support in long term strategic investment planning.

  19. Next Generation Monitoring: Tier 2 Experience

    NASA Astrophysics Data System (ADS)

    Fay, R.; Bland, J.; Jones, S.

    2017-10-01

    Monitoring IT infrastructure is essential for maximizing availability and minimizing disruption by detecting failures and developing issues. The HEP group at Liverpool have recently updated our monitoring infrastructure with the goal of increasing coverage, improving visualization capabilities, and streamlining configuration and maintenance. Here we present a summary of Liverpool’s experience, the monitoring infrastructure, and the tools used to build it. In brief, system checks are configured in Puppet using Hiera, and managed by Sensu, replacing Nagios. Centralised logging is managed with Elasticsearch, together with Logstash and Filebeat. Kibana provides an interface for interactive analysis, including visualization and dashboards. Metric collection is also configured in Puppet, managed by collectd and stored in Graphite, with Grafana providing a visualization and dashboard tool. The Uchiwa dashboard for Sensu provides a web interface for viewing infrastructure status. Alert capabilities are provided via external handlers. A custom alert handler is in development to provide an easily configurable, extensible and maintainable alert facility.

  20. Good practices on cost - effective road infrastructure safety investments.

    PubMed

    Yannis, George; Papadimitriou, Eleonora; Evgenikos, Petros; Dragomanovits, Anastasios

    2016-12-01

    The paper presents the findings of a research project aiming to quantify and subsequently classify several infrastructure-related road safety measures, based on the international experience attained through extensive and selected literature review and additionally on a full consultation process including questionnaire surveys addressed to experts and relevant workshops. Initially, a review of selected research reports was carried out and an exhaustive list of road safety infrastructure investments covering all types of infrastructure was compiled. Individual investments were classified according to the infrastructure investment area and the type of investment and were thereafter analysed on the basis of key safety components. These investments were subsequently ranked in relation to their safety effects and implementation costs and on the basis of this ranking, a set of five most promising investments was selected for an in-depth analysis. The results suggest that the overall cost effectiveness of a road safety infrastructure investment is not always in direct correlation with the safety effect and is recommended that cost-benefit ratios and safety effects are always examined in conjunction with each other in order to identify the optimum solution for a specific road safety problem in specific conditions and with specific objectives.

  1. Elastic extension of a local analysis facility on external clouds for the LHC experiments

    NASA Astrophysics Data System (ADS)

    Ciaschini, V.; Codispoti, G.; Rinaldi, L.; Aiftimiei, D. C.; Bonacorsi, D.; Calligola, P.; Dal Pra, S.; De Girolamo, D.; Di Maria, R.; Grandi, C.; Michelotto, D.; Panella, M.; Taneja, S.; Semeria, F.

    2017-10-01

    The computing infrastructures serving the LHC experiments have been designed to cope at most with the average amount of data recorded. The usage peaks, as already observed in Run-I, may however originate large backlogs, thus delaying the completion of the data reconstruction and ultimately the data availability for physics analysis. In order to cope with the production peaks, the LHC experiments are exploring the opportunity to access Cloud resources provided by external partners or commercial providers. In this work we present the proof of concept of the elastic extension of a local analysis facility, specifically the Bologna Tier-3 Grid site, for the LHC experiments hosted at the site, on an external OpenStack infrastructure. We focus on the Cloud Bursting of the Grid site using DynFarm, a newly designed tool that allows the dynamic registration of new worker nodes to LSF. In this approach, the dynamically added worker nodes instantiated on an OpenStack infrastructure are transparently accessed by the LHC Grid tools and at the same time they serve as an extension of the farm for the local usage.

  2. The role of minimum supply and social vulnerability assessment for governing critical infrastructure failure: current gaps and future agenda

    NASA Astrophysics Data System (ADS)

    Garschagen, Matthias; Sandholz, Simone

    2018-04-01

    Increased attention has lately been given to the resilience of critical infrastructure in the context of natural hazards and disasters. The major focus therein is on the sensitivity of critical infrastructure technologies and their management contingencies. However, strikingly little attention has been given to assessing and mitigating social vulnerabilities towards the failure of critical infrastructure and to the development, design and implementation of minimum supply standards in situations of major infrastructure failure. Addressing this gap and contributing to a more integrative perspective on critical infrastructure resilience is the objective of this paper. It asks which role social vulnerability assessments and minimum supply considerations can, should and do - or do not - play for the management and governance of critical infrastructure failure. In its first part, the paper provides a structured review on achievements and remaining gaps in the management of critical infrastructure and the understanding of social vulnerabilities towards disaster-related infrastructure failures. Special attention is given to the current state of minimum supply concepts with a regional focus on policies in Germany and the EU. In its second part, the paper then responds to the identified gaps by developing a heuristic model on the linkages of critical infrastructure management, social vulnerability and minimum supply. This framework helps to inform a vision of a future research agenda, which is presented in the paper's third part. Overall, the analysis suggests that the assessment of socially differentiated vulnerabilities towards critical infrastructure failure needs to be undertaken more stringently to inform the scientifically and politically difficult debate about minimum supply standards and the shared responsibilities for securing them.

  3. System Dynamics Approach for Critical Infrastructure and Decision Support. A Model for a Potable Water System.

    NASA Astrophysics Data System (ADS)

    Pasqualini, D.; Witkowski, M.

    2005-12-01

    The Critical Infrastructure Protection / Decision Support System (CIP/DSS) project, supported by the Science and Technology Office, has been developing a risk-informed Decision Support System that provides insights for making critical infrastructure protection decisions. The system considers seventeen different Department of Homeland Security defined Critical Infrastructures (potable water system, telecommunications, public health, economics, etc.) and their primary interdependencies. These infrastructures have been modeling in one model called CIP/DSS Metropolitan Model. The modeling approach used is a system dynamics modeling approach. System dynamics modeling combines control theory and the nonlinear dynamics theory, which is defined by a set of coupled differential equations, which seeks to explain how the structure of a given system determines its behavior. In this poster we present a system dynamics model for one of the seventeen critical infrastructures, a generic metropolitan potable water system (MPWS). Three are the goals: 1) to gain a better understanding of the MPWS infrastructure; 2) to identify improvements that would help protect MPWS; and 3) to understand the consequences, interdependencies, and impacts, when perturbations occur to the system. The model represents raw water sources, the metropolitan water treatment process, storage of treated water, damage and repair to the MPWS, distribution of water, and end user demand, but does not explicitly represent the detailed network topology of an actual MPWS. The MPWS model is dependent upon inputs from the metropolitan population, energy, telecommunication, public health, and transportation models as well as the national water and transportation models. We present modeling results and sensitivity analysis indicating critical choke points, negative and positive feedback loops in the system. A general scenario is also analyzed where the potable water system responds to a generic disruption.

  4. A centralized informatics infrastructure for the National Institute on Drug Abuse Clinical Trials Network.

    PubMed

    Pan, Jeng-Jong; Nahm, Meredith; Wakim, Paul; Cushing, Carol; Poole, Lori; Tai, Betty; Pieper, Carl F

    2009-02-01

    Clinical trial networks (CTNs) were created to provide a sustaining infrastructure for the conduct of multisite clinical trials. As such, they must withstand changes in membership. Centralization of infrastructure including knowledge management, portfolio management, information management, process automation, work policies, and procedures in clinical research networks facilitates consistency and ultimately research. In 2005, the National Institute on Drug Abuse (NIDA) CTN transitioned from a distributed data management model to a centralized informatics infrastructure to support the network's trial activities and administration. We describe the centralized informatics infrastructure and discuss our challenges to inform others considering such an endeavor. During the migration of a clinical trial network from a decentralized to a centralized data center model, descriptive data were captured and are presented here to assess the impact of centralization. We present the framework for the informatics infrastructure and evaluative metrics. The network has decreased the time from last patient-last visit to database lock from an average of 7.6 months to 2.8 months. The average database error rate decreased from 0.8% to 0.2%, with a corresponding decrease in the interquartile range from 0.04%-1.0% before centralization to 0.01-0.27% after centralization. Centralization has provided the CTN with integrated trial status reporting and the first standards-based public data share. A preliminary cost-benefit analysis showed a 50% reduction in data management cost per study participant over the life of a trial. A single clinical trial network comprising addiction researchers and community treatment programs was assessed. The findings may not be applicable to other research settings. The identified informatics components provide the information and infrastructure needed for our clinical trial network. Post centralization data management operations are more efficient and less costly, with higher data quality.

  5. Simulating Impacts of Disruptions to Liquid Fuels Infrastructure

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wilson, Michael; Corbet, Thomas F.; Baker, Arnold B.

    This report presents a methodology for estimating the impacts of events that damage or disrupt liquid fuels infrastructure. The impact of a disruption depends on which components of the infrastructure are damaged, the time required for repairs, and the position of the disrupted components in the fuels supply network. Impacts are estimated for seven stressing events in regions of the United States, which were selected to represent a range of disruption types. For most of these events the analysis is carried out using the National Transportation Fuels Model (NTFM) to simulate the system-level liquid fuels sector response. Results are presentedmore » for each event, and a brief cross comparison of event simulation results is provided.« less

  6. Galaxy CloudMan: delivering cloud compute clusters.

    PubMed

    Afgan, Enis; Baker, Dannon; Coraor, Nate; Chapman, Brad; Nekrutenko, Anton; Taylor, James

    2010-12-21

    Widespread adoption of high-throughput sequencing has greatly increased the scale and sophistication of computational infrastructure needed to perform genomic research. An alternative to building and maintaining local infrastructure is "cloud computing", which, in principle, offers on demand access to flexible computational infrastructure. However, cloud computing resources are not yet suitable for immediate "as is" use by experimental biologists. We present a cloud resource management system that makes it possible for individual researchers to compose and control an arbitrarily sized compute cluster on Amazon's EC2 cloud infrastructure without any informatics requirements. Within this system, an entire suite of biological tools packaged by the NERC Bio-Linux team (http://nebc.nerc.ac.uk/tools/bio-linux) is available for immediate consumption. The provided solution makes it possible, using only a web browser, to create a completely configured compute cluster ready to perform analysis in less than five minutes. Moreover, we provide an automated method for building custom deployments of cloud resources. This approach promotes reproducibility of results and, if desired, allows individuals and labs to add or customize an otherwise available cloud system to better meet their needs. The expected knowledge and associated effort with deploying a compute cluster in the Amazon EC2 cloud is not trivial. The solution presented in this paper eliminates these barriers, making it possible for researchers to deploy exactly the amount of computing power they need, combined with a wealth of existing analysis software, to handle the ongoing data deluge.

  7. Galaxy CloudMan: delivering cloud compute clusters

    PubMed Central

    2010-01-01

    Background Widespread adoption of high-throughput sequencing has greatly increased the scale and sophistication of computational infrastructure needed to perform genomic research. An alternative to building and maintaining local infrastructure is “cloud computing”, which, in principle, offers on demand access to flexible computational infrastructure. However, cloud computing resources are not yet suitable for immediate “as is” use by experimental biologists. Results We present a cloud resource management system that makes it possible for individual researchers to compose and control an arbitrarily sized compute cluster on Amazon’s EC2 cloud infrastructure without any informatics requirements. Within this system, an entire suite of biological tools packaged by the NERC Bio-Linux team (http://nebc.nerc.ac.uk/tools/bio-linux) is available for immediate consumption. The provided solution makes it possible, using only a web browser, to create a completely configured compute cluster ready to perform analysis in less than five minutes. Moreover, we provide an automated method for building custom deployments of cloud resources. This approach promotes reproducibility of results and, if desired, allows individuals and labs to add or customize an otherwise available cloud system to better meet their needs. Conclusions The expected knowledge and associated effort with deploying a compute cluster in the Amazon EC2 cloud is not trivial. The solution presented in this paper eliminates these barriers, making it possible for researchers to deploy exactly the amount of computing power they need, combined with a wealth of existing analysis software, to handle the ongoing data deluge. PMID:21210983

  8. A case study for cloud based high throughput analysis of NGS data using the globus genomics system

    DOE PAGES

    Bhuvaneshwar, Krithika; Sulakhe, Dinanath; Gauba, Robinder; ...

    2015-01-01

    Next generation sequencing (NGS) technologies produce massive amounts of data requiring a powerful computational infrastructure, high quality bioinformatics software, and skilled personnel to operate the tools. We present a case study of a practical solution to this data management and analysis challenge that simplifies terabyte scale data handling and provides advanced tools for NGS data analysis. These capabilities are implemented using the “Globus Genomics” system, which is an enhanced Galaxy workflow system made available as a service that offers users the capability to process and transfer data easily, reliably and quickly to address end-to-end NGS analysis requirements. The Globus Genomicsmore » system is built on Amazon's cloud computing infrastructure. The system takes advantage of elastic scaling of compute resources to run multiple workflows in parallel and it also helps meet the scale-out analysis needs of modern translational genomics research.« less

  9. A case study for cloud based high throughput analysis of NGS data using the globus genomics system

    PubMed Central

    Bhuvaneshwar, Krithika; Sulakhe, Dinanath; Gauba, Robinder; Rodriguez, Alex; Madduri, Ravi; Dave, Utpal; Lacinski, Lukasz; Foster, Ian; Gusev, Yuriy; Madhavan, Subha

    2014-01-01

    Next generation sequencing (NGS) technologies produce massive amounts of data requiring a powerful computational infrastructure, high quality bioinformatics software, and skilled personnel to operate the tools. We present a case study of a practical solution to this data management and analysis challenge that simplifies terabyte scale data handling and provides advanced tools for NGS data analysis. These capabilities are implemented using the “Globus Genomics” system, which is an enhanced Galaxy workflow system made available as a service that offers users the capability to process and transfer data easily, reliably and quickly to address end-to-endNGS analysis requirements. The Globus Genomics system is built on Amazon 's cloud computing infrastructure. The system takes advantage of elastic scaling of compute resources to run multiple workflows in parallel and it also helps meet the scale-out analysis needs of modern translational genomics research. PMID:26925205

  10. Geospatial-enabled Data Exploration and Computation through Data Infrastructure Building Blocks

    NASA Astrophysics Data System (ADS)

    Song, C. X.; Biehl, L. L.; Merwade, V.; Villoria, N.

    2015-12-01

    Geospatial data are present everywhere today with the proliferation of location-aware computing devices and sensors. This is especially true in the scientific community where large amounts of data are driving research and education activities in many domains. Collaboration over geospatial data, for example, in modeling, data analysis and visualization, must still overcome the barriers of specialized software and expertise among other challenges. The GABBs project aims at enabling broader access to geospatial data exploration and computation by developing spatial data infrastructure building blocks that leverage capabilities of end-to-end application service and virtualized computing framework in HUBzero. Funded by NSF Data Infrastructure Building Blocks (DIBBS) initiative, GABBs provides a geospatial data architecture that integrates spatial data management, mapping and visualization and will make it available as open source. The outcome of the project will enable users to rapidly create tools and share geospatial data and tools on the web for interactive exploration of data without requiring significant software development skills, GIS expertise or IT administrative privileges. This presentation will describe the development of geospatial data infrastructure building blocks and the scientific use cases that help drive the software development, as well as seek feedback from the user communities.

  11. User-level framework for performance monitoring of HPC applications

    NASA Astrophysics Data System (ADS)

    Hristova, R.; Goranov, G.

    2013-10-01

    HP-SEE is an infrastructure that links the existing HPC facilities in South East Europe in a common infrastructure. The analysis of the performance monitoring of the High-Performance Computing (HPC) applications in the infrastructure can be useful for the end user as diagnostic for the overall performance of his applications. The existing monitoring tools for HP-SEE provide to the end user only aggregated information for all applications. Usually, the user does not have permissions to select only the relevant information for him and for his applications. In this article we present a framework for performance monitoring of the HPC applications in the HP-SEE infrastructure. The framework provides standardized performance metrics, which every user can use in order to monitor his applications. Furthermore as a part of the framework a program interface is developed. The interface allows the user to publish metrics data from his application and to read and analyze gathered information. Publishing and reading through the framework is possible only with grid certificate valid for the infrastructure. Therefore the user is authorized to access only the data for his applications.

  12. Factors Relating Infrastructure Provision by Developer in Formal Housing

    NASA Astrophysics Data System (ADS)

    Putri, H. T.; Maryati, S.; Humaira, A. N. S.

    2018-03-01

    In big cities, housing developer has significant role in infrastructure provision. Nevertheless in some cases developers have not fulfilled their role to complete the housing with infrastructures needed. The objective of this study is to explore the characteristics and the related factors of infrastructure provisioning in formal housing developed by developer using the quantitative and association method analysis. Infrastructures are focused on clean water, sewage, drainage, and solid waste system. This study used Parongpong District, West Bandung Regency as case study where the need of infrastructure is not fulfilled. Based on the analysis, can be concluded that there are some variation in infrastructure provisioning and the factor related the condition is the level of income of house owner target.

  13. The Efficacy of Blue-Green Infrastructure for Pluvial Flood Prevention under Conditions of Deep Uncertainty

    NASA Astrophysics Data System (ADS)

    Babovic, Filip; Mijic, Ana; Madani, Kaveh

    2017-04-01

    Urban areas around the world are growing in size and importance; however, cities experience elevated risks of pluvial flooding due to the prevalence of impermeable land surfaces within them. Urban planners and engineers encounter a great deal of uncertainty when planning adaptations to these flood risks, due to the interaction of multiple factors such as climate change and land use change. This leads to conditions of deep uncertainty. Blue-Green (BG) solutions utilise natural vegetation and processes to absorb and retain runoff while providing a host of other social, economic and environmental services. When utilised in conjunction with Decision Making under Deep Uncertainty (DMDU) methodologies, BG infrastructure provides a flexible and adaptable method of "no-regret" adaptation; resulting in a practical, economically efficient, and socially acceptable solution for flood risk mitigation. This work presents the methodology for analysing the impact of BG infrastructure in the context of the Adaptation Tipping Points approach to protect against pluvial flood risk in an iterative manner. An economic analysis of the adaptation pathways is also conducted in order to better inform decision-makers on the benefits and costs of the adaptation options presented. The methodology was applied to a case study in the Cranbrook Catchment in the North East of London. Our results show that BG infrastructure performs better under conditions of uncertainty than traditional grey infrastructure.

  14. Assessment of online public opinions on large infrastructure projects: A case study of the Three Gorges Project in China

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jiang, Hanchen, E-mail: jhc13@mails.tsinghua.edu.cn; Qiang, Maoshan, E-mail: qiangms@tsinghua.edu.cn; Lin, Peng, E-mail: celinpe@mail.tsinghua.edu.cn

    Public opinion becomes increasingly salient in the ex post evaluation stage of large infrastructure projects which have significant impacts to the environment and the society. However, traditional survey methods are inefficient in collection and assessment of the public opinion due to its large quantity and diversity. Recently, Social media platforms provide a rich data source for monitoring and assessing the public opinion on controversial infrastructure projects. This paper proposes an assessment framework to transform unstructured online public opinions on large infrastructure projects into sentimental and topical indicators for enhancing practices of ex post evaluation and public participation. The framework usesmore » web crawlers to collect online comments related to a large infrastructure project and employs two natural language processing technologies, including sentiment analysis and topic modeling, with spatio-temporal analysis, to transform these comments into indicators for assessing online public opinion on the project. Based on the framework, we investigate the online public opinion of the Three Gorges Project on China's largest microblogging site, namely, Weibo. Assessment results present spatial-temporal distributions of post intensity and sentiment polarity, reveals major topics with different sentiments and summarizes managerial implications, for ex post evaluation of the world's largest hydropower project. The proposed assessment framework is expected to be widely applied as a methodological strategy to assess public opinion in the ex post evaluation stage of large infrastructure projects. - Highlights: • We developed a framework to assess online public opinion on large infrastructure projects with environmental impacts. • Indicators were built to assess post intensity, sentiment polarity and major topics of the public opinion. • We took the Three Gorges Project (TGP) as an example to demonstrate the effectiveness proposed framework. • We revealed spatial-temporal patterns of post intensity and sentiment polarity on the TGP. • We drew implications for a more in-depth understanding of the public opinion on large infrastructure projects.« less

  15. Cultured Construction: Global Evidence of the Impact of National Values on Piped-to-Premises Water Infrastructure Development.

    PubMed

    Kaminsky, Jessica A

    2016-07-19

    In 2016, the global community undertook the Sustainable Development Goals. One of these goals seeks to achieve universal and equitable access to safe and affordable drinking water for all people by the year 2030. In support of this undertaking, this paper seeks to discover the cultural work done by piped water infrastructure across 33 nations with developed and developing economies that have experienced change in the percentage of population served by piped-to-premises water infrastructure at the national level of analysis. To do so, I regressed the 1990-2012 change in piped-to-premises water infrastructure coverage against Hofstede's cultural dimensions, controlling for per capita GDP, the 1990 baseline level of coverage, percent urban population, overall 1990-2012 change in improved sanitation (all technologies), and per capita freshwater resources. Separate analyses were carried out for the urban, rural, and aggregate national contexts. Hofstede's dimensions provide a measure of cross-cultural difference; high or low scores are not in any way intended to represent better or worse but rather serve as a quantitative way to compare aggregate preferences for ways of being and doing. High scores in the cultural dimensions of Power Distance, Individualism-Collectivism, and Uncertainty Avoidance explain increased access to piped-to-premises water infrastructure in the rural context. Higher Power Distance and Uncertainty Avoidance scores are also statistically significant for increased coverage in the urban and national aggregate contexts. These results indicate that, as presently conceived, piped-to-premises water infrastructure fits best with spatial contexts that prefer hierarchy and centralized control. Furthermore, water infrastructure is understood to reduce uncertainty regarding the provision of individually valued benefits. The results of this analysis identify global trends that enable engineers and policy makers to design and manage more culturally appropriate and socially sustainable water infrastructure by better fitting technologies to user preferences.

  16. Landscape Characterization and Representativeness Analysis for Understanding Sampling Network Coverage

    DOE Data Explorer

    Maddalena, Damian; Hoffman, Forrest; Kumar, Jitendra; Hargrove, William

    2014-08-01

    Sampling networks rarely conform to spatial and temporal ideals, often comprised of network sampling points which are unevenly distributed and located in less than ideal locations due to access constraints, budget limitations, or political conflict. Quantifying the global, regional, and temporal representativeness of these networks by quantifying the coverage of network infrastructure highlights the capabilities and limitations of the data collected, facilitates upscaling and downscaling for modeling purposes, and improves the planning efforts for future infrastructure investment under current conditions and future modeled scenarios. The work presented here utilizes multivariate spatiotemporal clustering analysis and representativeness analysis for quantitative landscape characterization and assessment of the Fluxnet, RAINFOR, and ForestGEO networks. Results include ecoregions that highlight patterns of bioclimatic, topographic, and edaphic variables and quantitative representativeness maps of individual and combined networks.

  17. GOMMA: a component-based infrastructure for managing and analyzing life science ontologies and their evolution

    PubMed Central

    2011-01-01

    Background Ontologies are increasingly used to structure and semantically describe entities of domains, such as genes and proteins in life sciences. Their increasing size and the high frequency of updates resulting in a large set of ontology versions necessitates efficient management and analysis of this data. Results We present GOMMA, a generic infrastructure for managing and analyzing life science ontologies and their evolution. GOMMA utilizes a generic repository to uniformly and efficiently manage ontology versions and different kinds of mappings. Furthermore, it provides components for ontology matching, and determining evolutionary ontology changes. These components are used by analysis tools, such as the Ontology Evolution Explorer (OnEX) and the detection of unstable ontology regions. We introduce the component-based infrastructure and show analysis results for selected components and life science applications. GOMMA is available at http://dbs.uni-leipzig.de/GOMMA. Conclusions GOMMA provides a comprehensive and scalable infrastructure to manage large life science ontologies and analyze their evolution. Key functions include a generic storage of ontology versions and mappings, support for ontology matching and determining ontology changes. The supported features for analyzing ontology changes are helpful to assess their impact on ontology-dependent applications such as for term enrichment. GOMMA complements OnEX by providing functionalities to manage various versions of mappings between two ontologies and allows combining different match approaches. PMID:21914205

  18. Environmental impacts of dispersed development from federal infrastructure projects.

    PubMed

    Southerland, Mark T

    2004-06-01

    Dispersed development, also referred to as urban growth or sprawl, is a pattern of low-density development spread over previously rural landscapes. Such growth can result in adverse impacts to air quality, water quality, human health, aquatic and terrestrial ecosystems, agricultural land, military training areas, water supply and wastewater treatment, recreational resources, viewscapes, and cultural resources. The U.S. Environmental Protection Agency (U.S. EPA) is charged with protecting public health and the environment, which includes consideration of impacts from dispersed development. Specifically, because federal infrastructure projects can affect the progress of dispersed development, the secondary impacts resulting from it must be assessed in documents prepared under the National Environmental Policy Act (NEPA). The Council on Environmental Quality (CEQ) has oversight for NEPA and Section 309 of the Clean Air Act requires that U.S. EPA review and comment on federal agency NEPA documents. The adverse effects of dispersed development can be induced by federal infrastructure projects including transportation, built infrastructure, modifications in natural infrastructure, public land conversion and redevelopment of properties, construction of federal facilities, and large traffic or major growth generation developments requiring federal permits. This paper presents an approach that U.S. EPA reviewers and NEPA practitioners can use to provide accurate, realistic, and consistent analysis of secondary impacts of dispersed development resulting from federal infrastructure projects. It also presents 24 measures that can be used to mitigate adverse impacts from dispersed development by modifying project location and design, participating in preservation or restoration activities, or informing and supporting local communities in planning.

  19. Vibration energy harvesting based monitoring of an operational bridge undergoing forced vibration and train passage

    NASA Astrophysics Data System (ADS)

    Cahill, Paul; Hazra, Budhaditya; Karoumi, Raid; Mathewson, Alan; Pakrashi, Vikram

    2018-06-01

    The application of energy harvesting technology for monitoring civil infrastructure is a bourgeoning topic of interest. The ability of kinetic energy harvesters to scavenge ambient vibration energy can be useful for large civil infrastructure under operational conditions, particularly for bridge structures. The experimental integration of such harvesters with full scale structures and the subsequent use of the harvested energy directly for the purposes of structural health monitoring shows promise. This paper presents the first experimental deployment of piezoelectric vibration energy harvesting devices for monitoring a full-scale bridge undergoing forced dynamic vibrations under operational conditions using energy harvesting signatures against time. The calibration of the harvesters is presented, along with details of the host bridge structure and the dynamic assessment procedures. The measured responses of the harvesters from the tests are presented and the use the harvesters for the purposes of structural health monitoring (SHM) is investigated using empirical mode decomposition analysis, following a bespoke data cleaning approach. Finally, the use of sequential Karhunen Loeve transforms to detect train passages during the dynamic assessment is presented. This study is expected to further develop interest in energy-harvesting based monitoring of large infrastructure for both research and commercial purposes.

  20. Utilization of Multimedia Laboratory: An Acceptance Analysis using TAM

    NASA Astrophysics Data System (ADS)

    Modeong, M.; Palilingan, V. R.

    2018-02-01

    Multimedia is often utilized by teachers to present a learning materials. Learning that delivered by multimedia enables people to understand the information of up to 60% of the learning in general. To applying the creative learning to the classroom, multimedia presentation needs a laboratory as a space that provides multimedia needs. This study aims to reveal the level of student acceptance on the multimedia laboratories, by explaining the direct and indirect effect of internal support and technology infrastructure. Technology Acceptance Model (TAM) is used as the basis of measurement on this research, through the perception of usefulness, ease of use, and the intention, it’s recognized capable of predicting user acceptance about technology. This study used the quantitative method. The data analysis using path analysis that focuses on trimming models, it’s performed to improve the model of path analysis structure by removing exogenous variables that have insignificant path coefficients. The result stated that Internal Support and Technology Infrastructure are well mediated by TAM variables to measure the level of technology acceptance. The implications suggest that TAM can measure the success of multimedia laboratory utilization in Faculty of Engineering UNIMA.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gonder, J.; Brooker, A.; Burton, E.

    This presentation discusses current research at NREL on advanced wireless power transfer vehicle and infrastructure analysis. The potential benefits of E-roadway include more electrified driving miles from battery electric vehicles, plug-in hybrid electric vehicles, or even properly equipped hybrid electric vehicles (i.e., more electrified miles could be obtained from a given battery size, or electrified driving miles could be maintained while using smaller and less expensive batteries, thereby increasing cost competitiveness and potential market penetration). The system optimization aspect is key given the potential impact of this technology on the vehicles, the power grid and the road infrastructure.

  2. Risk-informed Management of Water Infrastructure in the United States: History, Development, and Best Practices

    NASA Astrophysics Data System (ADS)

    Wolfhope, J.

    2017-12-01

    This presentation will focus on the history, development, and best practices for evaluating the risks associated with the portfolio of water infrastructure in the United States. These practices have evolved from the early development of the Federal Guidelines for Dam Safety and the establishment of the National Dam Safety Program, to the most recent update of the Best Practices for Dam and Levee Risk Analysis jointly published by the U.S. Department of Interior Bureau of Reclamation and the U.S. Army Corps of Engineers. Since President Obama signed the Water Infrastructure Improvements for the Nation Act (WIIN) Act, on December 16, 2016, adding a new grant program under FEMA's National Dam Safety Program, the focus has been on establishing a risk-based priority system for use in identifying eligible high hazard potential dams for which grants may be made. Finally, the presentation provides thoughts on the future direction and priorities for managing the risk of dams and levees in the United States.

  3. A modified eco-efficiency framework and methodology for advancing the state of practice of sustainability analysis as applied to green infrastructure

    EPA Science Inventory

    We propose a modified eco-efficiency (EE) framework and novel sustainability analysis methodology for green infrastructure (GI) practices used in water resource management. Green infrastructure practices such as rainwater harvesting (RWH), rain gardens, porous pavements, and gree...

  4. Women in EPOS: the role of women in a large pan-European Research Infrastructure for Solid Earth sciences

    NASA Astrophysics Data System (ADS)

    Calignano, Elisa; Freda, Carmela; Baracchi, Laura

    2017-04-01

    Women are outnumbered by men in geosciences senior research positions, but what is the situation if we consider large pan-European Research Infrastructures? With this contribution we want to show an analysis of the role of women in the implementation of the European Plate Observing System (EPOS): a planned research infrastructure for European Solid Earth sciences, integrating national and transnational research infrastructures to enable innovative multidisciplinary research. EPOS involves 256 national research infrastructures, 47 partners (universities and research institutes) from 25 European countries and 4 international organizations. The EPOS integrated platform demands significant coordination between diverse solid Earth disciplinary communities, national research infrastructures and the policies and initiatives they drive, geoscientists and information technologists. The EPOS architecture takes into account governance, legal, financial and technical issues and is designed so that the enterprise works as a single, but distributed, sustainable research infrastructure. A solid management structure is vital for the successful implementation and sustainability of EPOS. The internal organization relies on community-specific Working Packages (WPs), Transversal WPs in charge of the overall EPOS integration and implementation, several governing, executive and advisory bodies, a Project Management Office (PMO) and the Project Coordinator. Driven by the timely debate on gender balance and commitment of the European Commission to promote gender equality in research and innovation, we decided to conduct a mapping exercise on a project that crosses European national borders and that brings together diverse geoscience disciplines under one management structure. We present an analysis of women representation in decision-making positions in each EPOS Working Package (WP Leader, proxy, legal, financial and IT contact persons), in the Boards and Councils and in the PMO, together with statistics on women participation based on the project intranet, which counts more than 500 users. The analysis allows us not only to assess the gender balance in decision-making positions in a pan-European research infrastructure, but also to investigate how women's participation varies with different aspects of the project implementation (management, coordination, legal, financial or technical). Most of the women in EPOS are active geoscientists (academic or in national research institutes), or have a scientific background. By interviewing some of them we report also on how being involved in the project affects their careers. We believe this kind of analysis is an important starting point to promote awareness and achieve gender equality in research and innovation.

  5. Policy Model of Sustainable Infrastructure Development (Case Study : Bandarlampung City, Indonesia)

    NASA Astrophysics Data System (ADS)

    Persada, C.; Sitorus, S. R. P.; Marimin; Djakapermana, R. D.

    2018-03-01

    Infrastructure development does not only affect the economic aspect, but also social and environmental, those are the main dimensions of sustainable development. Many aspects and actors involved in urban infrastructure development requires a comprehensive and integrated policy towards sustainability. Therefore, it is necessary to formulate an infrastructure development policy that considers various dimensions of sustainable development. The main objective of this research is to formulate policy of sustainable infrastructure development. In this research, urban infrastructure covers transportation, water systems (drinking water, storm water, wastewater), green open spaces and solid waste. This research was conducted in Bandarlampung City. This study use a comprehensive modeling, namely the Multi Dimensional Scaling (MDS) with Rapid Appraisal of Infrastructure (Rapinfra), it uses of Analytic Network Process (ANP) and it uses system dynamics model. The findings of the MDS analysis showed that the status of Bandarlampung City infrastructure sustainability is less sustainable. The ANP analysis produces 8 main indicators of the most influential in the development of sustainable infrastructure. The system dynamics model offered 4 scenarios of sustainable urban infrastructure policy model. The best scenario was implemented into 3 policies consist of: the integrated infrastructure management, the population control, and the local economy development.

  6. Integrating socio-economic and infrastructural dimension to reveal hazard vulnerability of coastal districts

    NASA Astrophysics Data System (ADS)

    Mazumdar, Jublee; Paul, Saikat

    2015-04-01

    Losses of life and property due to natural hazards have intensified in the past decade, motivating an alteration of disaster management away from simple post event resettlement and rehabilitation. The degree of exposure to hazard for a homogeneous population is not entirely reliant upon nearness to the source of hazard event. Socio-economic factors and infrastructural capability play an important role in determining the vulnerability of a place. This study investigates the vulnerability of eastern coastal states of India from tropical cyclones. The record of past hundred years shows that the physical vulnerability of eastern coastal states is four times as compared to the western coastal states in terms of frequency and intensity of tropical cyclones. Nevertheless, these physical factors played an imperative role in determining the vulnerability of eastern coast. However, the socio-economic and infrastructural factors influence the risk of exposure exponentially. Inclusion of these indicators would provide better insight regarding the preparedness and resilience of settlements to hazard events. In this regard, the present study is an effort to develop an Integrated Vulnerability Model (IVM) based on socio-economic and infrastructural factors for the districts of eastern coastal states of India. A method is proposed for quantifying the socio-economic and infrastructural vulnerability to tropical cyclone in these districts. The variables included in the study are extracted from Census of India, 2011 at district level administrative unit. In the analysis, a large number of variables are reduced to a smaller number of factors by using principal component analysis that represents the socio-economic and infrastructure vulnerability to tropical cyclone. Subsequently, the factor scores in socio-economic Vulnerability Index (SeVI) and Infrastructure Vulnerability Index (InVI) are standardized from 0 to 1, indicating the range from low to high vulnerability. The factor scores are then mapped for spatial analysis. Utilizing SeVI and InVI, the highly vulnerable districts are demonstrated that are likely to face significant challenges in coping with tropical cyclone and require strategies to address the various aspects of socio-economic and infrastructural vulnerability. Moreover, this model can be incorporated not only for multi-level governance but also to integrate it with the real-time weather forecasts to identify the predictive areas of vulnerability.

  7. Estimating net changes in life-cycle emissions from adoption of emerging civil infrastructure technologies.

    PubMed

    Amponsah, Isaac; Harrison, Kenneth W; Rizos, Dimitris C; Ziehl, Paul H

    2008-01-01

    There is a net emissions change when adopting new materials for use in civil infrastructure design. To evaluate the total net emissions change, one must consider changes in manufacture and associated life-cycle emissions, as well as changes in the quantity of material required. In addition, in principle one should also consider any differences in costs of the two designs because cost savings can be applied to other economic activities with associated environmental impacts. In this paper, a method is presented that combines these considerations to permit an evaluation of the net change in emissions when considering the adoption of emerging technologies/materials for civil infrastructure. The method factors in data on differences between a standard and new material for civil infrastructure, material requirements as specified in designs using both materials, and price information. The life-cycle assessment approach known as economic input-output life-cycle assessment (EIO-LCA) is utilized. A brief background on EIO-LCA is provided because its use is central to the method. The methodology is demonstrated with analysis of a switch from carbon steel to high-performance steel in military bridge design. The results are compared with a simplistic analysis that accounts for the weight reduction afforded by use of the high-performance steel but assuming no differences in manufacture.

  8. AGING WATER INFRASTRUCTURE RESEARCH PROGRAM: ADDRESSING THE CHALLENGE THROUGH INNOVATION

    EPA Science Inventory

    A driving force behind the Sustainable Water Infrastructure (SI) initiative and the Aging Water Infrastructure (AWI) research program is the Clean Water and Drinking Water Infrastructure Gap Analysis. In this report, EPA estimated that if operation, maintenance, and capital inves...

  9. Caribou distribution during the post-calving period in relation to infrastructure in the Prudhoe Bay oil field, Alaska

    USGS Publications Warehouse

    Cronin, Matthew A.; Amstrup, Steven C.; Durner, George M.; Noel, Lynn E.; McDonald, Trent L.; Ballard, Warren B.

    1998-01-01

    There is concern that caribou (Rangifer tarandus) may avoid roads and facilities (i.e., infrastructure) in the Prudhoe Bay oil field (PBOF) in northern Alaska, and that this avoidance can have negative effects on the animals. We quantified the relationship between caribou distribution and PBOF infrastructure during the post-calving period (mid-June to mid-August) with aerial surveys from 1990 to 1995. We conducted four to eight surveys per year with complete coverage of the PBOF. We identified active oil field infrastructure and used a geographic information system (GIS) to construct ten 1 km wide concentric intervals surrounding the infrastructure. We tested whether caribou distribution is related to distance from infrastructure with a chi-squared habitat utilization-availability analysis and log-linear regression. We considered bulls, calves, and total caribou of all sex/age classes separately. The habitat utilization-availability analysis indicated there was no consistent trend of attraction to or avoidance of infrastructure. Caribou frequently were more abundant than expected in the intervals close to infrastructure, and this trend was more pronounced for bulls and for total caribou of all sex/age classes than for calves. Log-linear regression (with Poisson error structure) of numbers of caribou and distance from infrastructure were also done, with and without combining data into the 1 km distance intervals. The analysis without intervals revealed no relationship between caribou distribution and distance from oil field infrastructure, or between caribou distribution and Julian date, year, or distance from the Beaufort Sea coast. The log-linear regression with caribou combined into distance intervals showed the density of bulls and total caribou of all sex/age classes declined with distance from infrastructure. Our results indicate that during the post-calving period: 1) caribou distribution is largely unrelated to distance from infrastructure; 2) caribou regularly use habitats in the PBOF; 3) caribou often occur close to infrastructure; and 4) caribou do not appear to avoid oil field infrastructure.

  10. The computing and data infrastructure to interconnect EEE stations

    NASA Astrophysics Data System (ADS)

    Noferini, F.; EEE Collaboration

    2016-07-01

    The Extreme Energy Event (EEE) experiment is devoted to the search of high energy cosmic rays through a network of telescopes installed in about 50 high schools distributed throughout the Italian territory. This project requires a peculiar data management infrastructure to collect data registered in stations very far from each other and to allow a coordinated analysis. Such an infrastructure is realized at INFN-CNAF, which operates a Cloud facility based on the OpenStack opensource Cloud framework and provides Infrastructure as a Service (IaaS) for its users. In 2014 EEE started to use it for collecting, monitoring and reconstructing the data acquired in all the EEE stations. For the synchronization between the stations and the INFN-CNAF infrastructure we used BitTorrent Sync, a free peer-to-peer software designed to optimize data syncronization between distributed nodes. All data folders are syncronized with the central repository in real time to allow an immediate reconstruction of the data and their publication in a monitoring webpage. We present the architecture and the functionalities of this data management system that provides a flexible environment for the specific needs of the EEE project.

  11. Lifecycle Cost Analysis of Green Infrastructure. U.S. EPA National Stormwater Calculator: Low Impact Development Stormwater Control Cost Estimation Module & Future Enhancements

    EPA Science Inventory

    This presentation will cover the new cost estimation module of the US EPA National Stormwater Calculator and future enhancements, including a new mobile web app version of the tool. The presentation mainly focuses on how the calculator may be used to provide planning level capita...

  12. A model for assessing habitat fragmentation caused by new infrastructures in extensive territories - evaluation of the impact of the Spanish strategic infrastructure and transport plan.

    PubMed

    Mancebo Quintana, S; Martín Ramos, B; Casermeiro Martínez, M A; Otero Pastor, I

    2010-05-01

    The aim of the present work is to design a model for evaluating the impact of planned infrastructures on species survival at the territorial scale by calculating a connectivity index. The method developed involves determining the effective distance of displacement between patches of the same habitat, simplifying earlier models so that there is no dependence on specific variables for each species. A case study is presented in which the model was used to assess the impact of the forthcoming roads and railways included in the Spanish Strategic Infrastructure and Transport Plan (PEIT, in its Spanish initials). This study took into account the habitats of peninsular Spain, which occupies an area of some 500,000 km(2). In this territory, the areas deemed to provide natural habitats are defined by Directive 92/43/EEC. The impact of new infrastructures on connectivity was assessed by comparing two scenarios, with and without the plan, for the major new road and railway networks. The calculation of the connectivity index (CI) requires the use of a raster methodology based on the Arc/Info geographical information system (GIS). The actual calculation was performed using a program written in Arc/Info Macro Language (AML); this program is available in FragtULs (Mancebo Quintana, 2007), a set of tools for calculating indicators of fragmentation caused by transport infrastructure (http://topografia.montes.upm.es/fragtuls.html). The indicator of connectivity proposed allows the estimation of the connectivity between all the patches of a territory, with no artificial (non-ecologically based) boundaries imposed. The model proposed appears to be a useful tool for the analysis of fragmentation caused by plans for large territories. Copyright 2009 Elsevier Ltd. All rights reserved.

  13. To ontologise or not to ontologise: An information model for a geospatial knowledge infrastructure

    NASA Astrophysics Data System (ADS)

    Stock, Kristin; Stojanovic, Tim; Reitsma, Femke; Ou, Yang; Bishr, Mohamed; Ortmann, Jens; Robertson, Anne

    2012-08-01

    A geospatial knowledge infrastructure consists of a set of interoperable components, including software, information, hardware, procedures and standards, that work together to support advanced discovery and creation of geoscientific resources, including publications, data sets and web services. The focus of the work presented is the development of such an infrastructure for resource discovery. Advanced resource discovery is intended to support scientists in finding resources that meet their needs, and focuses on representing the semantic details of the scientific resources, including the detailed aspects of the science that led to the resource being created. This paper describes an information model for a geospatial knowledge infrastructure that uses ontologies to represent these semantic details, including knowledge about domain concepts, the scientific elements of the resource (analysis methods, theories and scientific processes) and web services. This semantic information can be used to enable more intelligent search over scientific resources, and to support new ways to infer and visualise scientific knowledge. The work describes the requirements for semantic support of a knowledge infrastructure, and analyses the different options for information storage based on the twin goals of semantic richness and syntactic interoperability to allow communication between different infrastructures. Such interoperability is achieved by the use of open standards, and the architecture of the knowledge infrastructure adopts such standards, particularly from the geospatial community. The paper then describes an information model that uses a range of different types of ontologies, explaining those ontologies and their content. The information model was successfully implemented in a working geospatial knowledge infrastructure, but the evaluation identified some issues in creating the ontologies.

  14. Critical Infrastructure Vulnerability to Spatially Localized Failures with Applications to Chinese Railway System.

    PubMed

    Ouyang, Min; Tian, Hui; Wang, Zhenghua; Hong, Liu; Mao, Zijun

    2017-01-17

    This article studies a general type of initiating events in critical infrastructures, called spatially localized failures (SLFs), which are defined as the failure of a set of infrastructure components distributed in a spatially localized area due to damage sustained, while other components outside the area do not directly fail. These failures can be regarded as a special type of intentional attack, such as bomb or explosive assault, or a generalized modeling of the impact of localized natural hazards on large-scale systems. This article introduces three SLFs models: node centered SLFs, district-based SLFs, and circle-shaped SLFs, and proposes a SLFs-induced vulnerability analysis method from three aspects: identification of critical locations, comparisons of infrastructure vulnerability to random failures, topologically localized failures and SLFs, and quantification of infrastructure information value. The proposed SLFs-induced vulnerability analysis method is finally applied to the Chinese railway system and can be also easily adapted to analyze other critical infrastructures for valuable protection suggestions. © 2017 Society for Risk Analysis.

  15. Research infrastructure support to address ecosystem dynamics

    NASA Astrophysics Data System (ADS)

    Los, Wouter

    2014-05-01

    Predicting the evolution of ecosystems to climate change or human pressures is a challenge. Even understanding past or current processes is complicated as a result of the many interactions and feedbacks that occur within and between components of the system. This talk will present an example of current research on changes in landscape evolution, hydrology, soil biogeochemical processes, zoological food webs, and plant community succession, and how these affect feedbacks to components of the systems, including the climate system. Multiple observations, experiments, and simulations provide a wealth of data, but not necessarily understanding. Model development on the coupled processes on different spatial and temporal scales is sensitive for variations in data and of parameter change. Fast high performance computing may help to visualize the effect of these changes and the potential stability (and reliability) of the models. This may than allow for iteration between data production and models towards stable models reducing uncertainty and improving the prediction of change. The role of research infrastructures becomes crucial is overcoming barriers for such research. Environmental infrastructures are covering physical site facilities, dedicated instrumentation and e-infrastructure. The LifeWatch infrastructure for biodiversity and ecosystem research will provide services for data integration, analysis and modeling. But it has to cooperate intensively with the other kinds of infrastructures in order to support the iteration between data production and model computation. The cooperation in the ENVRI project (Common operations of environmental research infrastructures) is one of the initiatives to foster such multidisciplinary research.

  16. Sea Level Rise Impacts On Infrastructure Vulnerability

    NASA Astrophysics Data System (ADS)

    Pasqualini, D.; Mccown, A. W.; Backhaus, S.; Urban, N. M.

    2015-12-01

    Increase of global sea level is one of the potential consequences of climate change and represents a threat for the U.S.A coastal regions, which are highly populated and home of critical infrastructures. The potential danger caused by sea level rise may escalate if sea level rise is coupled with an increase in frequency and intensity of storms that may strike these regions. These coupled threats present a clear risk to population and critical infrastructure and are concerns for Federal, State, and particularly local response and recovery planners. Understanding the effect of sea level rise on the risk to critical infrastructure is crucial for long planning and for mitigating potential damages. In this work we quantify how infrastructure vulnerability to a range of storms changes due to an increase of sea level. Our study focuses on the Norfolk area of the U.S.A. We assess the direct damage of drinking water and wastewater facilities and the power sector caused by a distribution of synthetic hurricanes. In addition, our analysis estimates indirect consequences of these damages on population and economic activities accounting also for interdependencies across infrastructures. While projections unanimously indicate an increase in the rate of sea level rise, the scientific community does not agree on the size of this rate. Our risk assessment accounts for this uncertainty simulating a distribution of sea level rise for a specific climate scenario. Using our impact assessment results and assuming an increase of future hurricanes frequencies and intensities, we also estimate the expected benefits for critical infrastructure.

  17. Overset Grid Methods Applied to Nonlinear Potential Flows

    NASA Technical Reports Server (NTRS)

    Holst, Terry; Kwak, Dochan (Technical Monitor)

    2000-01-01

    The objectives of this viewgraph presentation are to develop Chimera-based potential methodology which is compatible with overflow and overflow infrastructure, creating options for an advanced problem solving environment and to significantly reduce turnaround time for aerodynamic analysis and design (primarily cruise conditions).

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Watson, Jean-Paul; Guttromson, Ross; Silva-Monroy, Cesar

    This report has been written for the Department of Energy’s Energy Policy and Systems Analysis Office to inform their writing of the Quadrennial Energy Review in the area of energy resilience. The topics of measuring and increasing energy resilience are addressed, including definitions, means of measuring, and analytic methodologies that can be used to make decisions for policy, infrastructure planning, and operations. A risk-based framework is presented which provides a standard definition of a resilience metric. Additionally, a process is identified which explains how the metrics can be applied. Research and development is articulated that will further accelerate the resiliencemore » of energy infrastructures.« less

  19. Improving the analysis, storage and sharing of neuroimaging data using relational databases and distributed computing.

    PubMed

    Hasson, Uri; Skipper, Jeremy I; Wilde, Michael J; Nusbaum, Howard C; Small, Steven L

    2008-01-15

    The increasingly complex research questions addressed by neuroimaging research impose substantial demands on computational infrastructures. These infrastructures need to support management of massive amounts of data in a way that affords rapid and precise data analysis, to allow collaborative research, and to achieve these aims securely and with minimum management overhead. Here we present an approach that overcomes many current limitations in data analysis and data sharing. This approach is based on open source database management systems that support complex data queries as an integral part of data analysis, flexible data sharing, and parallel and distributed data processing using cluster computing and Grid computing resources. We assess the strengths of these approaches as compared to current frameworks based on storage of binary or text files. We then describe in detail the implementation of such a system and provide a concrete description of how it was used to enable a complex analysis of fMRI time series data.

  20. Improving the Analysis, Storage and Sharing of Neuroimaging Data using Relational Databases and Distributed Computing

    PubMed Central

    Hasson, Uri; Skipper, Jeremy I.; Wilde, Michael J.; Nusbaum, Howard C.; Small, Steven L.

    2007-01-01

    The increasingly complex research questions addressed by neuroimaging research impose substantial demands on computational infrastructures. These infrastructures need to support management of massive amounts of data in a way that affords rapid and precise data analysis, to allow collaborative research, and to achieve these aims securely and with minimum management overhead. Here we present an approach that overcomes many current limitations in data analysis and data sharing. This approach is based on open source database management systems that support complex data queries as an integral part of data analysis, flexible data sharing, and parallel and distributed data processing using cluster computing and Grid computing resources. We assess the strengths of these approaches as compared to current frameworks based on storage of binary or text files. We then describe in detail the implementation of such a system and provide a concrete description of how it was used to enable a complex analysis of fMRI time series data. PMID:17964812

  1. IT Infrastructure Projects: A Framework for Analysis. ECAR Research Bulletin

    ERIC Educational Resources Information Center

    Grochow, Jerrold M.

    2014-01-01

    Just as maintaining a healthy infrastructure of water delivery and roads is essential to the functioning of cities and towns, maintaining a healthy infrastructure of information technology is essential to the functioning of universities. Deterioration in IT infrastructure can lead to deterioration in research, teaching, and administration. Given…

  2. Hadoop and friends - first experience at CERN with a new platform for high throughput analysis steps

    NASA Astrophysics Data System (ADS)

    Duellmann, D.; Surdy, K.; Menichetti, L.; Toebbicke, R.

    2017-10-01

    The statistical analysis of infrastructure metrics comes with several specific challenges, including the fairly large volume of unstructured metrics from a large set of independent data sources. Hadoop and Spark provide an ideal environment in particular for the first steps of skimming rapidly through hundreds of TB of low relevance data to find and extract the much smaller data volume that is relevant for statistical analysis and modelling. This presentation will describe the new Hadoop service at CERN and the use of several of its components for high throughput data aggregation and ad-hoc pattern searches. We will describe the hardware setup used, the service structure with a small set of decoupled clusters and the first experience with co-hosting different applications and performing software upgrades. We will further detail the common infrastructure used for data extraction and preparation from continuous monitoring and database input sources.

  3. In Situ Methods, Infrastructures, and Applications on High Performance Computing Platforms, a State-of-the-art (STAR) Report

    DOE PAGES

    Bethel, EW; Bauer, A; Abbasi, H; ...

    2016-06-10

    The considerable interest in the high performance computing (HPC) community regarding analyzing and visualization data without first writing to disk, i.e., in situ processing, is due to several factors. First is an I/O cost savings, where data is analyzed /visualized while being generated, without first storing to a filesystem. Second is the potential for increased accuracy, where fine temporal sampling of transient analysis might expose some complex behavior missed in coarse temporal sampling. Third is the ability to use all available resources, CPU’s and accelerators, in the computation of analysis products. This STAR paper brings together researchers, developers and practitionersmore » using in situ methods in extreme-scale HPC with the goal to present existing methods, infrastructures, and a range of computational science and engineering applications using in situ analysis and visualization.« less

  4. Constructing Pairing-Friendly Elliptic Curves under Embedding Degree 1 for Securing Critical Infrastructures.

    PubMed

    Wang, Maocai; Dai, Guangming; Choo, Kim-Kwang Raymond; Jayaraman, Prem Prakash; Ranjan, Rajiv

    2016-01-01

    Information confidentiality is an essential requirement for cyber security in critical infrastructure. Identity-based cryptography, an increasingly popular branch of cryptography, is widely used to protect the information confidentiality in the critical infrastructure sector due to the ability to directly compute the user's public key based on the user's identity. However, computational requirements complicate the practical application of Identity-based cryptography. In order to improve the efficiency of identity-based cryptography, this paper presents an effective method to construct pairing-friendly elliptic curves with low hamming weight 4 under embedding degree 1. Based on the analysis of the Complex Multiplication(CM) method, the soundness of our method to calculate the characteristic of the finite field is proved. And then, three relative algorithms to construct pairing-friendly elliptic curve are put forward. 10 elliptic curves with low hamming weight 4 under 160 bits are presented to demonstrate the utility of our approach. Finally, the evaluation also indicates that it is more efficient to compute Tate pairing with our curves, than that of Bertoni et al.

  5. Infrastructure for deployment of power systems

    NASA Technical Reports Server (NTRS)

    Sprouse, Kenneth M.

    1991-01-01

    A preliminary effort in characterizing the types of stationary lunar power systems which may be considered for emplacement on the lunar surface from the proposed initial 100-kW unit in 2003 to later units ranging in power from 25 to 825 kW is presented. Associated with these power systems are their related infrastructure hardware including: (1) electrical cable, wiring, switchgear, and converters; (2) deployable radiator panels; (3) deployable photovoltaic (PV) panels; (4) heat transfer fluid piping and connection joints; (5) power system instrumentation and control equipment; and (6) interface hardware between lunar surface construction/maintenance equipment and power system. This report: (1) presents estimates of the mass and volumes associated with these power systems and their related infrastructure hardware; (2) provides task breakdown description for emplacing this equipment; (3) gives estimated heat, forces, torques, and alignment tolerances for equipment assembly; and (4) provides other important equipment/machinery requirements where applicable. Packaging options for this equipment will be discussed along with necessary site preparation requirements. Design and analysis issues associated with the final emplacement of this power system hardware are also described.

  6. Constructing Pairing-Friendly Elliptic Curves under Embedding Degree 1 for Securing Critical Infrastructures

    PubMed Central

    Dai, Guangming

    2016-01-01

    Information confidentiality is an essential requirement for cyber security in critical infrastructure. Identity-based cryptography, an increasingly popular branch of cryptography, is widely used to protect the information confidentiality in the critical infrastructure sector due to the ability to directly compute the user’s public key based on the user’s identity. However, computational requirements complicate the practical application of Identity-based cryptography. In order to improve the efficiency of identity-based cryptography, this paper presents an effective method to construct pairing-friendly elliptic curves with low hamming weight 4 under embedding degree 1. Based on the analysis of the Complex Multiplication(CM) method, the soundness of our method to calculate the characteristic of the finite field is proved. And then, three relative algorithms to construct pairing-friendly elliptic curve are put forward. 10 elliptic curves with low hamming weight 4 under 160 bits are presented to demonstrate the utility of our approach. Finally, the evaluation also indicates that it is more efficient to compute Tate pairing with our curves, than that of Bertoni et al. PMID:27564373

  7. Biomedical image analysis and processing in clouds

    NASA Astrophysics Data System (ADS)

    Bednarz, Tomasz; Szul, Piotr; Arzhaeva, Yulia; Wang, Dadong; Burdett, Neil; Khassapov, Alex; Chen, Shiping; Vallotton, Pascal; Lagerstrom, Ryan; Gureyev, Tim; Taylor, John

    2013-10-01

    Cloud-based Image Analysis and Processing Toolbox project runs on the Australian National eResearch Collaboration Tools and Resources (NeCTAR) cloud infrastructure and allows access to biomedical image processing and analysis services to researchers via remotely accessible user interfaces. By providing user-friendly access to cloud computing resources and new workflow-based interfaces, our solution enables researchers to carry out various challenging image analysis and reconstruction tasks. Several case studies will be presented during the conference.

  8. A proto-Data Processing Center for LISA

    NASA Astrophysics Data System (ADS)

    Cavet, Cécile; Petiteau, Antoine; Le Jeune, Maude; Plagnol, Eric; Marin-Martholaz, Etienne; Bayle, Jean-Baptiste

    2017-05-01

    The LISA project preparation requires to study and define a new data analysis framework, capable of dealing with highly heterogeneous CPU needs and of exploiting the emergent information technologies. In this context, a prototype of the mission’s Data Processing Center (DPC) has been initiated. The DPC is designed to efficiently manage computing constraints and to offer a common infrastructure where the whole collaboration can contribute to development work. Several tools such as continuous integration (CI) have already been delivered to the collaboration and are presently used for simulations and performance studies. This article presents the progress made regarding this collaborative environment and discusses also the possible next steps towards an on-demand computing infrastructure. This activity is supported by CNES as part of the French contribution to LISA.

  9. [Improvement of sanitary and epidemiological safety of rail transport--a requirement of the new legislation of the Russian Federation].

    PubMed

    2012-01-01

    Brief analysis of the legal framework in recent years, both in the sphere of technical regulation, and in the field of sanitary and epidemiological welfare of the population is presented in this article. The necessity of inclusion in the technical regulations for the safety of railway rolling stock and elements of railway infrastructure the requirements for sanitary-epidemiological safety and hygiene regulations has been proved. Fragments of technical regulations for railway equipment and infrastructure elements, including the basic requirements for the sanitary-epidemiological security are presented. The position of authors in the processing of the regulatory framework in the field of sanitary-epidemiological welfare of population in standardization documents in accordance with the requirements of federal law "On technical regulation" has been reflected.

  10. Toward Information Infrastructure Studies: Ways of Knowing in a Networked Environment

    NASA Astrophysics Data System (ADS)

    Bowker, Geoffrey C.; Baker, Karen; Millerand, Florence; Ribes, David

    This article presents Information Infrastructure Studies, a research area that takes up some core issues in digital information and organization research. Infrastructure Studies simultaneously addresses the technical, social, and organizational aspects of the development, usage, and maintenance of infrastructures in local communities as well as global arenas. While infrastructure is understood as a broad category referring to a variety of pervasive, enabling network resources such as railroad lines, plumbing and pipes, electrical power plants and wires, this article focuses on information infrastructure, such as computational services and help desks, or federating activities such as scientific data repositories and archives spanning the multiple disciplines needed to address such issues as climate warming and the biodiversity crisis. These are elements associated with the internet and, frequently today, associated with cyberinfrastructure or e-science endeavors. We argue that a theoretical understanding of infrastructure provides the context for needed dialogue between design, use, and sustainability of internet-based infrastructure services. This article outlines a research area and outlines overarching themes of Infrastructure Studies. Part one of the paper presents definitions for infrastructure and cyberinfrastructure, reviewing salient previous work. Part two portrays key ideas from infrastructure studies (knowledge work, social and political values, new forms of sociality, etc.). In closing, the character of the field today is considered.

  11. Software and hardware infrastructure for research in electrophysiology

    PubMed Central

    Mouček, Roman; Ježek, Petr; Vařeka, Lukáš; Řondík, Tomáš; Brůha, Petr; Papež, Václav; Mautner, Pavel; Novotný, Jiří; Prokop, Tomáš; Štěbeták, Jan

    2014-01-01

    As in other areas of experimental science, operation of electrophysiological laboratory, design and performance of electrophysiological experiments, collection, storage and sharing of experimental data and metadata, analysis and interpretation of these data, and publication of results are time consuming activities. If these activities are well organized and supported by a suitable infrastructure, work efficiency of researchers increases significantly. This article deals with the main concepts, design, and development of software and hardware infrastructure for research in electrophysiology. The described infrastructure has been primarily developed for the needs of neuroinformatics laboratory at the University of West Bohemia, the Czech Republic. However, from the beginning it has been also designed and developed to be open and applicable in laboratories that do similar research. After introducing the laboratory and the whole architectural concept the individual parts of the infrastructure are described. The central element of the software infrastructure is a web-based portal that enables community researchers to store, share, download and search data and metadata from electrophysiological experiments. The data model, domain ontology and usage of semantic web languages and technologies are described. Current data publication policy used in the portal is briefly introduced. The registration of the portal within Neuroscience Information Framework is described. Then the methods used for processing of electrophysiological signals are presented. The specific modifications of these methods introduced by laboratory researches are summarized; the methods are organized into a laboratory workflow. Other parts of the software infrastructure include mobile and offline solutions for data/metadata storing and a hardware stimulator communicating with an EEG amplifier and recording software. PMID:24639646

  12. Software and hardware infrastructure for research in electrophysiology.

    PubMed

    Mouček, Roman; Ježek, Petr; Vařeka, Lukáš; Rondík, Tomáš; Brůha, Petr; Papež, Václav; Mautner, Pavel; Novotný, Jiří; Prokop, Tomáš; Stěbeták, Jan

    2014-01-01

    As in other areas of experimental science, operation of electrophysiological laboratory, design and performance of electrophysiological experiments, collection, storage and sharing of experimental data and metadata, analysis and interpretation of these data, and publication of results are time consuming activities. If these activities are well organized and supported by a suitable infrastructure, work efficiency of researchers increases significantly. This article deals with the main concepts, design, and development of software and hardware infrastructure for research in electrophysiology. The described infrastructure has been primarily developed for the needs of neuroinformatics laboratory at the University of West Bohemia, the Czech Republic. However, from the beginning it has been also designed and developed to be open and applicable in laboratories that do similar research. After introducing the laboratory and the whole architectural concept the individual parts of the infrastructure are described. The central element of the software infrastructure is a web-based portal that enables community researchers to store, share, download and search data and metadata from electrophysiological experiments. The data model, domain ontology and usage of semantic web languages and technologies are described. Current data publication policy used in the portal is briefly introduced. The registration of the portal within Neuroscience Information Framework is described. Then the methods used for processing of electrophysiological signals are presented. The specific modifications of these methods introduced by laboratory researches are summarized; the methods are organized into a laboratory workflow. Other parts of the software infrastructure include mobile and offline solutions for data/metadata storing and a hardware stimulator communicating with an EEG amplifier and recording software.

  13. Evaluating Investments in Natural Gas Vehicles and Infrastructure for Your Fleet: Vehicle Infrastructure Cash-Flow Estimation -- VICE 2.0; Clean Cities, Energy Efficiency & Renewable Energy (EERE)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gonzales, John

    2015-04-02

    Presentation by Senior Engineer John Gonzales on Evaluating Investments in Natural Gas Vehicles and Infrastructure for Your Fleet using the Vehicle Infrastructure Cash-flow Estimation (VICE) 2.0 model.

  14. [Relationship between water supply, sanitation, public health, and environment: elements for the formulation of a sanitary infrastructure planning model].

    PubMed

    Soares, Sérgio R A; Bernardes, Ricardo S; Netto, Oscar de M Cordeiro

    2002-01-01

    The understanding of sanitation infrastructure, public health, and environmental relations is a fundamental assumption for planning sanitation infrastructure in urban areas. This article thus suggests elements for developing a planning model for sanitation infrastructure. The authors performed a historical survey of environmental and public health issues related to the sector, an analysis of the conceptual frameworks involving public health and sanitation systems, and a systematization of the various effects that water supply and sanitation have on public health and the environment. Evaluation of these effects should guarantee the correct analysis of possible alternatives, deal with environmental and public health objectives (the main purpose of sanitation infrastructure), and provide the most reasonable indication of actions. The suggested systematization of the sanitation systems effects in each step of their implementation is an advance considering the association between the fundamental elements for formulating a planning model for sanitation infrastructure.

  15. A generally applicable lightweight method for calculating a value structure for tools and services in bioinformatics infrastructure projects.

    PubMed

    Mayer, Gerhard; Quast, Christian; Felden, Janine; Lange, Matthias; Prinz, Manuel; Pühler, Alfred; Lawerenz, Chris; Scholz, Uwe; Glöckner, Frank Oliver; Müller, Wolfgang; Marcus, Katrin; Eisenacher, Martin

    2017-10-30

    Sustainable noncommercial bioinformatics infrastructures are a prerequisite to use and take advantage of the potential of big data analysis for research and economy. Consequently, funders, universities and institutes as well as users ask for a transparent value model for the tools and services offered. In this article, a generally applicable lightweight method is described by which bioinformatics infrastructure projects can estimate the value of tools and services offered without determining exactly the total costs of ownership. Five representative scenarios for value estimation from a rough estimation to a detailed breakdown of costs are presented. To account for the diversity in bioinformatics applications and services, the notion of service-specific 'service provision units' is introduced together with the factors influencing them and the main underlying assumptions for these 'value influencing factors'. Special attention is given on how to handle personnel costs and indirect costs such as electricity. Four examples are presented for the calculation of the value of tools and services provided by the German Network for Bioinformatics Infrastructure (de.NBI): one for tool usage, one for (Web-based) database analyses, one for consulting services and one for bioinformatics training events. Finally, from the discussed values, the costs of direct funding and the costs of payment of services by funded projects are calculated and compared. © The Author 2017. Published by Oxford University Press.

  16. Systematic risk assessment methodology for critical infrastructure elements - Oil and Gas subsectors

    NASA Astrophysics Data System (ADS)

    Gheorghiu, A.-D.; Ozunu, A.

    2012-04-01

    The concern for the protection of critical infrastructure has been rapidly growing in the last few years in Europe. The level of knowledge and preparedness in this field is beginning to develop in a lawfully organized manner, for the identification and designation of critical infrastructure elements of national and European interest. Oil and gas production, refining, treatment, storage and transmission by pipelines facilities, are considered European critical infrastructure sectors, as per Annex I of the Council Directive 2008/114/EC of 8 December 2008 on the identification and designation of European critical infrastructures and the assessment of the need to improve their protection. Besides identifying European and national critical infrastructure elements, member states also need to perform a risk analysis for these infrastructure items, as stated in Annex II of the above mentioned Directive. In the field of risk assessment, there are a series of acknowledged and successfully used methods in the world, but not all hazard identification and assessment methods and techniques are suitable for a given site, situation, or type of hazard. As Theoharidou, M. et al. noted (Theoharidou, M., P. Kotzanikolaou, and D. Gritzalis 2009. Risk-Based Criticality Analysis. In Critical Infrastructure Protection III. Proceedings. Third Annual IFIP WG 11.10 International Conference on Critical Infrastructure Protection. Hanover, New Hampshire, USA, March 23-25, 2009: revised selected papers, edited by C. Palmer and S. Shenoi, 35-49. Berlin: Springer.), despite the wealth of knowledge already created, there is a need for simple, feasible, and standardized criticality analyses. The proposed systematic risk assessment methodology includes three basic steps: the first step (preliminary analysis) includes the identification of hazards (including possible natural hazards) for each installation/section within a given site, followed by a criterial analysis and then a detailed analysis step. The criterial evaluation is used as a ranking system in order to establish the priorities for the detailed risk assessment. This criterial analysis stage is necessary because the total number of installations and sections on a site can be quite large. As not all installations and sections on a site contribute significantly to the risk of a major accident occurring, it is not efficient to include all installations and sections in the detailed risk assessment, which can be time and resource consuming. The selected installations are then taken into consideration in the detailed risk assessment, which is the third step of the systematic risk assessment methodology. Following this step, conclusions can be drawn related to the overall risk characteristics of the site. The proposed methodology can as such be successfully applied to the assessment of risk related to critical infrastructure elements falling under the energy sector of Critical Infrastructure, mainly the sub-sectors oil and gas. Key words: Systematic risk assessment, criterial analysis, energy sector critical infrastructure elements

  17. Cargo Logistics Airlift Systems Study (CLASS). Volume 1: Analysis of current air cargo system

    NASA Technical Reports Server (NTRS)

    Burby, R. J.; Kuhlman, W. H.

    1978-01-01

    The material presented in this volume is classified into the following sections; (1) analysis of current routes; (2) air eligibility criteria; (3) current direct support infrastructure; (4) comparative mode analysis; (5) political and economic factors; and (6) future potential market areas. An effort was made to keep the observations and findings relating to the current systems as objective as possible in order not to bias the analysis of future air cargo operations reported in Volume 3 of the CLASS final report.

  18. Thoughts on Beijing's Long-Term Rural Infrastructure Management and Protection Issues from the Perspective of the Government to Effectively Perform Their Duties

    NASA Astrophysics Data System (ADS)

    Wang, Z.

    To strengthen rural infrastructure management, give full play to the role of benefit of infrastructure, it has important significance for promoting the development of rural economy and society. Protection-use and facility energy-use issues are outstanding during Beijing rural infrastructure management. The comprehensive and detailed analysis of the cause of the problems put forward the concrete feasible countermeasures from the government to fulfill the effective function to rural infrastructure: A clear property ownership; Implementation of special funds audit system of the rural infrastructure management; Implementation of rural infrastructure maintenance and management assessment methods and so on.

  19. Comparing drinking water treatment costs to source water protection costs using time series analysis.

    EPA Science Inventory

    We present a framework to compare water treatment costs to source water protection costs, an important knowledge gap for drinking water treatment plants (DWTPs). This trade-off helps to determine what incentives a DWTP has to invest in natural infrastructure or pollution reductio...

  20. Climate and change: simulating flooding impacts on urban transport network

    NASA Astrophysics Data System (ADS)

    Pregnolato, Maria; Ford, Alistair; Dawson, Richard

    2015-04-01

    National-scale climate projections indicate that in the future there will be hotter and drier summers, warmer and wetter winters, together with rising sea levels. The frequency of extreme weather events is expected to increase, causing severe damage to the built environment and disruption of infrastructures (Dawson, 2007), whilst population growth and changed demographics are placing new demands on urban infrastructure. It is therefore essential to ensure infrastructure networks are robust to these changes. This research addresses these challenges by focussing on the development of probabilistic tools for managing risk by modelling urban transport networks within the context of extreme weather events. This paper presents a methodology to investigate the impacts of extreme weather events on urban environment, in particular infrastructure networks, through a combination of climate simulations and spatial representations. By overlaying spatial data on hazard thresholds from a flood model and a flood safety function, mitigated by potential adaptation strategies, different levels of disruption to commuting journeys on road networks are evaluated. The method follows the Catastrophe Modelling approach and it consists of a spatial model, combining deterministic loss models and probabilistic risk assessment techniques. It can be applied to present conditions as well as future uncertain scenarios, allowing the examination of the impacts alongside socio-economic and climate changes. The hazard is determined by simulating free surface water flooding, with the software CityCAT (Glenis et al., 2013). The outputs are overlapped to the spatial locations of a simple network model in GIS, which uses journey-to-work (JTW) observations, supplemented with speed and capacity information. To calculate the disruptive effect of flooding on transport networks, a function relating water depth to safe driving car speed has been developed by combining data from experimental reports (Morris et al., 2011) safety literature (Great Britain Department for Transport, 1999), analysis of videos of cars driving through floodwater, and expert judgement. A preliminary analysis has been run in the Tyne & Wear (in North-East England) region to demonstrate how the analysis can be used to assess the disruptions for commuter journeys due to flooding and will be demonstrated in this paper. The research will also investigate the effectiveness of adaptation strategies for extreme rainfall events, such as permeable surfaces and roof storages for buildings. Multiple scenarios (from the every-day-rainfall to the extreme weather phenomena) will be modelled, with different rainfall rates, rainfall durations and return periods. The comparison between the scenarios in which no interventions are adopted and those improved by one of the adaptation option will be compared to determine the cost-effectiveness of the solution considered. Integrating spatial analysis of transport use with an urban flood model and flood safety function enables the investigation of the impacts of extreme weather on infrastructure networks. Further work will develop the analysis in a number of ways (i) testing a range of flood events with different severity and frequency, (ii) exploration of the influence of climate and socio-economic change (iii) analysis of multiple hazard events and (iv) consideration of cascading disruption across different infrastructure networks.

  1. Geovisualization applications to examine and explore high-density and hierarchical critical infrastructure data

    NASA Astrophysics Data System (ADS)

    Edsall, Robert; Hembree, Harvey

    2018-05-01

    The geospatial research and development team in the National and Homeland Security Division at Idaho National Laboratory was tasked with providing tools to derive insight from the substantial amount of data currently available - and continuously being produced - associated with the critical infrastructure of the US. This effort is in support of the Department of Homeland Security, whose mission includes the protection of this infrastructure and the enhancement of its resilience to hazards, both natural and human. We present geovisual-analytics-based approaches for analysis of vulnerabilities and resilience of critical infrastructure, designed so that decision makers, analysts, and infrastructure owners and managers can manage risk, prepare for hazards, and direct resources before and after an incident that might result in an interruption in service. Our designs are based on iterative discussions with DHS leadership and analysts, who in turn will use these tools to explore and communicate data in partnership with utility providers, law enforcement, and emergency response and recovery organizations, among others. In most cases these partners desire summaries of large amounts of data, but increasingly, our users seek the additional capability of focusing on, for example, a specific infrastructure sector, a particular geographic region, or time period, or of examining data in a variety of generalization or aggregation levels. These needs align well with tenets of in-formation-visualization design; in this paper, selected applications among those that we have designed are described and positioned within geovisualization, geovisual analytical, and information visualization frameworks.

  2. A case analysis of INFOMED: the Cuban national health care telecommunications network and portal.

    PubMed

    Séror, Ann C

    2006-01-27

    The Internet and telecommunications technologies contribute to national health care system infrastructures and extend global health care services markets. The Cuban national health care system offers a model to show how a national information portal can contribute to system integration, including research, education, and service delivery as well as international trade in products and services. The objectives of this paper are (1) to present the context of the Cuban national health care system since the revolution in 1959, (2) to identify virtual institutional infrastructures of the system associated with the Cuban National Health Care Telecommunications Network and Portal (INFOMED), and (3) to show how they contribute to Cuban trade in international health care service markets. Qualitative case research methods were used to identify the integrated virtual infrastructure of INFOMED and to show how it reflects socialist ideology. Virtual institutional infrastructures include electronic medical and information services and the structure of national networks linking such services. Analysis of INFOMED infrastructures shows integration of health care information, research, and education as well as the interface between Cuban national information networks and the global Internet. System control mechanisms include horizontal integration and coordination through virtual institutions linked through INFOMED, and vertical control through the Ministry of Public Health and the government hierarchy. Telecommunications technology serves as a foundation for a dual market structure differentiating domestic services from international trade. INFOMED is a model of interest for integrating health care information, research, education, and services. The virtual infrastructures linked through INFOMED support the diffusion of Cuban health care products and services in global markets. Transferability of this model is contingent upon ideology and interpretation of values such as individual intellectual property and confidentiality of individual health information. Future research should focus on examination of these issues and their consequences for global markets in health care.

  3. A Case Analysis of INFOMED: The Cuban National Health Care Telecommunications Network and Portal

    PubMed Central

    2006-01-01

    Background The Internet and telecommunications technologies contribute to national health care system infrastructures and extend global health care services markets. The Cuban national health care system offers a model to show how a national information portal can contribute to system integration, including research, education, and service delivery as well as international trade in products and services. Objective The objectives of this paper are (1) to present the context of the Cuban national health care system since the revolution in 1959, (2) to identify virtual institutional infrastructures of the system associated with the Cuban National Health Care Telecommunications Network and Portal (INFOMED), and (3) to show how they contribute to Cuban trade in international health care service markets. Methods Qualitative case research methods were used to identify the integrated virtual infrastructure of INFOMED and to show how it reflects socialist ideology. Virtual institutional infrastructures include electronic medical and information services and the structure of national networks linking such services. Results Analysis of INFOMED infrastructures shows integration of health care information, research, and education as well as the interface between Cuban national information networks and the global Internet. System control mechanisms include horizontal integration and coordination through virtual institutions linked through INFOMED, and vertical control through the Ministry of Public Health and the government hierarchy. Telecommunications technology serves as a foundation for a dual market structure differentiating domestic services from international trade. Conclusions INFOMED is a model of interest for integrating health care information, research, education, and services. The virtual infrastructures linked through INFOMED support the diffusion of Cuban health care products and services in global markets. Transferability of this model is contingent upon ideology and interpretation of values such as individual intellectual property and confidentiality of individual health information. Future research should focus on examination of these issues and their consequences for global markets in health care. PMID:16585025

  4. The political economy of United States multiutilities: The United States electric power industry and communication services

    NASA Astrophysics Data System (ADS)

    Quail, Christine M.

    This study consists of a political economic analysis of the multiutility industry, the industry located at the confluence of electric utilities, telephone, cable, and Internet markets. The study uses a theoretical framework based in political economy and urban theory. Methodologies used include industrial analysis and instrumental analysis. A discussion of technological convergence establishes the technical means by which multiutilities developed. Refusing technological determinism, however, the study presents a critical analysis of the history, philosophy, and regulation of utilities. Distinctions are made between public and private ownership structures in the electric utility industry. Next, the study embarks on an industrial analysis of the multiutility industry. The industrial analysis includes a discussion of the industry's history, markets, ownership types, and legal struggles. Following the broad industrial overview, two case studies are presented: Hawarden Integrated Technology, Energy and Communications (HITEC), and Con Edison Communications, LLC. HITEC is a public multiutility in the City of Hawarden, Iowa. Con Edison Communications is a private multiutility, based in New York City. The case studies provide a vehicle by which theoretical and philosophical underpinnings, as well as general trends, in the multiutility industry are localized and concretized. Finally, the study draws conclusions about the nature, history, and future of public versus private control of multiutilities' converged communications infrastructures. Questions of democratic control of media infrastructures are raised.

  5. Time Series Analysis of Energy Production and Associated Landscape Fragmentation in the Eagle Ford Shale Play.

    PubMed

    Pierre, Jon Paul; Young, Michael H; Wolaver, Brad D; Andrews, John R; Breton, Caroline L

    2017-11-01

    Spatio-temporal trends in infrastructure footprints, energy production, and landscape alteration were assessed for the Eagle Ford Shale of Texas. The period of analysis was over four 2-year periods (2006-2014). Analyses used high-resolution imagery, as well as pipeline data to map EF infrastructure. Landscape conditions from 2006 were used as baseline. Results indicate that infrastructure footprints varied from 94.5 km 2 in 2008 to 225.0 km 2 in 2014. By 2014, decreased land-use intensities (ratio of land alteration to energy production) were noted play-wide. Core-area alteration by period was highest (3331.6 km 2 ) in 2008 at the onset of play development, and increased from 582.3 to 3913.9 km 2 by 2014, though substantial revegetation of localized core areas was observed throughout the study (i.e., alteration improved in some areas and worsened in others). Land-use intensity in the eastern portion of the play was consistently lower than that in the western portion, while core alteration remained relatively constant east to west. Land alteration from pipeline construction was ~65 km 2 for all time periods, except in 2010 when alteration was recorded at 47 km 2 . Percent of total alteration from well-pad construction increased from 27.3% in 2008 to 71.5% in 2014. The average number of wells per pad across all 27 counties increased from 1.15 to 1.7. This study presents a framework for mapping landscape alteration from oil and gas infrastructure development. However, the framework could be applied to other energy development programs, such as wind or solar fields, or any other regional infrastructure development program. Landscape alteration caused by hydrocarbon pipeline installation in Val Verde County, Texas.

  6. Time Series Analysis of Energy Production and Associated Landscape Fragmentation in the Eagle Ford Shale Play

    NASA Astrophysics Data System (ADS)

    Pierre, Jon Paul; Young, Michael H.; Wolaver, Brad D.; Andrews, John R.; Breton, Caroline L.

    2017-11-01

    Spatio-temporal trends in infrastructure footprints, energy production, and landscape alteration were assessed for the Eagle Ford Shale of Texas. The period of analysis was over four 2-year periods (2006-2014). Analyses used high-resolution imagery, as well as pipeline data to map EF infrastructure. Landscape conditions from 2006 were used as baseline. Results indicate that infrastructure footprints varied from 94.5 km2 in 2008 to 225.0 km2 in 2014. By 2014, decreased land-use intensities (ratio of land alteration to energy production) were noted play-wide. Core-area alteration by period was highest (3331.6 km2) in 2008 at the onset of play development, and increased from 582.3 to 3913.9 km2 by 2014, though substantial revegetation of localized core areas was observed throughout the study (i.e., alteration improved in some areas and worsened in others). Land-use intensity in the eastern portion of the play was consistently lower than that in the western portion, while core alteration remained relatively constant east to west. Land alteration from pipeline construction was 65 km2 for all time periods, except in 2010 when alteration was recorded at 47 km2. Percent of total alteration from well-pad construction increased from 27.3% in 2008 to 71.5% in 2014. The average number of wells per pad across all 27 counties increased from 1.15 to 1.7. This study presents a framework for mapping landscape alteration from oil and gas infrastructure development. However, the framework could be applied to other energy development programs, such as wind or solar fields, or any other regional infrastructure development program.

  7. Solving the Software Legacy Problem with RISA

    NASA Astrophysics Data System (ADS)

    Ibarra, A.; Gabriel, C.

    2012-09-01

    Nowadays hardware and system infrastructure evolve on time scales much shorter than the typical duration of space astronomy missions. Data processing software capabilities have to evolve to preserve the scientific return during the entire experiment life time. Software preservation is a key issue that has to be tackled before the end of the project to keep the data usable over many years. We present RISA (Remote Interface to Science Analysis) as a solution to decouple data processing software and infrastructure life-cycles, using JAVA applications and web-services wrappers to existing software. This architecture employs embedded SAS in virtual machines assuring a homogeneous job execution environment. We will also present the first studies to reactivate the data processing software of the EXOSAT mission, the first ESA X-ray astronomy mission launched in 1983, using the generic RISA approach.

  8. A scalable infrastructure for CMS data analysis based on OpenStack Cloud and Gluster file system

    NASA Astrophysics Data System (ADS)

    Toor, S.; Osmani, L.; Eerola, P.; Kraemer, O.; Lindén, T.; Tarkoma, S.; White, J.

    2014-06-01

    The challenge of providing a resilient and scalable computational and data management solution for massive scale research environments requires continuous exploration of new technologies and techniques. In this project the aim has been to design a scalable and resilient infrastructure for CERN HEP data analysis. The infrastructure is based on OpenStack components for structuring a private Cloud with the Gluster File System. We integrate the state-of-the-art Cloud technologies with the traditional Grid middleware infrastructure. Our test results show that the adopted approach provides a scalable and resilient solution for managing resources without compromising on performance and high availability.

  9. Predictive Power of Clean Bed Filtration Theory for Fecal Indicator Bacteria Removal in Stormwater Biofilters

    NASA Astrophysics Data System (ADS)

    Parker, E.; Rippy, M.; Mehring, A.; Winfrey, B.; Ambrose, R. F.; Levin, L. A.; Grant, S. B.

    2017-12-01

    Green infrastructure (also referred to as low impact development, or LID) has the potential to transform urban stormwater runoff from an environmental threat to a valuable water resource. Here we focus on the removal of fecal indicator bacteria (FIB, a pollutant responsible for runoff associated inland and coastal beach closures) in stormwater biofilters (a common type of green infrastructure). Drawing on a combination of previously published and new laboratory studies of FIB removal in biofilters, we find that 66% of the variance in FIB removal rates can be explained by clean bed filtration theory (CBFT, 31%), antecedent dry period (14%), study effect (8%), biofilter age (7%), and the presence or absence of shrubs (6%). Our analysis suggests that, with the exception of shrubs, plants affect FIB removal indirectly by changing the infiltration rate, not directly by changing the FIB removal mechanisms or altering filtration rates in ways not already accounted for by CBFT. The analysis presented here represents a significant step forward in our understanding of how physicochemical theories (such as CBFT) can be melded with hydrology, engineering design, and ecology to improve the water quality benefits of green infrastructure.

  10. Toolkit of Available EPA Green Infrastructure Modeling ...

    EPA Pesticide Factsheets

    This webinar will present a toolkit consisting of five EPA green infrastructure models and tools, along with communication material. This toolkit can be used as a teaching and quick reference resource for use by planners and developers when making green infrastructure implementation decisions. It can also be used for low impact development design competitions. Models and tools included: Green Infrastructure Wizard (GIWiz), Watershed Management Optimization Support Tool (WMOST), Visualizing Ecosystem Land Management Assessments (VELMA) Model, Storm Water Management Model (SWMM), and the National Stormwater Calculator (SWC). This webinar will present a toolkit consisting of five EPA green infrastructure models and tools, along with communication material. This toolkit can be used as a teaching and quick reference resource for use by planners and developers when making green infrastructure implementation decisions. It can also be used for low impact development design competitions. Models and tools included: Green Infrastructure Wizard (GIWiz), Watershed Management Optimization Support Tool (WMOST), Visualizing Ecosystem Land Management Assessments (VELMA) Model, Storm Water Management Model (SWMM), and the National Stormwater Calculator (SWC).

  11. People at risk - nexus critical infrastructure and society

    NASA Astrophysics Data System (ADS)

    Heiser, Micha; Thaler, Thomas; Fuchs, Sven

    2016-04-01

    Strategic infrastructure networks include the highly complex and interconnected systems that are so vital to a city or state that any sudden disruption can result in debilitating impacts on human life, the economy and the society as a whole. Recently, various studies have applied complex network-based models to study the performance and vulnerability of infrastructure systems under various types of attacks and hazards - a major part of them is, particularly after the 9/11 incident, related to terrorism attacks. Here, vulnerability is generally defined as the performance drop of an infrastructure system under a given disruptive event. The performance can be measured by different metrics, which correspond to various levels of resilience. In this paper, we will address vulnerability and exposure of critical infrastructure in the Eastern Alps. The Federal State Tyrol is an international transport route and an essential component of the north-south transport connectivity in Europe. Any interruption of the transport flow leads to incommensurable consequences in terms of indirect losses, since the system does not feature redundant elements at comparable economic efficiency. Natural hazard processes such as floods, debris flows, rock falls and avalanches, endanger this infrastructure line, such as large flood events in 2005 or 2012, rock falls 2014, which had strong impacts to the critical infrastructure, such as disruption of the railway lines (in 2005 and 2012), highways and motorways (in 2014). The aim of this paper is to present how critical infrastructures as well as communities and societies are vulnerable and can be resilient against natural hazard risks and the relative cascading effects to different compartments (industrial, infrastructural, societal, institutional, cultural, etc.), which is the dominant by the type of hazard (avalanches, torrential flooding, debris flow, rock falls). Specific themes will be addressed in various case studies to allow cross-learning and cross-comparison of, for example rural and urban areas, and different scales. Correspondingly, scale-specific resilience indicators and metrics will be developed to tailor methods to specific needs according to the scale of assessment (micro/local and macro/regional) and to the type of infrastructure. The traditional indicators normally used in structural analysis are not sufficient to understand how events happening on the networks can have cascading consequences. Moreover, effects have multidimensional (technical, economic, organizational and human), multiscale (micro and macro) and temporal characteristics (short- to long-term incidence). These considerations will guide to different activities: 1) computation of classic structural analysis indicators on the case studies in order to obtain an identity of the transport infrastructure and; 2) development of a set of new measures of resilience. To mitigate natural hazard risk a large amount of protection measures of different typology have been constructed following inhomogeneous reliability standards. The focus of this case study will be on resilience issues and decision making in the context of a large scale sectorial approach focused on transport infrastructure network.

  12. Interactive Model-Centric Systems Engineering (IMCSE) Phase Two

    DTIC Science & Technology

    2015-02-28

    109 Backend Implementation...42 Figure 10. Interactive Epoch-Era Analysis leverages humans-in-the-loop analysis and supporting infrastructure ...preliminary supporting 10 infrastructure . This will inform the transition strategies, additional case application and prototype user testing. • The

  13. Mixing Metaphors: Building Infrastructure for Large Scale School Turnaround

    ERIC Educational Resources Information Center

    Peurach, Donald J.; Neumerski, Christine M.

    2015-01-01

    The purpose of this analysis is to increase understanding of the possibilities and challenges of building educational infrastructure--the basic, foundational structures, systems, and resources--to support large-scale school turnaround. Building educational infrastructure often exceeds the capacity of schools, districts, and state education…

  14. Implementation of green infrastructure concept in Citarum Watershed

    NASA Astrophysics Data System (ADS)

    Maryati, Sri; Humaira, An Nisaa'Siti

    2017-03-01

    Green infrastructure has several benefits compared to grey infrastructure in term of environmental services and sustainability, such as reducing energy consumption, improving air quality, providing carbon sequestration, and increasing property values. Nevertheless in practice, the implementation of the concept in Indonesia is still limited. Implementation of the certain concept has to be guided in planning document. In this paper, green infrastructure concept in the current spatial plan and other planning documents is assessed. The purpose of this research is to figure out how far the green infrastructure concept is integrated into planning system, based on the analysis of planning documents in Citarum Watershed and expert interviews with local stakeholders. Content analysis method is used to analyze the documents and result of interview. The result shows that green infrastructure concept has not been accommodated in spatial plan or other planning documents widely. There are some challenges in implementing the concept including reward and punishment system (incentive and disincentive), coordination, and lack of human resources.

  15. Fuzzy architecture assessment for critical infrastructure resilience

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Muller, George

    2012-12-01

    This paper presents an approach for the selection of alternative architectures in a connected infrastructure system to increase resilience of the overall infrastructure system. The paper begins with a description of resilience and critical infrastructure, then summarizes existing approaches to resilience, and presents a fuzzy-rule based method of selecting among alternative infrastructure architectures. This methodology includes considerations which are most important when deciding on an approach to resilience. The paper concludes with a proposed approach which builds on existing resilience architecting methods by integrating key system aspects using fuzzy memberships and fuzzy rule sets. This novel approach aids the systemsmore » architect in considering resilience for the evaluation of architectures for adoption into the final system architecture.« less

  16. Greenhouse Gas Mitigation in Chinese Eco-Industrial Parks by Targeting Energy Infrastructure: A Vintage Stock Model.

    PubMed

    Guo, Yang; Tian, Jinping; Chertow, Marian; Chen, Lujun

    2016-10-03

    Mitigating greenhouse gas (GHG) emissions in China's industrial sector is crucial for addressing climate change. We developed a vintage stock model to quantify the GHG mitigation potential and cost effectiveness in Chinese eco-industrial parks by targeting energy infrastructure with five key measures. The model, integrating energy efficiency assessments, GHG emission accounting, cost-effectiveness analyses, and scenario analyses, was applied to 548 units of energy infrastructure in 106 parks. The results indicate that two measures (shifting coal-fired boilers to natural gas-fired boilers and replacing coal-fired units with natural gas combined cycle units) present a substantial potential to mitigate GHGs (42%-46%) compared with the baseline scenario. The other three measures (installation of municipal solid waste-to-energy units, replacement of small-capacity coal-fired units with large units, and implementation of turbine retrofitting) present potential mitigation values of 6.7%, 0.3%, and 2.1%, respectively. In most cases, substantial economic benefits also can be achieved by GHG emission mitigation. An uncertainty analysis showed that enhancing the annual working time or serviceable lifetime levels could strengthen the GHG mitigation potential at a lower cost for all of the measures.

  17. Potential of best practice to reduce impacts from oil and gas projects in the Amazon.

    PubMed

    Finer, Matt; Jenkins, Clinton N; Powers, Bill

    2013-01-01

    The western Amazon continues to be an active and controversial zone of hydrocarbon exploration and production. We argue for the urgent need to implement best practices to reduce the negative environmental and social impacts associated with the sector. Here, we present a three-part study aimed at resolving the major obstacles impeding the advancement of best practice in the region. Our focus is on Loreto, Peru, one of the largest and most dynamic hydrocarbon zones in the Amazon. First, we develop a set of specific best practice guidelines to address the lack of clarity surrounding the issue. These guidelines incorporate both engineering-based criteria and key ecological and social factors. Second, we provide a detailed analysis of existing and planned hydrocarbon activities and infrastructure, overcoming the lack of information that typically hampers large-scale impact analysis. Third, we evaluate the planned activities and infrastructure with respect to the best practice guidelines. We show that Loreto is an extremely active hydrocarbon front, highlighted by a number of recent oil and gas discoveries and a sustained government push for increased exploration. Our analyses reveal that the use of technical best practice could minimize future impacts by greatly reducing the amount of required infrastructure such as drilling platforms and access roads. We also document a critical need to consider more fully the ecological and social factors, as the vast majority of planned infrastructure overlaps sensitive areas such as protected areas, indigenous territories, and key ecosystems and watersheds. Lastly, our cost analysis indicates that following best practice does not impose substantially greater costs than conventional practice, and may in fact reduce overall costs. Barriers to the widespread implementation of best practice in the Amazon clearly exist, but our findings show that there can be great benefits to its implementation.

  18. Early Support Development of Children with Disorders of the Biopsychosocial Functioning in Poland

    ERIC Educational Resources Information Center

    Czyz, Anna

    2017-01-01

    This article presents the results of a research study on the system of early child development support with developmental disabilities and their families in Poland. The analysis covered areas such as proximity and accessibility of services, infrastructural conditions, preparation of personnel, and occurrence of systemic barriers. The article…

  19. Cyber resilience: a review of critical national infrastructure and cyber security protection measures applied in the UK and USA.

    PubMed

    Harrop, Wayne; Matteson, Ashley

    This paper presents cyber resilience as key strand of national security. It establishes the importance of critical national infrastructure protection and the growing vicarious nature of remote, well-planned, and well executed cyber attacks on critical infrastructures. Examples of well-known historical cyber attacks are presented, and the emergence of 'internet of things' as a cyber vulnerability issue yet to be tackled is explored. The paper identifies key steps being undertaken by those responsible for detecting, deterring, and disrupting cyber attacks on critical national infrastructure in the United Kingdom and the USA.

  20. Reducing construction waste: A study of urban infrastructure projects.

    PubMed

    de Magalhães, Ruane Fernandes; Danilevicz, Ângela de Moura Ferreira; Saurin, Tarcisio Abreu

    2017-09-01

    The construction industry is well-known for producing waste detrimental to the environment, and its impacts have increased with the development process of cities. Although there are several studies focused on the environmental impact of residential and commercial buildings, less knowledge is available regarding decreasing construction waste (CW) generation in urban infrastructure projects. This study presents best practices to reduce waste in the said projects, stressing the role of decision-making in the design stage and the effective management of construction processes in public sector. The best practices were identified from literature review, document analysis in 14 projects of urban infrastructure, and both qualitative and quantitative survey with 18 experts (architects and engineers) playing different roles on those projects. The contributions of these research are: (i) the identification of the main building techniques related to the urban design typologies analyzed; (ii) the identification of cause-effect relationships between the design choices and the CW generation diagnosis; (iii) the proposal of a checklist to support the decision-making process, that can be used as a control and evaluation instrument when developing urban infrastructure designs, focused on the construction waste minimization (CWM). Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. Interactive Model-Centric Systems Engineering (IMCSE) Phase 1

    DTIC Science & Technology

    2014-09-30

    and supporting infrastructure ...testing. 4. Supporting MPTs. During Phase 1, the opportunity to develop several MPTs to support IMCSE arose, including supporting infrastructure ...Analysis will be completed and tested with a case application, along with preliminary supporting infrastructure , which will then be used to inform the

  2. Green Infrastructure Research and Demonstration at the Edison Environmental Center

    EPA Science Inventory

    This presentation will review the need for storm water control practices and will present a portion of the green infrastructure research and demonstration being performed at the Edison Environmental Center.

  3. Are We Ready for Mass Fatality Incidents? Preparedness of the US Mass Fatality Infrastructure.

    PubMed

    Merrill, Jacqueline A; Orr, Mark; Chen, Daniel Y; Zhi, Qi; Gershon, Robyn R

    2016-02-01

    To assess the preparedness of the US mass fatality infrastructure, we developed and tested metrics for 3 components of preparedness: organizational, operational, and resource sharing networks. In 2014, data were collected from 5 response sectors: medical examiners and coroners, the death care industry, health departments, faith-based organizations, and offices of emergency management. Scores were calculated within and across sectors and a weighted score was developed for the infrastructure. A total of 879 respondents reported highly variable organizational capabilities: 15% had responded to a mass fatality incident (MFI); 42% reported staff trained for an MFI, but only 27% for an MFI involving hazardous contaminants. Respondents estimated that 75% of their staff would be willing and able to respond, but only 53% if contaminants were involved. Most perceived their organization as somewhat prepared, but 13% indicated "not at all." Operational capability scores ranged from 33% (death care industry) to 77% (offices of emergency management). Network capability analysis found that only 42% of possible reciprocal relationships between resource-sharing partners were present. The cross-sector composite score was 51%; that is, half the key capabilities for preparedness were in place. The sectors in the US mass fatality infrastructure report suboptimal capability to respond. National leadership is needed to ensure sector-specific and infrastructure-wide preparedness for a large-scale MFI.

  4. Managing the water-energy-food nexus: Opportunities in Central Asia

    NASA Astrophysics Data System (ADS)

    Jalilov, Shokhrukh-Mirzo; Amer, Saud A.; Ward, Frank A.

    2018-02-01

    This article examines impacts of infrastructure development and climate variability on economic outcomes for the Amu Darya Basin in Central Asia. It aims to identify the most economically productive mix of expanded reservoir storage for economic benefit sharing to occur, in which economic welfare of all riparians is improved. Policies examined include four combinations of storage infrastructure for each of two climate futures. An empirical optimization model is developed and applied to identify opportunities for improving the welfare of Tajikistan, Uzbekistan, Afghanistan, and Turkmenistan. The analysis 1) characterizes politically constrained and economically optimized water-use patterns for these combinations of expanded reservoir storage capacity, 2) describes Pareto-Improving packages of expanded storage capacity that could raise economic welfare for all four riparians, and accounts for impacts for each of two climate scenarios. Results indicate that a combination of targeted water storage infrastructure and efficient water allocation could produce outcomes for which the discounted net present value of benefits are favorable for each riparian. Results identify a framework to provide economic motivation for all riparians to cooperate through development of water storage infrastructure. Our findings illustrate the principle that development of water infrastructure can expand the negotiation space by which all communities can gain economic benefits in the face of limited water supply. Still, despite our optimistic findings, patient and deliberate negotiation will be required to transform potential improvements into actual gains.

  5. Crash data analyses for vehicle-to-infrastructure communications for safety applications.

    DOT National Transportation Integrated Search

    2012-11-01

    This report presents the potential safety benefits of wireless communication between the roadway infrastructure and vehicles, : (i.e., vehicle-to-infrastructure (V2I) safety). Specifically, it identifies the magnitude, characteristics, and cost of cr...

  6. OOI CyberInfrastructure - Next Generation Oceanographic Research

    NASA Astrophysics Data System (ADS)

    Farcas, C.; Fox, P.; Arrott, M.; Farcas, E.; Klacansky, I.; Krueger, I.; Meisinger, M.; Orcutt, J.

    2008-12-01

    Software has become a key enabling technology for scientific discovery, observation, modeling, and exploitation of natural phenomena. New value emerges from the integration of individual subsystems into networked federations of capabilities exposed to the scientific community. Such data-intensive interoperability networks are crucial for future scientific collaborative research, as they open up new ways of fusing data from different sources and across various domains, and analysis on wide geographic areas. The recently established NSF OOI program, through its CyberInfrastructure component addresses this challenge by providing broad access from sensor networks for data acquisition up to computational grids for massive computations and binding infrastructure facilitating policy management and governance of the emerging system-of-scientific-systems. We provide insight into the integration core of this effort, namely, a hierarchic service-oriented architecture for a robust, performant, and maintainable implementation. We first discuss the relationship between data management and CI crosscutting concerns such as identity management, policy and governance, which define the organizational contexts for data access and usage. Next, we detail critical services including data ingestion, transformation, preservation, inventory, and presentation. To address interoperability issues between data represented in various formats we employ a semantic framework derived from the Earth System Grid technology, a canonical representation for scientific data based on DAP/OPeNDAP, and related data publishers such as ERDDAP. Finally, we briefly present the underlying transport based on a messaging infrastructure over the AMQP protocol, and the preservation based on a distributed file system through SDSC iRODS.

  7. "Bottleneck study" : transportation infrastructure and traffic management analysis of cross border bottlenecks.

    DOT National Transportation Integrated Search

    2004-11-01

    The motivation behind the Transportation Infrastructure and Traffic Management Analysis of : Cross Border Bottlenecks study was generated by the U.S.-Mexico Border Partnership Action : Plan (Action item #2 of the 22-Point Smart Border Action Plan: De...

  8. BIM cost analysis of transport infrastructure projects

    NASA Astrophysics Data System (ADS)

    Volkov, Andrey; Chelyshkov, Pavel; Grossman, Y.; Khromenkova, A.

    2017-10-01

    The article describes the method of analysis of the energy costs of transport infrastructure objects using BIM software. The paper consideres several options of orientation of a building using SketchUp and IES VE software programs. These options allow to choose the best direction of the building facades. Particular attention is given to a distribution of a temperature field in a cross-section of the wall according to the calculation made in the ELCUT software. The issues related to calculation of solar radiation penetration into a building and selection of translucent structures are considered in the paper. The article presents data on building codes relating to the transport sector, on the basis of which the calculations were made. The author emphasizes that BIM-programs should be implemented and used in order to optimize a thermal behavior of a building and increase its energy efficiency using climatic data.

  9. City of Hoboken Energy Surety Analysis: Preliminary Design Summary

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stamp, Jason Edwin; Baca, Michael J.; Munoz-Ramos, Karina

    2014-09-01

    In 2012, Hurricane Sandy devastated much of the U.S. northeast coastal areas. Among those hardest hit was the small community of Hoboken, New Jersey, located on the banks of the Hudson River across from Manhattan. This report describes a city-wide electrical infrastructure design that uses microgrids and other infrastructure to ensure the city retains functionality should such an event occur in the future. The designs ensure that up to 55 critical buildings will retain power during blackout or flooded conditions and include analysis for microgrid architectures, performance parameters, system control, renewable energy integration, and financial opportunities (while grid connected). Themore » results presented here are not binding and are subject to change based on input from the Hoboken stakeholders, the integrator selected to manage and implement the microgrid, or other subject matter experts during the detailed (final) phase of the design effort.« less

  10. Assessment of Change in Green Infrastructure Components Using Morphological Spatial Pattern Analysis for the Conterminous United States

    EPA Science Inventory

    Green infrastructure is a widely used framework for conservation planning in the United States and elsewhere. The main components of green infrastructure are hubs and corridors. Hubs are large areas of natural vegetation, and corridors are linear features that connect hubs. W...

  11. Optimizing Estimates of Impervious Cover and Riparian Zone Condition in New England Watersheds: A Green Infrastructure Analysis.

    EPA Science Inventory

    Under EPA’s Green Infrastructure Initiative, a variety of research activities are underway to evaluate the effectiveness of green infrastructure in mitigating the effects of urbanization and stormwater impacts on stream biota and habitat. Effectiveness of both site-scale st...

  12. A data infrastructure for the assessment of health care performance: lessons from the BRIDGE-health project.

    PubMed

    Bernal-Delgado, Enrique; Estupiñán-Romero, Francisco

    2018-01-01

    The integration of different administrative data sources from a number of European countries has been shown useful in the assessment of unwarranted variations in health care performance. This essay describes the procedures used to set up a data infrastructure (e.g., data access and exchange, definition of the minimum common wealth of data required, and the development of the relational logic data model) and, the methods to produce trustworthy healthcare performance measurements (e.g., ontologies standardisation and quality assurance analysis). The paper ends providing some hints on how to use these lessons in an eventual European infrastructure on public health research and monitoring. Although the relational data infrastructure developed has been proven accurate, effective to compare health system performance across different countries, and efficient enough to deal with hundred of millions of episodes, the logic data model might not be responsive if the European infrastructure aims at including electronic health records and carrying out multi-cohort multi-intervention comparative effectiveness research. The deployment of a distributed infrastructure based on semantic interoperability, where individual data remain in-country and open-access scripts for data management and analysis travel around the hubs composing the infrastructure, might be a sensible way forward.

  13. Managing a tier-2 computer centre with a private cloud infrastructure

    NASA Astrophysics Data System (ADS)

    Bagnasco, Stefano; Berzano, Dario; Brunetti, Riccardo; Lusso, Stefano; Vallero, Sara

    2014-06-01

    In a typical scientific computing centre, several applications coexist and share a single physical infrastructure. An underlying Private Cloud infrastructure eases the management and maintenance of such heterogeneous applications (such as multipurpose or application-specific batch farms, Grid sites, interactive data analysis facilities and others), allowing dynamic allocation resources to any application. Furthermore, the maintenance of large deployments of complex and rapidly evolving middleware and application software is eased by the use of virtual images and contextualization techniques. Such infrastructures are being deployed in some large centres (see e.g. the CERN Agile Infrastructure project), but with several open-source tools reaching maturity this is becoming viable also for smaller sites. In this contribution we describe the Private Cloud infrastructure at the INFN-Torino Computer Centre, that hosts a full-fledged WLCG Tier-2 centre, an Interactive Analysis Facility for the ALICE experiment at the CERN LHC and several smaller scientific computing applications. The private cloud building blocks include the OpenNebula software stack, the GlusterFS filesystem and the OpenWRT Linux distribution (used for network virtualization); a future integration into a federated higher-level infrastructure is made possible by exposing commonly used APIs like EC2 and OCCI.

  14. Soak Up the Rain New England Webinar Series: National Stormwater Calculator

    EPA Science Inventory

    Presenters will provide an introduction to the most recent EPA green infrastructure tools to R1 stakeholders; and their use in making decisions about implementing green infrastructure. We will discuss structuring your green infrastructure decision, finding appropriate information...

  15. Towards Social Radiology as an Information Infrastructure: Reconciling the Local With the Global

    PubMed Central

    2014-01-01

    The current widespread use of medical images and imaging procedures in clinical practice and patient diagnosis has brought about an increase in the demand for sharing medical imaging studies among health professionals in an easy and effective manner. This article reveals the existence of a polarization between the local and global demands for radiology practice. While there are no major barriers for sharing such studies, when access is made from a (local) picture archive and communication system (PACS) within the domain of a healthcare organization, there are a number of impediments for sharing studies among health professionals on a global scale. Social radiology as an information infrastructure involves the notion of a shared infrastructure as a public good, affording a social space where people, organizations and technical components may spontaneously form associations in order to share clinical information linked to patient care and radiology practice. This article shows however, that such polarization establishes a tension between local and global demands, which hinders the emergence of social radiology as an information infrastructure. Based on an analysis of the social space for radiology practice, the present article has observed that this tension persists due to the inertia of a locally installed base in radiology departments, for which common teleradiology models are not truly capable of reorganizing as a global social space for radiology practice. Reconciling the local with the global signifies integrating PACS and teleradiology into an evolving, secure, heterogeneous, shared, open information infrastructure where the conceptual boundaries between (local) PACS and (global) teleradiology are transparent, signaling the emergence of social radiology as an information infrastructure. PMID:25600710

  16. "Bottleneck study" : transportation infrastructure and traffic management analysis of cross border bottlenecks. [Executive summary].

    DOT National Transportation Integrated Search

    2004-11-01

    The motivation behind the Transportation Infrastructure and Traffic Management Analysis of : Cross Border Bottlenecks study was generated by the U.S.-Mexico Border Partnership Action : Plan (Action item #2 of the 22-Point Smart Border Action Plan: De...

  17. Modeling joint restoration strategies for interdependent infrastructure systems.

    PubMed

    Zhang, Chao; Kong, Jingjing; Simonovic, Slobodan P

    2018-01-01

    Life in the modern world depends on multiple critical services provided by infrastructure systems which are interdependent at multiple levels. To effectively respond to infrastructure failures, this paper proposes a model for developing optimal joint restoration strategy for interdependent infrastructure systems following a disruptive event. First, models for (i) describing structure of interdependent infrastructure system and (ii) their interaction process, are presented. Both models are considering the failure types, infrastructure operating rules and interdependencies among systems. Second, an optimization model for determining an optimal joint restoration strategy at infrastructure component level by minimizing the economic loss from the infrastructure failures, is proposed. The utility of the model is illustrated using a case study of electric-water systems. Results show that a small number of failed infrastructure components can trigger high level failures in interdependent systems; the optimal joint restoration strategy varies with failure occurrence time. The proposed models can help decision makers to understand the mechanisms of infrastructure interactions and search for optimal joint restoration strategy, which can significantly enhance safety of infrastructure systems.

  18. An EUDET/AIDA Pixel Beam Telescope for Detector Development

    NASA Astrophysics Data System (ADS)

    Rubinskiy, I.; EUDET Consortium; AIDA Consortium

    Ahigh resolution(σ< 2 μm) beam telescope based on monolithic active pixel sensors (MAPS) was developed within the EUDET collaboration. EUDET was a coordinated detector R&D programme for the future International Linear Collider providing test beam infrastructure to detector R&D groups. The telescope consists of six sensor planes with a pixel pitch of either 18.4 μm or 10 μmand canbe operated insidea solenoidal magnetic fieldofupto1.2T.Ageneral purpose cooling, positioning, data acquisition (DAQ) and offine data analysis tools are available for the users. The excellent resolution, readout rate andDAQintegration capabilities made the telescopea primary beam tests tool also for several CERN based experiments. In this report the performance of the final telescope is presented. The plans for an even more flexible telescope with three differentpixel technologies(ATLASPixel, Mimosa,Timepix) withinthenew European detector infrastructure project AIDA are presented.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Melaina, Marc W; Wood, Eric W

    The plug-in electric vehicle (PEV) market is experiencing rapid growth with dozens of battery electric (BEV) and plug-in hybrid electric (PHEV) models already available and billions of dollars being invested by automotive manufacturers in the PEV space. Electric range is increasing thanks to larger and more advanced batteries and significant infrastructure investments are being made to enable higher power fast charging. Costs are falling and PEVs are becoming more competitive with conventional vehicles. Moreover, new technologies such as connectivity and automation hold the promise of enhancing the value proposition of PEVs. This presentation outlines a suite of projects funded bymore » the U.S. Department of Energy's Vehicle Technology Office to conduct assessments of the economic value and charging infrastructure requirements of the evolving PEV market. Individual assessments include national evaluations of PEV economic value (assuming 73M PEVs on the road in 2035), national analysis of charging infrastructure requirements (with community and corridor level resolution), and case studies of PEV ownership in Columbus, OH and Massachusetts.« less

  20. Fragmented Flows: Water Supply in Los Angeles County

    NASA Astrophysics Data System (ADS)

    Pincetl, Stephanie; Porse, Erik; Cheng, Deborah

    2016-08-01

    In the Los Angeles metropolitan region, nearly 100 public and private entities are formally involved in the management and distribution of potable water—a legacy rooted in fragmented urban growth in the area and late 19th century convictions about local control of services. Yet, while policy debates focus on new forms of infrastructure, restructured pricing mechanisms, and other technical fixes, the complex institutional architecture of the present system has received little attention. In this paper, we trace the development of this system, describe its interconnections and disjunctures, and demonstrate the invisibility of water infrastructure in LA in multiple ways—through mapping, statistical analysis, and historical texts. Perverse blessings of past water abundance led to a complex, but less than resilient, system with users accustomed to cheap, easily accessible water. We describe the lack of transparency and accountability in the current system, as well as its shortcomings in building needed new infrastructure and instituting new water rate structures. Adapting to increasing water scarcity and likely droughts must include addressing the architecture of water management.

  1. A hybrid computational strategy to address WGS variant analysis in >5000 samples.

    PubMed

    Huang, Zhuoyi; Rustagi, Navin; Veeraraghavan, Narayanan; Carroll, Andrew; Gibbs, Richard; Boerwinkle, Eric; Venkata, Manjunath Gorentla; Yu, Fuli

    2016-09-10

    The decreasing costs of sequencing are driving the need for cost effective and real time variant calling of whole genome sequencing data. The scale of these projects are far beyond the capacity of typical computing resources available with most research labs. Other infrastructures like the cloud AWS environment and supercomputers also have limitations due to which large scale joint variant calling becomes infeasible, and infrastructure specific variant calling strategies either fail to scale up to large datasets or abandon joint calling strategies. We present a high throughput framework including multiple variant callers for single nucleotide variant (SNV) calling, which leverages hybrid computing infrastructure consisting of cloud AWS, supercomputers and local high performance computing infrastructures. We present a novel binning approach for large scale joint variant calling and imputation which can scale up to over 10,000 samples while producing SNV callsets with high sensitivity and specificity. As a proof of principle, we present results of analysis on Cohorts for Heart And Aging Research in Genomic Epidemiology (CHARGE) WGS freeze 3 dataset in which joint calling, imputation and phasing of over 5300 whole genome samples was produced in under 6 weeks using four state-of-the-art callers. The callers used were SNPTools, GATK-HaplotypeCaller, GATK-UnifiedGenotyper and GotCloud. We used Amazon AWS, a 4000-core in-house cluster at Baylor College of Medicine, IBM power PC Blue BioU at Rice and Rhea at Oak Ridge National Laboratory (ORNL) for the computation. AWS was used for joint calling of 180 TB of BAM files, and ORNL and Rice supercomputers were used for the imputation and phasing step. All other steps were carried out on the local compute cluster. The entire operation used 5.2 million core hours and only transferred a total of 6 TB of data across the platforms. Even with increasing sizes of whole genome datasets, ensemble joint calling of SNVs for low coverage data can be accomplished in a scalable, cost effective and fast manner by using heterogeneous computing platforms without compromising on the quality of variants.

  2. Field Evaluation of Innovative Wastewater Collection System Condition Assessment Technologies

    EPA Science Inventory

    As part of an effort to address aging infrastructure needs, the U.S. Environmental Protection Agency (USEPA) initiated research under the Aging Water Infrastructure program, part of the USEPA Office of Water’s Sustainable Infrastructure Initiative. This presentation discusses fi...

  3. FY16 Analysis report: Financial systems dependency on communications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beyeler, Walter E.

    Within the Department of Homeland Security (DHS), the Office of Cyber and Infrastructure Analysis (OCIA)'s National Infrastructure Simulation and Analysis Center (NISAC) develops capabilities to support the DHS mission and the resilience of the Nation’s critical infrastructure. At Sandia National Laboratories, under DHS/OCIA direction, NISAC is developing models of financial sector dependence on communications. This capability is designed to improve DHS's ability to assess potential impacts of communication disruptions to major financial services and the effectiveness of possible mitigations. This report summarizes findings and recommendations from the application of that capability as part of the FY2016 NISAC program plan.

  4. Presentation: EPA’s Stormwater Program and Improving Resiliency with Green Infrastructure

    EPA Pesticide Factsheets

    This presentation, EPA’s Stormwater Program and Improving Resiliency with Green Infrastructure, was given at the STAR Human and Ecological Health Impacts Associated with Water Reuse and Conservation Practices Kick-off Meeting and Webinar.

  5. Create, run, share, publish, and reference your LC-MS, FIA-MS, GC-MS, and NMR data analysis workflows with the Workflow4Metabolomics 3.0 Galaxy online infrastructure for metabolomics.

    PubMed

    Guitton, Yann; Tremblay-Franco, Marie; Le Corguillé, Gildas; Martin, Jean-François; Pétéra, Mélanie; Roger-Mele, Pierrick; Delabrière, Alexis; Goulitquer, Sophie; Monsoor, Misharl; Duperier, Christophe; Canlet, Cécile; Servien, Rémi; Tardivel, Patrick; Caron, Christophe; Giacomoni, Franck; Thévenot, Etienne A

    2017-12-01

    Metabolomics is a key approach in modern functional genomics and systems biology. Due to the complexity of metabolomics data, the variety of experimental designs, and the multiplicity of bioinformatics tools, providing experimenters with a simple and efficient resource to conduct comprehensive and rigorous analysis of their data is of utmost importance. In 2014, we launched the Workflow4Metabolomics (W4M; http://workflow4metabolomics.org) online infrastructure for metabolomics built on the Galaxy environment, which offers user-friendly features to build and run data analysis workflows including preprocessing, statistical analysis, and annotation steps. Here we present the new W4M 3.0 release, which contains twice as many tools as the first version, and provides two features which are, to our knowledge, unique among online resources. First, data from the four major metabolomics technologies (i.e., LC-MS, FIA-MS, GC-MS, and NMR) can be analyzed on a single platform. By using three studies in human physiology, alga evolution, and animal toxicology, we demonstrate how the 40 available tools can be easily combined to address biological issues. Second, the full analysis (including the workflow, the parameter values, the input data and output results) can be referenced with a permanent digital object identifier (DOI). Publication of data analyses is of major importance for robust and reproducible science. Furthermore, the publicly shared workflows are of high-value for e-learning and training. The Workflow4Metabolomics 3.0 e-infrastructure thus not only offers a unique online environment for analysis of data from the main metabolomics technologies, but it is also the first reference repository for metabolomics workflows. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. OpenCyto: An Open Source Infrastructure for Scalable, Robust, Reproducible, and Automated, End-to-End Flow Cytometry Data Analysis

    PubMed Central

    Finak, Greg; Frelinger, Jacob; Jiang, Wenxin; Newell, Evan W.; Ramey, John; Davis, Mark M.; Kalams, Spyros A.; De Rosa, Stephen C.; Gottardo, Raphael

    2014-01-01

    Flow cytometry is used increasingly in clinical research for cancer, immunology and vaccines. Technological advances in cytometry instrumentation are increasing the size and dimensionality of data sets, posing a challenge for traditional data management and analysis. Automated analysis methods, despite a general consensus of their importance to the future of the field, have been slow to gain widespread adoption. Here we present OpenCyto, a new BioConductor infrastructure and data analysis framework designed to lower the barrier of entry to automated flow data analysis algorithms by addressing key areas that we believe have held back wider adoption of automated approaches. OpenCyto supports end-to-end data analysis that is robust and reproducible while generating results that are easy to interpret. We have improved the existing, widely used core BioConductor flow cytometry infrastructure by allowing analysis to scale in a memory efficient manner to the large flow data sets that arise in clinical trials, and integrating domain-specific knowledge as part of the pipeline through the hierarchical relationships among cell populations. Pipelines are defined through a text-based csv file, limiting the need to write data-specific code, and are data agnostic to simplify repetitive analysis for core facilities. We demonstrate how to analyze two large cytometry data sets: an intracellular cytokine staining (ICS) data set from a published HIV vaccine trial focused on detecting rare, antigen-specific T-cell populations, where we identify a new subset of CD8 T-cells with a vaccine-regimen specific response that could not be identified through manual analysis, and a CyTOF T-cell phenotyping data set where a large staining panel and many cell populations are a challenge for traditional analysis. The substantial improvements to the core BioConductor flow cytometry packages give OpenCyto the potential for wide adoption. It can rapidly leverage new developments in computational cytometry and facilitate reproducible analysis in a unified environment. PMID:25167361

  7. OpenCyto: an open source infrastructure for scalable, robust, reproducible, and automated, end-to-end flow cytometry data analysis.

    PubMed

    Finak, Greg; Frelinger, Jacob; Jiang, Wenxin; Newell, Evan W; Ramey, John; Davis, Mark M; Kalams, Spyros A; De Rosa, Stephen C; Gottardo, Raphael

    2014-08-01

    Flow cytometry is used increasingly in clinical research for cancer, immunology and vaccines. Technological advances in cytometry instrumentation are increasing the size and dimensionality of data sets, posing a challenge for traditional data management and analysis. Automated analysis methods, despite a general consensus of their importance to the future of the field, have been slow to gain widespread adoption. Here we present OpenCyto, a new BioConductor infrastructure and data analysis framework designed to lower the barrier of entry to automated flow data analysis algorithms by addressing key areas that we believe have held back wider adoption of automated approaches. OpenCyto supports end-to-end data analysis that is robust and reproducible while generating results that are easy to interpret. We have improved the existing, widely used core BioConductor flow cytometry infrastructure by allowing analysis to scale in a memory efficient manner to the large flow data sets that arise in clinical trials, and integrating domain-specific knowledge as part of the pipeline through the hierarchical relationships among cell populations. Pipelines are defined through a text-based csv file, limiting the need to write data-specific code, and are data agnostic to simplify repetitive analysis for core facilities. We demonstrate how to analyze two large cytometry data sets: an intracellular cytokine staining (ICS) data set from a published HIV vaccine trial focused on detecting rare, antigen-specific T-cell populations, where we identify a new subset of CD8 T-cells with a vaccine-regimen specific response that could not be identified through manual analysis, and a CyTOF T-cell phenotyping data set where a large staining panel and many cell populations are a challenge for traditional analysis. The substantial improvements to the core BioConductor flow cytometry packages give OpenCyto the potential for wide adoption. It can rapidly leverage new developments in computational cytometry and facilitate reproducible analysis in a unified environment.

  8. Parallel digital forensics infrastructure.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liebrock, Lorie M.; Duggan, David Patrick

    2009-10-01

    This report documents the architecture and implementation of a Parallel Digital Forensics infrastructure. This infrastructure is necessary for supporting the design, implementation, and testing of new classes of parallel digital forensics tools. Digital Forensics has become extremely difficult with data sets of one terabyte and larger. The only way to overcome the processing time of these large sets is to identify and develop new parallel algorithms for performing the analysis. To support algorithm research, a flexible base infrastructure is required. A candidate architecture for this base infrastructure was designed, instantiated, and tested by this project, in collaboration with New Mexicomore » Tech. Previous infrastructures were not designed and built specifically for the development and testing of parallel algorithms. With the size of forensics data sets only expected to increase significantly, this type of infrastructure support is necessary for continued research in parallel digital forensics. This report documents the implementation of the parallel digital forensics (PDF) infrastructure architecture and implementation.« less

  9. Operational models of infrastructure resilience.

    PubMed

    Alderson, David L; Brown, Gerald G; Carlyle, W Matthew

    2015-04-01

    We propose a definition of infrastructure resilience that is tied to the operation (or function) of an infrastructure as a system of interacting components and that can be objectively evaluated using quantitative models. Specifically, for any particular system, we use quantitative models of system operation to represent the decisions of an infrastructure operator who guides the behavior of the system as a whole, even in the presence of disruptions. Modeling infrastructure operation in this way makes it possible to systematically evaluate the consequences associated with the loss of infrastructure components, and leads to a precise notion of "operational resilience" that facilitates model verification, validation, and reproducible results. Using a simple example of a notional infrastructure, we demonstrate how to use these models for (1) assessing the operational resilience of an infrastructure system, (2) identifying critical vulnerabilities that threaten its continued function, and (3) advising policymakers on investments to improve resilience. © 2014 Society for Risk Analysis.

  10. Infrastructure and Resources of Private Schools in Cali and the Implementation of the Bilingual Colombia Program

    ERIC Educational Resources Information Center

    Miranda, Norbella; Echeverry, Ángela Patricia

    2010-01-01

    Institutional factors affect the implementation of educational policies. Physical school infrastructure and the availability of resources determine to a certain extent whether a policy may be successfully transformed into practice. This article provides a description and analysis of school infrastructure and resources of private institutions of…

  11. Design of an environmental field observatory for quantifying the urban water budget

    Treesearch

    Claire Welty; Andrew J. Miller; Kenneth T. Belt; James A. Smith; Lawrence E. Band; Peter M. Groffman; Todd M. Scanlon; Juying Warner; Robert J. Ryan; Robert J. Shedlock; Michael P. McGuire

    2007-01-01

    Quantifying the water budget of urban areas presents special challenges, owing to the influence of subsurface infrastructure that can cause short-circuiting of natural flowpaths. In this paper we review some considerations for data collection and analysis in support of determining urban water budget components, with a particular emphasis on groundwater, using Baltimore...

  12. SERA Scenarios of Early Market Fuel Cell Electric Vehicle Introductions: Modeling Framework, Regional Markets, and Station Clustering; NREL (National Renewable Energy Laboratory)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Melaina, M.

    This presentation provides an overview of the Scenario Evaluation and Regionalization Analysis (SERA) model, describes the methodology for developing scenarios for hydrogen infrastructure development, outlines an example "Hydrogen Success" scenario, and discusses detailed scenario metrics for a particular case study region, the Northeast Corridor.

  13. 21st Century Water Asset Accounting - Case Studies Report (WERF Report INFR6R12a)

    EPA Science Inventory

    America’s decaying water infrastructure presents significant financial and logistical challenges for water utilities. Green infrastructure has been gaining traction as a viable alternative and complement to traditional “grey” infrastructure for water management. Current accounti...

  14. Toolkit of Available EPA Green Infrastructure Modeling Software. National Stormwater Calculator

    EPA Science Inventory

    This webinar will present a toolkit consisting of five EPA green infrastructure models and tools, along with communication material. This toolkit can be used as a teaching and quick reference resource for use by planners and developers when making green infrastructure implementat...

  15. Development of a Water Infrastructure Knowledge Database

    EPA Science Inventory

    This paper presents a methodology for developing a national database, as applied to water infrastructure systems, which includes both drinking water and wastewater. The database is branded as "WATERiD" and can be accessed at www.waterid.org. Water infrastructure in the U.S. is ag...

  16. Structural health monitoring of civil infrastructure.

    PubMed

    Brownjohn, J M W

    2007-02-15

    Structural health monitoring (SHM) is a term increasingly used in the last decade to describe a range of systems implemented on full-scale civil infrastructures and whose purposes are to assist and inform operators about continued 'fitness for purpose' of structures under gradual or sudden changes to their state, to learn about either or both of the load and response mechanisms. Arguably, various forms of SHM have been employed in civil infrastructure for at least half a century, but it is only in the last decade or two that computer-based systems are being designed for the purpose of assisting owners/operators of ageing infrastructure with timely information for their continued safe and economic operation. This paper describes the motivations for and recent history of SHM applications to various forms of civil infrastructure and provides case studies on specific types of structure. It ends with a discussion of the present state-of-the-art and future developments in terms of instrumentation, data acquisition, communication systems and data mining and presentation procedures for diagnosis of infrastructural 'health'.

  17. 3rd Annual Earth System Grid Federation and 3rd Annual Earth System Grid Federation and Ultrascale Visualization Climate Data Analysis Tools Face-to-Face Meeting Report December 2013

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williams, Dean N.

    The climate and weather data science community gathered December 3–5, 2013, at Lawrence Livermore National Laboratory, in Livermore, California, for the third annual Earth System Grid Federation (ESGF) and Ultra-scale Visualization Climate Data Analysis Tools (UV-CDAT) Face-to-Face (F2F) Meeting, which was hosted by the Department of Energy, National Aeronautics and Space Administration, National Oceanic and Atmospheric Administration, the European Infrastructure for the European Network of Earth System Modelling, and the Australian Department of Education. Both ESGF and UV-CDAT are global collaborations designed to develop a new generation of open-source software infrastructure that provides distributed access and analysis to observed andmore » simulated data from the climate and weather communities. The tools and infrastructure developed under these international multi-agency collaborations are critical to understanding extreme weather conditions and long-term climate change, while the F2F meetings help to build a stronger climate and weather data science community and stronger federated software infrastructure. The 2013 F2F meeting determined requirements for existing and impending national and international community projects; enhancements needed for data distribution, analysis, and visualization infrastructure; and standards and resources needed for better collaborations.« less

  18. Enterprise infocommunication infrastructure in training of IT-professionals

    NASA Astrophysics Data System (ADS)

    Eminov, F. I.; Golitsyna, I. N.; Eminov, B. F.

    2018-05-01

    The paper presents the enterprise infocommunication infrastructure and its management features as the influenced factors to the training of IT-professionals within the traditional educational process. The paper presents how the educational content of modern IT specialists can be developed on the basis of the infocommunication infrastructure of a modern enterprise and the interdisciplinary connections. Such approach needs to develop special forms and methods of training, adapted to the level of development of the professional environment of IT professionals.

  19. A Cross-Platform Infrastructure for Scalable Runtime Application Performance Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jack Dongarra; Shirley Moore; Bart Miller, Jeffrey Hollingsworth

    2005-03-15

    The purpose of this project was to build an extensible cross-platform infrastructure to facilitate the development of accurate and portable performance analysis tools for current and future high performance computing (HPC) architectures. Major accomplishments include tools and techniques for multidimensional performance analysis, as well as improved support for dynamic performance monitoring of multithreaded and multiprocess applications. Previous performance tool development has been limited by the burden of having to re-write a platform-dependent low-level substrate for each architecture/operating system pair in order to obtain the necessary performance data from the system. Manual interpretation of performance data is not scalable for large-scalemore » long-running applications. The infrastructure developed by this project provides a foundation for building portable and scalable performance analysis tools, with the end goal being to provide application developers with the information they need to analyze, understand, and tune the performance of terascale applications on HPC architectures. The backend portion of the infrastructure provides runtime instrumentation capability and access to hardware performance counters, with thread-safety for shared memory environments and a communication substrate to support instrumentation of multiprocess and distributed programs. Front end interfaces provides tool developers with a well-defined, platform-independent set of calls for requesting performance data. End-user tools have been developed that demonstrate runtime data collection, on-line and off-line analysis of performance data, and multidimensional performance analysis. The infrastructure is based on two underlying performance instrumentation technologies. These technologies are the PAPI cross-platform library interface to hardware performance counters and the cross-platform Dyninst library interface for runtime modification of executable images. The Paradyn and KOJAK projects have made use of this infrastructure to build performance measurement and analysis tools that scale to long-running programs on large parallel and distributed systems and that automate much of the search for performance bottlenecks.« less

  20. Integration of end-user Cloud storage for CMS analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Riahi, Hassen; Aimar, Alberto; Ayllon, Alejandro Alvarez

    End-user Cloud storage is increasing rapidly in popularity in research communities thanks to the collaboration capabilities it offers, namely synchronisation and sharing. CERN IT has implemented a model of such storage named, CERNBox, integrated with the CERN AuthN and AuthZ services. To exploit the use of the end-user Cloud storage for the distributed data analysis activity, the CMS experiment has started the integration of CERNBox as a Grid resource. This will allow CMS users to make use of their own storage in the Cloud for their analysis activities as well as to benefit from synchronisation and sharing capabilities to achievemore » results faster and more effectively. It will provide an integration model of Cloud storages in the Grid, which is implemented and commissioned over the world’s largest computing Grid infrastructure, Worldwide LHC Computing Grid (WLCG). In this paper, we present the integration strategy and infrastructure changes needed in order to transparently integrate end-user Cloud storage with the CMS distributed computing model. We describe the new challenges faced in data management between Grid and Cloud and how they were addressed, along with details of the support for Cloud storage recently introduced into the WLCG data movement middleware, FTS3. Finally, the commissioning experience of CERNBox for the distributed data analysis activity is also presented.« less

  1. Integration of end-user Cloud storage for CMS analysis

    DOE PAGES

    Riahi, Hassen; Aimar, Alberto; Ayllon, Alejandro Alvarez; ...

    2017-05-19

    End-user Cloud storage is increasing rapidly in popularity in research communities thanks to the collaboration capabilities it offers, namely synchronisation and sharing. CERN IT has implemented a model of such storage named, CERNBox, integrated with the CERN AuthN and AuthZ services. To exploit the use of the end-user Cloud storage for the distributed data analysis activity, the CMS experiment has started the integration of CERNBox as a Grid resource. This will allow CMS users to make use of their own storage in the Cloud for their analysis activities as well as to benefit from synchronisation and sharing capabilities to achievemore » results faster and more effectively. It will provide an integration model of Cloud storages in the Grid, which is implemented and commissioned over the world’s largest computing Grid infrastructure, Worldwide LHC Computing Grid (WLCG). In this paper, we present the integration strategy and infrastructure changes needed in order to transparently integrate end-user Cloud storage with the CMS distributed computing model. We describe the new challenges faced in data management between Grid and Cloud and how they were addressed, along with details of the support for Cloud storage recently introduced into the WLCG data movement middleware, FTS3. Finally, the commissioning experience of CERNBox for the distributed data analysis activity is also presented.« less

  2. School Mapping and Geospatial Analysis of the Schools in Jasra Development Block of India

    NASA Astrophysics Data System (ADS)

    Agrawal, S.; Gupta, R. D.

    2016-06-01

    GIS is a collection of tools and techniques that works on the geospatial data and is used in the analysis and decision making. Education is an inherent part of any civil society. Proper educational facilities generate the high quality human resource for any nation. Therefore, government needs an efficient system that can help in analysing the current state of education and its progress. Government also needs a system that can support in decision making and policy framing. GIS can serve the mentioned requirements not only for government but also for the general public. In order to meet the standards of human development, it is necessary for the government and decision makers to have a close watch on the existing education policy and its implementation condition. School mapping plays an important role in this aspect. School mapping consists of building the geospatial database of schools that supports in the infrastructure development, policy analysis and decision making. The present research work is an attempt for supporting Right to Education (RTE) and Sarv Sikha Abhiyaan (SSA) programmes run by Government of India through the use of GIS. School mapping of the study area is performed which is followed by the geospatial analysis. This research work will help in assessing the present status of educational infrastructure in Jasra block of Allahabad district, India.

  3. A network-based framework for assessing infrastructure resilience: a case study of the London metro system.

    PubMed

    Chopra, Shauhrat S; Dillon, Trent; Bilec, Melissa M; Khanna, Vikas

    2016-05-01

    Modern society is increasingly dependent on the stability of a complex system of interdependent infrastructure sectors. It is imperative to build resilience of large-scale infrastructures like metro systems for addressing the threat of natural disasters and man-made attacks in urban areas. Analysis is needed to ensure that these systems are capable of withstanding and containing unexpected perturbations, and develop heuristic strategies for guiding the design of more resilient networks in the future. We present a comprehensive, multi-pronged framework that analyses information on network topology, spatial organization and passenger flow to understand the resilience of the London metro system. Topology of the London metro system is not fault tolerant in terms of maintaining connectivity at the periphery of the network since it does not exhibit small-world properties. The passenger strength distribution follows a power law, suggesting that while the London metro system is robust to random failures, it is vulnerable to disruptions on a few critical stations. The analysis further identifies particular sources of structural and functional vulnerabilities that need to be mitigated for improving the resilience of the London metro network. The insights from our framework provide useful strategies to build resilience for both existing and upcoming metro systems. © 2016 The Author(s).

  4. On-track testing of a power harvesting device for railroad track health monitoring

    NASA Astrophysics Data System (ADS)

    Hansen, Sean E.; Pourghodrat, Abolfazl; Nelson, Carl A.; Fateh, Mahmood

    2010-03-01

    A considerable proportion of railroad infrastructure exists in regions which are comparatively remote. With regard to the cost of extending electrical infrastructure into these areas, road crossings in these areas do not have warning light systems or crossing gates and are commonly marked with reflective signage. For railroad track health monitoring purposes, distributed sensor networks can be applicable in remote areas, but the same limitation regarding electrical infrastructure is the hindrance. This motivated the development of an energy harvesting solution for remote railroad deployment. This paper describes on-track experimental testing of a mechanical device for harvesting mechanical power from passing railcar traffic, in view of supplying electrical power to warning light systems at crossings and to remote networks of sensors. The device is mounted to and spans two rail ties and transforms the vertical rail displacement into electrical energy through mechanical amplification and rectification into a PMDC generator. A prototype was tested under loaded and unloaded railcar traffic at low speeds. Stress analysis and speed scaling analysis are presented, results of the on-track tests are compared and contrasted to previous laboratory testing, discrepancies between the two are explained, and conclusions are drawn regarding suitability of the device for illuminating high-efficiency LED lights at railroad crossings and powering track-health sensor networks.

  5. Unleashing the Power of Distributed CPU/GPU Architectures: Massive Astronomical Data Analysis and Visualization Case Study

    NASA Astrophysics Data System (ADS)

    Hassan, A. H.; Fluke, C. J.; Barnes, D. G.

    2012-09-01

    Upcoming and future astronomy research facilities will systematically generate terabyte-sized data sets moving astronomy into the Petascale data era. While such facilities will provide astronomers with unprecedented levels of accuracy and coverage, the increases in dataset size and dimensionality will pose serious computational challenges for many current astronomy data analysis and visualization tools. With such data sizes, even simple data analysis tasks (e.g. calculating a histogram or computing data minimum/maximum) may not be achievable without access to a supercomputing facility. To effectively handle such dataset sizes, which exceed today's single machine memory and processing limits, we present a framework that exploits the distributed power of GPUs and many-core CPUs, with a goal of providing data analysis and visualizing tasks as a service for astronomers. By mixing shared and distributed memory architectures, our framework effectively utilizes the underlying hardware infrastructure handling both batched and real-time data analysis and visualization tasks. Offering such functionality as a service in a “software as a service” manner will reduce the total cost of ownership, provide an easy to use tool to the wider astronomical community, and enable a more optimized utilization of the underlying hardware infrastructure.

  6. Probability Distributome: A Web Computational Infrastructure for Exploring the Properties, Interrelations, and Applications of Probability Distributions.

    PubMed

    Dinov, Ivo D; Siegrist, Kyle; Pearl, Dennis K; Kalinin, Alexandr; Christou, Nicolas

    2016-06-01

    Probability distributions are useful for modeling, simulation, analysis, and inference on varieties of natural processes and physical phenomena. There are uncountably many probability distributions. However, a few dozen families of distributions are commonly defined and are frequently used in practice for problem solving, experimental applications, and theoretical studies. In this paper, we present a new computational and graphical infrastructure, the Distributome , which facilitates the discovery, exploration and application of diverse spectra of probability distributions. The extensible Distributome infrastructure provides interfaces for (human and machine) traversal, search, and navigation of all common probability distributions. It also enables distribution modeling, applications, investigation of inter-distribution relations, as well as their analytical representations and computational utilization. The entire Distributome framework is designed and implemented as an open-source, community-built, and Internet-accessible infrastructure. It is portable, extensible and compatible with HTML5 and Web2.0 standards (http://Distributome.org). We demonstrate two types of applications of the probability Distributome resources: computational research and science education. The Distributome tools may be employed to address five complementary computational modeling applications (simulation, data-analysis and inference, model-fitting, examination of the analytical, mathematical and computational properties of specific probability distributions, and exploration of the inter-distributional relations). Many high school and college science, technology, engineering and mathematics (STEM) courses may be enriched by the use of modern pedagogical approaches and technology-enhanced methods. The Distributome resources provide enhancements for blended STEM education by improving student motivation, augmenting the classical curriculum with interactive webapps, and overhauling the learning assessment protocols.

  7. Probability Distributome: A Web Computational Infrastructure for Exploring the Properties, Interrelations, and Applications of Probability Distributions

    PubMed Central

    Dinov, Ivo D.; Siegrist, Kyle; Pearl, Dennis K.; Kalinin, Alexandr; Christou, Nicolas

    2015-01-01

    Probability distributions are useful for modeling, simulation, analysis, and inference on varieties of natural processes and physical phenomena. There are uncountably many probability distributions. However, a few dozen families of distributions are commonly defined and are frequently used in practice for problem solving, experimental applications, and theoretical studies. In this paper, we present a new computational and graphical infrastructure, the Distributome, which facilitates the discovery, exploration and application of diverse spectra of probability distributions. The extensible Distributome infrastructure provides interfaces for (human and machine) traversal, search, and navigation of all common probability distributions. It also enables distribution modeling, applications, investigation of inter-distribution relations, as well as their analytical representations and computational utilization. The entire Distributome framework is designed and implemented as an open-source, community-built, and Internet-accessible infrastructure. It is portable, extensible and compatible with HTML5 and Web2.0 standards (http://Distributome.org). We demonstrate two types of applications of the probability Distributome resources: computational research and science education. The Distributome tools may be employed to address five complementary computational modeling applications (simulation, data-analysis and inference, model-fitting, examination of the analytical, mathematical and computational properties of specific probability distributions, and exploration of the inter-distributional relations). Many high school and college science, technology, engineering and mathematics (STEM) courses may be enriched by the use of modern pedagogical approaches and technology-enhanced methods. The Distributome resources provide enhancements for blended STEM education by improving student motivation, augmenting the classical curriculum with interactive webapps, and overhauling the learning assessment protocols. PMID:27158191

  8. Overview of Ongoing NRMRL GI Research

    EPA Science Inventory

    This presentation is an overview of ongoing NRMRL Green Infrastructure research and addresses the question: What do we need to know to present a cogent estimate of the value of Green Infrastructure? Discussions included are: stormwater well study, rain gardens and permeable su...

  9. A Systematic Comprehensive Computational Model for Stake Estimation in Mission Assurance: Applying Cyber Security Econometrics System (CSES) to Mission Assurance Analysis Protocol (MAAP)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abercrombie, Robert K; Sheldon, Frederick T; Grimaila, Michael R

    2010-01-01

    In earlier works, we presented a computational infrastructure that allows an analyst to estimate the security of a system in terms of the loss that each stakeholder stands to sustain as a result of security breakdowns. In this paper, we discuss how this infrastructure can be used in the subject domain of mission assurance as defined as the full life-cycle engineering process to identify and mitigate design, production, test, and field support deficiencies of mission success. We address the opportunity to apply the Cyberspace Security Econometrics System (CSES) to Carnegie Mellon University and Software Engineering Institute s Mission Assurance Analysismore » Protocol (MAAP) in this context.« less

  10. Modeling joint restoration strategies for interdependent infrastructure systems

    PubMed Central

    Simonovic, Slobodan P.

    2018-01-01

    Life in the modern world depends on multiple critical services provided by infrastructure systems which are interdependent at multiple levels. To effectively respond to infrastructure failures, this paper proposes a model for developing optimal joint restoration strategy for interdependent infrastructure systems following a disruptive event. First, models for (i) describing structure of interdependent infrastructure system and (ii) their interaction process, are presented. Both models are considering the failure types, infrastructure operating rules and interdependencies among systems. Second, an optimization model for determining an optimal joint restoration strategy at infrastructure component level by minimizing the economic loss from the infrastructure failures, is proposed. The utility of the model is illustrated using a case study of electric-water systems. Results show that a small number of failed infrastructure components can trigger high level failures in interdependent systems; the optimal joint restoration strategy varies with failure occurrence time. The proposed models can help decision makers to understand the mechanisms of infrastructure interactions and search for optimal joint restoration strategy, which can significantly enhance safety of infrastructure systems. PMID:29649300

  11. Risk and Reliability of Infrastructure Asset Management Workshop

    DTIC Science & Technology

    2006-08-01

    of assets within the portfolio for use in Risk and Reliability analysis ... US Army Corps of Engineers assesses its Civil Works infrastructure and applies risk and reliability in the management of that infrastructure. The ... the Corps must complete assessments across its portfolio of major assets before risk management can be used in decision making. Effective risk

  12. Vehicle Infrastructure Cash-Flow Estimation--VICE 2.0; Clean Cities, Energy Efficiency & Renewable Energy (EERE)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mitchell, G.

    This presentation discusses the differences between the original Vehicle and Infrastructure Cash-Flow Evaluation (VICE) Model and the revamped version, VICE 2.0. The enhanced tool can now help assess projects to acquire vehicles and infrastructure, or to acquire vehicles only.

  13. Aging Water Infrastructure Research Program Update: Innovation & Research for the 21st Century

    EPA Science Inventory

    This slide presentation summarizes key elements of the EOA, Office of Research and Development’s (ORD) Aging Water Infrastructure (AWI)) Research program. An overview of the national problems posed by aging water infrastructure is followed by a brief description of EPA’s overall...

  14. The Hydrologic Implications Of Unique Urban Soil Horizon Sequencing On The Functions Of Passive Green Infrastructure

    EPA Science Inventory

    Green infrastructure represents a broad set of site- to landscape-scale practices that can be flexibly implemented to increase sewershed retention capacity, and can thereby improve on the management of water quantity and quality. Although much green infrastructure presents as for...

  15. Defining resilience within a risk-informed assessment framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coles, Garill A.; Unwin, Stephen D.; Holter, Gregory M.

    2011-08-01

    The concept of resilience is the subject of considerable discussion in academic, business, and governmental circles. The United States Department of Homeland Security for one has emphasised the need to consider resilience in safeguarding critical infrastructure and key resources. The concept of resilience is complex, multidimensional, and defined differently by different stakeholders. The authors contend that there is a benefit in moving from discussing resilience as an abstraction to defining resilience as a measurable characteristic of a system. This paper proposes defining resilience measures using elements of a traditional risk assessment framework to help clarify the concept of resilience andmore » as a way to provide non-traditional risk information. The authors show various, diverse dimensions of resilience can be quantitatively defined in a common risk assessment framework based on the concept of loss of service. This allows the comparison of options for improving the resilience of infrastructure and presents a means to perform cost-benefit analysis. This paper discusses definitions and key aspects of resilience, presents equations for the risk of loss of infrastructure function that incorporate four key aspects of resilience that could prevent or mitigate that loss, describes proposed resilience factor definitions based on those risk impacts, and provides an example that illustrates how resilience factors would be calculated using a hypothetical scenario.« less

  16. Urban Mobility Analysis on Efficiency and Sustainability by Means of Transportation

    NASA Astrophysics Data System (ADS)

    Branea, Ana-Maria; Gaman, Marius; Badescu, Stefana

    2017-10-01

    Patterns of urban land use are inherently linked to the predominantly used means of transportation, both generating and being generated themselves. While each mode of transportation shapes a different development typology a clear understanding of their interrelations and dependencies is needed in order to create a comprehensive mobility strategy. The study proposes a 15-criteria analysis framework developed to identify and quantify the main modes of transportation’s key aspects. The analysis framework was applied to a yearlong research on Timisoara, Romania, comprising hard, quantitative data, digital simulations and mobility pattern analysis and soft data, quality assessment and perceived needs and satisfaction levels. The research was carried out in clear opposition to the national trend of official mobility strategies focusing on accommodating increased levels of car traffic on the underdeveloped existing roads infrastructure. By analysing the efficiency and sustainability of all four main modes of transportation the results offer a holistic comprehensive view. While, despite current practices, no mobility strategy can focus on a single means of transportation, the article will only present in detail the research on cycling, infrastructure and use, as it is the most underdeveloped and least discussed at the national level and proven through our study to be the most efficient for a city of Timisoara’s size and characteristics. By identifying a clear link between urban land use patterns, infrastructure quality and perceptions and the most efficient means of transportation for each particular city type mobility strategies could shift the trend of urban development towards a more sustainable one.

  17. Dynamic malware analysis using IntroVirt: a modified hypervisor-based system

    NASA Astrophysics Data System (ADS)

    White, Joshua S.; Pape, Stephen R.; Meily, Adam T.; Gloo, Richard M.

    2013-05-01

    In this paper, we present a system for Dynamic Malware Analysis which incorporates the use of IntroVirt™. IntroVirt is an introspective hypervisor architecture and infrastructure that supports advanced analysis techniques for stealth-malwareanalysis. This system allows for complete guest monitoring and interaction, including the manipulation and blocking of system calls. IntroVirt is capable of bypassing virtual machine detection capabilities of even the most sophisticated malware, by spoofing returns to system call responses. Additional fuzzing capabilities can be employed to detect both malware vulnerabilities and polymorphism.

  18. Cost-Benefit Analysis of Green Infrastructures on Community Stormwater Reduction and Utilization: A Case of Beijing, China.

    PubMed

    Liu, Wen; Chen, Weiping; Feng, Qi; Peng, Chi; Kang, Peng

    2016-12-01

    Cost-benefit analysis is demanded for guiding the plan, design and construction of green infrastructure practices in rapidly urbanized regions. We developed a framework to calculate the costs and benefits of different green infrastructures on stormwater reduction and utilization. A typical community of 54,783 m 2 in Beijing was selected for case study. For the four designed green infrastructure scenarios (green space depression, porous brick pavement, storage pond, and their combination), the average annual costs of green infrastructure facilities are ranged from 40.54 to 110.31 thousand yuan, and the average of the cost per m 3 stormwater reduction and utilization is 4.61 yuan. The total average annual benefits of stormwater reduction and utilization by green infrastructures of the community are ranged from 63.24 to 250.15 thousand yuan, and the benefit per m 3 stormwater reduction and utilization is ranged from 5.78 to 11.14 yuan. The average ratio of average annual benefit to cost of four green infrastructure facilities is 1.91. The integrated facilities had the highest economic feasibility with a benefit to cost ratio of 2.27, and followed by the storage pond construction with a benefit to cost ratio of 2.14. The results suggested that while the stormwater reduction and utilization by green infrastructures had higher construction and maintenance costs, their comprehensive benefits including source water replacements benefits, environmental benefits and avoided cost benefits are potentially interesting. The green infrastructure practices should be promoted for sustainable management of urban stormwater.

  19. Adapting New Space System Designs into Existing Ground Infrastructure

    NASA Technical Reports Server (NTRS)

    Delgado, Hector N.; McCleskey, Carey M.

    2008-01-01

    As routine space operations extend beyond earth orbit, the ability for ground infrastructures to take on new launch vehicle systems and a more complex suite of spacecraft and payloads has become a new challenge. The U.S. Vision for Space Exploration and its Constellation Program provides opportunities for our space operations community to meet this challenge. Presently, as new flight and ground systems add to the overall groundbased and space-based capabilities for NASA and its international partners, specific choices are being made as to what to abandon, what to retain, as well as what to build new. The total ground and space-based infrastructure must support a long-term, sustainable operation after it is all constructed, deployed, and activated. This paper addresses key areas of engineering concern during conceptual design, development, and routine operations, with a particular focus on: (1) legacy system reusability, (2) system supportability attributes and operations characteristics, (3) ground systems design trades and criteria, and (4) technology application survey. Each key area explored weighs the merits of reusability of the infrastructure in terms of: engineering analysis methods and techniques; top-level facility, systems, and equipment design criteria; and some suggested methods for making the operational system attributes (the "-ilities") highly visible to the design teams and decisionmakers throughout the design process.

  20. Expecting the Unexpected: Towards Robust Credential Infrastructure

    NASA Astrophysics Data System (ADS)

    Xu, Shouhuai; Yung, Moti

    Cryptographic credential infrastructures, such as Public key infrastructure (PKI), allow the building of trust relationships in electronic society and electronic commerce. At the center of credential infrastructures is the methodology of digital signatures. However, methods that assure that credentials and signed messages possess trustworthiness and longevity are not well understood, nor are they adequately addressed in both literature and practice. We believe that, as a basic engineering principle, these properties have to be built into the credential infrastructure rather than be treated as an after-thought since they are crucial to the long term success of this notion. In this paper we present a step in the direction of dealing with these issues. Specifically, we present the basic engineering reasoning as well as a model that helps understand (somewhat formally) the trustworthiness and longevity of digital signatures, and then we give basic mechanisms that help improve these notions.

  1. Complex Systems Analysis | Energy Analysis | NREL

    Science.gov Websites

    Generators, Transmission Infrastructure. A Power Plant drawing is above the text boxes. Solar Arrays drawing Flexibility and Storage. An Industry plant drawing and a house with the label Monitor Energy Use is connected to Transmission Infrastructure. A Geothermal Power Plant drawing and a Rooftop PV drawing is connect

  2. Impact of public electric vehicle charging infrastructure

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Levinson, Rebecca S.; West, Todd H.

    Our work uses market analysis and simulation to explore the potential of public charging infrastructure to spur US battery electric vehicle (BEV) sales, increase national electrified mileage, and lower greenhouse gas (GHG) emissions. By employing both scenario and parametric analysis for policy driven injection of public charging stations we find the following: (1) For large deployments of public chargers, DC fast chargers are more effective than level 2 chargers at increasing BEV sales, increasing electrified mileage, and lowering GHG emissions, even if only one DC fast charging station can be built for every ten level 2 charging stations. (2) Amore » national initiative to build DC fast charging infrastructure will see diminishing returns on investment at approximately 30,000 stations. (3) Some infrastructure deployment costs can be defrayed by passing them back to electric vehicle consumers, but once those costs to the consumer reach the equivalent of approximately 12¢/kWh for all miles driven, almost all gains to BEV sales and GHG emissions reductions from infrastructure construction are lost.« less

  3. Impact of public electric vehicle charging infrastructure

    DOE PAGES

    Levinson, Rebecca S.; West, Todd H.

    2017-10-16

    Our work uses market analysis and simulation to explore the potential of public charging infrastructure to spur US battery electric vehicle (BEV) sales, increase national electrified mileage, and lower greenhouse gas (GHG) emissions. By employing both scenario and parametric analysis for policy driven injection of public charging stations we find the following: (1) For large deployments of public chargers, DC fast chargers are more effective than level 2 chargers at increasing BEV sales, increasing electrified mileage, and lowering GHG emissions, even if only one DC fast charging station can be built for every ten level 2 charging stations. (2) Amore » national initiative to build DC fast charging infrastructure will see diminishing returns on investment at approximately 30,000 stations. (3) Some infrastructure deployment costs can be defrayed by passing them back to electric vehicle consumers, but once those costs to the consumer reach the equivalent of approximately 12¢/kWh for all miles driven, almost all gains to BEV sales and GHG emissions reductions from infrastructure construction are lost.« less

  4. Effects of 45 Years of Heavy Road Traffic and Infrastructure on Permafrost and Tundra at Prudhoe Bay, Alaska

    NASA Astrophysics Data System (ADS)

    Walker, D. A.; Buchhorn, M.; Raynolds, M. K.; Kanevskiy, M. Z.; Matyshak, G. V.; Shur, Y.; Peirce, J.

    2015-12-01

    The upper permafrost of the Prudhoe Bay Oilfield, the largest oil field in both the United States and in North America, contains significant amounts of excess ground ice, mainly in ice wedges. An increase in infrastructure development and road traffic since the initial development of the Prudhoe Bay Oilfield in 1968 has resulted in extensive flooding, accumulation of road dust, and roadside snowbanks, all of which affect the vegetation and alter the thermal properties of the ground surface. As part of the NSF's Arctic Science, Engineering, and Education for Sustainability (ArcSEES) project, we established four transects in 2014 and 2015 to document the effects of infrastructure and heavy road traffic on adjacent tundra. Two transects were established perpendicular to the Prudhoe Bay Spine Road north of Lake Colleen and two perpendicular to the Dalton Highway next to the Deadhorse airstrip. Prior to infrastructure development in 1949, rather homogeneous networks of low-centered polygons with less than 30 cm of trough-rim elevation contrast covered both locations. We present the detailed results of vegetation analysis, ice-core drilling, and extensive topographic surveys along the transects. A time series of aerial photographs from 1949 to 2014 (yearly since 1969) documents the changing landscapes in relationship to the record of air-temperature, active layer depths, and permafrost temperatures at Deadhorse. Flooding, road dust, and snow drifts have all contributed to creating warmer soil temperatures and deeper active layers near the road. These factors have all contributed in different ways to alteration of the plant canopy. The altered plant canopies in turn further altered the surface albedo and the ground temperatures. Historical photos indicate that between 1989 and 2012 a regional thawing of the ice-wedges occurred, increasing the extent of thermokarst. Our analysis demonstrates the cumulative effects of infrastructure-related and climate-related factors to these ice-rich permafrost landscapes.

  5. Deep Uncertainties in Sea-Level Rise and Storm Surge Projections: Implications for Coastal Flood Risk Management.

    PubMed

    Oddo, Perry C; Lee, Ben S; Garner, Gregory G; Srikrishnan, Vivek; Reed, Patrick M; Forest, Chris E; Keller, Klaus

    2017-09-05

    Sea levels are rising in many areas around the world, posing risks to coastal communities and infrastructures. Strategies for managing these flood risks present decision challenges that require a combination of geophysical, economic, and infrastructure models. Previous studies have broken important new ground on the considerable tensions between the costs of upgrading infrastructure and the damages that could result from extreme flood events. However, many risk-based adaptation strategies remain silent on certain potentially important uncertainties, as well as the tradeoffs between competing objectives. Here, we implement and improve on a classic decision-analytical model (Van Dantzig 1956) to: (i) capture tradeoffs across conflicting stakeholder objectives, (ii) demonstrate the consequences of structural uncertainties in the sea-level rise and storm surge models, and (iii) identify the parametric uncertainties that most strongly influence each objective using global sensitivity analysis. We find that the flood adaptation model produces potentially myopic solutions when formulated using traditional mean-centric decision theory. Moving from a single-objective problem formulation to one with multiobjective tradeoffs dramatically expands the decision space, and highlights the need for compromise solutions to address stakeholder preferences. We find deep structural uncertainties that have large effects on the model outcome, with the storm surge parameters accounting for the greatest impacts. Global sensitivity analysis effectively identifies important parameter interactions that local methods overlook, and that could have critical implications for flood adaptation strategies. © 2017 Society for Risk Analysis.

  6. Intelligent Transportation Infrastructure Deployment Analysis System

    DOT National Transportation Integrated Search

    1997-01-01

    Much of the work on Intelligent Transportation Systems (ITS) to date has emphasized technologies, Standards/protocols, architecture, user services, core infrastructure requirements, and various other technical and institutional issues. ITS implementa...

  7. A System for Integrated Reliability and Safety Analyses

    NASA Technical Reports Server (NTRS)

    Kostiuk, Peter; Shapiro, Gerald; Hanson, Dave; Kolitz, Stephan; Leong, Frank; Rosch, Gene; Coumeri, Marc; Scheidler, Peter, Jr.; Bonesteel, Charles

    1999-01-01

    We present an integrated reliability and aviation safety analysis tool. The reliability models for selected infrastructure components of the air traffic control system are described. The results of this model are used to evaluate the likelihood of seeing outcomes predicted by simulations with failures injected. We discuss the design of the simulation model, and the user interface to the integrated toolset.

  8. SmartPort: A Platform for Sensor Data Monitoring in a Seaport Based on FIWARE

    PubMed Central

    Fernández, Pablo; Santana, José Miguel; Ortega, Sebastián; Trujillo, Agustín; Suárez, José Pablo; Domínguez, Conrado; Santana, Jaisiel; Sánchez, Alejandro

    2016-01-01

    Seaport monitoring and management is a significant research area, in which infrastructure automatically collects big data sets that lead the organization in its multiple activities. Thus, this problem is heavily related to the fields of data acquisition, transfer, storage, big data analysis and information visualization. Las Palmas de Gran Canaria port is a good example of how a seaport generates big data volumes through a network of sensors. They are placed on meteorological stations and maritime buoys, registering environmental parameters. Likewise, the Automatic Identification System (AIS) registers several dynamic parameters about the tracked vessels. However, such an amount of data is useless without a system that enables a meaningful visualization and helps make decisions. In this work, we present SmartPort, a platform that offers a distributed architecture for the collection of the port sensors’ data and a rich Internet application that allows the user to explore the geolocated data. The presented SmartPort tool is a representative, promising and inspiring approach to manage and develop a smart system. It covers a demanding need for big data analysis and visualization utilities for managing complex infrastructures, such as a seaport. PMID:27011192

  9. 77 FR 62521 - National Infrastructure Advisory Council

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-15

    ... oral comments after the presentation of the report from the Regional Resilience Working Group. We... a presentation from the NIAC Regional Resilience Working Group documenting their work to date on the Regional Resilience Study, which includes the role and impact of critical infrastructure on regional...

  10. Accounting for Poverty in Infrastructure Reform: Learning from Latin America's Experience. WBI Development Studies.

    ERIC Educational Resources Information Center

    Estache, Antonio; Foster, Vivien; Wodon, Quentin

    This book explores the connections between infrastructure reform and poverty alleviation in Latin America based on a detailed analysis of the effects of a decade of reforms. The book demonstrates that because the access to, and affordability of, basic services is still a major problem, infrastructure investment will be a core component of poverty…

  11. Transportation of Large Wind Components: A Review of Existing Geospatial Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mooney, Meghan; Maclaurin, Galen

    2016-09-01

    This report features the geospatial data component of a larger project evaluating logistical and infrastructure requirements for transporting oversized and overweight (OSOW) wind components. The goal of the larger project was to assess the status and opportunities for improving the infrastructure and regulatory practices necessary to transport wind turbine towers, blades, and nacelles from current and potential manufacturing facilities to end-use markets. The purpose of this report is to summarize existing geospatial data on wind component transportation infrastructure and to provide a data gap analysis, identifying areas for further analysis and data collection.

  12. NASA World Wind: Infrastructure for Spatial Data

    NASA Technical Reports Server (NTRS)

    Hogan, Patrick

    2011-01-01

    The world has great need for analysis of Earth observation data, be it climate change, carbon monitoring, disaster response, national defense or simply local resource management. To best provide for spatial and time-dependent information analysis, the world benefits from an open standards and open source infrastructure for spatial data. In the spirit of NASA's motto "for the benefit of all" NASA invites the world community to collaboratively advance this core technology. The World Wind infrastructure for spatial data both unites and challenges the world for innovative solutions analyzing spatial data while also allowing absolute command and control over any respective information exchange medium.

  13. Transportation systems analyses: Volume 1: Executive Summary

    NASA Astrophysics Data System (ADS)

    1993-05-01

    The principal objective of this study is to accomplish a systems engineering assessment of the nation's space transportation infrastructure. This analysis addresses the necessary elements to perform man delivery and return, cargo transfer, cargo delivery, payload servicing, and the exploration of the Moon and Mars. Specific elements analyzed, but not limited to, include the Space Exploration Initiative (SEI), the National Launch System (NLS), the current expendable launch vehicle (ELV) fleet, ground facilities, the Space Station Freedom (SSF), and other civil, military and commercial payloads. The performance of this study entails maintaining a broad perspective on the large number of transportation elements that could potentially comprise the U.S. space infrastructure over the next several decades. To perform this systems evaluation, top-level trade studies are conducted to enhance our understanding of the relationships between elements of the infrastructure. This broad 'infrastructure-level perspective' permits the identification of preferred infrastructures. Sensitivity analyses are performed to assure the credibility and usefulness of study results. This executive summary of the transportation systems analyses (TSM) semi-annual report addresses the SSF logistics resupply. Our analysis parallels the ongoing NASA SSF redesign effort. Therefore, there could be no SSF design to drive our logistics analysis. Consequently, the analysis attempted to bound the reasonable SSF design possibilities (and the subsequent transportation implications). No other strategy really exists until after a final decision is rendered on the SSF configuration.

  14. 2014 Earth System Grid Federation and Ultrascale Visualization Climate Data Analysis Tools Conference Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williams, Dean N.

    2015-01-27

    The climate and weather data science community met December 9–11, 2014, in Livermore, California, for the fourth annual Earth System Grid Federation (ESGF) and Ultrascale Visualization Climate Data Analysis Tools (UV-CDAT) Face-to-Face (F2F) Conference, hosted by the Department of Energy, National Aeronautics and Space Administration, National Oceanic and Atmospheric Administration, the European Infrastructure for the European Network of Earth System Modelling, and the Australian Department of Education. Both ESGF and UVCDATremain global collaborations committed to developing a new generation of open-source software infrastructure that provides distributed access and analysis to simulated and observed data from the climate and weather communities.more » The tools and infrastructure created under these international multi-agency collaborations are critical to understanding extreme weather conditions and long-term climate change. In addition, the F2F conference fosters a stronger climate and weather data science community and facilitates a stronger federated software infrastructure. The 2014 F2F conference detailed the progress of ESGF, UV-CDAT, and other community efforts over the year and sets new priorities and requirements for existing and impending national and international community projects, such as the Coupled Model Intercomparison Project Phase Six. Specifically discussed at the conference were project capabilities and enhancements needs for data distribution, analysis, visualization, hardware and network infrastructure, standards, and resources.« less

  15. Efficient On-Demand Operations in Large-Scale Infrastructures

    ERIC Educational Resources Information Center

    Ko, Steven Y.

    2009-01-01

    In large-scale distributed infrastructures such as clouds, Grids, peer-to-peer systems, and wide-area testbeds, users and administrators typically desire to perform "on-demand operations" that deal with the most up-to-date state of the infrastructure. However, the scale and dynamism present in the operating environment make it challenging to…

  16. Overview of U.S. EPA Aging Water Infrastructure Research Program - Interfacing with the Water Industry on Technology Assessment

    EPA Science Inventory

    This slide presentation summarizes key elements of the EPA Office of Research and Development’s (ORD) Aging Water Infrastructure (AWI) Research program. An overview of the national problems posed by aging water infrastructure is followed by a brief description of EPA’s overall r...

  17. Real Option Cost Vulnerability Analysis of Electrical Infrastructure

    NASA Astrophysics Data System (ADS)

    Prime, Thomas; Knight, Phil

    2015-04-01

    Critical infrastructure such as electricity substations are vulnerable to various geo-hazards that arise from climate change. These geo-hazards range from increased vegetation growth to increased temperatures and flood inundation. Of all the identified geo-hazards, coastal flooding has the greatest impact, but to date has had a low probability of occurring. However, in the face of climate change, coastal flooding is likely to occur more often due to extreme water levels being experienced more frequently due to sea-level rise (SLR). Knowing what impact coastal flooding will have now and in the future on critical infrastructure such as electrical substations is important for long-term management. Using a flood inundation model, present day and future flood events have been simulated, from 1 in 1 year events up to 1 in 10,000 year events. The modelling makes an integrated assessment of impact by using sea-level and surge to simulate a storm tide. The geographical area the model covers is part of the Northwest UK coastline with a range of urban and rural areas. The ensemble of flood maps generated allows the identification of critical infrastructure exposed to coastal flooding. Vulnerability has be assessed using an Estimated Annual Damage (EAD) value. Sampling SLR annual probability distributions produces a projected "pathway" for SLR up to 2100. EAD is then calculated using a relationship derived from the flood model. Repeating the sampling process allows a distribution of EAD up to 2100 to be produced. These values are discounted to present day values using an appropriate discount rate. If the cost of building and maintain defences is also removed from this a Net Present Value (NPV) of building the defences can be calculated. This distribution of NPV can be used as part of a cost modelling process involving Real Options, A real option is the right but not obligation to undertake investment decisions. In terms of investment in critical infrastructure resilience this means that a real option can be deferred or exercised depending on the climate future that has been realised. The real option value is defined as the maximum positive NPV value that is found across the range of potential SLR "futures". Real Options add value in that flood defences may not be built when there is real value in doing so. The cost modelling output is in the form of an accessible database that has detailed real option values varying spatially across the model domain (for each critical infrastructure) and temporally up to 2100. The analysis has shown that in 2100, 8.2% of the substations analysed have a greater than a 1 in 2 chance of exercising the real option to build flood defences against coastal flooding. The cost modelling tool and flood maps that have been developed will help stakeholders in deciding where and when to invest in mitigating against coastal flooding.

  18. Security Analysis of Smart Grid Cyber Physical Infrastructures Using Modeling and Game Theoretic Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abercrombie, Robert K; Sheldon, Frederick T.

    Cyber physical computing infrastructures typically consist of a number of sites are interconnected. Its operation critically depends both on cyber components and physical components. Both types of components are subject to attacks of different kinds and frequencies, which must be accounted for the initial provisioning and subsequent operation of the infrastructure via information security analysis. Information security analysis can be performed using game theory implemented in dynamic Agent Based Game Theoretic (ABGT) simulations. Such simulations can be verified with the results from game theory analysis and further used to explore larger scale, real world scenarios involving multiple attackers, defenders, andmore » information assets. We concentrated our analysis on the electric sector failure scenarios and impact analyses by the NESCOR Working Group Study, From the Section 5 electric sector representative failure scenarios; we extracted the four generic failure scenarios and grouped them into three specific threat categories (confidentiality, integrity, and availability) to the system. These specific failure scenarios serve as a demonstration of our simulation. The analysis using our ABGT simulation demonstrates how to model the electric sector functional domain using a set of rationalized game theoretic rules decomposed from the failure scenarios in terms of how those scenarios might impact the cyber physical infrastructure network with respect to CIA.« less

  19. Life science research and drug discovery at the turn of the 21st century: the experience of SwissBioGrid.

    PubMed

    den Besten, Matthijs; Thomas, Arthur J; Schroeder, Ralph

    2009-04-22

    It is often said that the life sciences are transforming into an information science. As laboratory experiments are starting to yield ever increasing amounts of data and the capacity to deal with those data is catching up, an increasing share of scientific activity is seen to be taking place outside the laboratories, sifting through the data and modelling "in silico" the processes observed "in vitro." The transformation of the life sciences and similar developments in other disciplines have inspired a variety of initiatives around the world to create technical infrastructure to support the new scientific practices that are emerging. The e-Science programme in the United Kingdom and the NSF Office for Cyberinfrastructure are examples of these. In Switzerland there have been no such national initiatives. Yet, this has not prevented scientists from exploring the development of similar types of computing infrastructures. In 2004, a group of researchers in Switzerland established a project, SwissBioGrid, to explore whether Grid computing technologies could be successfully deployed within the life sciences. This paper presents their experiences as a case study of how the life sciences are currently operating as an information science and presents the lessons learned about how existing institutional and technical arrangements facilitate or impede this operation. SwissBioGrid gave rise to two pilot projects: one for proteomics data analysis and the other for high-throughput molecular docking ("virtual screening") to find new drugs for neglected diseases (specifically, for dengue fever). The proteomics project was an example of a data management problem, applying many different analysis algorithms to Terabyte-sized datasets from mass spectrometry, involving comparisons with many different reference databases; the virtual screening project was more a purely computational problem, modelling the interactions of millions of small molecules with a limited number of protein targets on the coat of the dengue virus. Both present interesting lessons about how scientific practices are changing when they tackle the problems of large-scale data analysis and data management by means of creating a novel technical infrastructure. In the experience of SwissBioGrid, data intensive discovery has a lot to gain from close collaboration with industry and harnessing distributed computing power. Yet the diversity in life science research implies only a limited role for generic infrastructure; and the transience of support means that researchers need to integrate their efforts with others if they want to sustain the benefits of their success, which are otherwise lost.

  20. Technography and Design-Actuality Gap-Analysis of Internet Computer Technologies-Assisted Education: Western Expectations and Global Education

    ERIC Educational Resources Information Center

    Greenhalgh-Spencer, Heather; Jerbi, Moja

    2017-01-01

    In this paper, we provide a design-actuality gap-analysis of the internet infrastructure that exists in developing nations and nations in the global South with the deployed internet computer technologies (ICT)-assisted programs that are designed to use internet infrastructure to provide educational opportunities. Programs that specifically…

  1. e-Infrastructures for e-Sciences 2013 A CHAIN-REDS Workshop organised under the aegis of the European Commission

    NASA Astrophysics Data System (ADS)

    The CHAIN-REDS Project is organising a workshop on "e-Infrastructures for e-Sciences" focusing on Cloud Computing and Data Repositories under the aegis of the European Commission and in co-location with the International Conference on e-Science 2013 (IEEE2013) that will be held in Beijing, P.R. of China on October 17-22, 2013. The core objective of the CHAIN-REDS project is to promote, coordinate and support the effort of a critical mass of non-European e-Infrastructures for Research and Education to collaborate with Europe addressing interoperability and interoperation of Grids and other Distributed Computing Infrastructures (DCI). From this perspective, CHAIN-REDS will optimise the interoperation of European infrastructures with those present in 6 other regions of the world, both from a development and use point of view, and catering to different communities. Overall, CHAIN-REDS will provide input for future strategies and decision-making regarding collaboration with other regions on e-Infrastructure deployment and availability of related data; it will raise the visibility of e-Infrastructures towards intercontinental audiences, covering most of the world and will provide support to establish globally connected and interoperable infrastructures, in particular between the EU and the developing regions. Organised by IHEP, INFN and Sigma Orionis with the support of all project partners, this workshop will aim at: - Presenting the state of the art of Cloud computing in Europe and in China and discussing the opportunities offered by having interoperable and federated e-Infrastructures; - Exploring the existing initiatives of Data Infrastructures in Europe and China, and highlighting the Data Repositories of interest for the Virtual Research Communities in several domains such as Health, Agriculture, Climate, etc.

  2. Oklahoma's transportation infrastructure : inventory and impacts.

    DOT National Transportation Integrated Search

    2009-10-01

    This project comprehensively analyzed Oklahomas transportation infrastructure and its impact on the states economy via network analysis techniques that are widely used in and outside geography. The focus was on the context, connectivity, and co...

  3. AASHTO connected vehicle infrastructure deployment analysis.

    DOT National Transportation Integrated Search

    2011-06-17

    This report describes a deployment scenario for Connected Vehicle infrastructure by state and local transportation agencies, together with a series of strategies and actions to be performed by AASHTO to support application development and deployment.

  4. National connected vehicle field infrastructure footprint analysis.

    DOT National Transportation Integrated Search

    2014-06-01

    The fundamental premise of the connected vehicle initiative is that enabling wireless connectivity among vehicles, the infrastructure, and mobile devices will bring about transformative changes in safety, mobility, and the environmental impacts in th...

  5. Quantification of physical and economic impacts of climate change on public infrastructure in Alaska and benefits of global greenhouse gas mitigation

    NASA Astrophysics Data System (ADS)

    Melvin, A. M.; Larsen, P.; Boehlert, B.; Martinich, J.; Neumann, J.; Chinowsky, P.; Schweikert, A.; Strzepek, K.

    2015-12-01

    Climate change poses many risks and challenges for the Arctic and sub-Arctic, including threats to infrastructure. The safety and stability of infrastructure in this region can be impacted by many factors including increased thawing of permafrost soils, reduced coastline protection due to declining arctic sea ice, and changes in inland flooding. The U.S. Environmental Protection Agency (EPA) is coordinating an effort to quantify physical and economic impacts of climate change on public infrastructure across the state of Alaska and estimate how global greenhouse gas (GHG) mitigation may avoid or reduce these impacts. This research builds on the Climate Change Impacts and Risk Analysis (CIRA) project developed for the contiguous U.S., which is described in an EPA report released in June 2015. We are using a multi-model analysis focused primarily on the impacts of changing permafrost, coastal erosion, and inland flooding on a range of infrastructure types, including transportation (e.g. roads, airports), buildings and harbors, energy sources and transmission, sewer and water systems, and others. This analysis considers multiple global GHG emission scenarios ranging from a business as usual future to significant global action. These scenarios drive climate projections through 2100 spanning a range of outcomes to capture variability amongst climate models. Projections are being combined with a recently developed public infrastructure database and integrated into a version of the Infrastructure Planning Support System (IPSS) we are modifying for use in the Arctic and sub-Arctic region. The IPSS tool allows for consideration of both adaptation and reactive responses to climate change. Results of this work will address a gap in our understanding of climate change impacts in Alaska, provide estimates of the physical and economic damages we may expect with and without global GHG mitigation, and produce important insights about infrastructure vulnerabilities in response to warming at northern latitudes.

  6. Attenuation of Storm Surge Flooding By Wetlands in the Chesapeake Bay: An Integrated Geospatial Framework Evaluating Impacts to Critical Infrastructure

    NASA Astrophysics Data System (ADS)

    Khalid, A.; Haddad, J.; Lawler, S.; Ferreira, C.

    2014-12-01

    Areas along the Chesapeake Bay and its tributaries are extremely vulnerable to hurricane flooding, as evidenced by the costly effects and severe impacts of recent storms along the Virginia coast, such as Hurricane Isabel in 2003 and Hurricane Sandy in 2012. Coastal wetlands, in addition to their ecological importance, are expected to mitigate the impact of storm surge by acting as a natural protection against hurricane flooding. Quantifying such interactions helps to provide a sound scientific basis to support planning and decision making. Using storm surge flooding from various historical hurricanes, simulated using a coupled hydrodynamic wave model (ADCIRC-SWAN), we propose an integrated framework yielding a geospatial identification of the capacity of Chesapeake Bay wetlands to protect critical infrastructure. Spatial identification of Chesapeake Bay wetlands is derived from the National Wetlands Inventory (NWI), National Land Cover Database (NLCD), and the Coastal Change Analysis Program (C-CAP). Inventories of population and critical infrastructure are extracted from US Census block data and FEMA's HAZUS-Multi Hazard geodatabase. Geospatial and statistical analyses are carried out to develop a relationship between wetland land cover, hurricane flooding, population and infrastructure vulnerability. These analyses result in the identification and quantification of populations and infrastructure in flooded areas that lie within a reasonable buffer surrounding the identified wetlands. Our analysis thus produces a spatial perspective on the potential for wetlands to attenuate hurricane flood impacts in critical areas. Statistical analysis will support hypothesis testing to evaluate the benefits of wetlands from a flooding and storm-surge attenuation perspective. Results from geospatial analysis are used to identify where interactions with critical infrastructure are relevant in the Chesapeake Bay.

  7. Potential of Best Practice to Reduce Impacts from Oil and Gas Projects in the Amazon

    PubMed Central

    Finer, Matt; Jenkins, Clinton N.; Powers, Bill

    2013-01-01

    The western Amazon continues to be an active and controversial zone of hydrocarbon exploration and production. We argue for the urgent need to implement best practices to reduce the negative environmental and social impacts associated with the sector. Here, we present a three-part study aimed at resolving the major obstacles impeding the advancement of best practice in the region. Our focus is on Loreto, Peru, one of the largest and most dynamic hydrocarbon zones in the Amazon. First, we develop a set of specific best practice guidelines to address the lack of clarity surrounding the issue. These guidelines incorporate both engineering-based criteria and key ecological and social factors. Second, we provide a detailed analysis of existing and planned hydrocarbon activities and infrastructure, overcoming the lack of information that typically hampers large-scale impact analysis. Third, we evaluate the planned activities and infrastructure with respect to the best practice guidelines. We show that Loreto is an extremely active hydrocarbon front, highlighted by a number of recent oil and gas discoveries and a sustained government push for increased exploration. Our analyses reveal that the use of technical best practice could minimize future impacts by greatly reducing the amount of required infrastructure such as drilling platforms and access roads. We also document a critical need to consider more fully the ecological and social factors, as the vast majority of planned infrastructure overlaps sensitive areas such as protected areas, indigenous territories, and key ecosystems and watersheds. Lastly, our cost analysis indicates that following best practice does not impose substantially greater costs than conventional practice, and may in fact reduce overall costs. Barriers to the widespread implementation of best practice in the Amazon clearly exist, but our findings show that there can be great benefits to its implementation. PMID:23650541

  8. Are Public-Private Partnerships an Appropriate Governance Structure for Power Plants? A Transaction Cost Analysis

    NASA Astrophysics Data System (ADS)

    Ho, S. Ping; Hsu, Yaowen

    2015-04-01

    In order to meet the requirements of the rapid economic growth, many countries demand an increasing number of power plants to meet the increasing electricity usage. Since high capital requirements of power plants present a big issue for these countries, PPPs have been considered an alternative to provide power plant infrastructure. In particular, in emerging or developing countries, PPPs may be the fastest way to provide the infrastructure needed. However, while PPPs are a promising alternative to providing various types of infrastructure, many failed power plant PPP projects have made it evident that PPPs, under certain situations, can be very costly or even a wrong choice of governance structure. While the higher efficiency due to better pooling of resources is greatly emphasized in Public-Private Partnerships (PPPs), the embedded transaction inefficiencies are often understated or even ignored. Through the lens of Transaction Cost Economics (TCE), this paper aims to answer why and when PPPs may become a costly governance structure for power plants. Specifically, we develop a TCE-based theory of PPPs as a governance structure. This theory suggests that three major opportunism problems embedded in infrastructure PPPs are possible to cause substantial transaction costs and render PPPs a costly governance structure. The three main opportunism problems are principal-principal problem, firm's hold-up problem, and government-led hold-up problem. Moreover, project and institutional characteristics that may lead to opportunism problems are identified. Based on these characteristics, an opportunism-focused transaction cost analysis (OTCA) for PPPs as a governance structure is proposed to supplement the current practice of PPP feasibility analysis. As a part of theory development, a case study of PPP power plants is performed to evaluate the proposed theory and to illustrate how the proposed OTCA can be applied in practice. Policies and administration strategies for power plant PPPs are derived based on the proposed theory.

  9. Crowdsourcing Physical Network Topology Mapping With Net.Tagger

    DTIC Science & Technology

    2016-03-01

    backend server infrastructure . This in- cludes a full security audit, better web services handling, and integration with the OSM stack and dataset to...a novel approach to network infrastructure mapping that combines smartphone apps with crowdsourced collection to gather data for offline aggregation...and analysis. The project aims to build a map of physical network infrastructure such as fiber-optic cables, facilities, and access points. The

  10. An Analysis of IT Governance Practices in the Federal Government: Protecting U.S. Critical Infrastructure from Cyber Terrorist Attacks

    ERIC Educational Resources Information Center

    Johnson, R. LeWayne

    2012-01-01

    Much of the governing process in the United States (U.S.) today depends on a reliable and well protected public information technology (IT) infrastructure. The Department of Homeland Security (DHS) is tasked with the responsibility of protecting the country's IT infrastructure. Critics contend that the DHS has failed to address planning and…

  11. Scalable collaborative risk management technology for complex critical systems

    NASA Technical Reports Server (NTRS)

    Campbell, Scott; Torgerson, Leigh; Burleigh, Scott; Feather, Martin S.; Kiper, James D.

    2004-01-01

    We describe here our project and plans to develop methods, software tools, and infrastructure tools to address challenges relating to geographically distributed software development. Specifically, this work is creating an infrastructure that supports applications working over distributed geographical and organizational domains and is using this infrastructure to develop a tool that supports project development using risk management and analysis techniques where the participants are not collocated.

  12. Scaling the CERN OpenStack cloud

    NASA Astrophysics Data System (ADS)

    Bell, T.; Bompastor, B.; Bukowiec, S.; Castro Leon, J.; Denis, M. K.; van Eldik, J.; Fermin Lobo, M.; Fernandez Alvarez, L.; Fernandez Rodriguez, D.; Marino, A.; Moreira, B.; Noel, B.; Oulevey, T.; Takase, W.; Wiebalck, A.; Zilli, S.

    2015-12-01

    CERN has been running a production OpenStack cloud since July 2013 to support physics computing and infrastructure services for the site. In the past year, CERN Cloud Infrastructure has seen a constant increase in nodes, virtual machines, users and projects. This paper will present what has been done in order to make the CERN cloud infrastructure scale out.

  13. Bioinformatics clouds for big data manipulation.

    PubMed

    Dai, Lin; Gao, Xin; Guo, Yan; Xiao, Jingfa; Zhang, Zhang

    2012-11-28

    As advances in life sciences and information technology bring profound influences on bioinformatics due to its interdisciplinary nature, bioinformatics is experiencing a new leap-forward from in-house computing infrastructure into utility-supplied cloud computing delivered over the Internet, in order to handle the vast quantities of biological data generated by high-throughput experimental technologies. Albeit relatively new, cloud computing promises to address big data storage and analysis issues in the bioinformatics field. Here we review extant cloud-based services in bioinformatics, classify them into Data as a Service (DaaS), Software as a Service (SaaS), Platform as a Service (PaaS), and Infrastructure as a Service (IaaS), and present our perspectives on the adoption of cloud computing in bioinformatics. This article was reviewed by Frank Eisenhaber, Igor Zhulin, and Sandor Pongor.

  14. Layer 1 VPN services in distributed next-generation SONET/SDH networks with inverse multiplexing

    NASA Astrophysics Data System (ADS)

    Ghani, N.; Muthalaly, M. V.; Benhaddou, D.; Alanqar, W.

    2006-05-01

    Advances in next-generation SONET/SDH along with GMPLS control architectures have enabled many new service provisioning capabilities. In particular, a key services paradigm is the emergent Layer 1 virtual private network (L1 VPN) framework, which allows multiple clients to utilize a common physical infrastructure and provision their own 'virtualized' circuit-switched networks. This precludes expensive infrastructure builds and increases resource utilization for carriers. Along these lines, a novel L1 VPN services resource management scheme for next-generation SONET/SDH networks is proposed that fully leverages advanced virtual concatenation and inverse multiplexing features. Additionally, both centralized and distributed GMPLS-based implementations are also tabled to support the proposed L1 VPN services model. Detailed performance analysis results are presented along with avenues for future research.

  15. Impact assessment of a high-speed railway line on species distribution: application to the European tree frog (Hyla arborea) in Franche-Comté.

    PubMed

    Clauzel, Céline; Girardet, Xavier; Foltête, Jean-Christophe

    2013-09-30

    The aim of the present work is to assess the potential long-distance effect of a high-speed railway line on the distribution of the European tree frog (Hyla arborea) in eastern France by combining graph-based analysis and species distribution models. This combination is a way to integrate patch-level connectivity metrics on different scales into a predictive model. The approach used is put in place before the construction of the infrastructure and allows areas potentially affected by isolation to be mapped. Through a diachronic analysis, comparing species distribution before and after the construction of the infrastructure, we identify changes in the probability of species presence and we determine the maximum distance of impact. The results show that the potential impact decreases with distance from the high-speed railway line and the largest disturbances occur within the first 500 m. Between 500 m and 3500 m, the infrastructure generates a moderate decrease in the probability of presence with maximum values close to -40%. Beyond 3500 m the average disturbance is less than -10%. The spatial extent of the impact is greater than the dispersal distance of the tree frog, confirming the assumption of the long-distance effect of the infrastructure. This predictive modelling approach appears to be a useful tool for environmental impact assessment and strategic environmental assessment. The results of the species distribution assessment may provide guidance for field surveys and support for conservation decisions by identifying the areas most affected. Copyright © 2013 Elsevier Ltd. All rights reserved.

  16. 32 CFR 240.4 - Policy.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... enterprise information infrastructure requirements. (c) The academic disciplines, with concentrations in IA..., computer systems analysis, cyber operations, cybersecurity, database administration, data management... infrastructure development and academic research to support the DoD IA/IT critical areas of interest. ...

  17. 32 CFR 240.4 - Policy.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... enterprise information infrastructure requirements. (c) The academic disciplines, with concentrations in IA..., computer systems analysis, cyber operations, cybersecurity, database administration, data management... infrastructure development and academic research to support the DoD IA/IT critical areas of interest. ...

  18. 32 CFR 240.4 - Policy.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... enterprise information infrastructure requirements. (c) The academic disciplines, with concentrations in IA..., computer systems analysis, cyber operations, cybersecurity, database administration, data management... infrastructure development and academic research to support the DoD IA/IT critical areas of interest. ...

  19. Use of certain alternative fuels in road transport in Poland

    NASA Astrophysics Data System (ADS)

    Gis, W.; Pielecha, J.; Waśkiewicz, J.; Gis, M.; Menes, M.

    2016-09-01

    The development of biomethane and hydrogen technology in the road transport in the EU countries is recommended, among the others, in the Directive of the European Parliament and of the Council 2014/94/EU of 22 October 2014. Under the provisions of the said Directive, it is recommended to EU countries to use biomethane and progressively ensure accessibility to hydrogen cars on their territories, and above all to ensure the possibility of driving hydrogen vehicles between the member States. The territorial accessibility for biomethane vehicles is determined by the availability of biomethane refuelling infrastructure in the first place in cities and then on the road network distances recommended in this directive. The territorial accessibility for hydrogen vehicles is determined by the availability of hydrogen refuelling infrastructure, in the first place along the TEN-T network. The article presents the possibilities of using these alternative fuels in Poland, presenting some of the results of research and analysis in this area.

  20. Automating Embedded Analysis Capabilities and Managing Software Complexity in Multiphysics Simulation, Part II: Application to Partial Differential Equations

    DOE PAGES

    Pawlowski, Roger P.; Phipps, Eric T.; Salinger, Andrew G.; ...

    2012-01-01

    A template-based generic programming approach was presented in Part I of this series of papers [Sci. Program. 20 (2012), 197–219] that separates the development effort of programming a physical model from that of computing additional quantities, such as derivatives, needed for embedded analysis algorithms. In this paper, we describe the implementation details for using the template-based generic programming approach for simulation and analysis of partial differential equations (PDEs). We detail several of the hurdles that we have encountered, and some of the software infrastructure developed to overcome them. We end with a demonstration where we present shape optimization and uncertaintymore » quantification results for a 3D PDE application.« less

  1. INFN-Pisa scientific computation environment (GRID, HPC and Interactive Analysis)

    NASA Astrophysics Data System (ADS)

    Arezzini, S.; Carboni, A.; Caruso, G.; Ciampa, A.; Coscetti, S.; Mazzoni, E.; Piras, S.

    2014-06-01

    The INFN-Pisa Tier2 infrastructure is described, optimized not only for GRID CPU and Storage access, but also for a more interactive use of the resources in order to provide good solutions for the final data analysis step. The Data Center, equipped with about 6700 production cores, permits the use of modern analysis techniques realized via advanced statistical tools (like RooFit and RooStat) implemented in multicore systems. In particular a POSIX file storage access integrated with standard SRM access is provided. Therefore the unified storage infrastructure is described, based on GPFS and Xrootd, used both for SRM data repository and interactive POSIX access. Such a common infrastructure allows a transparent access to the Tier2 data to the users for their interactive analysis. The organization of a specialized many cores CPU facility devoted to interactive analysis is also described along with the login mechanism integrated with the INFN-AAI (National INFN Infrastructure) to extend the site access and use to a geographical distributed community. Such infrastructure is used also for a national computing facility in use to the INFN theoretical community, it enables a synergic use of computing and storage resources. Our Center initially developed for the HEP community is now growing and includes also HPC resources fully integrated. In recent years has been installed and managed a cluster facility (1000 cores, parallel use via InfiniBand connection) and we are now updating this facility that will provide resources for all the intermediate level HPC computing needs of the INFN theoretical national community.

  2. Towards a more sustainable transport infrastructure: how spatial geological data can be utilized to improve early stage Life cycle assessment of road infrastructure

    NASA Astrophysics Data System (ADS)

    Karlsson, Caroline; Miliutenko, Sofiia; Björklund, Anna; Mörtberg, Ulla; Olofsson, Bo; Toller, Susanna

    2017-04-01

    Environmental impacts during the life cycle stages of transport infrastructure are substantial, including among other greenhouse gas (GHG) emissions, as well as resource and energy use. For transport infrastructure to be sustainable, such issues need to be integrated in the planning process. Environmental Impact Assessment (EIA) is required by the European Union (EU) in order to ensure that all environmental aspects are considered during planning of road infrastructure projects. As a part of this process, the European Commission has suggested the use of the tool life cycle assessment (LCA) for assessing life cycle energy use and GHG emissions. When analyzing life cycle impacts of the road infrastructure itself, it was shown that earthworks and materials used for the road construction have a big share in the total energy use and GHG emissions. Those aspects are largely determined by the geological conditions at the site of construction: parameters such as soil thickness, slope, bedrock quality and soil type. The geological parameters determine the amounts of earthworks (i.e. volumes of soil and rock that will be excavated and blasted), transportation need for excavated materials as well as the availability of building materials. The study presents a new geographic information system (GIS)-based approach for utilizing spatial geological data in three dimensions (i.e. length, width and depth) in order to improve estimates on earthworks during the early stages of road infrastructure planning. Three main methodological steps were undertaken: mass balance calculation, life cycle inventory analysis and spatial mapping of greenhouse gas (GHG) emissions and energy use. The proposed GIS-based approach was later evaluated by comparing with the actual values of extracted material of a real road construction project. The results showed that the estimate of filling material was the most accurate, while the estimate for excavated soil and blasted rock had a wide variation from the actual values. It was also found that the total volume of excavated and ripped soils did not change when accounting for geological stratigraphy. The proposed GIS-based approach shows promising results for usage in LCA at an early stage of road infrastructure planning, and by providing better data quality, GIS in combination with LCA can enable planning for a more sustainable transport infrastructure.

  3. The Diverse Data, User Driven Services and the Power of Giovanni at NASA GES DISC

    NASA Technical Reports Server (NTRS)

    Shen, Suhung

    2017-01-01

    This presentation provides an overview of remote sensing and model data at GES (Goddard Earth Sciences) DISC (Data and Information Services Center); Overview of data services at GES DISC (Registration with NASA data system; Searching and downloading data); Giovanni (Geospatial Interactive Online VisualizationANd aNalysis Infrastructure): online data exploration tool; and NASA Earth Data and Information System.

  4. Remote Sensing of Arctic Environmental Conditions and Critical Infrastructure using Infra-Red (IR) Cameras and Unmanned Air Vehicles (UAVs)

    NASA Astrophysics Data System (ADS)

    Hatfield, M. C.; Webley, P.; Saiet, E., II

    2014-12-01

    Remote Sensing of Arctic Environmental Conditions and Critical Infrastructure using Infra-Red (IR) Cameras and Unmanned Air Vehicles (UAVs) Numerous scientific and logistical applications exist in Alaska and other arctic regions requiring analysis of expansive, remote areas in the near infrared (NIR) and thermal infrared (TIR) bands. These include characterization of wild land fire plumes and volcanic ejecta, detailed mapping of lava flows, and inspection of lengthy segments of critical infrastructure, such as the Alaska pipeline and railroad system. Obtaining timely, repeatable, calibrated measurements of these extensive features and infrastructure networks requires localized, taskable assets such as UAVs. The Alaska Center for Unmanned Aircraft Systems Integration (ACUASI) provides practical solutions to these problem sets by pairing various IR sensors with a combination of fixed-wing and multi-rotor air vehicles. Fixed-wing assets, such as the Insitu ScanEagle, offer long reach and extended duration capabilities to quickly access remote locations and provide enduring surveillance of the target of interest. Rotary-wing assets, such as the Aeryon Scout or the ACUASI-built Ptarmigan hexcopter, provide a precision capability for detailed horizontal mapping or vertical stratification of atmospheric phenomena. When included with other ground capabilities, we will show how they can assist in decision support and hazard assessment as well as giving those in emergency management a new ability to increase knowledge of the event at hand while reducing the risk to all involved. Here, in this presentation, we illustrate how UAV's can provide the ideal tool to map and analyze the hazardous events and critical infrastructure under extreme environmental conditions.

  5. Smart Cities Intelligence System (SMACiSYS) Integrating Sensor Web with Spatial Data Infrastructures (sensdi)

    NASA Astrophysics Data System (ADS)

    Bhattacharya, D.; Painho, M.

    2017-09-01

    The paper endeavours to enhance the Sensor Web with crucial geospatial analysis capabilities through integration with Spatial Data Infrastructure. The objective is development of automated smart cities intelligence system (SMACiSYS) with sensor-web access (SENSDI) utilizing geomatics for sustainable societies. There has been a need to develop automated integrated system to categorize events and issue information that reaches users directly. At present, no web-enabled information system exists which can disseminate messages after events evaluation in real time. Research work formalizes a notion of an integrated, independent, generalized, and automated geo-event analysing system making use of geo-spatial data under popular usage platform. Integrating Sensor Web With Spatial Data Infrastructures (SENSDI) aims to extend SDIs with sensor web enablement, converging geospatial and built infrastructure, and implement test cases with sensor data and SDI. The other benefit, conversely, is the expansion of spatial data infrastructure to utilize sensor web, dynamically and in real time for smart applications that smarter cities demand nowadays. Hence, SENSDI augments existing smart cities platforms utilizing sensor web and spatial information achieved by coupling pairs of otherwise disjoint interfaces and APIs formulated by Open Geospatial Consortium (OGC) keeping entire platform open access and open source. SENSDI is based on Geonode, QGIS and Java, that bind most of the functionalities of Internet, sensor web and nowadays Internet of Things superseding Internet of Sensors as well. In a nutshell, the project delivers a generalized real-time accessible and analysable platform for sensing the environment and mapping the captured information for optimal decision-making and societal benefit.

  6. Reliability analysis of interdependent lattices

    NASA Astrophysics Data System (ADS)

    Limiao, Zhang; Daqing, Li; Pengju, Qin; Bowen, Fu; Yinan, Jiang; Zio, Enrico; Rui, Kang

    2016-06-01

    Network reliability analysis has drawn much attention recently due to the risks of catastrophic damage in networked infrastructures. These infrastructures are dependent on each other as a result of various interactions. However, most of the reliability analyses of these interdependent networks do not consider spatial constraints, which are found important for robustness of infrastructures including power grid and transport systems. Here we study the reliability properties of interdependent lattices with different ranges of spatial constraints. Our study shows that interdependent lattices with strong spatial constraints are more resilient than interdependent Erdös-Rényi networks. There exists an intermediate range of spatial constraints, at which the interdependent lattices have minimal resilience.

  7. LANL: Weapons Infrastructure Briefing to Naval Reactors, July 18, 2017

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chadwick, Frances

    Presentation slides address: The Laboratory infrastructure supports hundreds of high hazard, complex operations daily; LANL’s unique science and engineering infrastructure is critical to delivering on our mission; LANL FY17 Budget & Workforce; Direct-Funded Infrastructure Accounts; LANL Org Chart; Weapons Infrastructure Program Office; The Laboratory’s infrastructure relies on both Direct and Indirect funding; NA-50’s Operating, Maintenance & Recapitalization funding is critical to the execution of the mission; Los Alamos is currently executing several concurrent Line Item projects; Maintenance @ LANL; NA-50 is helping us to address D&D needs; We are executing a CHAMP Pilot Project at LANL; G2 = Main Toolmore » for Program Management; MDI: Future Investments are centered on facilities with a high Mission Dependency Index; Los Alamos hosted first “Deep Dive” in November 2016; Safety, Infrastructure & Operations is one of the most important programs at LANL, and is foundational for our mission success.« less

  8. The history of infrastructures and the future of cyberinfrastructure in the Earth system sciences

    NASA Astrophysics Data System (ADS)

    Edwards, P. N.

    2012-12-01

    Infrastructures display similar historical patterns of inception, development, growth and decay. They typically begin as centralized systems which later proliferate into competing variants. Users' desire for seamless functionality tends eventually to push these variants toward interoperability, usually through "gateway" technologies that link incompatible systems into networks. Another stage is reached when these networks are linked to others, as in the cases of container transport (connecting trucking, rail, and shipping) or the Internet. End stages of infrastructure development include "splintering" (specialized service tiering) and decay, as newer infrastructures displace older ones. Temporal patterns are also visible in historical infrastructure development. This presentation, by a historian of science and technology, describes these patterns through examples of both physical and digital infrastructures, focusing on the global weather forecast infrastructure since the 19th century. It then investigates how some of these patterns might apply to the future of cyberinfrastructure for the Earth system sciences.

  9. Multiscale Laboratory Infrastructure and Services to users: Plans within EPOS

    NASA Astrophysics Data System (ADS)

    Spiers, Chris; Willingshofer, Ernst; Drury, Martyn; Funiciello, Francesca; Rosenau, Matthias; Scarlato, Piergiorgio; Sagnotti, Leonardo; EPOS WG6, Corrado Cimarelli

    2015-04-01

    The participant countries in EPOS embody a wide range of world-class laboratory infrastructures ranging from high temperature and pressure experimental facilities, to electron microscopy, micro-beam analysis, analogue modeling and paleomagnetic laboratories. Most data produced by the various laboratory centres and networks are presently available only in limited "final form" in publications. Many data remain inaccessible and/or poorly preserved. However, the data produced at the participating laboratories are crucial to serving society's need for geo-resources exploration and for protection against geo-hazards. Indeed, to model resource formation and system behaviour during exploitation, we need an understanding from the molecular to the continental scale, based on experimental data. This contribution will describe the plans that the laboratories community in Europe is making, in the context of EPOS. The main objectives are: • To collect and harmonize available and emerging laboratory data on the properties and processes controlling rock system behaviour at multiple scales, in order to generate products accessible and interoperable through services for supporting research activities. • To co-ordinate the development, integration and trans-national usage of the major solid Earth Science laboratory centres and specialist networks. The length scales encompassed by the infrastructures included range from the nano- and micrometer levels (electron microscopy and micro-beam analysis) to the scale of experiments on centimetre sized samples, and to analogue model experiments simulating the reservoir scale, the basin scale and the plate scale. • To provide products and services supporting research into Geo-resources and Geo-storage, Geo-hazards and Earth System Evolution. If the EPOS Implementation Phase proposal presently under construction is successful, then a range of services and transnational activities will be put in place to realize these objectives.

  10. Geospatial Data as a Service: Towards planetary scale real-time analytics

    NASA Astrophysics Data System (ADS)

    Evans, B. J. K.; Larraondo, P. R.; Antony, J.; Richards, C. J.

    2017-12-01

    The rapid growth of earth systems, environmental and geophysical datasets poses a challenge to both end-users and infrastructure providers. For infrastructure and data providers, tasks like managing, indexing and storing large collections of geospatial data needs to take into consideration the various use cases by which consumers will want to access and use the data. Considerable investment has been made by the Earth Science community to produce suitable real-time analytics platforms for geospatial data. There are currently different interfaces that have been defined to provide data services. Unfortunately, there is considerable difference on the standards, protocols or data models which have been designed to target specific communities or working groups. The Australian National University's National Computational Infrastructure (NCI) is used for a wide range of activities in the geospatial community. Earth observations, climate and weather forecasting are examples of these communities which generate large amounts of geospatial data. The NCI has been carrying out significant effort to develop a data and services model that enables the cross-disciplinary use of data. Recent developments in cloud and distributed computing provide a publicly accessible platform where new infrastructures can be built. One of the key components these technologies offer is the possibility of having "limitless" compute power next to where the data is stored. This model is rapidly transforming data delivery from centralised monolithic services towards ubiquitous distributed services that scale up and down adapting to fluctuations in the demand. NCI has developed GSKY, a scalable, distributed server which presents a new approach for geospatial data discovery and delivery based on OGC standards. We will present the architecture and motivating use-cases that drove GSKY's collaborative design, development and production deployment. We show our approach offers the community valuable exploratory analysis capabilities, for dealing with petabyte-scale geospatial data collections.

  11. Geographic Hotspots of Critical National Infrastructure.

    PubMed

    Thacker, Scott; Barr, Stuart; Pant, Raghav; Hall, Jim W; Alderson, David

    2017-12-01

    Failure of critical national infrastructures can result in major disruptions to society and the economy. Understanding the criticality of individual assets and the geographic areas in which they are located is essential for targeting investments to reduce risks and enhance system resilience. Within this study we provide new insights into the criticality of real-life critical infrastructure networks by integrating high-resolution data on infrastructure location, connectivity, interdependence, and usage. We propose a metric of infrastructure criticality in terms of the number of users who may be directly or indirectly disrupted by the failure of physically interdependent infrastructures. Kernel density estimation is used to integrate spatially discrete criticality values associated with individual infrastructure assets, producing a continuous surface from which statistically significant infrastructure criticality hotspots are identified. We develop a comprehensive and unique national-scale demonstration for England and Wales that utilizes previously unavailable data from the energy, transport, water, waste, and digital communications sectors. The testing of 200,000 failure scenarios identifies that hotspots are typically located around the periphery of urban areas where there are large facilities upon which many users depend or where several critical infrastructures are concentrated in one location. © 2017 Society for Risk Analysis.

  12. Current Capabilities, Requirements and a Proposed Strategy for Interdependency Analysis in the UK

    NASA Astrophysics Data System (ADS)

    Bloomfield, Robin; Chozos, Nick; Salako, Kizito

    The UK government recently commissioned a research study to identify the state-of-the-art in Critical Infrastructure modelling and analysis, and the government/industry requirements for such tools and services. This study (Cetifs) concluded with a strategy aiming to bridge the gaps between the capabilities and requirements, which would establish interdependency analysis as a commercially viable service in the near future. This paper presents the findings of this study that was carried out by CSR, City University London, Adelard LLP, a safety/security consultancy and Cranfield University, defense academy of the UK.

  13. A Study of ATLAS Grid Performance for Distributed Analysis

    NASA Astrophysics Data System (ADS)

    Panitkin, Sergey; Fine, Valery; Wenaus, Torre

    2012-12-01

    In the past two years the ATLAS Collaboration at the LHC has collected a large volume of data and published a number of ground breaking papers. The Grid-based ATLAS distributed computing infrastructure played a crucial role in enabling timely analysis of the data. We will present a study of the performance and usage of the ATLAS Grid as platform for physics analysis in 2011. This includes studies of general properties as well as timing properties of user jobs (wait time, run time, etc). These studies are based on mining of data archived by the PanDA workload management system.

  14. Controlling factors of the parental safety perception on children's travel mode choice.

    PubMed

    Nevelsteen, Kristof; Steenberghen, Thérèse; Van Rompaey, Anton; Uyttersprot, Liesbeth

    2012-03-01

    The travel mode of children changed significantly over the last 20 years, with a decrease of children travelling as pedestrians or cyclists. This study focuses on six to twelve year old children. Parents determine to a large extent the mode choice of children in this age category. Based on the analysis of an extensive survey, the research shows that traffic infrastructure has a significant impact on parental decision making concerning children's travel mode choice, by affecting both the real and the perceived traffic safety. Real traffic safety is quantified in terms of numbers of accidents and road infrastructure. For the perceived traffic safety a parental allowance probability is calculated per road type to show that infrastructure characteristics influence parental decision making on the children's mode choice. A binary logistic model shows that this allowance is determined by age, gender and traffic infrastructure near the child's home or near destinations frequently visited by children. Since both real and perceived traffic safety are influenced by infrastructure characteristics, a spatial analysis of parental perception and accident statistics can be used to indicate the locations where infrastructure improvements will be most effective to increase the number of children travelling - safely - as pedestrians or cyclists. Copyright © 2011 Elsevier Ltd. All rights reserved.

  15. Succeeding criteria of community based on land transportation infrastructure for Johor innovation valley development

    NASA Astrophysics Data System (ADS)

    Redzuan, Amir A.; Aminudin, Eeydzah; Zakaria, Rozana; Ghazali, Farid Ezanee Mohamed; Baharudin, Nur Azwa Amyra; Siang, Lee Yong

    2017-10-01

    Developing countries around the world have developed innovative centre, or known as innovation hub, to meet the demand of today's changing global competitive environment. The shift of economic sector from manufacturing to services has allowed numerous regions and cities around the world to undergo major structural changes. In Malaysia, Skudai area is on its way of becoming a community-based innovation hub under the Johor State Economic Growth Strategic Plan called Johor Innovation Valley (JIV). Towards this new-city concept, land transportation infrastructure is among the most important network in being a linkage to the source of contribution in enhancing the local innovative environment. This paper highlights the requirement of land transportation infrastructure criteria that would be effective in making Skudai a community-based innovation hub. Data were collected through survey questionnaires involving stakeholders with the knowledge of land transportation infrastructure who also lives within the area. Descriptive analysis was employed with further rank breakdown using Average Index analysis. The findings distinguish the differences between each criteria of land transportation infrastructure. Change in traffic system, easier accessibility to one place to another and attraction to outside investor are among the impacts of growth of JIV. This paper concluded that selected requirement of land transportation infrastructure criteria is necessary for future contribution towards the growth of the JIV.

  16. Trends in public infrastructure spending

    DOT National Transportation Integrated Search

    1999-05-01

    This Congressional Budget Office (CBO) paper highlights trends in public : spending for infrastructure over the past 42 years. The analysis of those : trends is based on data supplied by the Office of Management and Budget, the : Bureau of the Census...

  17. Transportation Infrastructure Robustness : Joint Engineering and Economic Analysis

    DOT National Transportation Integrated Search

    2017-11-01

    The objectives of this study are to develop a methodology for assessing the robustness of transportation infrastructure facilities and assess the effect of damage to such facilities on travel demand and the facilities users welfare. The robustness...

  18. Envisioning a Planetary Spatial Data Infrastructure

    NASA Astrophysics Data System (ADS)

    Laura, J. R.; Fergason, R. L.; Skinner, J.; Gaddis, L.; Hare, T.; Hagerty, J.

    2017-02-01

    We present a vision of a codified Planetary Spatial Data Infrastructure to support vertical and horizontal data integration and reduce the burden of spatial data expertise from the planetary science expert.

  19. A new algorithm for grid-based hydrologic analysis by incorporating stormwater infrastructure

    NASA Astrophysics Data System (ADS)

    Choi, Yosoon; Yi, Huiuk; Park, Hyeong-Dong

    2011-08-01

    We developed a new algorithm, the Adaptive Stormwater Infrastructure (ASI) algorithm, to incorporate ancillary data sets related to stormwater infrastructure into the grid-based hydrologic analysis. The algorithm simultaneously considers the effects of the surface stormwater collector network (e.g., diversions, roadside ditches, and canals) and underground stormwater conveyance systems (e.g., waterway tunnels, collector pipes, and culverts). The surface drainage flows controlled by the surface runoff collector network are superimposed onto the flow directions derived from a DEM. After examining the connections between inlets and outfalls in the underground stormwater conveyance system, the flow accumulation and delineation of watersheds are calculated based on recursive computations. Application of the algorithm to the Sangdong tailings dam in Korea revealed superior performance to that of a conventional D8 single-flow algorithm in terms of providing reasonable hydrologic information on watersheds with stormwater infrastructure.

  20. Development of a lunar infrastructure

    NASA Astrophysics Data System (ADS)

    Burke, J. D.

    If humans are to reside continuously and productively on the Moon, they must be surrounded and supported there by an infrastructure having some attributes of the support systems that have made advanced civilization possible on Earth. Building this lunar infrastructure will, in a sense, be an investment. Creating it will require large resources from Earth, but once it exists it can do much to limit the further demands of a lunar base for Earthside support. What is needed for a viable lunar infrastructure? This question can be approached from two directions. The first is to examine history, which is essentially a record of growing information structures among humans on Earth (tribes, agriculture, specialization of work, education, ethics, arts and sciences, cities and states, technology). The second approach is much less secure but may provide useful insights: it is to examine the minimal needs of a small human community - not just for physical survival but for a stable existence with a net product output. This paper presents a summary, based on present knowledge of the Moon and of the likely functions of a human community there, of some of these infrastructure requirements, and also discusses possible ways to proceed toward meeting early infrastructure needs.

  1. Guidelines for Technology Infrastructure in Connecticut Schools: An Implementation Guide for the Connecticut Statewide Educational Technology Plan.

    ERIC Educational Resources Information Center

    Center for Educational Leadership and Technology, Inc., Marlborough, MA.

    This document presents guidelines and recommendations for development of a technology infrastructure in Connecticut public schools that conforms to national industry standards for voice, video, and data communications. The guidelines present information on the state statutes regarding facilities implementation and describe industry standards.…

  2. UAS Integration in the NAS Project: Integrated Test and LVC Infrastructure

    NASA Technical Reports Server (NTRS)

    Murphy, Jim; Hoang, Ty

    2015-01-01

    Overview presentation of the Integrated Test and Evaluation sub-project of the Unmanned Aircraft System (UAS) in the National Airspace System (NAS). The emphasis of the presentation is the Live, Virtual, and Constructive (LVC) system (a broadly used name for classifying modeling and simulation) infrastructure and use of external assets and connection.

  3. CloudMan as a platform for tool, data, and analysis distribution.

    PubMed

    Afgan, Enis; Chapman, Brad; Taylor, James

    2012-11-27

    Cloud computing provides an infrastructure that facilitates large scale computational analysis in a scalable, democratized fashion, However, in this context it is difficult to ensure sharing of an analysis environment and associated data in a scalable and precisely reproducible way. CloudMan (usecloudman.org) enables individual researchers to easily deploy, customize, and share their entire cloud analysis environment, including data, tools, and configurations. With the enabled customization and sharing of instances, CloudMan can be used as a platform for collaboration. The presented solution improves accessibility of cloud resources, tools, and data to the level of an individual researcher and contributes toward reproducibility and transparency of research solutions.

  4. Cooperative Drought Adaptation: Integrating Infrastructure Development, Conservation, and Water Transfers into Adaptive Policy Pathways

    NASA Astrophysics Data System (ADS)

    Zeff, H. B.; Characklis, G. W.; Reed, P. M.; Herman, J. D.

    2015-12-01

    Water supply policies that integrate portfolios of short-term management decisions with long-term infrastructure development enable utilities to adapt to a range of future scenarios. An effective mix of short-term management actions can augment existing infrastructure, potentially forestalling new development. Likewise, coordinated expansion of infrastructure such as regional interconnections and shared treatment capacity can increase the effectiveness of some management actions like water transfers. Highly adaptable decision pathways that mix long-term infrastructure options and short-term management actions require decision triggers capable of incorporating the impact of these time-evolving decisions on growing water supply needs. Here, we adapt risk-based triggers to sequence a set of potential infrastructure options in combination with utility-specific conservation actions and inter-utility water transfers. Individual infrastructure pathways can be augmented with conservation or water transfers to reduce the cost of meeting utility objectives, but they can also include cooperatively developed, shared infrastructure that expands regional capacity to transfer water. This analysis explores the role of cooperation among four water utilities in the 'Research Triangle' region of North Carolina by formulating three distinct categories of adaptive policy pathways: independent action (utility-specific conservation and supply infrastructure only), weak cooperation (utility-specific conservation and infrastructure development with regional transfers), and strong cooperation (utility specific conservation and jointly developed of regional infrastructure that supports transfers). Results suggest that strong cooperation aids the utilities in meeting their individual objections at substantially lower costs and with fewer irreversible infrastructure options.

  5. Maximizing the use of EO products: how to leverage the potential of open geospatial service architectures

    NASA Astrophysics Data System (ADS)

    Usländer, Thomas

    2012-10-01

    The demand for the rapid provision of EO products with well-defined characteristics in terms of temporal, spatial, image-specific and thematic criteria is increasing. Examples are products to support near real-time damage assessment after a natural disaster event, e.g. an earthquake. However, beyond the organizational and economic questions, there are technological and systemic barriers to enable a comfortable search, order, delivery or even combination of EO products. Most portals of space agencies and EO product providers require sophisticated satellite and product knowledge and, even worse, are all different and not interoperable. This paper gives an overview about the use cases and the architectural solutions that aim at an open and flexible EO mission infrastructure with application-oriented user interfaces and well-defined service interfaces based upon open standards. It presents corresponding international initiatives such as INSPIRE (Infrastructure for Spatial Information in the European Community), GMES (Global Monitoring for Environment and Security), GEOSS (Global Earth Observation System of Systems) and HMA (Heterogeneous Missions Accessibility) and their associated infrastructure approaches. The paper presents a corresponding analysis and design methodology and two examples how such architectures are already successfully used in early warning systems for geo-hazards and toolsets for environmentallyinduced health risks. Finally, the paper concludes with an outlook how these ideas relate to the vision of the Future Internet.

  6. A GeoNode-Based Multiscale Platform For Management, Visualization And Integration Of DInSAR Data With Different Geospatial Information Sources

    NASA Astrophysics Data System (ADS)

    Buonanno, Sabatino; Fusco, Adele; Zeni, Giovanni; Manunta, Michele; Lanari, Riccardo

    2017-04-01

    This work describes the implementation of an efficient system for managing, viewing, analyzing and updating remotely sensed data, with special reference to Differential Interferometric Synthetic Aperture Radar (DInSAR) data. The DInSAR products measure Earth surface deformation both in space and time, producing deformation maps and time series[1,2]. The use of these data in research or operational contexts requires tools that have to handle temporal and spatial variability with high efficiency. For this aim we present an implementation based on Spatial Data Infrastructure (SDI) for data integration, management and interchange, by using standard protocols[3]. SDI tools provide access to static datasets that operate only with spatial variability . In this paper we use the open source project GeoNode as framework to extend SDI infrastructure functionalities to ingest very efficiently DInSAR deformation maps and deformation time series. GeoNode allows to realize comprehensive and distributed infrastructure, following the standards of the Open Geospatial Consortium, Inc. - OGC, for remote sensing data management, analysis and integration [4,5]. In the current paper we explain the methodology used for manage the data complexity and data integration using the opens source project GeoNode. The solution presented in this work for the ingestion of DinSAR products is a very promising starting point for future developments of the OGC compliant implementation of a semi-automatic remote sensing data processing chain . [1] Berardino, P., Fornaro, G., Lanari, R., & Sansosti, E. (2002). A new Algorithm for Surface Deformation Monitoring based on Small Baseline Differential SAR Interferograms. IEEE Transactions on Geoscience and Remote Sensing, 40, 11, pp. 2375-2383. [2] Lanari R., F. Casu, M. Manzo, G. Zeni,, P. Berardino, M. Manunta and A. Pepe (2007), An overview of the Small Baseline Subset Algorithm: a DInSAR Technique for Surface Deformation Analysis, P. Appl. Geophys., 164, doi: 10.1007/s00024-007-0192-9. [3] Nebert, D.D. (ed). 2000. Developing Spatial data Infrastructures: The SDI Cookbook. [4] Geonode (www.geonode.org) [5] Kolodziej, k. (ed). 2004. OGC OpenGIS Web Map Server Cookbook. Open Geospatial Consortium, 1.0.2 edition.

  7. Green Infrastructure Research at EPA's Edison Environmental Center

    EPA Science Inventory

    The presentation outline includes: (1) Green infrastructure research objectives (2) Introduction to ongoing research projects - Aspects of design, construction, and maintenence that affect function - Real-world applications of GI research

  8. WLCG scale testing during CMS data challenges

    NASA Astrophysics Data System (ADS)

    Gutsche, O.; Hajdu, C.

    2008-07-01

    The CMS computing model to process and analyze LHC collision data follows a data-location driven approach and is using the WLCG infrastructure to provide access to GRID resources. As a preparation for data taking, CMS tests its computing model during dedicated data challenges. An important part of the challenges is the test of the user analysis which poses a special challenge for the infrastructure with its random distributed access patterns. The CMS Remote Analysis Builder (CRAB) handles all interactions with the WLCG infrastructure transparently for the user. During the 2006 challenge, CMS set its goal to test the infrastructure at a scale of 50,000 user jobs per day using CRAB. Both direct submissions by individual users and automated submissions by robots were used to achieve this goal. A report will be given about the outcome of the user analysis part of the challenge using both the EGEE and OSG parts of the WLCG. In particular, the difference in submission between both GRID middlewares (resource broker vs. direct submission) will be discussed. In the end, an outlook for the 2007 data challenge is given.

  9. Policy model for space economy infrastructure

    NASA Astrophysics Data System (ADS)

    Komerath, Narayanan; Nally, James; Zilin Tang, Elizabeth

    2007-12-01

    Extraterrestrial infrastructure is key to the development of a space economy. Means for accelerating transition from today's isolated projects to a broad-based economy are considered. A large system integration approach is proposed. The beginnings of an economic simulation model are presented, along with examples of how interactions and coordination bring down costs. A global organization focused on space infrastructure and economic expansion is proposed to plan, coordinate, fund and implement infrastructure construction. This entity also opens a way to raise low-cost capital and solve the legal and public policy issues of access to extraterrestrial resources.

  10. The Water, Energy and Food Nexus: Finding the Balance in Infrastructure Investment

    NASA Astrophysics Data System (ADS)

    Huber-lee, A. T.; Wickel, B.; Kemp-Benedict, E.; Purkey, D. R.; Hoff, H.; Heaps, C.

    2013-12-01

    There is increasing evidence that single-sector infrastructure planning is leading to severely stressed human and ecological systems. There are a number of cross-sectoral impacts in these highly inter-linked systems. Examples include: - Promotion of biofuels that leads to conversion from food crops, reducing both food and water security. - Promotion of dams solely built for hydropower rather than multi-purpose uses, that deplete fisheries and affect saltwater intrusion dynamics in downstream deltas - Historical use of water for cooling thermal power plants, with increasing pressure from other water uses, as well as problems of increased water temperatures that affect the ability to cool plants efficiently. This list can easily be expanded, as these inter-linkages are increasing over time. As developing countries see a need to invest in new infrastructure to improve the livelihoods of the poor, developed countries face conditions of deteriorating infrastructure with an opportunity for new investment. It is crucial, especially in the face of uncertainty of climate change and socio-political realities, that infrastructure planning factors in the influence of multiple sectors and the potential impacts from the perspectives of different stakeholders. There is a need for stronger linkages between science and policy as well. The Stockholm Environment Institute is developing and implementing practical and innovative nexus planning approaches in Latin America, Africa and Asia that brings together stakeholders and ways of integrating uncertainty in a cross-sectoral quantitative framework using the tools WEAP (Water Evaluation and Planning) and LEAP (Long-range Energy Alternatives Planning). The steps used include: 1. Identify key actors and stakeholders via social network analysis 2. Work with these actors to scope out priority issues and decision criteria in both the short and long term 3. Develop quantitative models to clarify options and balances between the needs and priorities of different stakeholders 4. Present and visualize results in ways easily comprehended by the general public, and, 5. Identify current and potential future governance options to implement various infrastructure investments and institutional innovations While this work is under active development, early results show the value of cross-sector integration. Perhaps the most crucial realization emerging from this body of work is that the current mode of single sector infrastructure investment is resulting in tremendous risk, given the interdependence of water, energy, food, and the environment and the uncertainties associated with climate change. By looking at a wider scope of water, energy and food trajectories, and seeing how these affect each other over time, stakeholders and decision makers can take advantage of potential synergies between sectors, rather than look solely at tradeoffs. While climate change poses a tremendous challenge for infrastructure development it also is emerging as a common concern among investors, developers, conservationists and others, presenting a unique opportunity for rethinking infrastructure development and balancing needs across sectors and including environmental needs. This paper will provide practical approaches to illustrate the value of balancing across sectors.

  11. Advanced space-based InSAR risk analysis of planned and existing transportation infrastructure.

    DOT National Transportation Integrated Search

    2017-03-21

    The purpose of this document is to summarize activities by Stanford University and : MDA Geospatial Services Inc. (MDA) to estimate surface deformation and associated : risk to transportation infrastructure using SAR Interferometric methods for the :...

  12. Green Infrastructure & Sustainable Urban Land Use Decision Analysis Workshop

    EPA Science Inventory

    Introduce green infrastructure, concepts and land use alternatives, to City of Cleveland operations staff. Discuss potential of green alternatives to impact daily operations and routine maintenance activities. Tie in sustainability concepts to long-term City planning and discu...

  13. Expanded Transportation Performance Measures to Supplement Level of Service (LOS) for Growth Management and Transportation Impact Analysis

    DOT National Transportation Integrated Search

    2012-10-01

    Floridas transportation infrastructure must continually evolve to meet the demands of its growing population. Many jurisdictions are moving toward multimodal transportation systems that utilize existing infrastructure more efficiently, providing u...

  14. U08 : finite element analysis crash model of tractor-trailers (Phase B).

    DOT National Transportation Integrated Search

    2009-08-01

    Improved understanding of truck-infrastructure crashes will enable the highway community to improve barrier design, to further reduce the likelihood of vehicle-infrastructure fatalities and injuries, and to reduce highway congestion resulting from se...

  15. Vulnerability assessment of the transportation infrastructure relying on global positioning system

    DOT National Transportation Integrated Search

    2001-08-29

    This report responds to Presidential Decision Directive 63 concerning assessing the risks to the transportation infrastructure resulting from the degradation or loss of the Global Positioning System (GPS) signal. This study includes analysis of civil...

  16. 6 CFR 29.5 - Requirements for protection.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... protected use regarding the security of critical infrastructure or protected systems, analysis, warning... expectation of protection from disclosure as provided by the provisions of the Critical Infrastructure... Domestic Security DEPARTMENT OF HOMELAND SECURITY, OFFICE OF THE SECRETARY PROTECTED CRITICAL...

  17. 6 CFR 29.5 - Requirements for protection.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... protected use regarding the security of critical infrastructure or protected systems, analysis, warning... expectation of protection from disclosure as provided by the provisions of the Critical Infrastructure... Domestic Security DEPARTMENT OF HOMELAND SECURITY, OFFICE OF THE SECRETARY PROTECTED CRITICAL...

  18. The Component Model of Infrastructure: A Practical Approach to Understanding Public Health Program Infrastructure

    PubMed Central

    Snyder, Kimberly; Rieker, Patricia P.

    2014-01-01

    Functioning program infrastructure is necessary for achieving public health outcomes. It is what supports program capacity, implementation, and sustainability. The public health program infrastructure model presented in this article is grounded in data from a broader evaluation of 18 state tobacco control programs and previous work. The newly developed Component Model of Infrastructure (CMI) addresses the limitations of a previous model and contains 5 core components (multilevel leadership, managed resources, engaged data, responsive plans and planning, networked partnerships) and 3 supporting components (strategic understanding, operations, contextual influences). The CMI is a practical, implementation-focused model applicable across public health programs, enabling linkages to capacity, sustainability, and outcome measurement. PMID:24922125

  19. Water Finance Forum - New Jersey

    EPA Pesticide Factsheets

    Presentations and materials from the Regional Finance Forum, Financing Resilient and Sustainable Water Infrastructure, held in Iselin, New Jersey, on December 2, 2015. The forum was co-sponsored by EPA's Water Infrastructure and Resiliency Finance Center,

  20. Conceptual Green Infrastructure Design for Washington Street, City of Sanford

    EPA Pesticide Factsheets

    Summary of the Sanford Mill Yard Complex presents an opportunity to include green infrastructure practices in a land redevelopment initiative with relative ease while providing multiple benefits to the surrounding community.

  1. Benefits to Minnesotans of communications infrastructure public-private partnership

    DOT National Transportation Integrated Search

    1997-06-01

    This paper presents a summary of the benefits of a communications infrastructure public-private partnership between the Minnesota Department of Transportation and the team of International Communications Systems (ICS) and Stone & Webster.

  2. Opportunistic data locality for end user data analysis

    NASA Astrophysics Data System (ADS)

    Fischer, M.; Heidecker, C.; Kuehn, E.; Quast, G.; Giffels, M.; Schnepf, M.; Heiss, A.; Petzold, A.

    2017-10-01

    With the increasing data volume of LHC Run2, user analyses are evolving towards increasing data throughput. This evolution translates to higher requirements for efficiency and scalability of the underlying analysis infrastructure. We approach this issue with a new middleware to optimise data access: a layer of coordinated caches transparently provides data locality for high-throughput analyses. We demonstrated the feasibility of this approach with a prototype used for analyses of the CMS working groups at KIT. In this paper, we present our experience both with the approach in general, and our prototype in specific.

  3. Perspectives on the use of green infrastructure for stormwater management in Cleveland and Milwaukee.

    PubMed

    Keeley, Melissa; Koburger, Althea; Dolowitz, David P; Medearis, Dale; Nickel, Darla; Shuster, William

    2013-06-01

    Green infrastructure is a general term referring to the management of landscapes in ways that generate human and ecosystem benefits. Many municipalities have begun to utilize green infrastructure in efforts to meet stormwater management goals. This study examines challenges to integrating gray and green infrastructure for stormwater management, informed by interviews with practitioners in Cleveland, OH and Milwaukee WI. Green infrastructure in these cities is utilized under conditions of extreme fiscal austerity and its use presents opportunities to connect stormwater management with urban revitalization and economic recovery while planning for the effects of negative- or zero-population growth. In this context, specific challenges in capturing the multiple benefits of green infrastructure exist because the projects required to meet federally mandated stormwater management targets and the needs of urban redevelopment frequently differ in scale and location.

  4. Perspectives on the Use of Green Infrastructure for Stormwater Management in Cleveland and Milwaukee

    NASA Astrophysics Data System (ADS)

    Keeley, Melissa; Koburger, Althea; Dolowitz, David P.; Medearis, Dale; Nickel, Darla; Shuster, William

    2013-06-01

    Green infrastructure is a general term referring to the management of landscapes in ways that generate human and ecosystem benefits. Many municipalities have begun to utilize green infrastructure in efforts to meet stormwater management goals. This study examines challenges to integrating gray and green infrastructure for stormwater management, informed by interviews with practitioners in Cleveland, OH and Milwaukee WI. Green infrastructure in these cities is utilized under conditions of extreme fiscal austerity and its use presents opportunities to connect stormwater management with urban revitalization and economic recovery while planning for the effects of negative- or zero-population growth. In this context, specific challenges in capturing the multiple benefits of green infrastructure exist because the projects required to meet federally mandated stormwater management targets and the needs of urban redevelopment frequently differ in scale and location.

  5. Ontology-Driven Provenance Management in eScience: An Application in Parasite Research

    NASA Astrophysics Data System (ADS)

    Sahoo, Satya S.; Weatherly, D. Brent; Mutharaju, Raghava; Anantharam, Pramod; Sheth, Amit; Tarleton, Rick L.

    Provenance, from the French word "provenir", describes the lineage or history of a data entity. Provenance is critical information in scientific applications to verify experiment process, validate data quality and associate trust values with scientific results. Current industrial scale eScience projects require an end-to-end provenance management infrastructure. This infrastructure needs to be underpinned by formal semantics to enable analysis of large scale provenance information by software applications. Further, effective analysis of provenance information requires well-defined query mechanisms to support complex queries over large datasets. This paper introduces an ontology-driven provenance management infrastructure for biology experiment data, as part of the Semantic Problem Solving Environment (SPSE) for Trypanosoma cruzi (T.cruzi). This provenance infrastructure, called T.cruzi Provenance Management System (PMS), is underpinned by (a) a domain-specific provenance ontology called Parasite Experiment ontology, (b) specialized query operators for provenance analysis, and (c) a provenance query engine. The query engine uses a novel optimization technique based on materialized views called materialized provenance views (MPV) to scale with increasing data size and query complexity. This comprehensive ontology-driven provenance infrastructure not only allows effective tracking and management of ongoing experiments in the Tarleton Research Group at the Center for Tropical and Emerging Global Diseases (CTEGD), but also enables researchers to retrieve the complete provenance information of scientific results for publication in literature.

  6. University Surroundings and Infrastructures That Are Accessible and Inclusive for All: Listening to Students with Disabilities

    ERIC Educational Resources Information Center

    Moriña, Anabel; Morgado, Beatriz

    2018-01-01

    The main topic of this article is architectural barriers and infrastructures as identified by university students with disabilities. The data presented is part of a much wider research project, sponsored by Spain's Ministry of Economy and Competition. A biographical-narrative methodology was used for this study. The results presented have been…

  7. Factors affecting long-term trends in surface-water quality in the Gwynns Falls watershed, Baltimore City and County, Maryland, 1998–2016

    USGS Publications Warehouse

    Majcher, Emily H.; Woytowitz, Ellen L.; Reisinger, Alexander J.; Groffman, Peter M.

    2018-03-30

    Factors affecting water-quality trends in urban streams are not well understood, despite current regulatory requirements and considerable ongoing investments in gray and green infrastructure. To address this gap, long-term water-quality trends and factors affecting these trends were examined in the Gwynns Falls, Maryland, watershed during 1998–2016 in cooperation with Blue Water Baltimore. Data on water-quality constituents and potential factors of influence were obtained from multiple sources and compiled for analysis, with a focus on data collected as part of the National Science Foundation funded Long-Term Ecological Research project, the Baltimore Ecosystem Study.Variability in climate (specifically, precipitation) and land cover can overwhelm actions taken to improve water quality and can present challenges for meeting regulatory goals. Analysis of land cover during 2001–11 in the Gwynns Falls watershed indicated minimal change during the study time frame; therefore, land-cover change is likely not a factor affecting trends in water quality. However, a modest increase in annual precipitation and a significant increase in winter precipitation were apparent in the region. A higher proportion of runoff producing storms was observed in the winter and a lower proportion in the summer, indicating that climate change may affect water quality in the watershed. The increase in precipitation was not reflected in annual or seasonal trends of streamflow in the watershed. Nonetheless, these precipitation changes may exacerbate the inflow and infiltration of water to gray infrastructure and reduce the effectiveness of green infrastructure. For streamflow and most water-quality constituents examined, no discernable trends were noted over the timeframe examined. Despite the increases in precipitation, no trends were observed for annual or seasonal discharge at the various sites within the study area. In some locations, nitrate, phosphate, and total nitrogen show downward trends, and total phosphorus and chloride show upward trends.Sanitary sewer overflows (gray infrastructure) and best management practices (green infrastructure) were identified as factors affecting water-quality change. The duration of sanitary sewer overflows was positively correlated with annual loads of nutrients and bacteria, and the drainage area of best management practices was negatively correlated with annual loads of phosphate and sulfate. Results of the study indicate that continued investments in gray and green infrastructure are necessary for urban water-quality improvement. Although this outcome is not unexpected, long-term datasets such as the one used in this study, allow the effects of gray and green infrastructures to be quantified.Results of this study have implications for the Gwynns Falls watershed and its residents and Baltimore City and County managers. Moreover, outcomes are relevant to other watersheds in the metropolitan region that do not have the same long-term dataset. Further, this study has established a framework for ongoing statistical analysis of primary factors affecting urban water-quality trends as regulatory programs mature.

  8. MmWave Vehicle-to-Infrastructure Communication :Analysis of Urban Microcellular Networks

    DOT National Transportation Integrated Search

    2017-05-01

    Vehicle-to-infrastructure (V2I) communication may provide high data rates to vehicles via millimeterwave (mmWave) microcellular networks. This report uses stochastic geometry to analyze the coverage of urban mmWave microcellular networks. Prior work ...

  9. Water and Carbon Footprints for Sustainability Analysis of Urban Infrastructure

    EPA Science Inventory

    Water and transportation infrastructures define spatial distribution of urban population and economic activities. In this context, energy and water consumed per capita are tangible measures of how efficient water and transportation systems are constructed and operated. At a hig...

  10. Incentives for mobility : using market mechanisms to rebuild America's transportation infrastructure

    DOT National Transportation Integrated Search

    1989-08-01

    America's transportation infrastructure is inadequate, but the solution is not simply to spend more public money. A market-oriented analysis reveals that the problem is institutional. The incentives which operate in the public sector under current po...

  11. Architecture of a spatial data service system for statistical analysis and visualization of regional climate changes

    NASA Astrophysics Data System (ADS)

    Titov, A. G.; Okladnikov, I. G.; Gordov, E. P.

    2017-11-01

    The use of large geospatial datasets in climate change studies requires the development of a set of Spatial Data Infrastructure (SDI) elements, including geoprocessing and cartographical visualization web services. This paper presents the architecture of a geospatial OGC web service system as an integral part of a virtual research environment (VRE) general architecture for statistical processing and visualization of meteorological and climatic data. The architecture is a set of interconnected standalone SDI nodes with corresponding data storage systems. Each node runs a specialized software, such as a geoportal, cartographical web services (WMS/WFS), a metadata catalog, and a MySQL database of technical metadata describing geospatial datasets available for the node. It also contains geospatial data processing services (WPS) based on a modular computing backend realizing statistical processing functionality and, thus, providing analysis of large datasets with the results of visualization and export into files of standard formats (XML, binary, etc.). Some cartographical web services have been developed in a system’s prototype to provide capabilities to work with raster and vector geospatial data based on OGC web services. The distributed architecture presented allows easy addition of new nodes, computing and data storage systems, and provides a solid computational infrastructure for regional climate change studies based on modern Web and GIS technologies.

  12. Collapse and pull - down analysis of high voltage electricity transmission towers subjected to cyclonic wind

    NASA Astrophysics Data System (ADS)

    Ahmed, Ammar; Arthur, Craig; Edwards, Mark

    2010-06-01

    Bulk electricity transmission lines are linear assets that can be very exposed to wind effects, particularly where they traverse steep topography or open coastal terrain in cyclonic regions. Interconnected nature of the lattice type towers and conductors also, present complex vulnerabilities. These relate to the direction of wind attack to the conductors and the cascading failure mechanisms in which the failure of a single tower has cascading effects on neighbouring towers. Such behaviour is exacerbated by the finely tuned nature of tower design which serves to minimize cost and reserve strength at design wind speeds. There is a clear need to better quantify the interdependent vulnerabilities of these critical infrastructure assets in the context of the severe wind hazard. This paper presents a novel methodology developed for the Critical Infrastructure Protection Modelling and Analysis (CIPMA) capability for assessing local wind speeds and the likelihood of tower failure for a range of transmission tower and conductor types. CIPMA is a program managed by the Federal Attorney-General's Department and Geoscience Australia is leading the technical development. The methodology then involves the development of heuristically derived vulnerability models that are consistent with Australian industry experience and full-scale static tower testing results, considering isolated tower loss along with three interdependent failure mechanisms to give overall likelihoods of failure.

  13. Managing water resources infrastructure in the face of different values

    NASA Astrophysics Data System (ADS)

    Mostert, Erik

    Water resources infrastructure (WRI) plays a key role in water management. It can serve or negatively affect some seven to ten different and sometimes conflicting values. WRI management is therefore not a purely technical issue. Economic analyses can help to some extent, but only for values related to current human use. Multi-criteria analysis can cover all values, but in the end WRI management is not an analytical issue, but a governance issue. Different governance paradigms exist: markets, hierarchies and “third alternatives”, such as common pool resources management and network management. This article presents social learning as the most promising paradigm. Positive experiences with social learning have been described and guidance on putting social learning into practice exists. Nonetheless, there are no magic solutions for managing WRI in the face of different values.

  14. 'Anyone can edit', not everyone does: Wikipedia's infrastructure and the gender gap.

    PubMed

    Ford, Heather; Wajcman, Judy

    2017-08-01

    Feminist STS has long established that science's provenance as a male domain continues to define what counts as knowledge and expertise. Wikipedia, arguably one of the most powerful sources of information today, was initially lauded as providing the opportunity to rebuild knowledge institutions by providing greater representation of multiple groups. However, less than ten percent of Wikipedia editors are women. At one level, this imbalance in contributions and therefore content is yet another case of the masculine culture of technoscience. This is an important argument and, in this article, we examine the empirical research that highlights these issues. Our main objective, however, is to extend current accounts by demonstrating that Wikipedia's infrastructure introduces new and less visible sources of gender disparity. In sum, our aim here is to present a consolidated analysis of the gendering of Wikipedia.

  15. Reactivated faulting near Cushing, Oklahoma: Increased potential for a triggered earthquake in an area of United States strategic infrastructure

    USGS Publications Warehouse

    McNamara, Daniel E.; Hayes, Gavin; Benz, Harley M.; Williams, Robert; McMahon, Nicole D; Aster, R.C.; Holland, Austin F.; Sickbert, T; Herrmann, Robert B.; Briggs, Richard; Smoczyk, Gregory M.; Bergman, Eric; Earle, Paul S.

    2015-01-01

    In October 2014 two moderate-sized earthquakes (Mw 4.0 and 4.3) struck south of Cushing, Oklahoma, below the largest crude oil storage facility in the world. Combined analysis of the spatial distribution of earthquakes and regional moment tensor focal mechanisms indicate reactivation of a subsurface unnamed and unmapped left-lateral strike-slip fault. Coulomb failure stress change calculations using the relocated seismicity and slip distribution determined from regional moment tensors, allow for the possibility that the Wilzetta-Whitetail fault zone south of Cushing, Oklahoma, could produce a large, damaging earthquake comparable to the 2011 Prague event. Resultant very strong shaking levels (MMI VII) in the epicentral region present the possibility of this potential earthquake causing moderate to heavy damage to national strategic infrastructure and local communities.

  16. Unified messaging solution for biosurveillance and disease surveillance.

    PubMed

    Abellera, John P; Srinivasan, Arunkumar; Danos, C Scott; McNabb, Scott; Rhodes, Barry

    2007-10-11

    Biosurveillance and disease surveillance systems serve different purposes. However, the richness and quality of an existing data stream and infrastructure used in biosurveillance may prove beneficial for any state-based electronic disease surveillance system, especially if an electronic laboratory data feed does not exist between a hospital and state-based system. The use of an Enterprise Application Integration(EAI) engine, such as the BioSense Integrator,will be necessary to map heterogeneous messages into standard representations, then validate and route them [1] to a disparate system. This poster illustrates the use of an existing BioSense Integrator in order to create a unified message to support the exchange of electronic lab messages necessary for reportable disease notification. An evaluation of the infrastructure for data messaging will be examined and presented, along with a cost and benefit analysis between hospital and state-based system.

  17. Bioinformatics clouds for big data manipulation

    PubMed Central

    2012-01-01

    Abstract As advances in life sciences and information technology bring profound influences on bioinformatics due to its interdisciplinary nature, bioinformatics is experiencing a new leap-forward from in-house computing infrastructure into utility-supplied cloud computing delivered over the Internet, in order to handle the vast quantities of biological data generated by high-throughput experimental technologies. Albeit relatively new, cloud computing promises to address big data storage and analysis issues in the bioinformatics field. Here we review extant cloud-based services in bioinformatics, classify them into Data as a Service (DaaS), Software as a Service (SaaS), Platform as a Service (PaaS), and Infrastructure as a Service (IaaS), and present our perspectives on the adoption of cloud computing in bioinformatics. Reviewers This article was reviewed by Frank Eisenhaber, Igor Zhulin, and Sandor Pongor. PMID:23190475

  18. Reconfiguring practice: the interdependence of experimental procedure and computing infrastructure in distributed earthquake engineering.

    PubMed

    De La Flor, Grace; Ojaghi, Mobin; Martínez, Ignacio Lamata; Jirotka, Marina; Williams, Martin S; Blakeborough, Anthony

    2010-09-13

    When transitioning local laboratory practices into distributed environments, the interdependent relationship between experimental procedure and the technologies used to execute experiments becomes highly visible and a focal point for system requirements. We present an analysis of ways in which this reciprocal relationship is reconfiguring laboratory practices in earthquake engineering as a new computing infrastructure is embedded within three laboratories in order to facilitate the execution of shared experiments across geographically distributed sites. The system has been developed as part of the UK Network for Earthquake Engineering Simulation e-Research project, which links together three earthquake engineering laboratories at the universities of Bristol, Cambridge and Oxford. We consider the ways in which researchers have successfully adapted their local laboratory practices through the modification of experimental procedure so that they may meet the challenges of coordinating distributed earthquake experiments.

  19. Implementing Liberia's poverty reduction strategy: An assessment of emergency and essential surgical care.

    PubMed

    Sherman, Lawrence; Clement, Peter T; Cherian, Meena N; Ndayimirije, Nestor; Noel, Luc; Dahn, Bernice; Gwenigale, Walter T; Kushner, Adam L

    2011-01-01

    To document infrastructure, personnel, procedures performed, and supplies and equipment available at all county hospitals in Liberia using the World Health Organization Tool for Situational Analysis of Emergency and Essential Surgical Care. Survey of county hospitals using the World Health Organization Tool for Situational Analysis of Emergency and Essential Surgical Care. Sixteen county hospitals in Liberia. Infrastructure, personnel, procedures performed, and supplies and equipment available. Uniformly, gross deficiencies in infrastructure, personnel, and supplies and equipment were identified. The World Health Organization Tool for Situational Analysis of Emergency and Essential Surgical Care was useful in identifying baseline emergency and surgical conditions for evidenced-based planning. To achieve the Poverty Reduction Strategy and delivery of the Basic Package of Health and Social Welfare Services, additional resources and manpower are needed to improve surgical and anesthetic care.

  20. MaTrace: tracing the fate of materials over time and across products in open-loop recycling.

    PubMed

    Nakamura, Shinichiro; Kondo, Yasushi; Kagawa, Shigemi; Matsubae, Kazuyo; Nakajima, Kenichi; Nagasaka, Tetsuya

    2014-07-01

    Even for metals, open-loop recycling is more common than closed-loop recycling due, among other factors, to the degradation of quality in the end-of-life (EoL) phase. Open-loop recycling is subject to loss of functionality of original materials, dissipation in forms that are difficult to recover, and recovered metals might need dilution with primary metals to meet quality requirements. Sustainable management of metal resources calls for the minimization of these losses. Imperative to this is quantitative tracking of the fate of materials across different stages, products, and losses. A new input-output analysis (IO) based model of dynamic material flow analysis (MFA) is presented that can trace the fate of materials over time and across products in open-loop recycling taking explicit consideration of losses and the quality of scrap into account. Application to car steel recovered from EoL vehicles (ELV) showed that after 50 years around 80% of the steel is used in products, mostly buildings and civil engineering (infrastructure), with the rest mostly resided in unrecovered obsolete infrastructure and refinery losses. Sensitivity analysis was conducted to evaluate the effects of changes in product lifespan, and the quality of scrap.

  1. Idaho National Laboratory’s Analysis of ARRA-Funded Plug-in Electric Vehicle and Charging Infrastructure Projects: Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Francfort, Jim; Bennett, Brion; Carlson, Richard

    2015-09-01

    Battelle Energy Alliance, LLC, managing and operating contractor for the U.S. Department of Energy’s (DOE) Idaho National Laboratory (INL), is the lead laboratory for U.S. Department of Energy’s Advanced Vehicle Testing Activity (AVTA). INL’s conduct of the AVTA resulted in a significant base of knowledge and experience in the area of testing light-duty vehicles that reduced transportation-related petroleum consumption. Due to this experience, INL was tasked by DOE to develop agreements with companies that were the recipients of The American Recovery and Reinvestment Act of 2009 (ARRA) grants, that would allow INL to collect raw data from light-duty vehicles andmore » charging infrastructure. INL developed non-disclosure agreements (NDAs) with several companies and their partners that resulted in INL being able to receive raw data via server-to-server connections from the partner companies. This raw data allowed INL to independently conduct data quality checks, perform analysis, and report publicly to DOE, partners, and stakeholders, how drivers used both new vehicle technologies and the deployed charging infrastructure. The ultimate goal was not the deployment of vehicles and charging infrastructure, cut rather to create real-world laboratories of vehicles, charging infrastructure and drivers that would aid in the design of future electric drive transportation systems. The five projects that INL collected data from and their partners are: • ChargePoint America - Plug-in Electric Vehicle Charging Infrastructure Demonstration • Chrysler Ram PHEV Pickup - Vehicle Demonstration • General Motors Chevrolet Volt - Vehicle Demonstration • The EV Project - Plug-in Electric Vehicle Charging Infrastructure Demonstration • EPRI / Via Motors PHEVs – Vehicle Demonstration The document serves to benchmark the performance science involved the execution, analysis and reporting for the five above projects that provided lessons learned based on driver’s use of the vehicles and recharging decisions made. Data is reported for the use of more than 25,000 vehicles and charging units.« less

  2. The Mais Médicos (More Doctors) Program, the infrastructure of Primary Health Units and the Municipal Human Development Index.

    PubMed

    Soares, Joaquim José; Machado, Maria Helena; Alves, Cecília Brito

    2016-09-01

    The main objective of this article was to examine the context in which professionals working within the Mais Médicos (More Doctors) Program operate. This study used the infrastructure scale of primary health units (PHUs), which was recently developed by Soares Neto and colleagues to provide more information regarding the relationship between the infrastructure of PHUs and the Municipal Human Development Index (MHDI) of municipalities that received Mais Médicos Program doctors. Using exploratory and inferential statistics, the article shows that 65.2% of the PHUs that received Mais Médicos Program doctors had medium-quality infrastructure and only 5.8% of them had low-quality infrastructure. The correlation of 0.50 between the infrastructure indicator and the MHDI points to a moderate tendency for municipalities with low MHDIs to have more precarious PHUs. Using multiple linear regression analysis it can be inferred that the main factor that contributed to the increase in the infrastructure indicator of the PHUs was the average municipal income. On the other hand, the factor that negatively affected the infrastructure of the PHUs was being located in the north or northeast regions.

  3. Transport Traffic Analysis for Abusive Infrastructure Characterization

    DTIC Science & Technology

    2012-12-14

    Introduction Abusive traffic abounds on the Internet, in the form of email, malware, vulnerability scanners, worms, denial-of-service, drive-by-downloads, scam ...insight is two-fold. First, attackers have a basic requirement to source large amounts of data, be it denial-of-service, scam -hosting, spam, or other...the network core. This paper explores the power of transport-layer traffic analysis to detect and characterize scam hosting infrastructure, including

  4. The European Network of Analytical and Experimental Laboratories for Geosciences

    NASA Astrophysics Data System (ADS)

    Freda, Carmela; Funiciello, Francesca; Meredith, Phil; Sagnotti, Leonardo; Scarlato, Piergiorgio; Troll, Valentin R.; Willingshofer, Ernst

    2013-04-01

    Integrating Earth Sciences infrastructures in Europe is the mission of the European Plate Observing System (EPOS).The integration of European analytical, experimental, and analogue laboratories plays a key role in this context and is the task of the EPOS Working Group 6 (WG6). Despite the presence in Europe of high performance infrastructures dedicated to geosciences, there is still limited collaboration in sharing facilities and best practices. The EPOS WG6 aims to overcome this limitation by pushing towards national and trans-national coordination, efficient use of current laboratory infrastructures, and future aggregation of facilities not yet included. This will be attained through the creation of common access and interoperability policies to foster and simplify personnel mobility. The EPOS ambition is to orchestrate European laboratory infrastructures with diverse, complementary tasks and competences into a single, but geographically distributed, infrastructure for rock physics, palaeomagnetism, analytical and experimental petrology and volcanology, and tectonic modeling. The WG6 is presently organizing its thematic core services within the EPOS distributed research infrastructure with the goal of joining the other EPOS communities (geologists, seismologists, volcanologists, etc...) and stakeholders (engineers, risk managers and other geosciences investigators) to: 1) develop tools and services to enhance visitor programs that will mutually benefit visitors and hosts (transnational access); 2) improve support and training activities to make facilities equally accessible to students, young researchers, and experienced users (training and dissemination); 3) collaborate in sharing technological and scientific know-how (transfer of knowledge); 4) optimize interoperability of distributed instrumentation by standardizing data collection, archive, and quality control standards (data preservation and interoperability); 5) implement a unified e-Infrastructure for data analysis, numerical modelling, and joint development and standardization of numerical tools (e-science implementation); 6) collect and store data in a flexible inventory database accessible within and beyond the Earth Sciences community(open access and outreach); 7) connect to environmental and hazard protection agencies, stakeholders, and public to raise consciousness of geo-hazards and geo-resources (innovation for society). We will inform scientists and industrial stakeholders on the most recent WG6 achievements in EPOS and we will show how our community is proceeding to design the thematic core services.

  5. Decision analysis and risk models for land development affecting infrastructure systems.

    PubMed

    Thekdi, Shital A; Lambert, James H

    2012-07-01

    Coordination and layering of models to identify risks in complex systems such as large-scale infrastructure of energy, water, and transportation is of current interest across application domains. Such infrastructures are increasingly vulnerable to adjacent commercial and residential land development. Land development can compromise the performance of essential infrastructure systems and increase the costs of maintaining or increasing performance. A risk-informed approach to this topic would be useful to avoid surprise, regret, and the need for costly remedies. This article develops a layering and coordination of models for risk management of land development affecting infrastructure systems. The layers are: system identification, expert elicitation, predictive modeling, comparison of investment alternatives, and implications of current decisions for future options. The modeling layers share a focus on observable factors that most contribute to volatility of land development and land use. The relevant data and expert evidence include current and forecasted growth in population and employment, conservation and preservation rules, land topography and geometries, real estate assessments, market and economic conditions, and other factors. The approach integrates to a decision framework of strategic considerations based on assessing risk, cost, and opportunity in order to prioritize needs and potential remedies that mitigate impacts of land development to the infrastructure systems. The approach is demonstrated for a 5,700-mile multimodal transportation system adjacent to 60,000 tracts of potential land development. © 2011 Society for Risk Analysis.

  6. Evaluation of Urban Drainage Infrastructure: New York City Case Study

    NASA Astrophysics Data System (ADS)

    Hamidi, A.; Grossberg, M.; Khanbilvardi, R.

    2017-12-01

    Flood response in an urban area is the product of interactions of spatially and temporally varying rainfall and infrastructures. In urban areas, however, the complex sub-surface networks of tunnels, waste and storm water drainage systems are often inaccessible, pose challenges for modeling and prediction of the drainage infrastructure performance. The increased availability of open data in cities is an emerging information asset for a better understanding of the dynamics of urban water drainage infrastructure. This includes crowd sourced data and community reporting. A well-known source of this type of data is the non-emergency hotline "311" which is available in many US cities, and may contain information pertaining to the performance of physical facilities, condition of the environment, or residents' experience, comfort and well-being. In this study, seven years of New York City 311 (NYC311) call during 2010-2016 is employed, as an alternative approach for identifying the areas of the city most prone to sewer back up flooding. These zones are compared with the hydrologic analysis of runoff flooding zones to provide a predictive model for the City. The proposed methodology is an example of urban system phenomenology using crowd sourced, open data. A novel algorithm for calculating the spatial distribution of flooding complaints across NYC's five boroughs is presented in this study. In this approach, the features that represent reporting bias are separated from those that relate to actual infrastructure system performance. The sewer backup results are assessed with the spatial distribution of runoff in NYC during 2010-2016. With advances in radar technologies, a high spatial-temporal resolution data set for precipitation is available for most of the United States that can be implemented in hydrologic analysis of dense urban environments. High resolution gridded Stage IV radar rainfall data along with the high resolution spatially distributed land cover data are employed to investigate the urban pluvial flooding. The monthly results of excess runoff are compared with the sewer backup in NYC to build a predictive model of flood zones according to the 311 phone calls.

  7. Water and Carbon Footprints for Sustainability Analysis of Urban Infrastructure - abstract

    EPA Science Inventory

    Water and transportation infrastructures define spatial distribution of urban population and economic activities. In this context, energy and water consumed per capita are tangible measures of how efficient water and transportation systems are constructed and operated. At a hig...

  8. 78 FR 56869 - Nuclear Infrastructure Programmatic Environmental Impact Statement Supplement Analysis...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-16

    ... DEPARTMENT OF ENERGY Nuclear Infrastructure Programmatic Environmental Impact Statement Supplement... Statement for Accomplishing Expanded Civilian Nuclear Energy Research and Development and Isotope Production...), Office of Nuclear Energy, U.S. Department of Energy, 1000 Independence Ave. SW., Washington, DC 20585...

  9. Data distribution service-based interoperability framework for smart grid testbed infrastructure

    DOE PAGES

    Youssef, Tarek A.; Elsayed, Ahmed T.; Mohammed, Osama A.

    2016-03-02

    This study presents the design and implementation of a communication and control infrastructure for smart grid operation. The proposed infrastructure enhances the reliability of the measurements and control network. The advantages of utilizing the data-centric over message-centric communication approach are discussed in the context of smart grid applications. The data distribution service (DDS) is used to implement a data-centric common data bus for the smart grid. This common data bus improves the communication reliability, enabling distributed control and smart load management. These enhancements are achieved by avoiding a single point of failure while enabling peer-to-peer communication and an automatic discoverymore » feature for dynamic participating nodes. The infrastructure and ideas presented in this paper were implemented and tested on the smart grid testbed. A toolbox and application programing interface for the testbed infrastructure are developed in order to facilitate interoperability and remote access to the testbed. This interface allows control, monitoring, and performing of experiments remotely. Furthermore, it could be used to integrate multidisciplinary testbeds to study complex cyber-physical systems (CPS).« less

  10. The GEOSS solution for enabling data interoperability and integrative research.

    PubMed

    Nativi, Stefano; Mazzetti, Paolo; Craglia, Max; Pirrone, Nicola

    2014-03-01

    Global sustainability research requires an integrative research effort underpinned by digital infrastructures (systems) able to harness data and heterogeneous information across disciplines. Digital data and information sharing across systems and applications is achieved by implementing interoperability: a property of a product or system to work with other products or systems, present or future. There are at least three main interoperability challenges a digital infrastructure must address: technological, semantic, and organizational. In recent years, important international programs and initiatives are focusing on such an ambitious objective. This manuscript presents and combines the studies and the experiences carried out by three relevant projects, focusing on the heavy metal domain: Global Mercury Observation System, Global Earth Observation System of Systems (GEOSS), and INSPIRE. This research work recognized a valuable interoperability service bus (i.e., a set of standards models, interfaces, and good practices) proposed to characterize the integrative research cyber-infrastructure of the heavy metal research community. In the paper, the GEOSS common infrastructure is discussed implementing a multidisciplinary and participatory research infrastructure, introducing a possible roadmap for the heavy metal pollution research community to join GEOSS as a new Group on Earth Observation community of practice and develop a research infrastructure for carrying out integrative research in its specific domain.

  11. CLIMB (the Cloud Infrastructure for Microbial Bioinformatics): an online resource for the medical microbiology community

    PubMed Central

    Smith, Andy; Southgate, Joel; Poplawski, Radoslaw; Bull, Matthew J.; Richardson, Emily; Ismail, Matthew; Thompson, Simon Elwood-; Kitchen, Christine; Guest, Martyn; Bakke, Marius

    2016-01-01

    The increasing availability and decreasing cost of high-throughput sequencing has transformed academic medical microbiology, delivering an explosion in available genomes while also driving advances in bioinformatics. However, many microbiologists are unable to exploit the resulting large genomics datasets because they do not have access to relevant computational resources and to an appropriate bioinformatics infrastructure. Here, we present the Cloud Infrastructure for Microbial Bioinformatics (CLIMB) facility, a shared computing infrastructure that has been designed from the ground up to provide an environment where microbiologists can share and reuse methods and data. PMID:28785418

  12. CLIMB (the Cloud Infrastructure for Microbial Bioinformatics): an online resource for the medical microbiology community.

    PubMed

    Connor, Thomas R; Loman, Nicholas J; Thompson, Simon; Smith, Andy; Southgate, Joel; Poplawski, Radoslaw; Bull, Matthew J; Richardson, Emily; Ismail, Matthew; Thompson, Simon Elwood-; Kitchen, Christine; Guest, Martyn; Bakke, Marius; Sheppard, Samuel K; Pallen, Mark J

    2016-09-01

    The increasing availability and decreasing cost of high-throughput sequencing has transformed academic medical microbiology, delivering an explosion in available genomes while also driving advances in bioinformatics. However, many microbiologists are unable to exploit the resulting large genomics datasets because they do not have access to relevant computational resources and to an appropriate bioinformatics infrastructure. Here, we present the Cloud Infrastructure for Microbial Bioinformatics (CLIMB) facility, a shared computing infrastructure that has been designed from the ground up to provide an environment where microbiologists can share and reuse methods and data.

  13. MFC Communications Infrastructure Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Michael Cannon; Terry Barney; Gary Cook

    2012-01-01

    Unprecedented growth of required telecommunications services and telecommunications applications change the way the INL does business today. High speed connectivity compiled with a high demand for telephony and network services requires a robust communications infrastructure.   The current state of the MFC communication infrastructure limits growth opportunities of current and future communication infrastructure services. This limitation is largely due to equipment capacity issues, aging cabling infrastructure (external/internal fiber and copper cable) and inadequate space for telecommunication equipment. While some communication infrastructure improvements have been implemented over time projects, it has been completed without a clear overall plan and technology standard.more »   This document identifies critical deficiencies with the current state of the communication infrastructure in operation at the MFC facilities and provides an analysis to identify needs and deficiencies to be addressed in order to achieve target architectural standards as defined in STD-170. The intent of STD-170 is to provide a robust, flexible, long-term solution to make communications capabilities align with the INL mission and fit the various programmatic growth and expansion needs.« less

  14. SeaDataCloud - further developing the pan-European SeaDataNet infrastructure for marine and ocean data management

    NASA Astrophysics Data System (ADS)

    Schaap, Dick M. A.; Fichaut, Michele

    2017-04-01

    SeaDataCloud marks the third phase of developing the pan-European SeaDataNet infrastructure for marine and ocean data management. The SeaDataCloud project is funded by EU and runs for 4 years from 1st November 2016. It succeeds the successful SeaDataNet II (2011 - 2015) and SeaDataNet (2006 - 2011) projects. SeaDataNet has set up and operates a pan-European infrastructure for managing marine and ocean data and is undertaken by National Oceanographic Data Centres (NODC's) and oceanographic data focal points from 34 coastal states in Europe. The infrastructure comprises a network of interconnected data centres and central SeaDataNet portal. The portal provides users a harmonised set of metadata directories and controlled access to the large collections of datasets, managed by the interconnected data centres. The population of directories has increased considerably in cooperation with and involvement in many associated EU projects and initiatives such as EMODnet. SeaDataNet at present gives overview and access to more than 1.9 million data sets for physical oceanography, chemistry, geology, geophysics, bathymetry and biology from more than 100 connected data centres from 34 countries riparian to European seas. SeaDataNet is also active in setting and governing marine data standards, and exploring and establishing interoperability solutions to connect to other e-infrastructures on the basis of standards of ISO (19115, 19139), and OGC (WMS, WFS, CS-W and SWE). Standards and associated SeaDataNet tools are made available at the SeaDataNet portal for wide uptake by data handling and managing organisations. SeaDataCloud aims at further developing standards, innovating services & products, adopting new technologies, and giving more attention to users. Moreover, it is about implementing a cooperation between the SeaDataNet consortium of marine data centres and the EUDAT consortium of e-infrastructure service providers. SeaDataCloud aims at considerably advancing services and increasing their usage by adopting cloud and High Performance Computing technology. SeaDataCloud will empower researchers with a packaged collection of services and tools, tailored to their specific needs, supporting research and enabling generation of added-value products from marine and ocean data. Substantial activities will be focused on developing added-value services, such as data subsetting, analysis, visualisation, and publishing workflows for users, both regular and advanced users, as part of a Virtual Research Environment (VRE). SeaDataCloud aims at a number of leading user communities that have new challenges for upgrading and expanding the SeaDataNet standards and services: Science, EMODnet, Copernicus Marine Environmental Monitoring Service (CMEMS) and EuroGOOS, and International scientific programmes. The presentation will give information on present services of the SeaDataNet infrastructure and services, and the new challenges in SeaDataCloud, and will highlight a number of key achievements in SeaDataCloud so far.

  15. EPA Office of Research and Development Green Infrastructure Research

    EPA Science Inventory

    This presentation provides an overview introduction to the USEPA Office of Research and Development (ORD)'s ongoing green infrastructure (GI) research efforts for stormwater management. GI approaches that increase infiltration, evapotranspiration, and rainwater harvesting offer ...

  16. Infrastructure Vulnerability Assessment Model (I-VAM).

    PubMed

    Ezell, Barry Charles

    2007-06-01

    Quantifying vulnerability to critical infrastructure has not been adequately addressed in the literature. Thus, the purpose of this article is to present a model that quantifies vulnerability. Vulnerability is defined as a measure of system susceptibility to threat scenarios. This article asserts that vulnerability is a condition of the system and it can be quantified using the Infrastructure Vulnerability Assessment Model (I-VAM). The model is presented and then applied to a medium-sized clean water system. The model requires subject matter experts (SMEs) to establish value functions and weights, and to assess protection measures of the system. Simulation is used to account for uncertainty in measurement, aggregate expert assessment, and to yield a vulnerability (Omega) density function. Results demonstrate that I-VAM is useful to decisionmakers who prefer quantification to qualitative treatment of vulnerability. I-VAM can be used to quantify vulnerability to other infrastructures, supervisory control and data acquisition systems (SCADA), and distributed control systems (DCS).

  17. RecceMan: an interactive recognition assistance for image-based reconnaissance: synergistic effects of human perception and computational methods for object recognition, identification, and infrastructure analysis

    NASA Astrophysics Data System (ADS)

    El Bekri, Nadia; Angele, Susanne; Ruckhäberle, Martin; Peinsipp-Byma, Elisabeth; Haelke, Bruno

    2015-10-01

    This paper introduces an interactive recognition assistance system for imaging reconnaissance. This system supports aerial image analysts on missions during two main tasks: Object recognition and infrastructure analysis. Object recognition concentrates on the classification of one single object. Infrastructure analysis deals with the description of the components of an infrastructure and the recognition of the infrastructure type (e.g. military airfield). Based on satellite or aerial images, aerial image analysts are able to extract single object features and thereby recognize different object types. It is one of the most challenging tasks in the imaging reconnaissance. Currently, there are no high potential ATR (automatic target recognition) applications available, as consequence the human observer cannot be replaced entirely. State-of-the-art ATR applications cannot assume in equal measure human perception and interpretation. Why is this still such a critical issue? First, cluttered and noisy images make it difficult to automatically extract, classify and identify object types. Second, due to the changed warfare and the rise of asymmetric threats it is nearly impossible to create an underlying data set containing all features, objects or infrastructure types. Many other reasons like environmental parameters or aspect angles compound the application of ATR supplementary. Due to the lack of suitable ATR procedures, the human factor is still important and so far irreplaceable. In order to use the potential benefits of the human perception and computational methods in a synergistic way, both are unified in an interactive assistance system. RecceMan® (Reconnaissance Manual) offers two different modes for aerial image analysts on missions: the object recognition mode and the infrastructure analysis mode. The aim of the object recognition mode is to recognize a certain object type based on the object features that originated from the image signatures. The infrastructure analysis mode pursues the goal to analyze the function of the infrastructure. The image analyst extracts visually certain target object signatures, assigns them to corresponding object features and is finally able to recognize the object type. The system offers him the possibility to assign the image signatures to features given by sample images. The underlying data set contains a wide range of objects features and object types for different domains like ships or land vehicles. Each domain has its own feature tree developed by aerial image analyst experts. By selecting the corresponding features, the possible solution set of objects is automatically reduced and matches only the objects that contain the selected features. Moreover, we give an outlook of current research in the field of ground target analysis in which we deal with partly automated methods to extract image signatures and assign them to the corresponding features. This research includes methods for automatically determining the orientation of an object and geometric features like width and length of the object. This step enables to reduce automatically the possible object types offered to the image analyst by the interactive recognition assistance system.

  18. Analysis of the cadastral data published in the Polish Spatial Data Infrastructure

    NASA Astrophysics Data System (ADS)

    Izdebski, Waldemar

    2017-12-01

    The cadastral data, including land parcels, are the basic reference data for presenting various objects collected in spatial databases. Easy access to up-to-date records is a very important matter for the individuals and institutions using spatial data infrastructure. The primary objective of the study was to check the current accessibility of cadastral data as well as to verify how current and complete they are. The author started researching this topic in 2007, i.e. from the moment the Team for National Spatial Data Infrastructure developed documentation concerning the standard of publishing cadastral data with the use of the WMS. Since ten years, the author was monitoring the status of cadastral data publishing in various districts as well as participated in data publishing in many districts. In 2017, when only half of the districts published WMS services from cadastral data, the questions arise: why is it so and how to change this unfavourable status? As a result of the tests performed, it was found that the status of publishing cadastral data is still far from perfect. The quality of the offered web services varies and, unfortunately, many services offer poor performance; moreover, there are plenty services that do not operate at all.

  19. Mesh infrastructure for coupled multiprocess geophysical simulations

    DOE PAGES

    Garimella, Rao V.; Perkins, William A.; Buksas, Mike W.; ...

    2014-01-01

    We have developed a sophisticated mesh infrastructure capability to support large scale multiphysics simulations such as subsurface flow and reactive contaminant transport at storage sites as well as the analysis of the effects of a warming climate on the terrestrial arctic. These simulations involve a wide range of coupled processes including overland flow, subsurface flow, freezing and thawing of ice rich soil, accumulation, redistribution and melting of snow, biogeochemical processes involving plant matter and finally, microtopography evolution due to melting and degradation of ice wedges below the surface. In addition to supporting the usual topological and geometric queries about themore » mesh, the mesh infrastructure adds capabilities such as identifying columnar structures in the mesh, enabling deforming of the mesh subject to constraints and enabling the simultaneous use of meshes of different dimensionality for subsurface and surface processes. The generic mesh interface is capable of using three different open source mesh frameworks (MSTK, MOAB and STKmesh) under the hood allowing the developers to directly compare them and choose one that is best suited for the application's needs. We demonstrate the results of some simulations using these capabilities as well as present a comparison of the performance of the different mesh frameworks.« less

  20. EVALUATING MACROINVERTEBRATE COMMUNITY ...

    EPA Pesticide Factsheets

    Since 2010, new construction in California is required to include stormwater detention and infiltration that is designed to capture rainfall from the 85th percentile of storm events in the region, preferably through green infrastructure. This study used recent macroinvertebrate community monitoring data to determine the ecological threshold for percent impervious cover prior to large scale adoption of green infrastructure using Threshold Indicator Taxa Analysis (TITAN). TITAN uses an environmental gradient and biological community data to determine individual taxa change points with respect to changes in taxa abundance and frequency across that gradient. Individual taxa change points are then aggregated to calculate the ecological threshold. This study used impervious cover data from National Land Cover Datasets and macroinvertebrate community data from California Environmental Data Exchange Network and Southern California Coastal Water Research Project. Preliminary TITAN runs for California’s Chaparral region indicated that both increasing and decreasing taxa had ecological thresholds of <1% watershed impervious cover. Next, TITAN will be used to determine shifts in the ecological threshold after the implementation of green infrastructure on a large scale. This presentation for the Society for Freshwater Scientists will discuss initial evaluation of community and taxa-specific thresholds of impairment for macroinvertebrates in California streams along

  1. A case study analysis to examine motorcycle crashes in Bogota, Colombia.

    PubMed

    Jimenez, Adriana; Bocarejo, Juan Pablo; Zarama, Roberto; Yerpez, Joël

    2015-02-01

    Contributory factors to motorcycle crashes vary among populations depending on several aspects such as the users' profiles, the composition and density of traffic, and the infrastructure features. A better understanding of local motorcycle crashes can be reached in those places where a comprehensive analysis is performed. This paper presents the results obtained from a case study analysis of 400 police records of accidents involving motorcycles in Bogota. To achieve a deeper level of understanding of how these accidents occur, we propose a systemic approach that uses available crash data. The methodology is inspired by accident prototypical scenarios, a tool for analysis developed in France. When grouping cases we identified three categories: solo motorcycle accidents, motorcyclist and pedestrian accidents, and accidents involving a motorcycle and another vehicle. Within these categories we undertook in-depth analyses of 32 groups of accidents obtaining valuable information to better comprehend motorcyclists' road crashes in a local context. Recurrent contributory factors in the groups of accidents include: inexperienced motorcyclists, wide urban roads that incite speeding and risky overtaking maneuvers, flowing urban roads that encourage high speed and increased interaction between vehicles, and lack of infrastructure maintenance. The results obtained are a valuable asset to define measures that will be conveniently adapted to the group of accident on which we want to act. The methodology exposed in this paper is applicable to the study of road crashes that involve all types of actors, not only the motorcyclists, and in contexts different than those presented in Bogota. Copyright © 2014 National Safety Council and Elsevier Ltd. All rights reserved.

  2. Assessing Terrorist Motivations for Attacking Critical Infrastructure

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ackerman, G; Abhayaratne, P; Bale, J

    Certain types of infrastructure--critical infrastructure (CI)--play vital roles in underpinning our economy, security and way of life. These complex and often interconnected systems have become so ubiquitous and essential to day-to-day life that they are easily taken for granted. Often it is only when the important services provided by such infrastructure are interrupted--when we lose easy access to electricity, health care, telecommunications, transportation or water, for example--that we are conscious of our great dependence on these networks and of the vulnerabilities that stem from such dependence. Unfortunately, it must be assumed that many terrorists are all too aware that CImore » facilities pose high-value targets that, if successfully attacked, have the potential to dramatically disrupt the normal rhythm of society, cause public fear and intimidation, and generate significant publicity. Indeed, revelations emerging at the time of this writing about Al Qaida's efforts to prepare for possible attacks on major financial facilities in New York, New Jersey, and the District of Columbia remind us just how real and immediate such threats to CI may be. Simply being aware that our nation's critical infrastructure presents terrorists with a plethora of targets, however, does little to mitigate the dangers of CI attacks. In order to prevent and preempt such terrorist acts, better understanding of the threats and vulnerabilities relating to critical infrastructure is required. The Center for Nonproliferation Studies (CNS) presents this document as both a contribution to the understanding of such threats and an initial effort at ''operationalizing'' its findings for use by analysts who work on issues of critical infrastructure protection. Specifically, this study focuses on a subsidiary aspect of CI threat assessment that has thus far remained largely unaddressed by contemporary terrorism research: the motivations and related factors that determine whether a terrorist organization will attack critical infrastructure. In other words, this research investigates: (1) why terrorists choose to attack critical infrastructure rather than other targets; (2) how groups make such decisions; (3) what, if any, types of groups are most inclined to attack critical infrastructure targets; and (4) which types of critical infrastructure terrorists prefer to attack and why. In an effort to address the above questions as comprehensively as possible, the project team employed four discrete investigative approaches in its research design. These include: (1) a review of existing terrorism and threat assessment literature to glean expert consensus regarding terrorist target selection, as well as to identify theoretical approaches that might be valuable to analysts and decision-makers who are seeking to understand such terrorist group decision-making processes; (2) the preparation of several concise case studies to help identify internal group factors and contextual influences that have played significant roles in leading some terrorist groups to attack critical infrastructure; (3) the creation of a new database--the Critical Infrastructure Terrorist Incident Catalog (CrITC)--to capture a large sample of empirical CI attack data that might be used to illuminate the nature of such attacks to date; and (4) the development of a new analytical framework--the Determinants Effecting Critical Infrastructure Decisions (DECIDe) Framework--designed to make the factors and dynamics identified by the study more ''usable'' in any future efforts to assess terrorist intentions to target critical infrastructure. Although each is addressed separately in the following chapters, none of the four aspects of this study were developed in isolation. Rather, all the constituent elements of the project informed--and were informed by--the others. For example, the review of the available literature on terrorist target selection made possible the identification of several target selection factors that were both important in the development of the analytical framework and subsequently validated by the case studies. Similarly, statistical analysis of the CrITIC data yielded measurable evidence that supported hypotheses derived from the framework, the case studies, and the writings of various experts. Besides providing an important mechanism of self-reinforcement and validation, the project's multifaceted nature made it possible to discern aspects of CI attack motivations that would likely have been missed if any single approach had been adopted.« less

  3. Transmission Infrastructure | Energy Analysis | NREL

    Science.gov Websites

    aggregating geothermal with other complementary generating technologies, in renewable energy zones infrastructure planning and expansion to enable large-scale deployment of renewable energy in the future. Large Energy, FERC, NERC, and the regional entities, transmission providers, generating companies, utilities

  4. A game theory analysis of green infrastructure stormwater management policies

    NASA Astrophysics Data System (ADS)

    William, Reshmina; Garg, Jugal; Stillwell, Ashlynn S.

    2017-09-01

    Green stormwater infrastructure has been demonstrated as an innovative water resources management approach that addresses multiple challenges facing urban environments. However, there is little consensus on what policy strategies can be used to best incentivize green infrastructure adoption by private landowners. Game theory, an analysis framework that has historically been under-utilized within the context of stormwater management, is uniquely suited to address this policy question. We used a cooperative game theory framework to investigate the potential impacts of different policy strategies used to incentivize green infrastructure installation. The results indicate that municipal regulation leads to the greatest reduction in pollutant loading. However, the choice of the "best" regulatory approach will depend on a variety of different factors including politics and financial considerations. Large, downstream agents have a disproportionate share of bargaining power. Results also reveal that policy impacts are highly dependent on agents' spatial position within the stormwater network, leading to important questions of social equity and environmental justice.

  5. Planetary plasma data analysis and 3D visualisation tools of the CDPP in the IMPEx infrastructure

    NASA Astrophysics Data System (ADS)

    Gangloff, Michel; Génot, Vincent; Khodachenko, Maxim; Modolo, Ronan; Kallio, Esa; Alexeev, Igor; Al-Ubaidi, Tarek; Scherf, Manuel; André, Nicolas; Bourrel, Nataliya; Budnik, Elena; Bouchemit, Myriam; Dufourg, Nicolas; Beigbeder, Laurent

    2015-04-01

    The CDPP (Centre de Données de la Physique des Plasmas,(http://cdpp.eu/), the French data center for plasma physics, is engaged for more than a decade in the archiving and dissemination of plasma data products from space missions and ground observatories. Besides these activities, the CDPP developed services like AMDA (http://amda.cdpp.eu/) which enables in depth analysis of a large amount of data through dedicated functionalities such as: visualization, conditional search, cataloguing, and 3DView (http://3dview.cdpp.eu/) which provides immersive visualisations in planetary environments and is further developed to include simulation and observational data. Both tools provide an interface to the IMPEx infrastructure (http://impexfp7.oeaw.ac.at) which facilitates the joint access to outputs of simulations (MHD or Hybrid models) in planetary sciences from providers like LATMOS, FMI as well as planetary plasma observational data provided by the CDPP. Several magnetospheric models are implemented in 3Dview (e.g. Tsyganenko for the Earth, and Cain for Mars). Magnetospheric models provided by SINP for the Earth, Jupiter, Saturn and Mercury as well as Hess models for Jupiter can also be used in 3DView, through the IMPEx infrastructure. A use case demonstrating the new capabilities offered by these tools and their interaction, including magnetospheric models, will be presented together with the IMPEx simulation metadata model used for the interface to simulation databases and model providers.

  6. NFFA-Europe: enhancing European competitiveness in nanoscience research and innovation (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Carsughi, Flavio; Fonseca, Luis

    2017-06-01

    NFFA-EUROPE is an European open access resource for experimental and theoretical nanoscience and sets out a platform to carry out comprehensive projects for multidisciplinary research at the nanoscale extending from synthesis to nanocharacterization to theory and numerical simulation. Advanced infrastructures specialized on growth, nano-lithography, nano-characterization, theory and simulation and fine-analysis with Synchrotron, FEL and Neutron radiation sources are integrated in a multi-site combination to develop frontier research on methods for reproducible nanoscience research and to enable European and international researchers from diverse disciplines to carry out advanced proposals impacting science and innovation. NFFA-EUROPE will enable coordinated access to infrastructures on different aspects of nanoscience research that is not currently available at single specialized ones and without duplicating their specific scopes. Approved user projects will have access to the best suited instruments and support competences for performing the research, including access to analytical large scale facilities, theory and simulation and high-performance computing facilities. Access is offered free of charge to European users and users will receive a financial contribution for their travel, accommodation and subsistence costs. The users access will include several "installations" and will be coordinated through a single entry point portal that will activate an advanced user-infrastructure dialogue to build up a personalized access programme with an increasing return on science and innovation production. The own research activity of NFFA-EUROPE will address key bottlenecks of nanoscience research: nanostructure traceability, protocol reproducibility, in-operando nano-manipulation and analysis, open data.

  7. Infrastructure and social tie: Spatial model approach on understanding poverty in Malang regency, Indonesia

    NASA Astrophysics Data System (ADS)

    Ari, I. R. D.; Hasyim, A. W.; Pratama, B. A.; Helmy, M.; Sheilla, M. N.

    2017-06-01

    Poverty is a problem that requires attention from the government especially in developing countries such as Indonesia. This Research takes Place at Kasembon District because it has 53,19% family below poverty line in the region. The purpose of this research is to measure poverty based on 3 poverty indicators published by World Bank and 1 multidimensional poverty index. Furthermore, this research invesitigas the relationship between poverty with social and infrastructure in Kasembon District. This study using social network analysis, hot spots analysis, and regression analysis with ordinary least squares. From the poverty indicators known that Pondokagung Village has the highest poverty rate compared to another region. Results from regression model indicate that social and infrastructure affecting poverty in Kasembon District. Social parameter that affecting poverty is density. Infrastructure parameter that affecting poverty is length of paved road. Coefficient value of density is the largest in the model. Therefore it can be concluded that social factors can give more opportunity to reduce poverty rates in Kasembon District. In the local model of paved road coefficient, it is known that the coefficient for each village has not much different value from the global model.

  8. CloudMan as a platform for tool, data, and analysis distribution

    PubMed Central

    2012-01-01

    Background Cloud computing provides an infrastructure that facilitates large scale computational analysis in a scalable, democratized fashion, However, in this context it is difficult to ensure sharing of an analysis environment and associated data in a scalable and precisely reproducible way. Results CloudMan (usecloudman.org) enables individual researchers to easily deploy, customize, and share their entire cloud analysis environment, including data, tools, and configurations. Conclusions With the enabled customization and sharing of instances, CloudMan can be used as a platform for collaboration. The presented solution improves accessibility of cloud resources, tools, and data to the level of an individual researcher and contributes toward reproducibility and transparency of research solutions. PMID:23181507

  9. Complete distributed computing environment for a HEP experiment: experience with ARC-connected infrastructure for ATLAS

    NASA Astrophysics Data System (ADS)

    Read, A.; Taga, A.; O-Saada, F.; Pajchel, K.; Samset, B. H.; Cameron, D.

    2008-07-01

    Computing and storage resources connected by the Nordugrid ARC middleware in the Nordic countries, Switzerland and Slovenia are a part of the ATLAS computing Grid. This infrastructure is being commissioned with the ongoing ATLAS Monte Carlo simulation production in preparation for the commencement of data taking in 2008. The unique non-intrusive architecture of ARC, its straightforward interplay with the ATLAS Production System via the Dulcinea executor, and its performance during the commissioning exercise is described. ARC support for flexible and powerful end-user analysis within the GANGA distributed analysis framework is also shown. Whereas the storage solution for this Grid was earlier based on a large, distributed collection of GridFTP-servers, the ATLAS computing design includes a structured SRM-based system with a limited number of storage endpoints. The characteristics, integration and performance of the old and new storage solutions are presented. Although the hardware resources in this Grid are quite modest, it has provided more than double the agreed contribution to the ATLAS production with an efficiency above 95% during long periods of stable operation.

  10. Updated Intensity - Duration - Frequency Curves Under Different Future Climate Scenarios

    NASA Astrophysics Data System (ADS)

    Ragno, E.; AghaKouchak, A.

    2016-12-01

    Current infrastructure design procedures rely on the use of Intensity - Duration - Frequency (IDF) curves retrieved under the assumption of temporal stationarity, meaning that occurrences of extreme events are expected to be time invariant. However, numerous studies have observed more severe extreme events over time. Hence, the stationarity assumption for extreme analysis may not be appropriate in a warming climate. This issue raises concerns regarding the safety and resilience of the existing and future infrastructures. Here we employ historical and projected (RCP 8.5) CMIP5 runs to investigate IDF curves of 14 urban areas across the United States. We first statistically assess changes in precipitation extremes using an energy-based test for equal distributions. Then, through a Bayesian inference approach for stationary and non-stationary extreme value analysis, we provide updated IDF curves based on climatic model projections. This presentation summarizes the projected changes in statistics of extremes. We show that, based on CMIP5 simulations, extreme precipitation events in some urban areas can be 20% more severe in the future, even when projected annual mean precipitation is expected to remain similar to the ground-based climatology.

  11. An ICT Adoption Framework for Education: A Case Study in Public Secondary School of Indonesia

    NASA Astrophysics Data System (ADS)

    Nurjanah, S.; Santoso, H. B.; Hasibuan, Z. A.

    2017-01-01

    This paper presents preliminary research findings on the ICT adoption framework for education. Despite many studies have been conducted on ICT adoption framework in education at various countries, they are lack of analysis on the degree of component contribution to the success to the framework. In this paper a set of components that link to ICT adoption in education is observed based on literatures and explorative analysis. The components are Infrastructure, Application, User Skills, Utilization, Finance, and Policy. The components are used as a basis to develop a questionnaire to capture the current ICT adoption condition in schools. The data from questionnaire are processed using Structured Equation Model (SEM). The results show that each component contributes differently to the ICT adoption framework. Finance provides the strongest affect to Infrastructure readiness, whilst User Skills provides the strongest affect to Utilization. The study concludes that development of ICT adoption framework should consider components contribution weights among the components that can be used to guide the implementation of ICT adoption in education.

  12. Robust Engineering Designs for Infrastructure Adaptation to a Changing Climate

    NASA Astrophysics Data System (ADS)

    Samaras, C.; Cook, L.

    2015-12-01

    Infrastructure systems are expected to be functional, durable and safe over long service lives - 50 to over 100 years. Observations and models of climate science show that greenhouse gas emissions resulting from human activities have changed climate, weather and extreme events. Projections of future changes (albeit with uncertainties caused by inadequacies of current climate/weather models) can be made based on scenarios for future emissions, but actual future emissions are themselves uncertain. Most current engineering standards and practices for infrastructure assume that the probabilities of future extreme climate and weather events will match those of the past. Climate science shows that this assumption is invalid, but is unable, at present, to define these probabilities over the service lives of existing and new infrastructure systems. Engineering designs, plans, and institutions and regulations will need to be adaptable for a range of future conditions (conditions of climate, weather and extreme events, as well as changing societal demands for infrastructure services). For their current and future projects, engineers should: Involve all stakeholders (owners, financers, insurance, regulators, affected public, climate/weather scientists, etc.) in key decisions; Use low regret, adaptive strategies, such as robust decision making and the observational method, comply with relevant standards and regulations, and exceed their requirements where appropriate; Publish design studies and performance/failure investigations to extend the body of knowledge for advancement of practice. The engineering community should conduct observational and modeling research with climate/weather/social scientists and the concerned communities and account rationally for climate change in revised engineering standards and codes. This presentation presents initial research on decisionmaking under uncertainty for climate resilient infrastructure design.

  13. Analysis on Transportation Infrastructure Availability to Achieve Environmental and Social Sustainability in Karawang

    NASA Astrophysics Data System (ADS)

    Rarasati, A. D.; Octoria, N. B.

    2018-03-01

    Sustainable infrastructure is the key to development success. At the same time, transportation infrastructure development will involve social and environmental conditions of the local surroundings. Assessment of the availability of such transport infrastructure is one of the solutions adapted from social and environmental impacts. By conducting a correlation test, the presence of transportation infrastructure and the social conditions of the environment can be identified. The results obtained show that the accessibility, the level of security, and the level of equality are correlated to social and environmental sustainability in Karawang. In terms of environment, the availability of transportation infrastructure is not directly related to the impact of environmental sustainability. The impact of the perceived environment also has no effect on the journey. Correlation results indicate that the length of travel time and congestion level do not make the perceived impact greater. The impact of the perceived environment is merely due to the high utilization of private vehicles in Karawang which subsequently leads to higher energy consumption.

  14. Advanced Artificial Science. The development of an artificial science and engineering research infrastructure to facilitate innovative computational modeling, analysis, and application to interdisciplinary areas of scientific investigation.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Saffer, Shelley

    2014-12-01

    This is a final report of the DOE award DE-SC0001132, Advanced Artificial Science. The development of an artificial science and engineering research infrastructure to facilitate innovative computational modeling, analysis, and application to interdisciplinary areas of scientific investigation. This document describes the achievements of the goals, and resulting research made possible by this award.

  15. Where to go? Strategic modelling of access to emergency shelters in Mozambique.

    PubMed

    Gall, Melanie

    2004-03-01

    This paper, through spatial-analysis techniques, examines the accessibility of emergency shelters for vulnerable populations, and outlines the benefits of an extended and permanently established shelter network in central Mozambique. The raster-based modelling approach considers data on land cover, locations of accommodation centres in 2000, settlements and infrastructure. The shelter analysis is a two-step process determining access for vulnerable communities first, followed by a suitability analysis for additional emergency shelter sites. The results indicate the need for both retrofitting existing infrastructure (schools, health posts) to function as shelters during an emergency, and constructing new facilities - at best multi-purpose facilities that can serve as social infrastructure and shelter. Besides assessing the current situation in terms of availability and accessibility of emergency shelters, this paper provides an example of evaluating the effectiveness of humanitarian assistance without conventional mechanisms like food tonnage and number of beneficiaries.

  16. Adapting to climate change : the public policy response - public infrastructure

    DOT National Transportation Integrated Search

    2009-06-01

    This paper assesses the threats and needs that multidimensional climate change imposes for : public infrastructure, reviews the existing adaptive capacity that could be applied to respond : to these threats and needs, and presents options for enhanci...

  17. Integrating sea floor observatory data: the EMSO data infrastructure

    NASA Astrophysics Data System (ADS)

    Huber, Robert; Azzarone, Adriano; Carval, Thierry; Doumaz, Fawzi; Giovanetti, Gabriele; Marinaro, Giuditta; Rolin, Jean-Francois; Beranzoli, Laura; Waldmann, Christoph

    2013-04-01

    The European research infrastructure EMSO is a European network of fixed-point, deep-seafloor and water column observatories deployed in key sites of the European Continental margin and Arctic. It aims to provide the technological and scientific framework for the investigation of the environmental processes related to the interaction between the geosphere, biosphere, and hydrosphere and for a sustainable management by long-term monitoring also with real-time data transmission. Since 2006, EMSO is on the ESFRI (European Strategy Forum on Research Infrastructures) roadmap and has entered its construction phase in 2012. Within this framework, EMSO is contributing to large infrastructure integration projects such as ENVRI and COOPEUS. The EMSO infrastructure is geographically distributed in key sites of European waters, spanning from the Arctic, through the Atlantic and Mediterranean Sea to the Black Sea. It is presently consisting of thirteen sites which have been identified by the scientific community according to their importance respect to Marine Ecosystems, Climate Changes and Marine GeoHazards. The data infrastructure for EMSO is being designed as a distributed system. Presently, EMSO data collected during experiments at each EMSO site are locally stored and organized in catalogues or relational databases run by the responsible regional EMSO nodes. Three major institutions and their data centers are currently offering access to EMSO data: PANGAEA, INGV and IFREMER. In continuation of the IT activities which have been performed during EMSOs twin project ESONET, EMSO is now implementing the ESONET data architecture within an operational EMSO data infrastructure. EMSO aims to be compliant with relevant marine initiatives such as MyOceans, EUROSITES, EuroARGO, SEADATANET and EMODNET as well as to meet the requirements of international and interdisciplinary projects such as COOPEUS and ENVRI, EUDAT and iCORDI. A major focus is therefore set on standardization and interoperability of the EMSO data infrastructure. Beneath common standards for metadata exchange such as OpenSearch or OAI-PMH, EMSO has chosen to implement core standards of the Open Geospatial Consortium (OGC) Sensor Web Enablement (SWE) suite of standards, such as Catalogue Service for Web (CS-W), Sensor Observation Service (SOS) and Observations and Measurements (O&M). Further, strong integration efforts are currently undertaken to harmonize data formats e.g NetCDF as well as the used ontologies and terminologies. The presentation will also give information to users about the discovery and visualization procedure for the EMSO data presently available.

  18. Mock Data Challenge for the MPD/NICA Experiment on the HybriLIT Cluster

    NASA Astrophysics Data System (ADS)

    Gertsenberger, Konstantin; Rogachevsky, Oleg

    2018-02-01

    Simulation of data processing before receiving first experimental data is an important issue in high-energy physics experiments. This article presents the current Event Data Model and the Mock Data Challenge for the MPD experiment at the NICA accelerator complex which uses ongoing simulation studies to exercise in a stress-testing the distributed computing infrastructure and experiment software in the full production environment from simulated data through the physical analysis.

  19. Assessment of municipal infrastructure development and its critical influencing factors in urban China: A FA and STIRPAT approach.

    PubMed

    Li, Yu; Zheng, Ji; Li, Fei; Jin, Xueting; Xu, Chen

    2017-01-01

    Municipal infrastructure is a fundamental facility for the normal operation and development of an urban city and is of significance for the stable progress of sustainable urbanization around the world, especially in developing countries. Based on the municipal infrastructure data of the prefecture-level cities in China, municipal infrastructure development is assessed comprehensively using a FA (factor analysis) model, and then the stochastic model STIRPAT (stochastic impacts by regression on population, affluence and technology) is examined to investigate key factors that influence municipal infrastructure of cities in various stages of urbanization and economy. This study indicates that the municipal infrastructure development in urban China demonstrates typical characteristics of regional differentiation, in line with the economic development pattern. Municipal infrastructure development in cities is primarily influenced by income, industrialization and investment. For China and similar developing countries under transformation, national public investment remains the primary driving force of economy as well as the key influencing factor of municipal infrastructure. Contribution from urbanization and the relative consumption level, and the tertiary industry is still scanty, which is a crux issue for many developing countries under transformation. With economic growth and the transformation requirements, the influence of the conventional factors such as public investment and industrialization on municipal infrastructure development would be expected to decline, meanwhile, other factors like the consumption and tertiary industry driven model and the innovation society can become key contributors to municipal infrastructure sustainability.

  20. Assessment of municipal infrastructure development and its critical influencing factors in urban China: A FA and STIRPAT approach

    PubMed Central

    Li, Yu; Zheng, Ji; Li, Fei; Jin, Xueting; Xu, Chen

    2017-01-01

    Municipal infrastructure is a fundamental facility for the normal operation and development of an urban city and is of significance for the stable progress of sustainable urbanization around the world, especially in developing countries. Based on the municipal infrastructure data of the prefecture-level cities in China, municipal infrastructure development is assessed comprehensively using a FA (factor analysis) model, and then the stochastic model STIRPAT (stochastic impacts by regression on population, affluence and technology) is examined to investigate key factors that influence municipal infrastructure of cities in various stages of urbanization and economy. This study indicates that the municipal infrastructure development in urban China demonstrates typical characteristics of regional differentiation, in line with the economic development pattern. Municipal infrastructure development in cities is primarily influenced by income, industrialization and investment. For China and similar developing countries under transformation, national public investment remains the primary driving force of economy as well as the key influencing factor of municipal infrastructure. Contribution from urbanization and the relative consumption level, and the tertiary industry is still scanty, which is a crux issue for many developing countries under transformation. With economic growth and the transformation requirements, the influence of the conventional factors such as public investment and industrialization on municipal infrastructure development would be expected to decline, meanwhile, other factors like the consumption and tertiary industry driven model and the innovation society can become key contributors to municipal infrastructure sustainability. PMID:28787031

  1. Rapid assessment of infrastructure of primary health care facilities - a relevant instrument for health care systems management.

    PubMed

    Scholz, Stefan; Ngoli, Baltazar; Flessa, Steffen

    2015-05-01

    Health care infrastructure constitutes a major component of the structural quality of a health system. Infrastructural deficiencies of health services are reported in literature and research. A number of instruments exist for the assessment of infrastructure. However, no easy-to-use instruments to assess health facility infrastructure in developing countries are available. Present tools are not applicable for a rapid assessment by health facility staff. Therefore, health information systems lack data on facility infrastructure. A rapid assessment tool for the infrastructure of primary health care facilities was developed by the authors and pilot-tested in Tanzania. The tool measures the quality of all infrastructural components comprehensively and with high standardization. Ratings use a 2-1-0 scheme which is frequently used in Tanzanian health care services. Infrastructural indicators and indices are obtained from the assessment and serve for reporting and tracing of interventions. The tool was pilot-tested in Tanga Region (Tanzania). The pilot test covered seven primary care facilities in the range between dispensary and district hospital. The assessment encompassed the facilities as entities as well as 42 facility buildings and 80 pieces of technical medical equipment. A full assessment of facility infrastructure was undertaken by health care professionals while the rapid assessment was performed by facility staff. Serious infrastructural deficiencies were revealed. The rapid assessment tool proved a reliable instrument of routine data collection by health facility staff. The authors recommend integrating the rapid assessment tool in the health information systems of developing countries. Health authorities in a decentralized health system are thus enabled to detect infrastructural deficiencies and trace the effects of interventions. The tool can lay the data foundation for district facility infrastructure management.

  2. Evolution of the Virtualized HPC Infrastructure of Novosibirsk Scientific Center

    NASA Astrophysics Data System (ADS)

    Adakin, A.; Anisenkov, A.; Belov, S.; Chubarov, D.; Kalyuzhny, V.; Kaplin, V.; Korol, A.; Kuchin, N.; Lomakin, S.; Nikultsev, V.; Skovpen, K.; Sukharev, A.; Zaytsev, A.

    2012-12-01

    Novosibirsk Scientific Center (NSC), also known worldwide as Akademgorodok, is one of the largest Russian scientific centers hosting Novosibirsk State University (NSU) and more than 35 research organizations of the Siberian Branch of Russian Academy of Sciences including Budker Institute of Nuclear Physics (BINP), Institute of Computational Technologies, and Institute of Computational Mathematics and Mathematical Geophysics (ICM&MG). Since each institute has specific requirements on the architecture of computing farms involved in its research field, currently we've got several computing facilities hosted by NSC institutes, each optimized for a particular set of tasks, of which the largest are the NSU Supercomputer Center, Siberian Supercomputer Center (ICM&MG), and a Grid Computing Facility of BINP. A dedicated optical network with the initial bandwidth of 10 Gb/s connecting these three facilities was built in order to make it possible to share the computing resources among the research communities, thus increasing the efficiency of operating the existing computing facilities and offering a common platform for building the computing infrastructure for future scientific projects. Unification of the computing infrastructure is achieved by extensive use of virtualization technology based on XEN and KVM platforms. This contribution gives a thorough review of the present status and future development prospects for the NSC virtualized computing infrastructure and the experience gained while using it for running production data analysis jobs related to HEP experiments being carried out at BINP, especially the KEDR detector experiment at the VEPP-4M electron-positron collider.

  3. An infrastructure for accurate characterization of single-event transients in digital circuits.

    PubMed

    Savulimedu Veeravalli, Varadan; Polzer, Thomas; Schmid, Ulrich; Steininger, Andreas; Hofbauer, Michael; Schweiger, Kurt; Dietrich, Horst; Schneider-Hornstein, Kerstin; Zimmermann, Horst; Voss, Kay-Obbe; Merk, Bruno; Hajek, Michael

    2013-11-01

    We present the architecture and a detailed pre-fabrication analysis of a digital measurement ASIC facilitating long-term irradiation experiments of basic asynchronous circuits, which also demonstrates the suitability of the general approach for obtaining accurate radiation failure models developed in our FATAL project. Our ASIC design combines radiation targets like Muller C-elements and elastic pipelines as well as standard combinational gates and flip-flops with an elaborate on-chip measurement infrastructure. Major architectural challenges result from the fact that the latter must operate reliably under the same radiation conditions the target circuits are exposed to, without wasting precious die area for a rad-hard design. A measurement architecture based on multiple non-rad-hard counters is used, which we show to be resilient against double faults, as well as many triple and even higher-multiplicity faults. The design evaluation is done by means of comprehensive fault injection experiments, which are based on detailed Spice models of the target circuits in conjunction with a standard double-exponential current injection model for single-event transients (SET). To be as accurate as possible, the parameters of this current model have been aligned with results obtained from 3D device simulation models, which have in turn been validated and calibrated using micro-beam radiation experiments at the GSI in Darmstadt, Germany. For the latter, target circuits instrumented with high-speed sense amplifiers have been used for analog SET recording. Together with a probabilistic analysis of the sustainable particle flow rates, based on a detailed area analysis and experimental cross-section data, we can conclude that the proposed architecture will indeed sustain significant target hit rates, without exceeding the resilience bound of the measurement infrastructure.

  4. An infrastructure for accurate characterization of single-event transients in digital circuits☆

    PubMed Central

    Savulimedu Veeravalli, Varadan; Polzer, Thomas; Schmid, Ulrich; Steininger, Andreas; Hofbauer, Michael; Schweiger, Kurt; Dietrich, Horst; Schneider-Hornstein, Kerstin; Zimmermann, Horst; Voss, Kay-Obbe; Merk, Bruno; Hajek, Michael

    2013-01-01

    We present the architecture and a detailed pre-fabrication analysis of a digital measurement ASIC facilitating long-term irradiation experiments of basic asynchronous circuits, which also demonstrates the suitability of the general approach for obtaining accurate radiation failure models developed in our FATAL project. Our ASIC design combines radiation targets like Muller C-elements and elastic pipelines as well as standard combinational gates and flip-flops with an elaborate on-chip measurement infrastructure. Major architectural challenges result from the fact that the latter must operate reliably under the same radiation conditions the target circuits are exposed to, without wasting precious die area for a rad-hard design. A measurement architecture based on multiple non-rad-hard counters is used, which we show to be resilient against double faults, as well as many triple and even higher-multiplicity faults. The design evaluation is done by means of comprehensive fault injection experiments, which are based on detailed Spice models of the target circuits in conjunction with a standard double-exponential current injection model for single-event transients (SET). To be as accurate as possible, the parameters of this current model have been aligned with results obtained from 3D device simulation models, which have in turn been validated and calibrated using micro-beam radiation experiments at the GSI in Darmstadt, Germany. For the latter, target circuits instrumented with high-speed sense amplifiers have been used for analog SET recording. Together with a probabilistic analysis of the sustainable particle flow rates, based on a detailed area analysis and experimental cross-section data, we can conclude that the proposed architecture will indeed sustain significant target hit rates, without exceeding the resilience bound of the measurement infrastructure. PMID:24748694

  5. On the Storm Surge and Sea Level Rise Projections for Infrastructure Risk Analysis and Adaptation

    EPA Science Inventory

    Storm surge can cause coastal hydrology changes, flooding, water quality changes, and even inundation of low-lying terrain. Strong wave actions and disruptive winds can damage water infrastructure and other environmental assets (hazardous and solid waste management facilities, w...

  6. 75 FR 30460 - Notice of Funding Availability for the Department of Transportation's National Infrastructure...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-01

    ... provide quantitative information regarding expected reductions in emissions of CO 2 or fuel consumption as... provide quantitative information that validates the existence of substantial transportation-related costs... infrastructure investments on systematic analysis of expected benefits and costs, including both quantitative and...

  7. The Diamond Model of Intrusion Analysis

    DTIC Science & Technology

    2013-07-05

    infrastructure pivot) which were then “ sinkholed ”12 to identify global victims (infrastructure-to-victim pivot). Each victim was then further identified...which would have matching social-political needs using cyber-victimology (§5.1.2) [43]. 12“ Sinkholing ” is an aggressive defender technique to takeover

  8. Investing in Alternative Fuel Infrastructure: Insights for California from Stakeholder Interviews: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Melaina, Marc; Muratori, Matteo; McLaren, Joyce

    Increased interest in the use of alternative transportation fuels, such as natural gas, hydrogen, and electricity, is being driven by heightened concern about the climate impacts of gasoline and diesel emissions and our dependence on finite oil resources. A key barrier to widespread adoption of low- and zero-emission passenger vehicles is the availability of refueling infrastructure. Recalling the 'chicken and egg' conundrum, limited adoption of alternative fuel vehicles increases the perceived risk of investments in refueling infrastructure, while lack of refueling infrastructure inhibits vehicle adoption. In this paper, we present the results of a study of the perceived risks andmore » barriers to investment in alternative fuels infrastructure, based on interviews with industry experts and stakeholders. We cover barriers to infrastructure development for three alternative fuels for passenger vehicles: compressed natural gas, hydrogen, and electricity. As an early-mover in zero emission passenger vehicles, California provides the early market experience necessary to map the alternative fuel infrastructure business space. Results and insights identified in this study can be used to inform investment decisions, formulate incentive programs, and guide deployment plans for alternative fueling infrastructure in the U.S. and elsewhere.« less

  9. Online & Offline data storage and data processing at the European XFEL facility

    NASA Astrophysics Data System (ADS)

    Gasthuber, Martin; Dietrich, Stefan; Malka, Janusz; Kuhn, Manuela; Ensslin, Uwe; Wrona, Krzysztof; Szuba, Janusz

    2017-10-01

    For the upcoming experiments at the European XFEL light source facility, a new online and offline data processing and storage infrastructure is currently being built and verified. Based on the experience of the system being developed for the Petra III light source at DESY, presented at the last CHEP conference, we further develop the system to cope with the much higher volumes and rates ( 50GB/sec) together with a more complex data analysis and infrastructure conditions (i.e. long range InfiniBand connections). This work will be carried out in collaboration of DESY/IT, European XFEL and technology support from IBM/Research. This presentation will shortly wrap up the experience of 1 year runtime of the PetraIII ([3]) system, continue with a short description of the challenges for the European XFEL ([2]) experiments and the main section, showing the proposed system for online and offline with initial result from real implementation (HW & SW). This will cover the selected cluster filesystem GPFS ([5]) including Quality of Service (QOS), extensive use of flash based subsystems and other new and unique features this architecture will benefit from.

  10. Critical Infrastructure Interdependencies Assessment

    DOE PAGES

    Petit, Frederic; Verner, Duane

    2016-11-01

    Throughout the world there is strong recognition that critical infrastructure security and resilience needs to be improved. In the United States, the National Infrastructure Protection Plan (NIPP) provides the strategic vision to guide the national effort to manage risk to the Nation’s critical infrastructure.”1 The achievement of this vision is challenged by the complexity of critical infrastructure systems and their inherent interdependencies. The update to the NIPP presents an opportunity to advance the nation’s efforts to further understand and analyze interdependencies. Such an important undertaking requires the involvement of public and private sector stakeholders and the reinforcement of existing partnershipsmore » and collaborations within the U.S. Department of Homeland Security (DHS) and other Federal agencies, including national laboratories; State, local, tribal, and territorial governments; and nongovernmental organizations.« less

  11. The Contribution for Improving GNSS Data and Derived Products for Solid Earth Sciences Promoted by EPOS-IP

    NASA Astrophysics Data System (ADS)

    Fernandes, R. M. S.; Bos, M. S.; Bruyninx, C.; Crocker, P.; Dousa, J.; Walpersdorf, A.; Socquet, A.; Avallone, A.; Ganas, A.; Ionescu, C.; Kenyeres, A.; Ofeigsson, B.; Ozener, H.; Vergnolle, M.; Lidberg, M.; Liwosz, T.; Soehne, W.; Bezdeka, P.; Cardoso, R.; Cotte, N.; Couto, R.; D'Agostino, N.; Deprez, A.; Fabian, A.; Gonçalves, H.; Féres, L.; Legrand, J.; Menut, J. L.; Nastase, E.; Ngo, K. M.; Sigurðarson, F.; Vaclavovic, P.

    2017-12-01

    The GNSS working group part of the EPOS-IP (European Plate Observing System - Implementation Phase) project oversees the implementation of services focused on GNSS data and derived products for the use of the geo-sciences community. The objective is to serve essentially the Solid Earth community, but other scientific and technical communities will also be able the benefit of the efforts being carried out to access the data (and derived products) of the European Geodetic Infrastructures. The geodetic component of EPOS is dealing essentially with implementing an e-infrastructure to store and disseminate continuous GNSS data (and derived solutions) from existing Research Infrastructures and new dedicated services. Present efforts are on developing an integrated software package, called GLASS, that will permit to disseminate quality controlled data (using special tools) in a seamless way from dozens of Geodetic Research Infrastructures in Europe. Conceptually, GLASS can be used in a single Research Infrastructure or in hundreds cooperative ones. We present and discuss the status of the implementation of these services, including also the generation of products - time-series, velocity fields and strain rate fields. In concrete, we will present the results of the current validation phase of these services and we will discuss in detail the technical and cooperative efforts being implemented. EPOS-IP is a project funded by the ESFRI European Union.

  12. !CHAOS: A cloud of controls

    NASA Astrophysics Data System (ADS)

    Angius, S.; Bisegni, C.; Ciuffetti, P.; Di Pirro, G.; Foggetta, L. G.; Galletti, F.; Gargana, R.; Gioscio, E.; Maselli, D.; Mazzitelli, G.; Michelotti, A.; Orrù, R.; Pistoni, M.; Spagnoli, F.; Spigone, D.; Stecchi, A.; Tonto, T.; Tota, M. A.; Catani, L.; Di Giulio, C.; Salina, G.; Buzzi, P.; Checcucci, B.; Lubrano, P.; Piccini, M.; Fattibene, E.; Michelotto, M.; Cavallaro, S. R.; Diana, B. F.; Enrico, F.; Pulvirenti, S.

    2016-01-01

    The paper is aimed to present the !CHAOS open source project aimed to develop a prototype of a national private Cloud Computing infrastructure, devoted to accelerator control systems and large experiments of High Energy Physics (HEP). The !CHAOS project has been financed by MIUR (Italian Ministry of Research and Education) and aims to develop a new concept of control system and data acquisition framework by providing, with a high level of aaabstraction, all the services needed for controlling and managing a large scientific, or non-scientific, infrastructure. A beta version of the !CHAOS infrastructure will be released at the end of December 2015 and will run on private Cloud infrastructures based on OpenStack.

  13. Materials research at CMAM

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zucchiatti, Alessandro

    2013-07-18

    The Centro de Micro Analisis de Materiales (CMAM) is a research centre of the Universidad Autonoma de Madrid dedicated to the modification and analysis of materials using ion beam techniques. The infrastructure, based on a HVEE 5MV tandem accelerator, provided with a coaxial Cockcroft Walton charging system, is fully open to research groups of the UAM, to other public research institutions and to private enterprises. The CMAM research covers a few important lines such as advanced materials, surface science, biomedical materials, cultural heritage, materials for energy production. The Centre gives as well support to university teaching and technical training. Amore » detail description of the research infrastructures and their use statistics will be given. Some of the main research results will be presented to show the progress of research in the Centre in the past few years and to motivate the strategic plans for the forthcoming.« less

  14. Defining a successful commercial asteroid mining program

    NASA Astrophysics Data System (ADS)

    Andrews, Dana G.; Bonner, K. D.; Butterworth, A. W.; Calvert, H. R.; Dagang, B. R. H.; Dimond, K. J.; Eckenroth, L. G.; Erickson, J. M.; Gilbertson, B. A.; Gompertz, N. R.; Igbinosun, O. J.; Ip, T. J.; Khan, B. H.; Marquez, S. L.; Neilson, N. M.; Parker, C. O.; Ransom, E. H.; Reeve, B. W.; Robinson, T. L.; Rogers, M.; Schuh, P. M.; Tom, C. J.; Wall, S. E.; Watanabe, N.; Yoo, C. J.

    2015-03-01

    This paper summarizes a commercial Asteroid Mining Architecture synthesized by the Senior Space Design Class at the University of Washington in Winter/Spring Quarters of 2013. The main author was the instructor for that class. These results use design-to-cost development methods and focused infrastructure advancements to identify and characterize a workable space industrialization architecture including space transportation elements, asteroid exploration and mining equipment, and the earth orbit infrastructure needed to make it all work. Cost analysis predicts that for an initial investment in time and money equivalent to that for the US North Slope Oil Field, the yearly world supply of Platinum Group Metals could be increased by 50%, roughly 1500 t of LOX/LH2 propellant/year would be available in LEO, and very low cost solar panels could be assembled at GEO using asteroidal materials. The investment also would have a discounted net present value return on investment of 22% over twenty years.

  15. Clinical Knowledge Governance Framework for Nationwide Data Infrastructure Projects.

    PubMed

    Wulff, Antje; Haarbrandt, Birger; Marschollek, Michael

    2018-01-01

    The availability of semantically-enriched and interoperable clinical information models is crucial for reusing once collected data across institutions like aspired in the German HiGHmed project. Funded by the Federal Ministry of Education and Research, this nationwide data infrastructure project adopts the openEHR approach for semantic modelling. Here, strong governance is required to define high-quality and reusable models. Design of a clinical knowledge governance framework for openEHR modelling in cross-institutional settings like HiGHmed. Analysis of successful practices from international projects, published ideas on archetype governance and own modelling experiences as well as modelling of BPMN processes. We designed a framework by presenting archetype variations, roles and responsibilities, IT support and modelling workflows. Our framework has great potential to make the openEHR modelling efforts manageable. Because practical experiences are rare, prospectively our work will be predestinated to evaluate the benefits of such structured governance approaches.

  16. Implementation status of the extreme light infrastructure - nuclear physics (ELI-NP) project

    NASA Astrophysics Data System (ADS)

    Gales, S.; Zamfir, N. V.

    2015-02-01

    The Project Extreme Light Infrastructure (ELI) is part of the European Strategic Forum for Research Infrastructures (ESFRI) Roadmap. ELI will be built as a network of three complementary pillars at the frontier of laser technologies. The ELI-NP pillar (NP for Nuclear Physics) is under construction near Bucharest (Romania) and will develop a scientific program using two 10 PW lasers and a Compton back-scattering high-brilliance and intense gamma beam, a marriage of laser and accelerator technology at the frontier of knowledge. In the present paper, the technical description of the facility, the present status of the project as well as the science, applications and future perspectives will be discussed.

  17. Strategic behaviors and governance challenges in social-ecological systems

    NASA Astrophysics Data System (ADS)

    Muneepeerakul, Rachata; Anderies, John M.

    2017-08-01

    The resource management and environmental policy literature focuses on devising regulations and incentive structures to achieve desirable goals. It often presumes the existence of public infrastructure that actualizes these incentives and regulations through a process loosely referred to as `governance.' In many cases, it is not clear if and how such governance infrastructure can be created and supported. Here, we take a complex systems view in which `governance' is an emergent phenomenon generated by interactions between social, economic, and environmental (both built and natural) factors. We present a framework and formal stylized model to explore under what circumstances stable governance structures may emerge endogenously in coupled infrastructure systems comprising shared natural, social, and built infrastructures of which social-ecological systems are specific examples. The model allows us to derive general conditions for a sustainable coupled infrastructure system in which critical infrastructure (e.g., canals) is provided by a governing entity that enables resource users (e.g., farmers) to produce outputs from natural infrastructure (e.g., water) to meet their needs while supporting the governing entity.

  18. Sea level rise impacts on wastewater treatment systems along the U.S. coasts

    NASA Astrophysics Data System (ADS)

    Hummel, M.; Berry, M.; Stacey, M. T.

    2017-12-01

    As sea levels rise, coastal communities will experience more frequent and persistent nuisance flooding, and some low-lying areas may be permanently inundated. Critical components of lifeline infrastructure networks in these areas are also at risk of flooding, which could cause significant service disruptions that extend beyond the flooded zone. Thus, identifying critical infrastructure components that are vulnerable to sea level rise is an important first step in developing targeted investment in protective actions and enhancing the overall resilience of coastal communities. Wastewater treatment plants are typically located at low elevations near the coastline to minimize the cost of collecting consumed water and discharging treated effluent, which makes them particularly susceptible to coastal flooding. For this analysis, we used geographic information systems to assess the vulnerability of wastewater infrastructure to various sea level rise projections at the national level. We then estimated the number of people who would lose wastewater services, which could be more than three times as high as previous predictions of the number of people at risk of direct flooding due to sea level rise. We also considered several case studies of wastewater infrastructure in mid-sized cities to determine how topography and system configuration (centralized versus distributed) impact vulnerability. Overall, this analysis highlights the widespread vulnerability of wastewater infrastructure in the U.S. and demonstrates that local disruptions to infrastructure networks may have far-ranging impacts on areas that do not experience direct flooding.

  19. Importance of biometrics to addressing vulnerabilities of the U.S. infrastructure

    NASA Astrophysics Data System (ADS)

    Arndt, Craig M.; Hall, Nathaniel A.

    2004-08-01

    Human identification technologies are important threat countermeasures in minimizing select infrastructure vulnerabilities. Properly targeted countermeasures should be selected and integrated into an overall security solution based on disciplined analysis and modeling. Available data on infrastructure value, threat intelligence, and system vulnerabilities are carefully organized, analyzed and modeled. Prior to design and deployment of an effective countermeasure; the proper role and appropriateness of technology in addressing the overall set of vulnerabilities is established. Deployment of biometrics systems, as with other countermeasures, introduces potentially heightened vulnerabilities into the system. Heightened vulnerabilities may arise from both the newly introduced system complexities and an unfocused understanding of the set of vulnerabilities impacted by the new countermeasure. The countermeasure's own inherent vulnerabilities and those introduced by the system's integration with the existing system are analyzed and modeled to determine the overall vulnerability impact. The United States infrastructure is composed of government and private assets. The infrastructure is valued by their potential impact on several components: human physical safety, physical/information replacement/repair cost, potential contribution to future loss (criticality in weapons production), direct productivity output, national macro-economic output/productivity, and information integrity. These components must be considered in determining the overall impact of an infrastructure security breach. Cost/benefit analysis is then incorporated in the security technology deployment decision process. Overall security risks based on system vulnerabilities and threat intelligence determines areas of potential benefit. Biometric countermeasures are often considered when additional security at intended points of entry would minimize vulnerabilities.

  20. Analysis of Instrumentation to Monitor the Hydrologic Performance of Green Infrastructure at the Edison Environmental Center

    EPA Science Inventory

    Infiltration is one of the primary functional mechanisms of green infrastructure stormwater controls, so this study explored selection and placement of embedded soil moisture and water level sensors to monitor surface infiltration and infiltration into the underlying soil for per...

  1. Is the Infrastructure of EHDI Programs Working?

    ERIC Educational Resources Information Center

    Houston, K. Todd; Hoffman, Jeff; Munoz, Karen F.; Bradham, Tamala S.

    2011-01-01

    State coordinators of early hearing detection and intervention (EHDI) programs completed a strengths, weaknesses, opportunities, and threats, or SWOT, analysis that consisted of 12 evaluative areas of EHDI programs. For the EHDI program infrastructure area, 47 coordinators responded with a total of 292 items, and themes were identified in each…

  2. Geospatial analysis of bicycle network "level of traffic stress", bicycle mode choice behavior, and bicycle crashes for risk factor identification.

    DOT National Transportation Integrated Search

    2015-08-01

    Small and medium-sized cities need publicly acceptable criteria for bicycle infrastructure improvements. This report explores the : effectiveness of one proposed system of bicycle infrastructure criteria using data from a state-of-the-art travel surv...

  3. Electronic Business Transaction Infrastructure Analysis Using Petri Nets and Simulation

    ERIC Educational Resources Information Center

    Feller, Andrew Lee

    2010-01-01

    Rapid growth in eBusiness has made industry and commerce increasingly dependent on the hardware and software infrastructure that enables high-volume transaction processing across the Internet. Large transaction volumes at major industrial-firm data centers rely on robust transaction protocols and adequately provisioned hardware capacity to ensure…

  4. A genome-wide association study platform built on iPlant cyber-infrastructure

    USDA-ARS?s Scientific Manuscript database

    We demonstrated a flexible Genome-Wide Association (GWA) Study (GWAS) platform built upon the iPlant Collaborative Cyber-infrastructure. The platform supports big data management, sharing, and large scale study of both genotype and phenotype data on clusters. End users can add their own analysis too...

  5. An Analysis of the Relationship between Professional Development, School Leadership, Technology Infrastructure, and Technology Use

    ERIC Educational Resources Information Center

    Mishnick, Nicole

    2017-01-01

    This dissertation study investigates the relationship between professional development, school leadership, technology infrastructure and technology use. Three research questions were developed. The first examines the relationship between professional development and technology use, the second examines the relationship between school leadership and…

  6. Analysis of Malicious Traffic in Modbus/TCP Communications

    NASA Astrophysics Data System (ADS)

    Kobayashi, Tiago H.; Batista, Aguinaldo B.; Medeiros, João Paulo S.; Filho, José Macedo F.; Brito, Agostinho M.; Pires, Paulo S. Motta

    This paper presents the results of our analysis about the influence of Information Technology (IT) malicious traffic on an IP-based automation environment. We utilized a traffic generator, called MACE (Malicious trAffic Composition Environment), to inject malicious traffic in a Modbus/TCP communication system and a sniffer to capture and analyze network traffic. The realized tests show that malicious traffic represents a serious risk to critical information infrastructures. We show that this kind of traffic can increase latency of Modbus/TCP communication and that, in some cases, can put Modbus/TCP devices out of communication.

  7. Internet infrastructures and health care systems: a qualitative comparative analysis on networks and markets in the British National Health Service and Kaiser Permanente.

    PubMed

    Séror, Ann C

    2002-12-01

    The Internet and emergent telecommunications infrastructures are transforming the future of health care management. The costs of health care delivery systems, products, and services continue to rise everywhere, but performance of health care delivery is associated with institutional and ideological considerations as well as availability of financial and technological resources. to identify the effects of ideological differences on health care market infrastructures including the Internet and telecommunications technologies by a comparative case analysis of two large health care organizations: the British National Health Service and the California-based Kaiser Permanente health maintenance organization. A qualitative comparative analysis focusing on the British National Health Service and the Kaiser Permanente health maintenance organization to show how system infrastructures vary according to market dynamics dominated by health care institutions ("push") or by consumer demand ("pull"). System control mechanisms may be technologically embedded, institutional, or behavioral. The analysis suggests that telecommunications technologies and the Internet may contribute significantly to health care system performance in a context of ideological diversity. The study offers evidence to validate alternative models of health care governance: the national constitution model, and the enterprise business contract model. This evidence also suggests important questions for health care policy makers as well as researchers in telecommunications, organizational theory, and health care management.

  8. Internet Infrastructures and Health Care Systems: a Qualitative Comparative Analysis on Networks and Markets in the British National Health Service and Kaiser Permanente

    PubMed Central

    2002-01-01

    Background The Internet and emergent telecommunications infrastructures are transforming the future of health care management. The costs of health care delivery systems, products, and services continue to rise everywhere, but performance of health care delivery is associated with institutional and ideological considerations as well as availability of financial and technological resources. Objective To identify the effects of ideological differences on health care market infrastructures including the Internet and telecommunications technologies by a comparative case analysis of two large health care organizations: the British National Health Service and the California-based Kaiser Permanente health maintenance organization. Methods A qualitative comparative analysis focusing on the British National Health Service and the Kaiser Permanente health maintenance organization to show how system infrastructures vary according to market dynamics dominated by health care institutions ("push") or by consumer demand ("pull"). System control mechanisms may be technologically embedded, institutional, or behavioral. Results The analysis suggests that telecommunications technologies and the Internet may contribute significantly to health care system performance in a context of ideological diversity. Conclusions The study offers evidence to validate alternative models of health care governance: the national constitution model, and the enterprise business contract model. This evidence also suggests important questions for health care policy makers as well as researchers in telecommunications, organizational theory, and health care management. PMID:12554552

  9. Accelerators for society: succession of European infrastructural projects: CARE, EuCARD, TIARA, EuCARD2

    NASA Astrophysics Data System (ADS)

    Romaniuk, Ryszard S.

    2013-10-01

    Accelerator science and technology is one of a key enablers of the developments in the particle physic, photon physics and also applications in medicine and industry. The paper presents a digest of the research results in the domain of accelerator science and technology in Europe, shown during the realization of CARE (Coordinated Accelerator R&D), EuCARD (European Coordination of Accelerator R&D) and during the national annual review meeting of the TIARA - Test Infrastructure of European Research Area in Accelerator R&D. The European projects on accelerator technology started in 2003 with CARE. TIARA is an European Collaboration of Accelerator Technology, which by running research projects, technical, networks and infrastructural has a duty to integrate the research and technical communities and infrastructures in the global scale of Europe. The Collaboration gathers all research centers with large accelerator infrastructures. Other ones, like universities, are affiliated as associate members. TIARA-PP (preparatory phase) is an European infrastructural project run by this Consortium and realized inside EU-FP7. The paper presents a general overview of CARE, EuCARD and especially TIARA activities, with an introduction containing a portrait of contemporary accelerator technology and a digest of its applications in modern society. CARE, EuCARD and TIARA activities integrated the European accelerator community in a very effective way. These projects are expected very much to be continued.

  10. Building a Community Infrastructure for Scalable On-Line Performance Analysis Tools around Open|Speedshop

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, Barton

    2014-06-30

    Peta-scale computing environments pose significant challenges for both system and application developers and addressing them required more than simply scaling up existing tera-scale solutions. Performance analysis tools play an important role in gaining this understanding, but previous monolithic tools with fixed feature sets have not sufficed. Instead, this project worked on the design, implementation, and evaluation of a general, flexible tool infrastructure supporting the construction of performance tools as “pipelines” of high-quality tool building blocks. These tool building blocks provide common performance tool functionality, and are designed for scalability, lightweight data acquisition and analysis, and interoperability. For this project, wemore » built on Open|SpeedShop, a modular and extensible open source performance analysis tool set. The design and implementation of such a general and reusable infrastructure targeted for petascale systems required us to address several challenging research issues. All components needed to be designed for scale, a task made more difficult by the need to provide general modules. The infrastructure needed to support online data aggregation to cope with the large amounts of performance and debugging data. We needed to be able to map any combination of tool components to each target architecture. And we needed to design interoperable tool APIs and workflows that were concrete enough to support the required functionality, yet provide the necessary flexibility to address a wide range of tools. A major result of this project is the ability to use this scalable infrastructure to quickly create tools that match with a machine architecture and a performance problem that needs to be understood. Another benefit is the ability for application engineers to use the highly scalable, interoperable version of Open|SpeedShop, which are reassembled from the tool building blocks into a flexible, multi-user interface set of tools. This set of tools targeted at Office of Science Leadership Class computer systems and selected Office of Science application codes. We describe the contributions made by the team at the University of Wisconsin. The project built on the efforts in Open|SpeedShop funded by DOE/NNSA and the DOE/NNSA Tri-Lab community, extended Open|Speedshop to the Office of Science Leadership Class Computing Facilities, and addressed new challenges found on these cutting edge systems. Work done under this project at Wisconsin can be divided into two categories, new algorithms and techniques for debugging, and foundation infrastructure work on our Dyninst binary analysis and instrumentation toolkits and MRNet scalability infrastructure.« less

  11. Addressing Data Access Needs of the Long-tail Distribution of Geoscientists

    NASA Astrophysics Data System (ADS)

    Malik, T.; Foster, I.

    2012-12-01

    Geoscientists must increasingly consider data from multiple disciplines and make intelligent connections between the data in order to advance research frontiers in mission critical problems. As a first step towards making timely and relevant connections, scientists require data and resource access, made available through simple and efficient protocols and web services that allows them to conveniently transmit, acquire, process, and inspect data and metadata. The last decade witnessed some vital data and resource access barriers being crossed. "Big iron" data infrastructures enabled geoscientists with large volumes of simulation and observational datasets, protocols made data access convenient, and strong governing bodies ensured standards for interoperability, repeatability and auditability. All this remarkable growth in access, however, addresses needs of publishers of large data and ignores consumers of that data. To-date limited access mechanisms exist for the consumers, who fetch subsets, analyze them, and, more often than not, generate new data and analysis, which finally gets published in scientific articles. In this session, we will highlight the data access needs of the long-tail distribution of geoscientists and a state-of-the art cyber-infrastructure approaches proposed to address those needs. The needs and the state-of-the-art arose from discussions held with geoscientists as part of the EarthCube Data Access Workshop, which was coordinated by the authors. Our presentation will summarize the proceedings of the Data Access workshop. It will present qualifying characteristics of solutions that will continue to serve the needs of these scientists in the long-term. Finally, we will present some cyber-infrastructure efforts in building such solutions and also provide a vision of the future CI in which such solutions can be useful.

  12. Modeling track access charge to enhance railway industry performance

    NASA Astrophysics Data System (ADS)

    Berawi, Mohammed Ali; Miraj, Perdana; Berawi, Abdur Rohim Boy; Susantono, Bambang; Leviakangas, Pekka; Radiansyah, Hendra

    2017-11-01

    Indonesia attempts to improve nation's competitiveness by increasing the quality and the availability of railway network. However, the infrastructure improperly managed by the operator in terms of the technical issue. One of the reasons for this problem is an unbalanced value of infrastructure charge. In 2000's track access charge and infrastructure maintenance and operation for Indonesia railways are equal and despite current formula of the infrastructure charge, issues of transparency and accountability still in question. This research aims to produce an alternative scheme of track access charge by considering marginal cost plus markup (MC+) approach. The research combines qualitative and quantitative method through an in-depth interview and financial analysis. The result will generate alternative formula of infrastructure charge in Indonesia's railway industry. The simulation also conducted to estimate track access charge for the operator and to forecast government support in terms of subsidy. The result is expected to enhance railway industry performance and competitiveness.

  13. Geospatial decision support framework for critical infrastructure interdependency assessment

    NASA Astrophysics Data System (ADS)

    Shih, Chung Yan

    Critical infrastructures, such as telecommunications, energy, banking and finance, transportation, water systems and emergency services are the foundations of modern society. There is a heavy dependence on critical infrastructures at multiple levels within the supply chain of any good or service. Any disruptions in the supply chain may cause profound cascading effect to other critical infrastructures. A 1997 report by the President's Commission on Critical Infrastructure Protection states that a serious interruption in freight rail service would bring the coal mining industry to a halt within approximately two weeks and the availability of electric power could be reduced in a matter of one to two months. Therefore, this research aimed at representing and assessing the interdependencies between coal supply, transportation and energy production. A proposed geospatial decision support framework was established and applied to analyze interdependency related disruption impact. By utilizing the data warehousing approach, geospatial and non-geospatial data were retrieved, integrated and analyzed based on the transportation model and geospatial disruption analysis developed in the research. The results showed that by utilizing this framework, disruption impacts can be estimated at various levels (e.g., power plant, county, state, etc.) for preventative or emergency response efforts. The information derived from the framework can be used for data mining analysis (e.g., assessing transportation mode usages; finding alternative coal suppliers, etc.).

  14. Technology Assessment On Stressor Impacts To Green Infrastructure BMP Performance, Monitoring And Integration

    EPA Science Inventory

    This presentation will document, benchmark and evalute state-of-the-science research and implementation on BMP performance, monitoring, and integration for green infrastructure applications, to manage wet weather flwo, storm-water-runoff stressor relief and remedial sustainable w...

  15. US EPA/ORD Condition Assessment Research for Drinking Water Conveyance Infrastructure

    EPA Science Inventory

    This presentation describes research on condition assessment for drinking water transmission and distribution systems that EPA is conducting under the U.S. Environmental Protection Agency’s Aging Water Infrastructure (AWI) Research Program. This research program will help U.S. ...

  16. A new Geo-Information Architecture for Risk Management in the Alps

    NASA Astrophysics Data System (ADS)

    Baruffini, Mi.; Thuering, M.

    2009-04-01

    During the last decades land-use increased significantly in the Swiss (and European) mountain regions. Due to the scarceness of areas suitable for development, anthropic activities were extended into areas prone to natural hazards such as avalanches, debris flows and rockfalls (Smith 2001). Furthermore, the transalpine transport system necessity to develop effective links in an important area collides with the need to ensure the safety of travelers and the health of the population. Consequently, an increase in losses due to hazards can be observed. To mitigate these associated losses, both traditional protective measures and land-use planning policies are to be developed and implemented to optimize future investments. Efficient protection alternatives can be obtained considering the concept of integral risk management. Risk analysis, as the central part of risk management, has become gradually a generally accepted approach for the assessment of current and future scenarios (Loat & Zimmermann 2004). The procedure aims at risk reduction which can be reached by conventional mitigation on one hand and the implementation of land-use planning on the other hand: a combination of active and passive mitigation measures is applied to prevent damage to buildings, people and infrastructures. As part of the Swiss National Science Foundation Project 54 "Evaluation of the optimal resilience for vulnerable infrastructure networks - An interdisciplinary pilot study on the transalpine transportation corridors" we study the vulnerability of infrastructures due to natural hazards. The project aims to study various natural hazards (and later, even man-made) and to obtain an evaluation of the resilience according to an interdisciplinary approach, considering the possible damage by means of risk criteria and pointing out the feasibility of conceivable measures to reduce potential damage. The project consists of a geoscientific part and an application. The fist part consists in studying the dangers (natural) and related risks in terms of infrastructure vulnerability. The application considers different types of danger (logically intersected with the transport infrastructure) and compares them with fixed values to obtain a so-called deficit. As framework we adopt The Swiss system for risk analysis of gravitational natural hazards (BUWAL 1999). In this way the project develops a methodology that makes possible a risk analysis aiming to optimize the infrastructure vulnerability and therefore allows to obtain a model designed to optimize the functionality of the network infrastructure. A simulation environment, RiskBox, is developed within the open-source GIS environment GRASS (Geographic Resources Analysis Support System) and a database (PostgreSQL) in order to manage a infrastructure data catalog. The targeted simulation environment includes the elements that identify the consecutive steps of risk analysis: hazard - vulnerability - risk. The initial results of the experimental case study show how useful a GIS-based system, which identify the risk of any single vulnerable element in the corridor and to assess the risk to the global system on the basis of priorities of the actors involved, can be for effective and efficient disaster response management, as explained in (ARMONIA Project 2007). In our work we wanted to highlight the complexity of the risk analysis methodology, difficulty that is amplified by many peculiarities in the mountain areas. In particular, the illustrative performed process can give an overview of the interests and the need to act to reduce vulnerability and the hazardous nature of the Gotthard corridor. We present the concept and current state of development of our project and our application to the testbed, the Alps-crossing corridor of St. Gotthard. REFERENCES ARMONIA Project 2007: Land use plans in Risky areas fro Unwise to Wise Practices - Materials 2nd conference. Politecnico di Milano. BUWAL 1999: Risikoanalyse bei gravitativen Naturgefahren - Methode, Fallbeispiele und Daten (Risk analyses for gravitational natural hazards). Bundesamt für Umwelt, Wald und Landschaft (BUWAL). Umwelt-Materialen Nr. 107, 1-244. Loat, R. & Zimmermann, M. 2004 : La gestion des risques en Suisse (Risk Management in Switzerland). In: Veyret, Y., Garry, G., Meschinet de Richemont, N. & Armand Colin (eds) 2002: Colloque Arche de la Défense 22-24 octobre 2002, dans Risques naturels et aménagement en Europe, 108-120. Smith, K. 2001: Environmental hazards. Assessing the risk and reducing disaster. Third edition. London

  17. VERCE, Virtual Earthquake and Seismology Research Community in Europe, a new ESFRI initiative integrating data infrastructure, Grid and HPC infrastructures for data integration, data analysis and data modeling in seismology

    NASA Astrophysics Data System (ADS)

    van Hemert, Jano; Vilotte, Jean-Pierre

    2010-05-01

    Research in earthquake and seismology addresses fundamental problems in understanding Earth's internal wave sources and structures, and augment applications to societal concerns about natural hazards, energy resources and environmental change. This community is central to the European Plate Observing System (EPOS)—the ESFRI initiative in solid Earth Sciences. Global and regional seismology monitoring systems are continuously operated and are transmitting a growing wealth of data from Europe and from around the world. These tremendous volumes of seismograms, i.e., records of ground motions as a function of time, have a definite multi-use attribute, which puts a great premium on open-access data infrastructures that are integrated globally. In Europe, the earthquake and seismology community is part of the European Integrated Data Archives (EIDA) infrastructure and is structured as "horizontal" data services. On top of this distributed data archive system, the community has developed recently within the EC project NERIES advanced SOA-based web services and a unified portal system. Enabling advanced analysis of these data by utilising a data-aware distributed computing environment is instrumental to fully exploit the cornucopia of data and to guarantee optimal operation of the high-cost monitoring facilities. The strategy of VERCE is driven by the needs of data-intensive applications in data mining and modelling and will be illustrated through a set of applications. It aims to provide a comprehensive architecture and framework adapted to the scale and the diversity of these applications, and to integrate the community data infrastructure with Grid and HPC infrastructures. A first novel aspect is a service-oriented architecture that provides well-equipped integrated workbenches, with an efficient communication layer between data and Grid infrastructures, augmented with bridges to the HPC facilities. A second novel aspect is the coupling between Grid data analysis and HPC data modelling applications through workflow and data sharing mechanisms. VERCE will develop important interactions with the European infrastructure initiatives in Grid and HPC computing. The VERCE team: CNRS-France (IPG Paris, LGIT Grenoble), UEDIN (UK), KNMI-ORFEUS (Holland), EMSC, INGV (Italy), LMU (Germany), ULIV (UK), BADW-LRZ (Germany), SCAI (Germany), CINECA (Italy)

  18. Infrastructure Systems for Advanced Computing in E-science applications

    NASA Astrophysics Data System (ADS)

    Terzo, Olivier

    2013-04-01

    In the e-science field are growing needs for having computing infrastructure more dynamic and customizable with a model of use "on demand" that follow the exact request in term of resources and storage capacities. The integration of grid and cloud infrastructure solutions allows us to offer services that can adapt the availability in terms of up scaling and downscaling resources. The main challenges for e-sciences domains will on implement infrastructure solutions for scientific computing that allow to adapt dynamically the demands of computing resources with a strong emphasis on optimizing the use of computing resources for reducing costs of investments. Instrumentation, data volumes, algorithms, analysis contribute to increase the complexity for applications who require high processing power and storage for a limited time and often exceeds the computational resources that equip the majority of laboratories, research Unit in an organization. Very often it is necessary to adapt or even tweak rethink tools, algorithms, and consolidate existing applications through a phase of reverse engineering in order to adapt them to a deployment on Cloud infrastructure. For example, in areas such as rainfall monitoring, meteorological analysis, Hydrometeorology, Climatology Bioinformatics Next Generation Sequencing, Computational Electromagnetic, Radio occultation, the complexity of the analysis raises several issues such as the processing time, the scheduling of tasks of processing, storage of results, a multi users environment. For these reasons, it is necessary to rethink the writing model of E-Science applications in order to be already adapted to exploit the potentiality of cloud computing services through the uses of IaaS, PaaS and SaaS layer. An other important focus is on create/use hybrid infrastructure typically a federation between Private and public cloud, in fact in this way when all resources owned by the organization are all used it will be easy with a federate cloud infrastructure to add some additional resources form the Public cloud for following the needs in term of computational and storage resources and release them where process are finished. Following the hybrid model, the scheduling approach is important for managing both cloud models. Thanks to this model infrastructure every time resources are available for additional request in term of IT capacities that can used "on demand" for a limited time without having to proceed to purchase additional servers.

  19. [Big Data- challenges and risks].

    PubMed

    Krauß, Manuela; Tóth, Tamás; Hanika, Heinrich; Kozlovszky, Miklós; Dinya, Elek

    2015-12-06

    The term "Big Data" is commonly used to describe the growing mass of information being created recently. New conclusions can be drawn and new services can be developed by the connection, processing and analysis of these information. This affects all aspects of life, including health and medicine. The authors review the application areas of Big Data, and present examples from health and other areas. However, there are several preconditions of the effective use of the opportunities: proper infrastructure, well defined regulatory environment with particular emphasis on data protection and privacy. These issues and the current actions for solution are also presented.

  20. Geothermal power development in Hawaii. Volume 1. Review and analysis

    NASA Astrophysics Data System (ADS)

    1982-06-01

    The history of geothermal exploration in Hawaii is reviewed briefly. The nature and occurrences of geothermal resources are presented island by island. An overview of geothermal markets is presented. Other topics covered are: potential markets of the identified geothermal areas, well drilling technology, hydrothermal fluid transport, overland and submarine electrical transmission, community aspects of geothermal development, legal and policy issues associated with mineral and land ownership, logistics and infrastructure, legislation and permitting, land use controls, Regulation 8, public utilities commission, political climate and environment, state plans, county plans, geothermal development risks, and business planning guidelines.

  1. The statistical geoportal and the ``cartographic added value'' - creation of the spatial knowledge infrastructure

    NASA Astrophysics Data System (ADS)

    Fiedukowicz, Anna; Gasiorowski, Jedrzej; Kowalski, Paweł; Olszewski, Robert; Pillich-Kolipinska, Agata

    2012-11-01

    The wide access to source data, published by numerous websites, results in situation, when information acquisition is not a problem any more. The real problem is how to transform information in the useful knowledge. Cartographic method of research, dealing with spatial data, has been serving this purpose for many years. Nowadays, it allows conducting analyses at the high complexity level, thanks to the intense development in IT technologies, The vast majority of analytic methods utilizing the so-called data mining and data enrichment techniques, however, concerns non-spatial data. According to the Authors, utilizing those techniques in spatial data analysis (including analysis based on statistical data with spatial reference), would allow the evolution of the Spatial Information Infrastructure (SII) into the Spatial Knowledge Infrastructure (SKI). The SKI development would benefit from the existence of statistical geoportal. Its proposed functionality, consisting of data analysis as well as visualization, is outlined in the article. The examples of geostatistical analyses (ANOVA and the regression model considering the spatial neighborhood), possible to implement in such portal and allowing to produce the “cartographic added value”, are also presented here. Szeroki dostep do danych zródłowych publikowanych w licznych serwisach internetowych sprawia, iz współczesnie problemem jest nie pozyskanie informacji, lecz umiejetne przekształcenie jej w uzyteczna wiedze. Kartograficzna metoda badan, która od wielu lat słuzy temu celowi w odniesieniu do danych przestrzennych, zyskuje dzis nowe oblicze - pozwala na wykonywanie złozonych analiz dzieki wykorzystaniu intensywnego rozwoju technologii informatycznych. Znaczaca wiekszosc zastosowan metod analitycznych tzw. eksploracyjnej analizy danych (data mining) i ich "wzbogacania” (data enrichment) dotyczy jednakze danych nieprzestrzennych. Wykorzystanie tych metod do analizy danych o charakterze przestrzennym, w tym danych statystycznych, i zapewnienie dostepu do nich w formie dedykowanych usług przyczyniłoby sie, zdaniem Autorów, do przetworzenia infrastruktury informacji przestrzennej (Spatial InformationInfrastructure - SII) w infrastrukture wiedzy przestrzennej (Spatial Knowledge Infrastructure - SKI). Rozwojowi SKI mógłby słuzyc geoportal statystyczny, którego propozycje funkcjonalnosci, obejmujace zarówno analize jak i wizualizacje danych, zarysowano w artykule. Zaprezentowano tez przykłady analiz statystycznych (ANOVA, regresja z uwzglednieniem sasiedztwa przestrzennego), mozliwych do zaimplementowania w takim portalu, a które mogłyby sie przyczynic do wytworzenia "kartograficznej wartosci dodanej”.

  2. Assessing large-scale wildlife responses to human infrastructure development

    PubMed Central

    Torres, Aurora; Jaeger, Jochen A. G.; Alonso, Juan Carlos

    2016-01-01

    Habitat loss and deterioration represent the main threats to wildlife species, and are closely linked to the expansion of roads and human settlements. Unfortunately, large-scale effects of these structures remain generally overlooked. Here, we analyzed the European transportation infrastructure network and found that 50% of the continent is within 1.5 km of transportation infrastructure. We present a method for assessing the impacts from infrastructure on wildlife, based on functional response curves describing density reductions in birds and mammals (e.g., road-effect zones), and apply it to Spain as a case study. The imprint of infrastructure extends over most of the country (55.5% in the case of birds and 97.9% for mammals), with moderate declines predicted for birds (22.6% of individuals) and severe declines predicted for mammals (46.6%). Despite certain limitations, we suggest the approach proposed is widely applicable to the evaluation of effects of planned infrastructure developments under multiple scenarios, and propose an internationally coordinated strategy to update and improve it in the future. PMID:27402749

  3. Assessing large-scale wildlife responses to human infrastructure development.

    PubMed

    Torres, Aurora; Jaeger, Jochen A G; Alonso, Juan Carlos

    2016-07-26

    Habitat loss and deterioration represent the main threats to wildlife species, and are closely linked to the expansion of roads and human settlements. Unfortunately, large-scale effects of these structures remain generally overlooked. Here, we analyzed the European transportation infrastructure network and found that 50% of the continent is within 1.5 km of transportation infrastructure. We present a method for assessing the impacts from infrastructure on wildlife, based on functional response curves describing density reductions in birds and mammals (e.g., road-effect zones), and apply it to Spain as a case study. The imprint of infrastructure extends over most of the country (55.5% in the case of birds and 97.9% for mammals), with moderate declines predicted for birds (22.6% of individuals) and severe declines predicted for mammals (46.6%). Despite certain limitations, we suggest the approach proposed is widely applicable to the evaluation of effects of planned infrastructure developments under multiple scenarios, and propose an internationally coordinated strategy to update and improve it in the future.

  4. Infrastructure Retrofit Design via Composite Mechanics

    NASA Technical Reports Server (NTRS)

    Chamis, Christos, C.; Gotsis,Pascal K.

    1998-01-01

    Select applications are described to illustrate the concept for retrofitting reinforced concrete infrastructure with fiber reinforced plastic laminates. The concept is first illustrated by using an axially loaded reinforced concrete column. A reinforced concrete arch and a dome are then used to illustrate the versatility of the concept. Advanced methods such as finite element structural analysis and progressive structural fracture are then used to evaluate the retrofitting laminate adequacy. Results obtains show that retrofits can be designed to double and even triple the as-designed load of the select reinforced concrete infrastructures.

  5. Development of Bioinformatics Infrastructure for Genomics Research.

    PubMed

    Mulder, Nicola J; Adebiyi, Ezekiel; Adebiyi, Marion; Adeyemi, Seun; Ahmed, Azza; Ahmed, Rehab; Akanle, Bola; Alibi, Mohamed; Armstrong, Don L; Aron, Shaun; Ashano, Efejiro; Baichoo, Shakuntala; Benkahla, Alia; Brown, David K; Chimusa, Emile R; Fadlelmola, Faisal M; Falola, Dare; Fatumo, Segun; Ghedira, Kais; Ghouila, Amel; Hazelhurst, Scott; Isewon, Itunuoluwa; Jung, Segun; Kassim, Samar Kamal; Kayondo, Jonathan K; Mbiyavanga, Mamana; Meintjes, Ayton; Mohammed, Somia; Mosaku, Abayomi; Moussa, Ahmed; Muhammd, Mustafa; Mungloo-Dilmohamud, Zahra; Nashiru, Oyekanmi; Odia, Trust; Okafor, Adaobi; Oladipo, Olaleye; Osamor, Victor; Oyelade, Jellili; Sadki, Khalid; Salifu, Samson Pandam; Soyemi, Jumoke; Panji, Sumir; Radouani, Fouzia; Souiai, Oussama; Tastan Bishop, Özlem

    2017-06-01

    Although pockets of bioinformatics excellence have developed in Africa, generally, large-scale genomic data analysis has been limited by the availability of expertise and infrastructure. H3ABioNet, a pan-African bioinformatics network, was established to build capacity specifically to enable H3Africa (Human Heredity and Health in Africa) researchers to analyze their data in Africa. Since the inception of the H3Africa initiative, H3ABioNet's role has evolved in response to changing needs from the consortium and the African bioinformatics community. H3ABioNet set out to develop core bioinformatics infrastructure and capacity for genomics research in various aspects of data collection, transfer, storage, and analysis. Various resources have been developed to address genomic data management and analysis needs of H3Africa researchers and other scientific communities on the continent. NetMap was developed and used to build an accurate picture of network performance within Africa and between Africa and the rest of the world, and Globus Online has been rolled out to facilitate data transfer. A participant recruitment database was developed to monitor participant enrollment, and data is being harmonized through the use of ontologies and controlled vocabularies. The standardized metadata will be integrated to provide a search facility for H3Africa data and biospecimens. Because H3Africa projects are generating large-scale genomic data, facilities for analysis and interpretation are critical. H3ABioNet is implementing several data analysis platforms that provide a large range of bioinformatics tools or workflows, such as Galaxy, the Job Management System, and eBiokits. A set of reproducible, portable, and cloud-scalable pipelines to support the multiple H3Africa data types are also being developed and dockerized to enable execution on multiple computing infrastructures. In addition, new tools have been developed for analysis of the uniquely divergent African data and for downstream interpretation of prioritized variants. To provide support for these and other bioinformatics queries, an online bioinformatics helpdesk backed by broad consortium expertise has been established. Further support is provided by means of various modes of bioinformatics training. For the past 4 years, the development of infrastructure support and human capacity through H3ABioNet, have significantly contributed to the establishment of African scientific networks, data analysis facilities, and training programs. Here, we describe the infrastructure and how it has affected genomics and bioinformatics research in Africa. Copyright © 2017 World Heart Federation (Geneva). Published by Elsevier B.V. All rights reserved.

  6. Modeling, Simulation and Analysis of Public Key Infrastructure

    NASA Technical Reports Server (NTRS)

    Liu, Yuan-Kwei; Tuey, Richard; Ma, Paul (Technical Monitor)

    1998-01-01

    Security is an essential part of network communication. The advances in cryptography have provided solutions to many of the network security requirements. Public Key Infrastructure (PKI) is the foundation of the cryptography applications. The main objective of this research is to design a model to simulate a reliable, scalable, manageable, and high-performance public key infrastructure. We build a model to simulate the NASA public key infrastructure by using SimProcess and MatLab Software. The simulation is from top level all the way down to the computation needed for encryption, decryption, digital signature, and secure web server. The application of secure web server could be utilized in wireless communications. The results of the simulation are analyzed and confirmed by using queueing theory.

  7. Innovative neuro-fuzzy system of smart transport infrastructure for road traffic safety

    NASA Astrophysics Data System (ADS)

    Beinarovica, Anna; Gorobetz, Mikhail; Levchenkov, Anatoly

    2017-09-01

    The proposed study describes applying of neural network and fuzzy logic in transport control for safety improvement by evaluation of accidents’ risk by intelligent infrastructure devices. Risk evaluation is made by following multiple-criteria: danger, changeability and influence of changes for risk increasing. Neuro-fuzzy algorithms are described and proposed for task solution. The novelty of the proposed system is proved by deep analysis of known studies in the field. The structure of neuro-fuzzy system for risk evaluation and mathematical model is described in the paper. The simulation model of the intelligent devices for transport infrastructure is proposed to simulate different situations, assess the risks and propose the possible actions for infrastructure or vehicles to minimize the risk of possible accidents.

  8. The AAL project: automated monitoring and intelligent analysis for the ATLAS data taking infrastructure

    NASA Astrophysics Data System (ADS)

    Kazarov, A.; Lehmann Miotto, G.; Magnoni, L.

    2012-06-01

    The Trigger and Data Acquisition (TDAQ) system of the ATLAS experiment at CERN is the infrastructure responsible for collecting and transferring ATLAS experimental data from detectors to the mass storage system. It relies on a large, distributed computing environment, including thousands of computing nodes with thousands of application running concurrently. In such a complex environment, information analysis is fundamental for controlling applications behavior, error reporting and operational monitoring. During data taking runs, streams of messages sent by applications via the message reporting system together with data published from applications via information services are the main sources of knowledge about correctness of running operations. The flow of data produced (with an average rate of O(1-10KHz)) is constantly monitored by experts to detect problem or misbehavior. This requires strong competence and experience in understanding and discovering problems and root causes, and often the meaningful information is not in the single message or update, but in the aggregated behavior in a certain time-line. The AAL project is meant at reducing the man power needs and at assuring a constant high quality of problem detection by automating most of the monitoring tasks and providing real-time correlation of data-taking and system metrics. This project combines technologies coming from different disciplines, in particular it leverages on an Event Driven Architecture to unify the flow of data from the ATLAS infrastructure, on a Complex Event Processing (CEP) engine for correlation of events and on a message oriented architecture for components integration. The project is composed of 2 main components: a core processing engine, responsible for correlation of events through expert-defined queries and a web based front-end to present real-time information and interact with the system. All components works in a loose-coupled event based architecture, with a message broker to centralize all communication between modules. The result is an intelligent system able to extract and compute relevant information from the flow of operational data to provide real-time feedback to human experts who can promptly react when needed. The paper presents the design and implementation of the AAL project, together with the results of its usage as automated monitoring assistant for the ATLAS data taking infrastructure.

  9. Vista-LA: Mapping methane-emitting infrastructure in the Los Angeles megacity

    NASA Astrophysics Data System (ADS)

    Carranza, Valerie; Rafiq, Talha; Frausto-Vicencio, Isis; Hopkins, Francesca M.; Verhulst, Kristal R.; Rao, Preeti; Duren, Riley M.; Miller, Charles E.

    2018-03-01

    Methane (CH4) is a potent greenhouse gas (GHG) and a critical target of climate mitigation efforts. However, actionable emission reduction efforts are complicated by large uncertainties in the methane budget on relevant scales. Here, we present Vista, a Geographic Information System (GIS)-based approach to map potential methane emissions sources in the South Coast Air Basin (SoCAB) that encompasses Los Angeles, an area with a dense, complex mixture of methane sources. The goal of this work is to provide a database that, together with atmospheric observations, improves methane emissions estimates in urban areas with complex infrastructure. We aggregated methane source location information into three sectors (energy, agriculture, and waste) following the frameworks used by the State of California GHG Inventory and the Intergovernmental Panel on Climate Change (IPCC) Guidelines for GHG Reporting. Geospatial modeling was applied to publicly available datasets to precisely geolocate facilities and infrastructure comprising major anthropogenic methane source sectors. The final database, Vista-Los Angeles (Vista-LA), is presented as maps of infrastructure known or expected to emit CH4. Vista-LA contains over 33 000 features concentrated on < 1 % of land area in the region. Currently, Vista-LA is used as a planning and analysis tool for atmospheric measurement surveys of methane sources, particularly for airborne remote sensing, and methane hotspot detection using regional observations. This study represents a first step towards developing an accurate, spatially resolved methane flux estimate for point sources in SoCAB, with the potential to address discrepancies between bottom-up and top-down methane emissions accounting in this region. The Vista-LA datasets and associated metadata are available from the Oak Ridge National Laboratory Distributed Active Archive Center for Biogeochemical Dynamics (ORNL DAAC; https://doi.org/10.3334/ORNLDAAC/1525).

  10. Applications of the pipeline environment for visual informatics and genomics computations

    PubMed Central

    2011-01-01

    Background Contemporary informatics and genomics research require efficient, flexible and robust management of large heterogeneous data, advanced computational tools, powerful visualization, reliable hardware infrastructure, interoperability of computational resources, and detailed data and analysis-protocol provenance. The Pipeline is a client-server distributed computational environment that facilitates the visual graphical construction, execution, monitoring, validation and dissemination of advanced data analysis protocols. Results This paper reports on the applications of the LONI Pipeline environment to address two informatics challenges - graphical management of diverse genomics tools, and the interoperability of informatics software. Specifically, this manuscript presents the concrete details of deploying general informatics suites and individual software tools to new hardware infrastructures, the design, validation and execution of new visual analysis protocols via the Pipeline graphical interface, and integration of diverse informatics tools via the Pipeline eXtensible Markup Language syntax. We demonstrate each of these processes using several established informatics packages (e.g., miBLAST, EMBOSS, mrFAST, GWASS, MAQ, SAMtools, Bowtie) for basic local sequence alignment and search, molecular biology data analysis, and genome-wide association studies. These examples demonstrate the power of the Pipeline graphical workflow environment to enable integration of bioinformatics resources which provide a well-defined syntax for dynamic specification of the input/output parameters and the run-time execution controls. Conclusions The LONI Pipeline environment http://pipeline.loni.ucla.edu provides a flexible graphical infrastructure for efficient biomedical computing and distributed informatics research. The interactive Pipeline resource manager enables the utilization and interoperability of diverse types of informatics resources. The Pipeline client-server model provides computational power to a broad spectrum of informatics investigators - experienced developers and novice users, user with or without access to advanced computational-resources (e.g., Grid, data), as well as basic and translational scientists. The open development, validation and dissemination of computational networks (pipeline workflows) facilitates the sharing of knowledge, tools, protocols and best practices, and enables the unbiased validation and replication of scientific findings by the entire community. PMID:21791102

  11. Technology Assessment On Stressor Impacts to Green Infrastructure BMP Performance, Monitoring, and Integration (Cincinnati, OH)

    EPA Science Inventory

    This poster presentation will document, benchmark and evaluate state-of-the-science research and implementation on BMP performance, monitoring and integration for green infrastructure applications, to manage wet weather flow, storm-water runoff stressor relief and remedial sustai...

  12. Cloud Environment Automation: from infrastructure deployment to application monitoring

    NASA Astrophysics Data System (ADS)

    Aiftimiei, C.; Costantini, A.; Bucchi, R.; Italiano, A.; Michelotto, D.; Panella, M.; Pergolesi, M.; Saletta, M.; Traldi, S.; Vistoli, C.; Zizzi, G.; Salomoni, D.

    2017-10-01

    The potential offered by the cloud paradigm is often limited by technical issues, rules and regulations. In particular, the activities related to the design and deployment of the Infrastructure as a Service (IaaS) cloud layer can be difficult to apply and time-consuming for the infrastructure maintainers. In this paper the research activity, carried out during the Open City Platform (OCP) research project [1], aimed at designing and developing an automatic tool for cloud-based IaaS deployment is presented. Open City Platform is an industrial research project funded by the Italian Ministry of University and Research (MIUR), started in 2014. It intends to research, develop and test new technological solutions open, interoperable and usable on-demand in the field of Cloud Computing, along with new sustainable organizational models that can be deployed for and adopted by the Public Administrations (PA). The presented work and the related outcomes are aimed at simplifying the deployment and maintenance of a complete IaaS cloud-based infrastructure.

  13. Structural Monitoring of Metro Infrastructure during Shield Tunneling Construction

    PubMed Central

    Ran, L.; Ye, X. W.; Ming, G.; Dong, X. B.

    2014-01-01

    Shield tunneling construction of metro infrastructure will continuously disturb the soils. The ground surface will be subjected to uplift or subsidence due to the deep excavation and the extrusion and consolidation of the soils. Implementation of the simultaneous monitoring with the shield tunnel construction will provide an effective reference in controlling the shield driving, while how to design and implement a safe, economic, and effective structural monitoring system for metro infrastructure is of great importance and necessity. This paper presents the general architecture of the shield construction of metro tunnels as well as the procedure of the artificial ground freezing construction of the metro-tunnel cross-passages. The design principles for metro infrastructure monitoring of the shield tunnel intervals in the Hangzhou Metro Line 1 are introduced. The detailed monitoring items and the specified alarming indices for construction monitoring of the shield tunneling are addressed, and the measured settlement variations at different monitoring locations are also presented. PMID:25032238

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Youssef, Tarek A.; Elsayed, Ahmed T.; Mohammed, Osama A.

    This study presents the design and implementation of a communication and control infrastructure for smart grid operation. The proposed infrastructure enhances the reliability of the measurements and control network. The advantages of utilizing the data-centric over message-centric communication approach are discussed in the context of smart grid applications. The data distribution service (DDS) is used to implement a data-centric common data bus for the smart grid. This common data bus improves the communication reliability, enabling distributed control and smart load management. These enhancements are achieved by avoiding a single point of failure while enabling peer-to-peer communication and an automatic discoverymore » feature for dynamic participating nodes. The infrastructure and ideas presented in this paper were implemented and tested on the smart grid testbed. A toolbox and application programing interface for the testbed infrastructure are developed in order to facilitate interoperability and remote access to the testbed. This interface allows control, monitoring, and performing of experiments remotely. Furthermore, it could be used to integrate multidisciplinary testbeds to study complex cyber-physical systems (CPS).« less

  15. Network and computing infrastructure for scientific applications in Georgia

    NASA Astrophysics Data System (ADS)

    Kvatadze, R.; Modebadze, Z.

    2016-09-01

    Status of network and computing infrastructure and available services for research and education community of Georgia are presented. Research and Educational Networking Association - GRENA provides the following network services: Internet connectivity, network services, cyber security, technical support, etc. Computing resources used by the research teams are located at GRENA and at major state universities. GE-01-GRENA site is included in European Grid infrastructure. Paper also contains information about programs of Learning Center and research and development projects in which GRENA is participating.

  16. Critical Infrastructure Protection- Los Alamos National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bofman, Ryan K.

    Los Alamos National Laboratory (LANL) has been a key facet of Critical National Infrastructure since the nuclear bombing of Hiroshima exposed the nature of the Laboratory’s work in 1945. Common knowledge of the nature of sensitive information contained here presents a necessity to protect this critical infrastructure as a matter of national security. This protection occurs in multiple forms beginning with physical security, followed by cybersecurity, safeguarding of classified information, and concluded by the missions of the National Nuclear Security Administration.

  17. Mesoscale carbon sequestration site screening and CCS infrastructure analysis.

    PubMed

    Keating, Gordon N; Middleton, Richard S; Stauffer, Philip H; Viswanathan, Hari S; Letellier, Bruce C; Pasqualini, Donatella; Pawar, Rajesh J; Wolfsberg, Andrew V

    2011-01-01

    We explore carbon capture and sequestration (CCS) at the meso-scale, a level of study between regional carbon accounting and highly detailed reservoir models for individual sites. We develop an approach to CO(2) sequestration site screening for industries or energy development policies that involves identification of appropriate sequestration basin, analysis of geologic formations, definition of surface sites, design of infrastructure, and analysis of CO(2) transport and storage costs. Our case study involves carbon management for potential oil shale development in the Piceance-Uinta Basin, CO and UT. This study uses new capabilities of the CO(2)-PENS model for site screening, including reservoir capacity, injectivity, and cost calculations for simple reservoirs at multiple sites. We couple this with a model of optimized source-sink-network infrastructure (SimCCS) to design pipeline networks and minimize CCS cost for a given industry or region. The CLEAR(uff) dynamical assessment model calculates the CO(2) source term for various oil production levels. Nine sites in a 13,300 km(2) area have the capacity to store 6.5 GtCO(2), corresponding to shale-oil production of 1.3 Mbbl/day for 50 years (about 1/4 of U.S. crude oil production). Our results highlight the complex, nonlinear relationship between the spatial deployment of CCS infrastructure and the oil-shale production rate.

  18. Hurricane Harvey Rainfall, Did It Exceed PMP and What are the Implications?

    NASA Astrophysics Data System (ADS)

    Kappel, B.; Hultstrand, D.; Muhlestein, G.

    2017-12-01

    Rainfall resulting from Hurricane Harvey reached historic levels over the coastal regions of Texas and Louisiana during the last week of August 2017. Although extreme rainfall from this landfalling tropical system is not uncommon in the region, Harvey was unique in that it persisted over the same general location for several days, producing volumes of rainfall not previously observed in the United States. Devastating flooding and severe stress to infrastructure in the region was the result. Coincidentally, Applied Weather Associates had recently completed an updated statewide Probable Maximum Precipitation (PMP) study for Texas. This storm proved to be a real-time test of the adequacy of those values. AWA calculates PMP following a storm-based approach. This same approach was use in the HMRs. Therefore inclusion of all PMP-type storms is critically important to ensuring that appropriate PMP values are produced. This presentation will discuss the analysis of the Harvey rainfall using the Storm Precipitation Analysis System (SPAS) program used to analyze all storms used in PMP development, compare the results of the Harvey rainfall analysis against previous similar storms, and provide comparisons of the Harvey rainfall against previous and current PMP depths. Discussion will be included regarding the implications of the storm on previous and future PMP estimates, dam safety design, and infrastructure vulnerable to extreme flooding.

  19. Current status of the EPOS WG4 - GNSS and Other Geodetic Data

    NASA Astrophysics Data System (ADS)

    Fernandes, Rui; Bastos, Luísa; Bruyninx, Carine; D'Agostino, Nicola; Dousa, Jan; Ganas, Athanassios; Lidberg, Martin; Nocquet, Jean-Mathieu

    2013-04-01

    WG4 - "EPOS Geodetic Data and Other Geodetic Data" is the Working Group of the EPOS project in charge of defining and preparing the integration of the existing Pan-European Geodetic Infrastructures that will support the European Geosciences, which is the ultimate goal of the EPOS project. The WG4 is formed by representatives of the participating EPOS countries (23) but it is also open to the entire geodetic community. In fact, WG4 also includes members from countries that formally are not part of the current phase of EPOS. In an ongoing effort, the majority of existing GNSS Research Infrastructures in Europe were identified. The current database, available at http://epos-couch.cloudant.com/epos-couch/_design/epos-couch/, lists a total of 50 Research Infrastructures managing a total of 1534 GNSS CORS sites. This presentation intends to detail the work being produced within the working group WG4 related with the definition of strategies towards the implementation of the best solutions that will permit to the end-users, and in particular geo-scientists, to access the geodetic data, derived solutions, and associated metadata using transparent and uniform processes. The first step toward the design of an implementation and business plan is the definition of the core services for geodetic data within EPOS. In this talk, we will present the current status of the discussion about the content of core services. Three levels of core services could be distinguished, for which their content need to be defined. The 3 levels are: (1) the core services associated to data (diffusion, archive, long-term preservation, quality check, rapid analysis) (2) core services associated to geodetic products (analysis, products definition like position time series, velocity field and Zenithal Total Delay) (3) User oriented services (reference frames, real-time solutions for early warning systems, strain rate maps, meteorology, space weather, …). Current propositions and remaining open questions will be discussed.

  20. A Cloud-based Infrastructure and Architecture for Environmental System Research

    NASA Astrophysics Data System (ADS)

    Wang, D.; Wei, Y.; Shankar, M.; Quigley, J.; Wilson, B. E.

    2016-12-01

    The present availability of high-capacity networks, low-cost computers and storage devices, and the widespread adoption of hardware virtualization and service-oriented architecture provide a great opportunity to enable data and computing infrastructure sharing between closely related research activities. By taking advantage of these approaches, along with the world-class high computing and data infrastructure located at Oak Ridge National Laboratory, a cloud-based infrastructure and architecture has been developed to efficiently deliver essential data and informatics service and utilities to the environmental system research community, and will provide unique capabilities that allows terrestrial ecosystem research projects to share their software utilities (tools), data and even data submission workflow in a straightforward fashion. The infrastructure will minimize large disruptions from current project-based data submission workflows for better acceptances from existing projects, since many ecosystem research projects already have their own requirements or preferences for data submission and collection. The infrastructure will eliminate scalability problems with current project silos by provide unified data services and infrastructure. The Infrastructure consists of two key components (1) a collection of configurable virtual computing environments and user management systems that expedite data submission and collection from environmental system research community, and (2) scalable data management services and system, originated and development by ORNL data centers.

  1. Exposure of coastal built assets in the South Pacific to climate risks

    NASA Astrophysics Data System (ADS)

    Kumar, Lalit; Taylor, Subhashni

    2015-11-01

    Pacific island countries (PICs) are situated in a highly dynamic ocean-atmosphere interface, are dispersed over a large ocean area, and have highly populated urban centres located on the coastal margin. The built infrastructure associated with urban centres is also located within close proximity to the coastlines, exposing such infrastructure to a variety of natural and climate change-related hazards. In this research we undertake a comprehensive analysis of the exposure of built infrastructure assets to climate risk for 12 PICs. We show that 57% of the assessed built infrastructure for the 12 PICs is located within 500 m of their coastlines, amounting to a total replacement value of US$21.9 billion. Eight of the 12 PICs have 50% or more of their built infrastructure located within 500 m of their coastlines. In particular, Kiribati, Marshall Islands and Tuvalu have over 95% of their built infrastructure located within 500 m of their coastlines. Coastal adaptation costs will require substantial financial resources, which may not be available in developing countries such as the PICs, leaving them to face very high impacts but lacking the adaptive capacity.

  2. Evaluating the Benefits of Adaptation of Critical Infrastructures to Hydrometeorological Risks.

    PubMed

    Thacker, Scott; Kelly, Scott; Pant, Raghav; Hall, Jim W

    2018-01-01

    Infrastructure adaptation measures provide a practical way to reduce the risk from extreme hydrometeorological hazards, such as floods and windstorms. The benefit of adapting infrastructure assets is evaluated as the reduction in risk relative to the "do nothing" case. However, evaluating the full benefits of risk reduction is challenging because of the complexity of the systems, the scarcity of data, and the uncertainty of future climatic changes. We address this challenge by integrating methods from the study of climate adaptation, infrastructure systems, and complex networks. In doing so, we outline an infrastructure risk assessment that incorporates interdependence, user demands, and potential failure-related economic losses. Individual infrastructure assets are intersected with probabilistic hazard maps to calculate expected annual damages. Protection measure costs are integrated to calculate risk reduction and associated discounted benefits, which are used to explore the business case for investment in adaptation. A demonstration of the methodology is provided for flood protection of major electricity substations in England and Wales. We conclude that the ongoing adaptation program for major electricity assets is highly cost beneficial. © 2017 Society for Risk Analysis.

  3. Fostering incidental experiences of nature through green infrastructure planning.

    PubMed

    Beery, Thomas H; Raymond, Christopher M; Kyttä, Marketta; Olafsson, Anton Stahl; Plieninger, Tobias; Sandberg, Mattias; Stenseke, Marie; Tengö, Maria; Jönsson, K Ingemar

    2017-11-01

    Concern for a diminished human experience of nature and subsequent decreased human well-being is addressed via a consideration of green infrastructure's potential to facilitate unplanned or incidental nature experience. Incidental nature experience is conceptualized and illustrated in order to consider this seldom addressed aspect of human interaction with nature in green infrastructure planning. Special attention has been paid to the ability of incidental nature experience to redirect attention from a primary activity toward an unplanned focus (in this case, nature phenomena). The value of such experience for human well-being is considered. The role of green infrastructure to provide the opportunity for incidental nature experience may serve as a nudge or guide toward meaningful interaction. These ideas are explored using examples of green infrastructure design in two Nordic municipalities: Kristianstad, Sweden, and Copenhagen, Denmark. The outcome of the case study analysis coupled with the review of literature is a set of sample recommendations for how green infrastructure can be designed to support a range of incidental nature experiences with the potential to support human well-being.

  4. 75 FR 16080 - Notice of Intent To Prepare an Environmental Impact Statement for Basewide Water Infrastructure...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-31

    ...- design and to develop additional alternatives for analysis. These two water infrastructure projects are... carbon, and reverse osmosis. The facility would be designed in modular form for ease of expandability... lighting, asphalt pavement, and pavement marking and signs. The project includes ``100-year storm'' flood...

  5. Cost Comparison of Conventional Gray Combined Sewer Overflow Control Infrastructure versus a Green/Gray Combination

    EPA Science Inventory

    This paper outlines a life-cycle cost analysis comparing a green (rain gardens) and gray (tunnels) infrastructure combination to a gray-only option to control combined sewer overflow in the Turkey Creek Combined Sewer Overflow Basin, in Kansas City, MO. The plan area of this Bas...

  6. Incorporating intelligent transportation systems into planning analysis : summary of key findings from a 2020 case study -- improving travel time reliability with ITS

    DOT National Transportation Integrated Search

    2002-05-01

    ITS is typically considered an operational detail to be worked out after infrastructure planning is complete. This approach ignores the potential for the introduction of ITS to change the decisions made during infrastructure planning, or even the ove...

  7. Evaluating Effectiveness of Green Infrastructure Application of Stormwater Best Management Practices in Protecting Stream Habitat and Biotic Condition in New England

    EPA Science Inventory

    The US EPA is developing assessment tools to evaluate the effectiveness of green infrastructure (GI) applied in stormwater best management practices (BMPs) at the small watershed (HUC12 or finer) scale. Based on analysis of historical monitoring data using boosted regression tre...

  8. Modeling the impacts of green infrastructure land use changes on air quality and meteorology case study and sensitivity analysis in Kansas City

    EPA Science Inventory

    Changes in vegetation cover associated with urban planning efforts may affect regional meteorology and air quality. Here we use a comprehensive coupled meteorology-air quality model (WRF-CMAQ) to simulate the influence of planned land use changes from green infrastructure impleme...

  9. The ELIXIR channel in F1000Research.

    PubMed

    Blomberg, Niklas; Oliveira, Arlindo; Mons, Barend; Persson, Bengt; Jonassen, Inge

    2015-01-01

    ELIXIR, the European life science infrastructure for biological information, is a unique initiative to consolidate Europe's national centres, services, and core bioinformatics resources into a single, coordinated infrastructure. ELIXIR brings together Europe's major life-science data archives and connects these with national bioinformatics infrastructures  - the ELIXIR Nodes. This editorial introduces the ELIXIR channel in F1000Research; the aim of the channel is to collect and present ELIXIR's scientific and operational output, engage with the broad life science community and encourage discussion on proposed infrastructure solutions. Submissions will be assessed by the ELIXIR channel Advisory Board to ensure they are relevant to ELIXIR community, and subjected to F1000Research open peer review process.

  10. The ELIXIR channel in F1000Research

    PubMed Central

    Blomberg, Niklas; Oliveira, Arlindo; Mons, Barend; Persson, Bengt; Jonassen, Inge

    2016-01-01

    ELIXIR, the European life science infrastructure for biological information, is a unique initiative to consolidate Europe’s national centres, services, and core bioinformatics resources into a single, coordinated infrastructure. ELIXIR brings together Europe’s major life-science data archives and connects these with national bioinformatics infrastructures  - the ELIXIR Nodes. This editorial introduces the ELIXIR channel in F1000Research; the aim of the channel is to collect and present ELIXIR’s scientific and operational output, engage with the broad life science community and encourage discussion on proposed infrastructure solutions. Submissions will be assessed by the ELIXIR channel Advisory Board to ensure they are relevant to ELIXIR community, and subjected to F1000Research open peer review process. PMID:26913192

  11. CanvasDB: a local database infrastructure for analysis of targeted- and whole genome re-sequencing projects

    PubMed Central

    Ameur, Adam; Bunikis, Ignas; Enroth, Stefan; Gyllensten, Ulf

    2014-01-01

    CanvasDB is an infrastructure for management and analysis of genetic variants from massively parallel sequencing (MPS) projects. The system stores SNP and indel calls in a local database, designed to handle very large datasets, to allow for rapid analysis using simple commands in R. Functional annotations are included in the system, making it suitable for direct identification of disease-causing mutations in human exome- (WES) or whole-genome sequencing (WGS) projects. The system has a built-in filtering function implemented to simultaneously take into account variant calls from all individual samples. This enables advanced comparative analysis of variant distribution between groups of samples, including detection of candidate causative mutations within family structures and genome-wide association by sequencing. In most cases, these analyses are executed within just a matter of seconds, even when there are several hundreds of samples and millions of variants in the database. We demonstrate the scalability of canvasDB by importing the individual variant calls from all 1092 individuals present in the 1000 Genomes Project into the system, over 4.4 billion SNPs and indels in total. Our results show that canvasDB makes it possible to perform advanced analyses of large-scale WGS projects on a local server. Database URL: https://github.com/UppsalaGenomeCenter/CanvasDB PMID:25281234

  12. CanvasDB: a local database infrastructure for analysis of targeted- and whole genome re-sequencing projects.

    PubMed

    Ameur, Adam; Bunikis, Ignas; Enroth, Stefan; Gyllensten, Ulf

    2014-01-01

    CanvasDB is an infrastructure for management and analysis of genetic variants from massively parallel sequencing (MPS) projects. The system stores SNP and indel calls in a local database, designed to handle very large datasets, to allow for rapid analysis using simple commands in R. Functional annotations are included in the system, making it suitable for direct identification of disease-causing mutations in human exome- (WES) or whole-genome sequencing (WGS) projects. The system has a built-in filtering function implemented to simultaneously take into account variant calls from all individual samples. This enables advanced comparative analysis of variant distribution between groups of samples, including detection of candidate causative mutations within family structures and genome-wide association by sequencing. In most cases, these analyses are executed within just a matter of seconds, even when there are several hundreds of samples and millions of variants in the database. We demonstrate the scalability of canvasDB by importing the individual variant calls from all 1092 individuals present in the 1000 Genomes Project into the system, over 4.4 billion SNPs and indels in total. Our results show that canvasDB makes it possible to perform advanced analyses of large-scale WGS projects on a local server. Database URL: https://github.com/UppsalaGenomeCenter/CanvasDB. © The Author(s) 2014. Published by Oxford University Press.

  13. Neural Network Based Intrusion Detection System for Critical Infrastructures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Todd Vollmer; Ondrej Linda; Milos Manic

    2009-07-01

    Resiliency and security in control systems such as SCADA and Nuclear plant’s in today’s world of hackers and malware are a relevant concern. Computer systems used within critical infrastructures to control physical functions are not immune to the threat of cyber attacks and may be potentially vulnerable. Tailoring an intrusion detection system to the specifics of critical infrastructures can significantly improve the security of such systems. The IDS-NNM – Intrusion Detection System using Neural Network based Modeling, is presented in this paper. The main contributions of this work are: 1) the use and analyses of real network data (data recordedmore » from an existing critical infrastructure); 2) the development of a specific window based feature extraction technique; 3) the construction of training dataset using randomly generated intrusion vectors; 4) the use of a combination of two neural network learning algorithms – the Error-Back Propagation and Levenberg-Marquardt, for normal behavior modeling. The presented algorithm was evaluated on previously unseen network data. The IDS-NNM algorithm proved to be capable of capturing all intrusion attempts presented in the network communication while not generating any false alerts.« less

  14. Identifying influential factors of business process performance using dependency analysis

    NASA Astrophysics Data System (ADS)

    Wetzstein, Branimir; Leitner, Philipp; Rosenberg, Florian; Dustdar, Schahram; Leymann, Frank

    2011-02-01

    We present a comprehensive framework for identifying influential factors of business process performance. In particular, our approach combines monitoring of process events and Quality of Service (QoS) measurements with dependency analysis to effectively identify influential factors. The framework uses data mining techniques to construct tree structures to represent dependencies of a key performance indicator (KPI) on process and QoS metrics. These dependency trees allow business analysts to determine how process KPIs depend on lower-level process metrics and QoS characteristics of the IT infrastructure. The structure of the dependencies enables a drill-down analysis of single factors of influence to gain a deeper knowledge why certain KPI targets are not met.

  15. Implementation status of the extreme light infrastructure - nuclear physics (ELI-NP) project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gales, S., E-mail: sydney.gales@eli-np.ro; Zamfir, N. V., E-mail: sydney.gales@eli-np.ro

    2015-02-24

    The Project Extreme Light Infrastructure (ELI) is part of the European Strategic Forum for Research Infrastructures (ESFRI) Roadmap. ELI will be built as a network of three complementary pillars at the frontier of laser technologies. The ELI-NP pillar (NP for Nuclear Physics) is under construction near Bucharest (Romania) and will develop a scientific program using two 10 PW lasers and a Compton back-scattering high-brilliance and intense gamma beam, a marriage of laser and accelerator technology at the frontier of knowledge. In the present paper, the technical description of the facility, the present status of the project as well as themore » science, applications and future perspectives will be discussed.« less

  16. Benefits and Challenges of Linking Green Infrastructure and Highway Planning in the United States

    NASA Astrophysics Data System (ADS)

    Marcucci, Daniel J.; Jordan, Lauren M.

    2013-01-01

    Landscape-level green infrastructure creates a network of natural and semi-natural areas that protects and enhances ecosystem services, regenerative capacities, and ecological dynamism over long timeframes. It can also enhance quality of life and certain economic activity. Highways create a network for moving goods and services efficiently, enabling commerce, and improving mobility. A fundamentally profound conflict exists between transportation planning and green infrastructure planning because they both seek to create connected, functioning networks across the same landscapes and regions, but transportation networks, especially in the form of highways, fragment and disconnect green infrastructure networks. A key opportunity has emerged in the United States during the last ten years with the promotion of measures to link transportation and environmental concerns. In this article we examined the potential benefits and challenges of linking landscape-level green infrastructure planning and implementation with integrated transportation planning and highway project development in the United States policy context. This was done by establishing a conceptual model that identified logical flow lines from planning to implementation as well as the potential interconnectors between green infrastructure and highway infrastructure. We analyzed the relationship of these activities through literature review, policy analysis, and a case study of a suburban Maryland, USA landscape. We found that regionally developed and adopted green infrastructure plans can be instrumental in creating more responsive regional transportation plans and streamlining the project environmental review process while enabling better outcomes by enabling more targeted mitigation. In order for benefits to occur, however, landscape-scale green infrastructure assessments and plans must be in place before integrated transportation planning and highway project development occurs. It is in the transportation community's interests to actively facilitate green infrastructure planning because it creates a more predictable environmental review context. On the other hand, for landscape-level green infrastructure, transportation planning and development is much more established and better funded and can provide a means of supporting green infrastructure planning and implementation, thereby enhancing conservation of ecological function.

  17. Benefits and challenges of linking green infrastructure and highway planning in the United States.

    PubMed

    Marcucci, Daniel J; Jordan, Lauren M

    2013-01-01

    Landscape-level green infrastructure creates a network of natural and semi-natural areas that protects and enhances ecosystem services, regenerative capacities, and ecological dynamism over long timeframes. It can also enhance quality of life and certain economic activity. Highways create a network for moving goods and services efficiently, enabling commerce, and improving mobility. A fundamentally profound conflict exists between transportation planning and green infrastructure planning because they both seek to create connected, functioning networks across the same landscapes and regions, but transportation networks, especially in the form of highways, fragment and disconnect green infrastructure networks. A key opportunity has emerged in the United States during the last ten years with the promotion of measures to link transportation and environmental concerns. In this article we examined the potential benefits and challenges of linking landscape-level green infrastructure planning and implementation with integrated transportation planning and highway project development in the United States policy context. This was done by establishing a conceptual model that identified logical flow lines from planning to implementation as well as the potential interconnectors between green infrastructure and highway infrastructure. We analyzed the relationship of these activities through literature review, policy analysis, and a case study of a suburban Maryland, USA landscape. We found that regionally developed and adopted green infrastructure plans can be instrumental in creating more responsive regional transportation plans and streamlining the project environmental review process while enabling better outcomes by enabling more targeted mitigation. In order for benefits to occur, however, landscape-scale green infrastructure assessments and plans must be in place before integrated transportation planning and highway project development occurs. It is in the transportation community's interests to actively facilitate green infrastructure planning because it creates a more predictable environmental review context. On the other hand, for landscape-level green infrastructure, transportation planning and development is much more established and better funded and can provide a means of supporting green infrastructure planning and implementation, thereby enhancing conservation of ecological function.

  18. The Communications Competitiveness and Infrastructure Modernization Act of 1991 (S. 1200).

    ERIC Educational Resources Information Center

    Sikes, Alfred C.; Verveer, Philip L.

    1992-01-01

    Two papers present arguments for and against the Communications Competitiveness and Infrastructure Act of 1991 (S. 1200). Topics addressed include earlier policy recommendations, competition in the telecommunications industry, benefits of video dial tone availability, telephone company participation in video services, restrictions on telephone…

  19. Role of EPA in Asset Management Research – The Aging Water Infrastructure Research Program

    EPA Science Inventory

    This slide presentation provides an overview of the EPA Office of Research and Development’s Aging Water infrastructure Research Program (AWIRP). The research program origins, goals, products, and plans are described. The research program focuses on four areas: condition asses...

  20. Research Challenges in Water Infrastructure Condition Assessment, Rehabilitation and System Optimization – The U.S. Perspective

    EPA Science Inventory

    This presentation first provides an overview of U.S.EPA research activities on water infrastructure condition assessment, system rehabilitation, and asset management. It then describes in detail specific activities in pipe leak detection, water conservation and the advanced wate...

  1. Educational Infrastructure Using Virtualization Technologies: Experience at Kaunas University of Technology

    ERIC Educational Resources Information Center

    Miseviciene, Regina; Ambraziene, Danute; Tuminauskas, Raimundas; Pažereckas, Nerijus

    2012-01-01

    Many factors influence education nowadays. Educational institutions are faced with budget cuttings, outdated IT, data security management and the willingness to integrate remote learning at home. Virtualization technologies provide innovative solutions to the problems. The paper presents an original educational infrastructure using virtualization…

  2. Ohio Department of Transportation State Infrastructure Bank Annual Financial Report : Federal Fiscal Year 2009

    DOT National Transportation Integrated Search

    2009-01-01

    The Ohio Department of Transportation is pleased to present the Federal : Fiscal Year (FFY) 2009 State Infrastructure Bank (SIB) Annual Financial : Report. : The portfolio of the FFY 2009 SIB had a total of nine loans totaling $9.0 : million and one ...

  3. An integrated approach to infrastructure.

    PubMed

    Hayes, Stewart

    2010-02-01

    In an edited version of a paper presented at the IHEA (Institute of Hospital Engineering Australia) 60th National Conference 2009, Stewart Hayes, principal consultant at Jakeman Business Solutions, argues that, with "traditional" means of purchasing and maintaining critical hospital infrastructure systems "becoming less viable", a more integrated, strategic approach to procuring and providing essential hospital services that looks not just to the present, but equally to the facility's anticipated future needs, is becoming ever more important.

  4. Urban underground infrastructure mapping and assessment

    NASA Astrophysics Data System (ADS)

    Huston, Dryver; Xia, Tian; Zhang, Yu; Fan, Taian; Orfeo, Dan; Razinger, Jonathan

    2017-04-01

    This paper outlines and discusses a few associated details of a smart cities approach to the mapping and condition assessment of urban underground infrastructure. Underground utilities are critical infrastructure for all modern cities. They carry drinking water, storm water, sewage, natural gas, electric power, telecommunications, steam, etc. In most cities, the underground infrastructure reflects the growth and history of the city. Many components are aging, in unknown locations with congested configurations, and in unknown condition. The technique uses sensing and information technology to determine the state of infrastructure and provide it in an appropriate, timely and secure format for managers, planners and users. The sensors include ground penetrating radar and buried sensors for persistent sensing of localized conditions. Signal processing and pattern recognition techniques convert the data in information-laden databases for use in analytics, graphical presentations, metering and planning. The presented data are from construction of the St. Paul St. CCTA Bus Station Project in Burlington, VT; utility replacement sites in Winooski, VT; and laboratory tests of smart phone position registration and magnetic signaling. The soil conditions encountered are favorable for GPR sensing and make it possible to locate buried pipes and soil layers. The present state of the art is that the data collection and processing procedures are manual and somewhat tedious, but that solutions for automating these procedures appear to be viable. Magnetic signaling with moving permanent magnets has the potential for sending lowfrequency telemetry signals through soils that are largely impenetrable by other electromagnetic waves.

  5. Upgrade of the cryogenic infrastructure of SM18, CERN main test facility for superconducting magnets and RF cavities

    NASA Astrophysics Data System (ADS)

    Perin, A.; Dhalla, F.; Gayet, P.; Serio, L.

    2017-12-01

    SM18 is CERN main facility for testing superconducting accelerator magnets and superconducting RF cavities. Its cryogenic infrastructure will have to be significantly upgraded in the coming years, starting in 2019, to meet the testing requirements for the LHC High Luminosity project and for the R&D program for superconducting magnets and RF equipment until 2023 and beyond. This article presents the assessment of the cryogenic needs based on the foreseen test program and on past testing experience. The current configuration of the cryogenic infrastructure is presented and several possible upgrade scenarios are discussed. The chosen upgrade configuration is then described and the characteristics of the main newly required cryogenic equipment, in particular a new 35 g/s helium liquefier, are presented. The upgrade implementation strategy and plan to meet the required schedule are then described.

  6. Cafe: A Generic Configurable Customizable Composite Cloud Application Framework

    NASA Astrophysics Data System (ADS)

    Mietzner, Ralph; Unger, Tobias; Leymann, Frank

    In this paper we present Cafe (Composite Application Framework) an approach to describe configurable composite service-oriented applications and to automatically provision them across different providers. Cafe enables independent software vendors to describe their composite service-oriented applications and the components that are used to assemble them. Components can be internal to the application or external and can be deployed in any of the delivery models present in the cloud. The components are annotated with requirements for the infrastructure they later need to be run on. Providers on the other hand advertise their infrastructure services by describing them as infrastructure capabilities. The separation of software vendors and providers enables end users and providers to follow a best-of-breed strategy by combining arbitrary applications with arbitrary providers. We show how such applications can be automatically provisioned and present an architecture and a prototype that implements the concepts.

  7. Barriers to the conduct of randomised clinical trials within all disease areas.

    PubMed

    Djurisic, Snezana; Rath, Ana; Gaber, Sabrina; Garattini, Silvio; Bertele, Vittorio; Ngwabyt, Sandra-Nadia; Hivert, Virginie; Neugebauer, Edmund A M; Laville, Martine; Hiesmayr, Michael; Demotes-Mainard, Jacques; Kubiak, Christine; Jakobsen, Janus C; Gluud, Christian

    2017-08-01

    Randomised clinical trials are key to advancing medical knowledge and to enhancing patient care, but major barriers to their conduct exist. The present paper presents some of these barriers. We performed systematic literature searches and internal European Clinical Research Infrastructure Network (ECRIN) communications during face-to-face meetings and telephone conferences from 2013 to 2017 within the context of the ECRIN Integrating Activity (ECRIN-IA) project. The following barriers to randomised clinical trials were identified: inadequate knowledge of clinical research and trial methodology; lack of funding; excessive monitoring; restrictive privacy law and lack of transparency; complex regulatory requirements; and inadequate infrastructures. There is a need for more pragmatic randomised clinical trials conducted with low risks of systematic and random errors, and multinational cooperation is essential. The present paper presents major barriers to randomised clinical trials. It also underlines the value of using a pan-European-distributed infrastructure to help investigators overcome barriers for multi-country trials in any disease area.

  8. Quantitative Analysis of the Educational Infrastructure in Colombia Through the Use of a Georeferencing Software and Analytic Hierarchy Process

    NASA Astrophysics Data System (ADS)

    Cala Estupiñan, Jose Luis; María González Bernal, Lina; Ponz Tienda, Jose Luis; Gutierrez Bucheli, Laura Andrea; Alejandro Arboleda, Carlos

    2017-10-01

    The distribution policies of the national budget have been showing an increasing trend of the investment in education infrastructure. This is the reason that makes it necessary to identify the territories with the greatest number of facilities (such as schools, colleges, universities and libraries) and those lacking this type of infrastructure, in order to know where a possible government intervention is required. This work is not intended to give a judgment on the qualitative state of the national infrastructure. It focuses, in terms of infrastructure, on Colombia’s quantitative status of the educational sector, by identifying the territories with more facilities, such as schools, colleges, universities and public libraries. To do this a quantitative index will be created to identify if the coverage of educational infrastructure at departmental level is enough, by taking into account not only the number of facilities, but also the population and the area of influence each one has. The above study is framed within a project of the University of the Andes called “visible Infrastructure”. The index is obtained through a hierarchical analytical process (AHP) and subsequently a linear equation that reflects the variables investigated. The validation of this index is performed through correlations and regressions of social, economic and cultural indicators determined by official entities. All the information on which the analysis is based is official and public. With the end of the armed conflict, it is necessary to focus the planning of public policies to heal the social gaps that the most vulnerable population needs.

  9. Large-scale parallel genome assembler over cloud computing environment.

    PubMed

    Das, Arghya Kusum; Koppa, Praveen Kumar; Goswami, Sayan; Platania, Richard; Park, Seung-Jong

    2017-06-01

    The size of high throughput DNA sequencing data has already reached the terabyte scale. To manage this huge volume of data, many downstream sequencing applications started using locality-based computing over different cloud infrastructures to take advantage of elastic (pay as you go) resources at a lower cost. However, the locality-based programming model (e.g. MapReduce) is relatively new. Consequently, developing scalable data-intensive bioinformatics applications using this model and understanding the hardware environment that these applications require for good performance, both require further research. In this paper, we present a de Bruijn graph oriented Parallel Giraph-based Genome Assembler (GiGA), as well as the hardware platform required for its optimal performance. GiGA uses the power of Hadoop (MapReduce) and Giraph (large-scale graph analysis) to achieve high scalability over hundreds of compute nodes by collocating the computation and data. GiGA achieves significantly higher scalability with competitive assembly quality compared to contemporary parallel assemblers (e.g. ABySS and Contrail) over traditional HPC cluster. Moreover, we show that the performance of GiGA is significantly improved by using an SSD-based private cloud infrastructure over traditional HPC cluster. We observe that the performance of GiGA on 256 cores of this SSD-based cloud infrastructure closely matches that of 512 cores of traditional HPC cluster.

  10. [Sanitation and racial inequality conditions in urban Brazil: an analysis focused on the indigenous population based on the 2010 Population Census].

    PubMed

    Raupp, Ludimila; Fávaro, Thatiana Regina; Cunha, Geraldo Marcelo; Santos, Ricardo Ventura

    2017-01-01

    The aims of this study were to analyze and describe the presence and infrastructure of basic sanitation in the urban areas of Brazil, contrasting indigenous with non-indigenous households. Methods: A cross-sectional study based on microdata from the 2010 Census was conducted. The analyses were based on descriptive statistics (prevalence) and the construction of multiple logistic regression models (adjusted by socioeconomic and demographic covariates). The odds ratios were estimated for the association between the explanatory variables (covariates) and the outcome variables (water supply, sewage, garbage collection, and adequate sanitation). The statistical significance level established was 5%. Among the analyzed services, sewage proved to be the most precarious. Regarding race or color, indigenous households presented the lowest rate of sanitary infrastructure in Urban Brazil. The adjusted regression showed that, in general, indigenous households were at a disadvantage when compared to other categories of race or color, especially in terms of the presence of garbage collection services. These inequalities were much more pronounced in the South and Southeastern regions. The analyses of this study not only confirm the profile of poor conditions and infrastructure of the basic sanitation of indigenous households in urban areas, but also demonstrate the persistence of inequalities associated with race or color in the country.

  11. Role of Housing Reconstruction Between Years 2000 - 2014 in Merging of Urban Structure of The North East Wrocław / Rola Zabudowy Mieszkaniowej z Lat 2000-2014 w Scalaniu Struktury Urbanistycznej Północno-Wschodniego Wrocławia

    NASA Astrophysics Data System (ADS)

    Masztalski, Robert; Michalski, Marcin

    2016-03-01

    The article presents the urban structure of the North-East of Wroclaw, where in the vicinity of the historic buildings and residential buildings from the 70s of the last century, and between, in the last 20 years there were built new buildings, as binding material of the urban structure. The new multifamily housing development of years 2000-2014 of Psie Pole as a housing development in Wroclaw, closes the gap between the historic district residential buildings in the old Psie Pole and the buildings of large slabs of the 70s of the twentieth century. The contemporary residential development uses the existing social infrastructure centre of the old Psie Pole district, and also the social infrastructure of the housing development of 70s of the twentieth century. The authors analyze, in the first part, the spatial development of these areas on the basis of historical materials. In the following, based on an analysis of urban structure created in the last 15 years of development, analyze existing conditions, context and value (in terms of urban planning - the wealth of social infrastructure), the contemporary housing development of Psie Pole.

  12. Landfills as critical infrastructures: analysis of observational datasets after 12 years of non-invasive monitoring

    NASA Astrophysics Data System (ADS)

    Scozzari, Andrea; Raco, Brunella; Battaglini, Raffaele

    2016-04-01

    This work presents the results of more than ten years of observations, performed on a regular basis, on a municipal solid waste disposal located in Italy. Observational data are generated by the combination of non-invasive techniques, involving the direct measurement of biogas release to the atmosphere and thermal infrared imaging. In fact, part of the generated biogas tends to escape from the landfill surface even when collecting systems are installed and properly working. Thus, methodologies for estimating the behaviour of a landfill system by means of direct and/or indirect measurement systems have been developed in the last decades. It is nowadays known that these infrastructures produce more than 20% of the total anthropogenic methane released to the atmosphere, justifying the need for a systematic and efficient monitoring of such infrastructures. During the last 12 years, observational data regarding a solid waste disposal site located in Tuscany (Italy) have been collected on a regular basis. The collected datasets consist in direct measurements of gas flux with the accumulation chamber method, combined with the detection of thermal anomalies by infrared radiometry. This work discusses the evolution of the estimated performance of the landfill system, its trends, the benefits and the critical aspects of such relatively long-term monitoring activity.

  13. A Watershed Scale Life Cycle Assessment Framework for Hydrologic Design

    NASA Astrophysics Data System (ADS)

    Tavakol-Davani, H.; Tavakol-Davani, PhD, H.; Burian, S. J.

    2017-12-01

    Sustainable hydrologic design has received attention from researchers with different backgrounds, including hydrologists and sustainability experts, recently. On one hand, hydrologists have been analyzing ways to achieve hydrologic goals through implementation of recent environmentally-friendly approaches, e.g. Green Infrastructure (GI) - without quantifying the life cycle environmental impacts of the infrastructure through the ISO Life Cycle Assessment (LCA) method. On the other hand, sustainability experts have been applying the LCA to study the life cycle impacts of water infrastructure - without considering the important hydrologic aspects through hydrologic and hydraulic (H&H) analysis. In fact, defining proper system elements for a watershed scale urban water sustainability study requires both H&H and LCA specialties, which reveals the necessity of performing an integrated, interdisciplinary study. Therefore, the present study developed a watershed scale coupled H&H-LCA framework to bring the hydrology and sustainability expertise together to contribute moving the current wage definition of sustainable hydrologic design towards onto a globally standard concept. The proposed framework was employed to study GIs for an urban watershed in Toledo, OH. Lastly, uncertainties associated with the proposed method and parameters were analyzed through a robust Monte Carlo simulation using parallel processing. Results indicated the necessity of both hydrologic and LCA components in the design procedure in order to achieve sustainability.

  14. Synergizing green and gray infrastructures to increase water supply resilience in the Brazos River basin in Texas

    NASA Astrophysics Data System (ADS)

    Gao, H.; Yamazaki, D.; Finley, T.; Bohn, T. J.; Low, G.; Sabo, J. L.

    2017-12-01

    Water infrastructure lies at the heart of the challenges and opportunities of Integrated Water Resource Management (IWRM). Green infrastructure (e.g., wetlands restoration) presents an alternative to its hard-path counterpart - gray infrastructure, which often has external, economic and unmeasured ecological costs. But the science framework to prioritize green infrastructure buildout is nascent. In this study, we addressed this gap in Brazos River basin in Texas, in the context of corporate decisions to secure water supplies for various water stewardship objectives. We developed a physically-based tool to quantify the potential for wetland restoration to restore desired flows (hydrology), and a financial framework for comparing its cost-benefit with heightening an existing dam (conservation finance). Our framework has three components. First, we harnessed a topographic index (HAND) to identify the potential wetlands sites. Second, we coupled a land surface model (VIC) with a hydrodynamic model (CaMa-Flood) to investigate the effects of wetland size, location, and vegetation on hydrology. Finally, we estimated the net present value, indirect rate of return and payback period for green (wetlands) vs. gray (reservoir expansion) infrastructure. We found wetlands have more substantial impact on peak flow than baseflow. Interestingly, wetlands can improve baseflow reliability but not directly except with the largest (>400 km2) projects. Peak flow reduction volumes of wetlands if used as credits towards reservoir flood-control storage provide adequate conservation storage to deliver guaranteed reliability of baseflow. Hence, the synergy of existing dams with newly created wetlands offers a promising natural solution to increase water supply resilience, while green projects also generate revenue compared to their gray counterparts. This study demonstrates the possibility of using innovative engineering design to synergize green and gray infrastructures to convert water conflict to opportunities.

  15. Connecting the Empire: New Research Perspectives on Infrastructures and the Environment in the (Post)Colonial World.

    PubMed

    van der Straeten, Jonas; Hasenöhrl, Ute

    2016-12-01

    In the academic debate on infrastructures in the Global South, there is a broad consensus that (post)colonial legacies present a major challenge for a transition towards more inclusive, sustainable and adapted modes of providing services. Yet, relatively little is known about the emergence and evolution of infrastructures in former colonies. Until a decade ago, most historical studies followed Daniel Headrick's (1981) "tools of empire" thesis, painting-with broad brush strokes-a picture of infrastructures as instruments for advancing the colonial project of exploitation and subordination of non-European peoples and environments. This paper explores new research perspectives beyond this straightforward, 'diffusionist' perspective on technology transfer. In order to do so, it presents and discusses more recent studies which focus on interactive transfer processes as well as mechanisms of appropriation, and which increasingly combine approaches from imperial history, environmental history, and history of technology.There is much to gain from unpacking the changing motives and ideologies behind technology transfer; tracing the often contested and negotiated flows of ideas, technologies and knowledge within multilayered global networks; investigating the manifold ways in which infrastructures reflected and (re)produced colonial spaces and identities; critically reflecting on the utility of large (socio)technical systems (LTS) for the Global South; and approaching infrastructures in the (post)colonial world through entangled histories of technology and the environment. Following David Arnold's (2005) plea for a "more interactive, culturally-nuanced, multi-sited debate" on technology in the non-Western world, the paper offers fresh insights for a broader debate about how infrastructures work within specific parameters of time, place and culture.

  16. Current and future flood risk to railway infrastructure in Europe

    NASA Astrophysics Data System (ADS)

    Bubeck, Philip; Kellermann, Patric; Alfieri, Lorenzo; Feyen, Luc; Dillenardt, Lisa; Thieken, Annegret H.

    2017-04-01

    Railway infrastructure plays an important role in the transportation of freight and passengers across the European Union. According to Eurostat, more than four billion passenger-kilometres were travelled on national and international railway lines of the EU28 in 2014. To further strengthen transport infrastructure in Europe, the European Commission will invest another € 24.05 billion in the transnational transport network until 2020 as part of its new transport infrastructure policy (TEN-T), including railway infrastructure. Floods pose a significant risk to infrastructure elements. Damage data of recent flood events in Europe show that infrastructure losses can make up a considerable share of overall losses. For example, damage to state and municipal infrastructure in the federal state of Saxony (Germany) accounted for nearly 60% of overall losses during the large-scale event in June 2013. Especially in mountainous areas with little usable space available, roads and railway lines often follow floodplains or are located along steep and unsteady slopes. In Austria, for instance, the flood of 2013 caused € 75 million of direct damage to railway infrastructure. Despite the importance of railway infrastructure and its exposure to flooding, assessments of potential damage and risk (i.e. probability * damage) are still in its infancy compared with other sectors, such as the residential or industrial sector. Infrastructure-specific assessments at the regional scale are largely lacking. Regional assessment of potential damage to railway infrastructure has been hampered by a lack of infrastructure-specific damage models and data availability. The few available regional approaches have used damage models that assess damage to various infrastructure elements (e.g. roads, railway, airports and harbours) using one aggregated damage function and cost estimate. Moreover, infrastructure elements are often considerably underrepresented in regional land cover data, such as CORINE, due to their line shapes. To assess current and future damage and risk to railway infrastructure in Europe, we apply the damage model RAIL -' RAilway Infrastructure Loss' that was specifically developed for railway infrastructure using empirical damage data. To adequately and comprehensively capture the line-shaped features of railway infrastructure, the assessment makes use of the open-access data set of openrailway.org. Current and future flood hazard in Europe is obtained with the LISFLOOD-based pan-European flood hazard mapping procedure combined with ensemble projections of extreme streamflow for the current century based on EURO-CORDEX RCP 8.5 climate scenarios. The presentation shows first results of the combination of the hazard data and the model RAIL for Europe.

  17. Explorations Around "Graceful Failure" in Transportation Infrastructure: Lessons Learned By the Infrastructure and Climate Network (ICNet)

    NASA Astrophysics Data System (ADS)

    Jacobs, J. M.; Thomas, N.; Mo, W.; Kirshen, P. H.; Douglas, E. M.; Daniel, J.; Bell, E.; Friess, L.; Mallick, R.; Kartez, J.; Hayhoe, K.; Croope, S.

    2014-12-01

    Recent events have demonstrated that the United States' transportation infrastructure is highly vulnerable to extreme weather events which will likely increase in the future. In light of the 60% shortfall of the $900 billion investment needed over the next five years to maintain this aging infrastructure, hardening of all infrastructures is unlikely. Alternative strategies are needed to ensure that critical aspects of the transportation network are maintained during climate extremes. Preliminary concepts around multi-tier service expectations of bridges and roads with reference to network capacity will be presented. Drawing from recent flooding events across the U.S., specific examples for roads/pavement will be used to illustrate impacts, disruptions, and trade-offs between performance during events and subsequent damage. This talk will also address policy and cultural norms within the civil engineering practice that will likely challenge the application of graceful failure pathways during extreme events.

  18. Women's health nursing in the context of the National Health Information Infrastructure.

    PubMed

    Jenkins, Melinda L; Hewitt, Caroline; Bakken, Suzanne

    2006-01-01

    Nurses must be prepared to participate in the evolving National Health Information Infrastructure and the changes that will consequently occur in health care practice and documentation. Informatics technologies will be used to develop electronic health records with integrated decision support features that will likely lead to enhanced health care quality and safety. This paper provides a summary of the National Health Information Infrastructure and highlights electronic health records and decision support systems within the context of evidence-based practice. Activities at the Columbia University School of Nursing designed to prepare nurses with the necessary informatics competencies to practice in a National Health Information Infrastructure-enabled health care system are described. Data are presented from electronic (personal digital assistant) encounter logs used in our Women's Health Nurse Practitioner program to support evidence-based advanced practice nursing care. Implications for nursing practice, education, and research in the evolving National Health Information Infrastructure are discussed.

  19. Risk assessment of sewer condition using artificial intelligence tools: application to the SANEST sewer system.

    PubMed

    Sousa, V; Matos, J P; Almeida, N; Saldanha Matos, J

    2014-01-01

    Operation, maintenance and rehabilitation comprise the main concerns of wastewater infrastructure asset management. Given the nature of the service provided by a wastewater system and the characteristics of the supporting infrastructure, technical issues are relevant to support asset management decisions. In particular, in densely urbanized areas served by large, complex and aging sewer networks, the sustainability of the infrastructures largely depends on the implementation of an efficient asset management system. The efficiency of such a system may be enhanced with technical decision support tools. This paper describes the role of artificial intelligence tools such as artificial neural networks and support vector machines for assisting the planning of operation and maintenance activities of wastewater infrastructures. A case study of the application of this type of tool to the wastewater infrastructures of Sistema de Saneamento da Costa do Estoril is presented.

  20. A hybrid method for protection against threats to a network infrastructure for an electronic warfare management system

    NASA Astrophysics Data System (ADS)

    Byłak, Michał; RóŻański, Grzegorz

    2017-04-01

    The article presents the concept of ensuring the security of network information infrastructure for the management of Electronic Warfare (EW) systems. The concept takes into account the reactive and proactive tools against threats. An overview of the methods used to support the safety of IT networks and information sources about threats is presented. Integration of mechanisms that allow for effective intrusion detection and rapid response to threats in a network has been proposed. The architecture of the research environment is also presented.

  1. Arctic cities and climate change: climate-induced changes in stability of Russian urban infrastructure built on permafrost

    NASA Astrophysics Data System (ADS)

    Shiklomanov, Nikolay; Streletskiy, Dmitry; Swales, Timothy

    2014-05-01

    Planned socio-economic development during the Soviet period promoted migration into the Arctic and work force consolidation in urbanized settlements to support mineral resources extraction and transportation industries. These policies have resulted in very high level of urbanization in the Soviet Arctic. Despite the mass migration from the northern regions during the 1990s following the collapse of the Soviet Union and the diminishing government support, the Russian Arctic population remains predominantly urban. In five Russian Administrative regions underlined by permafrost and bordering the Arctic Ocean 66 to 82% (depending on region) of the total population is living in Soviet-era urban communities. The political, economic and demographic changes in the Russian Arctic over the last 20 years are further complicated by climate change which is greatly amplified in the Arctic region. One of the most significant impacts of climate change on arctic urban landscapes is the warming and degradation of permafrost which negatively affects the structural integrity of infrastructure. The majority of structures in the Russian Arctic are built according to the passive principle, which promotes equilibrium between the permafrost thermal regime and infrastructure foundations. This presentation is focused on quantitative assessment of potential changes in stability of Russian urban infrastructure built on permafrost in response to ongoing and future climatic changes using permafrost - geotechnical model forced by GCM-projected climate. To address the uncertainties in GCM projections we have utilized results from 6 models participated in most recent IPCC model inter-comparison project. The analysis was conducted for entire extent of Russian permafrost-affected area and on several representative urban communities. Our results demonstrate that significant observed reduction in urban infrastructure stability throughout the Russian Arctic can be attributed to climatic changes and that projected future climatic changes will further negatively affect communities on permafrost. However, the uncertainties in magnitude and spatial and temporal patterns of projected climate change produced by individual GCMs translate to substantial variability of the future state of infrastructure built on permafrost.

  2. Water Resources Sustainability in Northwest Mexico: Analysis of Regional Infrastructure Plans under Historical and Climate Change Scenarios

    NASA Astrophysics Data System (ADS)

    Che, D.; Robles-Morua, A.; Mayer, A. S.; Vivoni, E. R.

    2012-12-01

    The arid state of Sonora, Mexico, has embarked on a large water infrastructure project to provide additional water supply and improved sanitation to the growing capital of Hermosillo. The main component of the Sonora SI project involves an interbasin transfer from rural to urban water users that has generated conflicts over water among different social sectors. Through interactions with regional stakeholders from agricultural and water management agencies, we ascertained the need for a long-term assessment of the water resources of one of the system components, the Sonora River Basin (SRB). A semi-distributed, daily watershed model that includes current and proposed reservoir infrastructure was applied to the SRB. This simulation framework allowed us to explore alternative scenarios of water supply from the SRB to Hermosillo under historical (1980-2010) and future (2031-2040) periods that include the impact of climate change. We compared three precipitation forcing scenarios for the historical period: (1) a network of ground observations from Mexican water agencies; (2) gridded fields from the North America Land Data Assimilation System (NLDAS) at 12 km resolution; and (3) gridded fields from the Weather Research and Forecasting (WRF) model at 10 km resolution. These were compared to daily historical observations at two stream gauging stations and two reservoirs to generate confidence in the simulation tools. We then tested the impact of climate change through the use of the A2 emissions scenario and HadCM3 boundary forcing on the WRF simulations of a future period. Our analysis is focused on the combined impact of existing and proposed reservoir infrastructure at two new sites on the water supply management in the SRB under historical and future climate conditions. We also explore the impact of climate variability and change on the bimodal precipitation pattern from winter frontal storms and the summertime North American monsoon and its consequences on water management. Our results are presented in the form of flow duration, reliability and exceedence frequency curves that are commonly used in the water management agencies. Through this effort, we anticipate to build confidence among regional stakeholders in utilizing hydrological models in the development of water infrastructure plans and to foster conversations that address water sustainability issues.

  3. The Jericho Option: Al-Qa'ida and Attacks on Critical Infrastructure

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ackerman, G; Blair, C; Bale, J

    There is no doubt that al-Qaida and its affiliates have displayed, and continue to display, an acute interest in attacking targets that are considered to be important components of the infrastructure of the United States. What has not thus far been carried out, however, is an in-depth examination of the basic nature, historical evolution, and present scope of the organization's objectives that might help government personnel develop sound policy recommendations and analytical indicators to assist in detecting and interdicting plots of this nature. This study was completed with the financial support of the Lawrence Livermore National Laboratory, through a projectmore » sponsored by the U.S. Department of Homeland Security, Science and Technology Directorate. It is specifically intended to increase counterterrorism analysts understanding of certain features of al-Qaida's strategy and operations in order to facilitate the anticipation and prevention of attacks directed against our most critical infrastructures. The procedure adopted herein has involved consulting a wide variety of source materials that bear on the topic, ranging from sacred religious texts and historical accounts to al-Qaida-linked materials and the firsthand testimony of captured members of the group. It has also intentionally combined multiple approaches, including exploring the more esoteric religion-historical referents that have served to influence al-Qaida's behavior, providing a strategic analysis of its objectives and targeting rationales, closely examining the statements and writings of al-Qaida leaders and spokesmen (in part on the basis of material translated from primary sources), offering a descriptive analysis of its past global attack patterns, and producing concise but nonetheless in-depth case studies of its previous ''infrastructural'' attacks on U.S. soil. The analyses contained herein tend to support the preliminary assessment made by some of the authors in an earlier report, namely, that transnational jihadist organizations are amongst the extremist groups that are most likely to carry out successful attacks against targets that U.S. officials would categorize as elements of this country's critical infrastructure. These networks clearly have the operational capabilities to conduct these types of attacks, even on a large scale, and they display a number of ideological proclivities that may incline them to attack such targets. Although this seems self-evident, this study has also yielded more detailed insights into the behavior and orientation of al-Qaida and its affiliated networks.« less

  4. Against Infrastructure: Curating Community Literacy in a Jail Writing Program

    ERIC Educational Resources Information Center

    Jacobi, Tobi

    2016-01-01

    This essay argues that while fostering individual and collaborative literacy can indeed promote self-awareness, confidence, and political awareness, the threat of emotional and material retribution is ever-present in jail, making the development of infrastructure challenging. Such reality compels engaged teacher-researchers to develop tactical…

  5. EPA-WERF Cooperative Agreement: Innovation and Research for Water Infrastructure for the 21st Century

    EPA Science Inventory

    This is a brief slide presentation that will provide an overview of several projects that are being conducted in EPA-WERF Cooperative Agreement, Innovation and Research for Water Infrastructure for the 21st Century. The cooperative agreement objectives are to produce, evaluate, &...

  6. Adaptation of irrigation infrastructure on irrigation demands under future drought in the USA

    USDA-ARS?s Scientific Manuscript database

    More severe droughts in the United States will bring great challenges to irrigation water supply. Here, the authors assessed the potential adaptive effects of irrigation infrastructure under present and more extensive droughts. Based on data over 1985–2005, this study established a statistical model...

  7. Ohio Department of Transportation State Infrastructure Bank Annual Financial Report : Federal Fiscal Year 2004

    DOT National Transportation Integrated Search

    2004-01-01

    The Ohio Department of Transportation is pleased to present the Federal Fiscal : Year 2004 State Infrastructure Bank (SIB) Annual Financial Report. The portfolio of : the FFY 04 SIB had a total of nineteen loans in the amount of $47,340,891. : A comp...

  8. Software Engineering Infrastructure in a Large Virtual Campus

    ERIC Educational Resources Information Center

    Cristobal, Jesus; Merino, Jorge; Navarro, Antonio; Peralta, Miguel; Roldan, Yolanda; Silveira, Rosa Maria

    2011-01-01

    Purpose: The design, construction and deployment of a large virtual campus are a complex issue. Present virtual campuses are made of several software applications that complement e-learning platforms. In order to develop and maintain such virtual campuses, a complex software engineering infrastructure is needed. This paper aims to analyse the…

  9. The Classroom Infrastructure and the Early Learner: Reducing Aggression during Transition Times

    ERIC Educational Resources Information Center

    Guardino, Caroline; Fullerton, Elizabeth Kirby

    2012-01-01

    High levels of aggressive behaviors were observed during the transition times in two selfcontained special education classrooms: a kindergarten and pre-kindergarten. The present case studies examine how modifying the classroom infrastructure impacts students' aggressive behavior. Teachers were assisted on the usage of select modifications (visual…

  10. Early Intervention Service Coordination Policies: National Policy Infrastructure

    ERIC Educational Resources Information Center

    Harbin, Gloria L.; Bruder, Mary Beth; Adams, Candace; Mazzarella, Cynthia; Whitbread, Kathy; Gabbard, Glenn; Staff, Ilene

    2004-01-01

    Effective implementation of service coordination in early intervention, as mandated by the Individuals with Disabilities Education Act, remains a challenge for most states. The present study provides a better understanding of the various aspects of the policy infrastructure that undergird service coordination across the United States. Data from a…

  11. 49 CFR 260.3 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... appropriations and which must be paid by Applicant or its non-Federal infrastructure partner before that direct... provisions of this part. (j) Including means including but not limited to. (k) Infrastructure partner means... subtitle IV of title 49, United States Code. (s) Subsidy cost of a direct loan means the net present value...

  12. Map of Water Infrastructure and Homes Without Access to Safe Drinking Water and Basic Sanitation on the Navajo Nation - October 2010

    EPA Pesticide Factsheets

    This document presents the results of completed work using existing geographic information system (GIS) data to map existing water and sewer infrastructure and homes without access to safe drinking water and basic sanitation on the Navajo Nation.

  13. Ohio Department of Transportation State Infrastructure Bank Annual Financial Report : Federal Fiscal Year 2008

    DOT National Transportation Integrated Search

    2008-01-01

    The Ohio Department of Transportation is pleased to present the Federal Fiscal Year (FFY) 2008 State Infrastructure Bank (SIB) Annual Financial Report. The portfolio of the FFY 2008 SIB had a total of five loans totaling $22.1 million. Since the begi...

  14. Ohio Department of Transportation State Infrastructure Bank Annual Financial Report : Federal Fiscal Year 2007

    DOT National Transportation Integrated Search

    2007-01-01

    The Ohio Department of Transportation is pleased to present the Federal : Fiscal Year (FFY) 2007 State Infrastructure Bank (SIB) Annual Financial : Report. : The portfolio of the FFY 2007 SIB had a total of 13 loans and 1 bond in the : amount of $17....

  15. Autonomous low-power magnetic data collection platform to enable remote high latitude array deployment.

    PubMed

    Musko, Stephen B; Clauer, C Robert; Ridley, Aaron J; Arnett, Kennneth L

    2009-04-01

    A major driver in the advancement of geophysical sciences is improvement in the quality and resolution of data for use in scientific analysis, discovery, and for assimilation into or validation of empirical and physical models. The need for more and better measurements together with improvements in technical capabilities is driving the ambition to deploy arrays of autonomous geophysical instrument platforms in remote regions. This is particularly true in the southern polar regions where measurements are presently sparse due to the remoteness, lack of infrastructure, and harshness of the environment. The need for the acquisition of continuous long-term data from remote polar locations exists across geophysical disciplines and is a generic infrastructure problem. The infrastructure, however, to support autonomous instrument platforms in polar environments is still in the early stages of development. We report here the development of an autonomous low-power magnetic variation data collection system. Following 2 years of field testing at the south pole station, the system is being reproduced to establish a dense chain of stations on the Antarctic plateau along the 40 degrees magnetic meridian. The system is designed to operate for at least 5 years unattended and to provide data access via satellite communication. The system will store 1 s measurements of the magnetic field variation (<0.2 nT resolution) in three vector components plus a variety of engineering status and environment parameters. We believe that the data collection platform can be utilized by a variety of low-power instruments designed for low-temperature operation. The design, technical characteristics, and operation results are presented here.

  16. Evolution of the pilot infrastructure of CMS: towards a single glideinWMS pool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Belforte, S.; Gutsche, O.; Letts, J.

    2014-01-01

    CMS production and analysis job submission is based largely on glideinWMS and pilot submissions. The transition from multiple different submission solutions like gLite WMS and HTCondor-based implementations was carried out over years and is coming now to a conclusion. The historically explained separate glideinWMS pools for different types of production jobs and analysis jobs are being unified into a single global pool. This enables CMS to benefit from global prioritization and scheduling possibilities. It also presents the sites with only one kind of pilots and eliminates the need of having to make scheduling decisions on the CE level. This papermore » provides an analysis of the benefits of a unified resource pool, as well as a description of the resulting global policy. It will explain the technical challenges moving forward and present solutions to some of them.« less

  17. Infrastructure stability surveillance with high resolution InSAR

    NASA Astrophysics Data System (ADS)

    Balz, Timo; Düring, Ralf

    2017-02-01

    The construction of new infrastructure in largely unknown and difficult environments, as it is necessary for the construction of the New Silk Road, can lead to a decreased stability along the construction site, leading to an increase in landslide risk and deformation caused by surface motion. This generally requires a thorough pre-analysis and consecutive surveillance of the deformation patterns to ensure the stability and safety of the infrastructure projects. Interferometric SAR (InSAR) and the derived techniques of multi-baseline InSAR are very powerful tools for a large area observation of surface deformation patterns. With InSAR and deriver techniques, the topographic height and the surface motion can be estimated for large areas, making it an ideal tool for supporting the planning, construction, and safety surveillance of new infrastructure elements in remote areas.

  18. Sea Level Rise Impacts on Wastewater Treatment Systems Along the U.S. Coasts

    NASA Astrophysics Data System (ADS)

    Hummel, Michelle A.; Berry, Matthew S.; Stacey, Mark T.

    2018-04-01

    As sea levels rise, coastal communities will experience more frequent and persistent nuisance flooding, and some low-lying areas may be permanently inundated. Critical components of lifeline infrastructure networks in these areas are also at risk of flooding, which could cause significant service disruptions that extend beyond the flooded zone. Thus, identifying critical infrastructure components that are exposed to sea level rise is an important first step in developing targeted investment in protective actions and enhancing the overall resilience of coastal communities. Wastewater treatment plants are typically located at low elevations near the coastline to minimize the cost of collecting consumed water and discharging treated effluent, which makes them particularly susceptible to coastal flooding. For this analysis, we used geographic information systems to assess the exposure of wastewater infrastructure to various sea level rise projections at the national level. We then estimated the number of people who would lose wastewater services, which could be more than five times as high as previous predictions of the number of people at risk of direct flooding due to sea level rise. We also performed a regional comparison of wastewater exposure to marine and groundwater flooding in the San Francisco Bay Area. Overall, this analysis highlights the widespread exposure of wastewater infrastructure in the United States and demonstrates that local disruptions to infrastructure networks may have far-ranging impacts on areas that do not experience direct flooding.

  19. Human Planetary Landing System (HPLS) Capability Roadmap NRC Progress Review

    NASA Technical Reports Server (NTRS)

    Manning, Rob; Schmitt, Harrison H.; Graves, Claude

    2005-01-01

    Capability Roadmap Team. Capability Description, Scope and Capability Breakdown Structure. Benefits of the HPLS. Roadmap Process and Approach. Current State-of-the-Art, Assumptions and Key Requirements. Top Level HPLS Roadmap. Capability Presentations by Leads. Mission Drivers Requirements. "AEDL" System Engineering. Communication & Navigation Systems. Hypersonic Systems. Super to Subsonic Decelerator Systems. Terminal Descent and Landing Systems. A Priori In-Situ Mars Observations. AEDL Analysis, Test and Validation Infrastructure. Capability Technical Challenges. Capability Connection Points to other Roadmaps/Crosswalks. Summary of Top Level Capability. Forward Work.

  20. Design and Implementation of a Computation Server for Optimization with Application to the Analysis of Critical Infrastructure

    DTIC Science & Technology

    2013-06-01

    for crop irrigation. The disruptions also 1 idled key industries, led to billions of dollars of lost productivity, and stressed the entire Western...modify the super-system, and to resume the super-system run. 2.2 Requirements An important step in the software development life cycle is to capture...detects the . gms file and associated files in the remote directory that is allocated to the user. 4. If all of the files are present, the files are

  1. Jobs and Economic Development from New Transmission and Generation in Wyoming

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lantz, E.; Tegen, S.

    2011-03-01

    This report is intended to inform policymakers, local government officials, and Wyoming residents about the jobs and economic development activity that could occur should new infrastructure investments in Wyoming move forward. The report and analysis presented is not a projection or a forecast of what will happen. Instead, the report uses a hypothetical deployment scenario and economic modeling tools to estimate the jobs and economic activity likely associated with these projects if or when they are built.

  2. Jobs and Economic Development from New Transmission and Generation in Wyoming

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lantz, Eric; Tegen, Suzanne

    2011-03-31

    This report is intended to inform policymakers, local government officials, and Wyoming residents about the jobs and economic development activity that could occur should new infrastructure investments in Wyoming move forward. The report and analysis presented is not a projection or a forecast of what will happen. Instead, the report uses a hypothetical deployment scenario and economic modeling tools to estimate the jobs and economic activity likely associated with these projects if or when they are built.

  3. State of the South: Building an Infrastructure of Opportunity for the Next Generation

    ERIC Educational Resources Information Center

    Guillory, Ferrel; Mitchell, Kate; Parcell, Abby; Hart, Richard; Zandt, Alyson; Caldwell, Beth; Robertson, Shun; Rose, Max; Dodson, David

    2014-01-01

    The 2014 edition of "State of the South" features analysis of state and regional data and calls on the region to develop and implement purposeful policies and systemic practices--an "infrastructure of opportunity"--to bolster the prospects for its 15- to 24-year-olds to achieve economic resilience as adults and a fulfilling…

  4. The impact of green stormwater infrastructure installation on surrounding health and safety

    Treesearch

    Michelle C. Kondo; Sarah C. Low; Jason Henning; Charles C. Branas

    2015-01-01

    We investigated the health and safety effects of urban green stormwater infrastructure (GSI) installments. We conducted a difference-in-differences analysis of the effects of GSI installments on health (e.g., blood pressure, cholesterol and stress levels) and safety (e.g., felonies, nuisance and property crimes, narcotics crimes) outcomes from 2000 to 2012 in...

  5. Organizing English Learner Instruction in New Immigrant Destinations: District Infrastructure and Subject-Specific School Practice

    ERIC Educational Resources Information Center

    Hopkins, Megan; Lowenhaupt, Rebecca; Sweet, Tracy M.

    2015-01-01

    In the context of shifting demographics and standards-based reform, school districts in new immigrant destinations are charged with designing infrastructures that support teaching and learning for English learners (ELs) in core academic subjects. This article uses qualitative data and social network analysis to examine how one district in the…

  6. Risk-based zoning for urbanizing floodplains.

    PubMed

    Porse, Erik

    2014-01-01

    Urban floodplain development brings economic benefits and enhanced flood risks. Rapidly growing cities must often balance the economic benefits and increased risks of floodplain settlement. Planning can provide multiple flood mitigation and environmental benefits by combining traditional structural measures such as levees, increasingly popular landscape and design features (green infrastructure), and non-structural measures such as zoning. Flexibility in both structural and non-structural options, including zoning procedures, can reduce flood risks. This paper presents a linear programming formulation to assess cost-effective urban floodplain development decisions that consider benefits and costs of development along with expected flood damages. It uses a probabilistic approach to identify combinations of land-use allocations (residential and commercial development, flood channels, distributed runoff management) and zoning regulations (development zones in channel) to maximize benefits. The model is applied to a floodplain planning analysis for an urbanizing region in the Baja Sur peninsula of Mexico. The analysis demonstrates how (1) economic benefits drive floodplain development, (2) flexible zoning can improve economic returns, and (3) cities can use landscapes, enhanced by technology and design, to manage floods. The framework can incorporate additional green infrastructure benefits, and bridges typical disciplinary gaps for planning and engineering.

  7. Common Badging and Access Control System (CBACS)

    NASA Technical Reports Server (NTRS)

    Dischinger, Portia

    2005-01-01

    This slide presentation presents NASA's Common Badging and Access Control System. NASA began a Smart Card implementation in January 2004. Following site surveys, it was determined that NASA's badging and access control systems required upgrades to common infrastructure in order to provide flexibly, usability, and return on investment prior to a smart card implantation. Common Badging and Access Control System (CBACS) provides the common infrastructure from which FIPS-201 compliant processes, systems, and credentials can be developed and used.

  8. Primary care access barriers as reported by nonurgent emergency department users: implications for the US primary care infrastructure.

    PubMed

    Hefner, Jennifer L; Wexler, Randy; McAlearney, Ann Scheck

    2015-01-01

    The objective was to explore variation by insurance status in patient-reported barriers to accessing primary care. The authors fielded a brief, anonymous, voluntary survey of nonurgent emergency department (ED) visits at a large academic medical center and conducted descriptive analysis and thematic coding of 349 open-ended survey responses. The privately insured predominantly reported primary care infrastructure barriers-wait time in clinic and for an appointment, constraints related to conventional business hours, and difficulty finding a primary care provider (because of geography or lack of new patient openings). Half of those insured by Medicaid and/or Medicare also reported these infrastructure barriers. In contrast, the uninsured predominantly reported insurance, income, and transportation barriers. Given that insured nonurgent ED users frequently report infrastructure barriers, these should be the focus of patient-level interventions to reduce nonurgent ED use and of health system-level policies to enhance the capacity of the US primary care infrastructure. © 2014 by the American College of Medical Quality.

  9. LIBS-LIF-Raman: a new tool for the future E-RIHS

    NASA Astrophysics Data System (ADS)

    Detalle, Vincent; Bai, Xueshi; Bourguignon, Elsa; Menu, Michel; Pallot-Frossard, Isabelle

    2017-07-01

    France is one of the countries involved in the future E-RIHS - European Research Infrastructure for Heritage Science. The research infrastructure dedicated to the study of materials of cultural and natural heritage will provide transnational access to state-of-the-art technologies (synchrotron, ion beams, lasers, portable methods, etc.) and scientific archives. E-RIHS addresses the experimental problems of knowledge and conservation of heritage materials (collections of art and natural museums, monuments, archaeological sites, archives, libraries, etc.). The cultural artefacts are characterized by complementary methods at multi-scales. The variety and the hybrid are specific of these artefacts and induce complex problems that are not expected in traditional Natural Science: paints, ceramics and glasses, metals, palaeontological specimens, lithic materials, graphic documents, etc. E-RIHS develops in that purpose transnational access to distributed platforms in many European countries. Five complementary accesses are in this way available: FIXLAB (access to fixed platforms for synchrotron, neutrons, ion beams, lasers, etc.), MOLAB (access to mobile examination and analytical methods to study the works in situ), ARCHLAB (access to scientific archives kept in the cultural institutions), DIGILAB (access to a digital infrastructure for the processing of quantitative data, implementing a policy on (re)use of data, choice of data formats, etc.) and finally EXPERTLAB (panels of experts for the implementation of collaborative and multidisciplinary projects for the study, the analysis and the conservation of heritage works).Thus E-RIHS is specifically involved in complex studies for the development of advanced high-resolution analytical and imaging tools. The privileged field of intervention of the infrastructure is that of the study of large corpora, collections and architectural ensembles. Based on previous I3 European program, and especially IPERION-CH program that support the creation of new mobile instrumentation, the French institutions are involved in the development of LIBS/LIF/RAMAN portable instrumentation. After a presentation of the challenge and the multiple advantages in building the European Infrastructure and of the French E-RIHS hub, the major interests of associating the three lasers based on analytical methods for a more global and precise characterization of the heritage objects taking into account their precious characters and their specific constraints. Lastly some preliminary results will be presented in order to give a first idea of the power of this analytical tool.

  10. Challenges for Data Archival Centers in Evolving Environmental Sciences

    NASA Astrophysics Data System (ADS)

    Wei, Y.; Cook, R. B.; Gu, L.; Santhana Vannan, S. K.; Beaty, T.

    2015-12-01

    Environmental science has entered into a big data era as enormous data about the Earth environment are continuously collected through field and airborne missions, remote sensing observations, model simulations, sensor networks, etc. An open-access and open-management data infrastructure for data-intensive science is a major grand challenge in global environmental research (BERAC, 2010). Such an infrastructure, as exemplified in EOSDIS, GEOSS, and NSF EarthCube, will provide a complete lifecycle of environmental data and ensures that data will smoothly flow among different phases of collection, preservation, integration, and analysis. Data archival centers, as the data integration units closest to data providers, serve as the source power to compile and integrate heterogeneous environmental data into this global infrastructure. This presentation discusses the interoperability challenges and practices of geosciences from the aspect of data archival centers, based on the operational experiences of the NASA-sponsored Oak Ridge National Laboratory Distributed Active Archive Center (ORNL DAAC) and related environmental data management activities. Specifically, we will discuss the challenges to 1) encourage and help scientists to more actively share data with the broader scientific community, so that valuable environmental data, especially those dark data collected by individual scientists in small independent projects, can be shared and integrated into the infrastructure to tackle big science questions; 2) curate heterogeneous multi-disciplinary data, focusing on the key aspects of identification, format, metadata, data quality, and semantics to make them ready to be plugged into a global data infrastructure. We will highlight data curation practices at the ORNL DAAC for global campaigns such as BOREAS, LBA, SAFARI 2000; and 3) enhance the capabilities to more effectively and efficiently expose and deliver "big" environmental data to broad range of users and systems. Experiences and challenges with integrating large data sets via the ORNL DAAC's data discovery and delivery Web services will be discussed.

  11. Coupling Adaptation Tipping Points and Engineering Options: New Insights for Resilient Water Infrastructure Replacement Planning

    NASA Astrophysics Data System (ADS)

    Smet, K.; de Neufville, R.; van der Vlist, M.

    2017-12-01

    This work presents an innovative approach for replacement planning for aging water infrastructure given uncertain future conditions. We draw upon two existing methodologies to develop an integrated long-term replacement planning framework. We first expand the concept of Adaptation Tipping Points to generate long-term planning timelines that incorporate drivers of investment related to both internal structural processes as well as changes in external operating conditions. Then, we use Engineering Options to explore different actions taken at key moments in this timeline. Contrasting to the traditionally more static approach to infrastructure design, designing the next generation of infrastructure so it can be changed incrementally is a promising method to safeguard current investments given future uncertainty. This up-front inclusion of structural options in the system actively facilitates future adaptation, transforming uncertainty management in infrastructure planning from reactive to more proactive. A two-part model underpins this approach. A simulation model generates diverse future conditions, allowing development of timelines of intervention moments in the structure's life. This feeds into an economic model, evaluating the lifetime performance of different replacement strategies, making explicit the value of different designs and their flexibility. A proof of concept study demonstrates this approach for a pumping station. The strategic planning timelines for this structure demonstrate that moments when capital interventions become necessary due to reduced functionality from structural degradation or changed operating conditions are widely spread over the structure's life. The disparate timing of these necessary interventions supports an incremental, adaptive mindset when considering end-of-life and replacement decisions. The analysis then explores different replacement decisions, varying the size and specific options included in the proposed new structure. Results show that incremental adaptive designs and incorporating options can improve economic performance, as compared to traditional, "build it once & build it big" designs. The benefit from incorporating flexibility varies with structural functionality, future conditions and the specific options examined.

  12. Landscape of the EU-US Research Infrastructures and actors: Moving towards international interoperability of earth system data

    NASA Astrophysics Data System (ADS)

    Asmi, Ari; Powers, Lindsay

    2015-04-01

    Research Infrastructures (RIs) are major long-term investments supporting innovative, bottom-up research activities. In the environmental research, they range from high atmosphere radars, to field observation networks and coordinated laboratory facilities. The Earth system is highly interactive and each part of the system interconnected across the spatial and disciplinary borders. However, due practical and historical reasons, the RIs are built from disciplinary points-of-view and separately in different parts of the world, with differing standards, policies, methods and research cultures. This heterogeneity provides necessary diversity to study the complex Earth system, but makes cross-disciplinary and/or global interoperability a challenge. Global actions towards better interoperability are surfacing, especially with EU and US. For example, recent mandates within the US government prioritize open data for federal agencies and federally funded science, and encourage collaboration among agencies to reduce duplication of efforts and increase efficient use of resources. There are several existing initiatives working toward these goals (e.g., COOPEUS, EarthCube, RDA, ICSU-WDS, DataOne, ESIP, USGEO, GEO). However, there is no cohesive framework to coordinate efforts among these, and other, entities. COOPEUS and EarthCube have now begun to map the landscape of interoperability efforts across earth science domains. The COOPEUS mapping effort describes the EU and US landscape of environmental research infrastructures to accomplish the following: identify gaps in services (data provision) necessary to address societal priorities; provide guidance for development of future research infrastructures; and identify opportunities for Research Infrastructures (RIs) to collaborate on issues of common interest. EarthCube mapping effort identifies opportunities to engage a broader community by identifying scientific domain organizations and entities. We present the current situation of the landscape analysis to create a sustainable effort towards removing barriers to interoperability on a global scale.

  13. Impacts of Bicycle Infrastructure in Mid-Sized Cities (IBIMS): protocol for a natural experiment study in three Canadian cities

    PubMed Central

    Winters, Meghan; Branion-Calles, Michael; Therrien, Suzanne; Fuller, Daniel; Gauvin, Lise; Whitehurst, David G T; Nelson, Trisalyn

    2018-01-01

    Introduction Bicycling is promoted as a transportation and population health strategy globally. Yet bicycling has low uptake in North America (1%–2% of trips) compared with European bicycling cities (15%–40% of trips) and shows marked sex and age trends. Safety concerns due to collisions with motor vehicles are primary barriers. To attract the broader population to bicycling, many cities are making investments in bicycle infrastructure. These interventions hold promise for improving population health given the potential for increased physical activity and improved safety, but such outcomes have been largely unstudied. In 2016, the City of Victoria, Canada, committed to build a connected network of infrastructure that separates bicycles from motor vehicles, designed to attract people of ‘all ages and abilities’ to bicycling. This natural experiment study examines the impacts of the City of Victoria’s investment in a bicycle network on active travel and safety outcomes. The specific objectives are to (1) estimate changes in active travel, perceived safety and bicycle safety incidents; (2) analyse spatial inequities in access to bicycle infrastructure and safety incidents; and (3) assess health-related economic benefits. Methods and analysis The study is in three Canadian cities (intervention: Victoria; comparison: Kelowna, Halifax). We will administer population-based surveys in 2016, 2018 and 2021 (1000 people/city). The primary outcome is the proportion of people reporting bicycling. Secondary outcomes are perceived safety and bicycle safety incidents. Spatial analyses will compare the distribution of bicycle infrastructure and bicycle safety incidents across neighbourhoods and across time. We will also calculate the economic benefits of bicycling using WHO’s Health Economic Assessment Tool. Ethics and dissemination This study received approval from the Simon Fraser University Office of Research Ethics (study no. 2016s0401). Findings will be disseminated via a website, presentations to stakeholders, at academic conferences and through peer-reviewed journal articles. PMID:29358440

  14. Advanced European Network of E-Infrastructures for Astronomy with the SKA

    NASA Astrophysics Data System (ADS)

    Massardi, Marcella

    2017-11-01

    Here, I present the AENEAS (Advanced European Network of E-infrastructures for Astronomy with the SKA) project has been funded in the Horizon 2020 Work Programme call "Research and Innovation Actions for International Co-operation on high-end e-infrastructure requirements" supporting the Square Kilometre Array (SKA). INAF is contributing to all the AENEAS working packages and leading the WP5 - Access and Knowledge Creation (WP leader M. Massardi IRA-ARC), participants from IRA (Brand, Nanni, Venturi) ,OACT(Becciani, Costa, Umana), OATS (Smareglia, Knapic, Taffoni)

  15. Brokering Capabilities for EarthCube - supporting Multi-disciplinary Earth Science Research

    NASA Astrophysics Data System (ADS)

    Jodha Khalsa, Siri; Pearlman, Jay; Nativi, Stefano; Browdy, Steve; Parsons, Mark; Duerr, Ruth; Pearlman, Francoise

    2013-04-01

    The goal of NSF's EarthCube is to create a sustainable infrastructure that enables the sharing of all geosciences data, information, and knowledge in an open, transparent and inclusive manner. Brokering of data and improvements in discovery and access are a key to data exchange and promotion of collaboration across the geosciences. In this presentation we describe an evolutionary process of infrastructure and interoperability development focused on participation of existing science research infrastructures and augmenting them for improved access. All geosciences communities already have, to a greater or lesser degree, elements of an information infrastructure in place. These elements include resources such as data archives, catalogs, and portals as well as vocabularies, data models, protocols, best practices and other community conventions. What is necessary now is a process for levering these diverse infrastructure elements into an overall infrastructure that provides easy discovery, access and utilization of resources across disciplinary boundaries. Brokers connect disparate systems with only minimal burdens upon those systems, and enable the infrastructure to adjust to new technical developments and scientific requirements as they emerge. Robust cyberinfrastructure will arise only when social, organizational, and cultural issues are resolved in tandem with the creation of technology-based services. This is a governance issue, but is facilitated by infrastructure capabilities that can impact the uptake of new interdisciplinary collaborations and exchange. Thus brokering must address both the cyberinfrastructure and computer technology requirements and also the social issues to allow improved cross-domain collaborations. This is best done through use-case-driven requirements and agile, iterative development methods. It is important to start by solving real (not hypothetical) information access and use problems via small pilot projects that develop capabilities targeted to specific communities. Brokering, as a critical capability for connecting systems, evolves over time through more connections and increased functionality. This adaptive process allows for continual evaluation as to how well science-driven use cases are being met. There is a near term, and possibly unique, opportunity through EarthCube and European e-Infrastructure projects to increase the impact and interconnectivity of projects. In the developments described in this presentation, brokering has been demonstrated to be an essential part of a robust, adaptive technical infrastructure and demonstration and user scenarios can address of both the governance and detailed implementation paths forward. The EarthCube Brokering roadmap proposes the expansion of brokering pilots into fully operational prototypes that work with the broader science and informatics communities to answer these questions, connect existing and emerging systems, and evolve the EarthCube infrastructure.

  16. Probabilistic Scenario-based Seismic Risk Analysis for Critical Infrastructures Method and Application for a Nuclear Power Plant

    NASA Astrophysics Data System (ADS)

    Klügel, J.

    2006-12-01

    Deterministic scenario-based seismic hazard analysis has a long tradition in earthquake engineering for developing the design basis of critical infrastructures like dams, transport infrastructures, chemical plants and nuclear power plants. For many applications besides of the design of infrastructures it is of interest to assess the efficiency of the design measures taken. These applications require a method allowing to perform a meaningful quantitative risk analysis. A new method for a probabilistic scenario-based seismic risk analysis has been developed based on a probabilistic extension of proven deterministic methods like the MCE- methodology. The input data required for the method are entirely based on the information which is necessary to perform any meaningful seismic hazard analysis. The method is based on the probabilistic risk analysis approach common for applications in nuclear technology developed originally by Kaplan & Garrick (1981). It is based (1) on a classification of earthquake events into different size classes (by magnitude), (2) the evaluation of the frequency of occurrence of events, assigned to the different classes (frequency of initiating events, (3) the development of bounding critical scenarios assigned to each class based on the solution of an optimization problem and (4) in the evaluation of the conditional probability of exceedance of critical design parameters (vulnerability analysis). The advantage of the method in comparison with traditional PSHA consists in (1) its flexibility, allowing to use different probabilistic models for earthquake occurrence as well as to incorporate advanced physical models into the analysis, (2) in the mathematically consistent treatment of uncertainties, and (3) in the explicit consideration of the lifetime of the critical structure as a criterion to formulate different risk goals. The method was applied for the evaluation of the risk of production interruption losses of a nuclear power plant during its residual lifetime.

  17. Communicating Pacific Rim Risk: A GIS Analysis of Hazard, Vulnerability, Population, and Infrastructure

    NASA Astrophysics Data System (ADS)

    Yurkovich, E. S.; Howell, D. G.

    2002-12-01

    Exploding population and unprecedented urban development within the last century helped fuel an increase in the severity of natural disasters. Not only has the world become more populated, but people, information and commodities now travel greater distances to service larger concentrations of people. While many of the earth's natural hazards remain relatively constant, understanding the risk to increasingly interconnected and large populations requires an expanded analysis. To improve mitigation planning we propose a model that is accessible to planners and implemented with public domain data and industry standard GIS software. The model comprises 1) the potential impact of five significant natural hazards: earthquake, flood, tropical storm, tsunami and volcanic eruption assessed by a comparative index of risk, 2) population density, 3) infrastructure distribution represented by a proxy, 4) the vulnerability of the elements at risk (population density and infrastructure distribution) and 5) the connections and dependencies of our increasingly 'globalized' world, portrayed by a relative linkage index. We depict this model with the equation, Risk = f(H, E, V, I) Where H is an index normalizing the impact of five major categories of natural hazards; E is one element at risk, population or infrastructure; V is a measure of the vulnerability for of the elements at risk; and I pertains to a measure of interconnectivity of the elements at risk as a result of economic and social globalization. We propose that future risk analysis include the variable I to better define and quantify risk. Each assessment reflects different repercussions from natural disasters: losses of life or economic activity. Because population and infrastructure are distributed heterogeneously across the Pacific region, two contrasting representations of risk emerge from this study.

  18. Using mental mapping to unpack perceived cycling risk.

    PubMed

    Manton, Richard; Rau, Henrike; Fahy, Frances; Sheahan, Jerome; Clifford, Eoghan

    2016-03-01

    Cycling is the most energy-efficient mode of transport and can bring extensive environmental, social and economic benefits. Research has highlighted negative perceptions of safety as a major barrier to the growth of cycling. Understanding these perceptions through the application of novel place-sensitive methodological tools such as mental mapping could inform measures to increase cyclist numbers and consequently improve cyclist safety. Key steps to achieving this include: (a) the design of infrastructure to reduce actual risks and (b) targeted work on improving safety perceptions among current and future cyclists. This study combines mental mapping, a stated-preference survey and a transport infrastructure inventory to unpack perceptions of cycling risk and to reveal both overlaps and discrepancies between perceived and actual characteristics of the physical environment. Participants translate mentally mapped cycle routes onto hard-copy base-maps, colour-coding road sections according to risk, while a transport infrastructure inventory captures the objective cycling environment. These qualitative and quantitative data are matched using Geographic Information Systems and exported to statistical analysis software to model the individual and (infra)structural determinants of perceived cycling risk. This method was applied to cycling conditions in Galway City (Ireland). Participants' (n=104) mental maps delivered data-rich perceived safety observations (n=484) and initial comparison with locations of cycling collisions suggests some alignment between perception and reality, particularly relating to danger at roundabouts. Attributing individual and (infra)structural characteristics to each observation, a Generalised Linear Mixed Model statistical analysis identified segregated infrastructure, road width, the number of vehicles as well as gender and cycling experience as significant, and interactions were found between individual and infrastructural variables. The paper concludes that mental mapping is a highly useful tool for assessing perceptions of cycling risk with a strong visual aspect and significant potential for public participation. This distinguishes it from more traditional cycling safety assessment tools that focus solely on the technical assessment of cycling infrastructure. Further development of online mapping tools is recommended as part of bicycle suitability measures to engage cyclists and the general public and to inform 'soft' and 'hard' cycling policy responses. Copyright © 2015 Elsevier Ltd. All rights reserved.

  19. Assessing the vulnerability of infrastructure to climate change on the Islands of Samoa

    NASA Astrophysics Data System (ADS)

    Fakhruddin, S. H. M.

    2015-03-01

    Pacific Islanders have been exposed to risks associated with climate change. Samoa as one of the Pacific Islands are prone to climatic hazards that will likely increase in coming decades, affecting coastal communities and infrastructure around the islands. Climate models do not predict a reduction of such disaster events in the future in Samoa; indeed, most predict an increase in such events. This paper identifies key infrastructure and their functions and status in order to provide an overall picture of relative vulnerability to climate-related stresses of such infrastructure on the island. By reviewing existing reports as well as holding a series of consultation meetings, a list of critical infrastructures were developed and shared with stakeholders for their consideration. An indicator-based vulnerability model (SIVM) was developed in collaboration with stakeholders to assess the vulnerability of selected infrastructure systems on the Samoan Islands. Damage costs were extracted from the Evan cyclone recovery needs document. On the other hand, criticality and capacity to repair data were collected from stakeholders. Having stakeholder perspectives on these two issues was important because (a) criticality of a given infrastructure could be viewed differently among different stakeholders, and (b) stakeholders were the best available source (in this study) to estimate the capacity to repair non-physical damage to such infrastructure. Analysis of the results suggested rankings from most vulnerable to least vulnerable sectors are the transportation sector, the power sector, the water supply sector and the sewerage system.

  20. Assessing the vulnerability of infrastructure to climate change on the Islands of Samoa

    NASA Astrophysics Data System (ADS)

    Fakhruddin, S. H. M.; Babel, M. S.; Kawasaki, A.

    2015-06-01

    Pacific Islanders have been exposed to risks associated with climate change. Samoa, as one of the Pacific Islands, is prone to climatic hazards that will likely increase in the coming decades, affecting coastal communities and infrastructure around the islands. Climate models do not predict a reduction of such disaster events in the future in Samoa; indeed, most predict an increase. This paper identifies key infrastructure and their functions and status in order to provide an overall picture of relative vulnerability to climate-related stresses of such infrastructure on the island. By reviewing existing reports as well as holding a series of consultation meetings, a list of critical infrastructure was developed and shared with stakeholders for their consideration. An indicator-based vulnerability model (SIVM) was developed in collaboration with stakeholders to assess the vulnerability of selected infrastructure systems on the Samoan Islands. Damage costs were extracted from the Cyclone Evan recovery needs document. Additionally, data on criticality and capacity to repair damage were collected from stakeholders. Having stakeholder perspectives on these two issues was important because (a) criticality of a given infrastructure could be viewed differently among different stakeholders, and (b) stakeholders were the best available source (in this study) to estimate the capacity to repair non-physical damage to such infrastructure. Analysis of the results suggested a ranking of sectors from the most vulnerable to least vulnerable are: the transportation sector, the power sector, the water supply sector and the sewerage system.

  1. Web-GIS platform for green infrastructure in Bucharest, Romania

    NASA Astrophysics Data System (ADS)

    Sercaianu, Mihai; Petrescu, Florian; Aldea, Mihaela; Oana, Luca; Rotaru, George

    2015-06-01

    In the last decade, reducing urban pollution and improving quality of public spaces became a more and more important issue for public administration authorities in Romania. The paper describes the development of a web-GIS solution dedicated to monitoring of the green infrastructure in Bucharest, Romania. Thus, the system allows the urban residents (citizens) to collect themselves and directly report relevant information regarding the current status of the green infrastructure of the city. Consequently, the citizens become an active component of the decision-support process within the public administration. Besides the usual technical characteristics of such geo-information processing systems, due to the complex legal and organizational problems that arise in collecting information directly from the citizens, additional analysis was required concerning, for example, local government involvement, environmental protection agencies regulations or public entities requirements. Designing and implementing the whole information exchange process, based on the active interaction between the citizens and public administration bodies, required the use of the "citizen-sensor" concept deployed with GIS tools. The information collected and reported from the field is related to a lot of factors, which are not always limited to the city level, providing the possibility to consider the green infrastructure as a whole. The "citizen-request" web-GIS for green infrastructure monitoring solution is characterized by a very diverse urban information, due to the fact that the green infrastructure itself is conditioned by a lot of urban elements, such as urban infrastructures, urban infrastructure works and construction density.

  2. Utilizing an integrated infrastructure for outcomes research: a systematic review.

    PubMed

    Dixon, Brian E; Whipple, Elizabeth C; Lajiness, John M; Murray, Michael D

    2016-03-01

    To explore the ability of an integrated health information infrastructure to support outcomes research. A systematic review of articles published from 1983 to 2012 by Regenstrief Institute investigators using data from an integrated electronic health record infrastructure involving multiple provider organisations was performed. Articles were independently assessed and classified by study design, disease and other metadata including bibliometrics. A total of 190 articles were identified. Diseases included cognitive, (16) cardiovascular, (16) infectious, (15) chronic illness (14) and cancer (12). Publications grew steadily (26 in the first decade vs. 100 in the last) as did the number of investigators (from 15 in 1983 to 62 in 2012). The proportion of articles involving non-Regenstrief authors also expanded from 54% in the first decade to 72% in the last decade. During this period, the infrastructure grew from a single health system into a health information exchange network covering more than 6 million patients. Analysis of journal and article metrics reveals high impact for clinical trials and comparative effectiveness research studies that utilised data available in the integrated infrastructure. Integrated information infrastructures support growth in high quality observational studies and diverse collaboration consistent with the goals for the learning health system. More recent publications demonstrate growing external collaborations facilitated by greater access to the infrastructure and improved opportunities to study broader disease and health outcomes. Integrated information infrastructures can stimulate learning from electronic data captured during routine clinical care but require time and collaboration to reach full potential. © 2015 Health Libraries Group.

  3. A compilation of lunar and Mars exploration strategies utilizing indigenous propellants

    NASA Technical Reports Server (NTRS)

    Linne, Diane L.; Meyer, Michael L.

    1992-01-01

    The use of propellants manufactured from indigenous space materials has the potential to significantly reduce the amount of mass required to be launched from the Earth's surface. The extent of the leverage, however, along with the cost for developing the infrastructure necessary to support such a process, is unclear. Many mission analyses have been performed that have attempted to quantify the potential benefits of in situ propellant utilization. Because the planning of future space missions includes many unknowns, the presentation of any single study on the use of in situ propellants is often met with critics' claims of the inaccuracy of assumptions or omission of infrastructure requirements. The results of many such mission analyses are presented in one format. Each summarized mission analysis used different assumptions and baseline mission scenarios. The conclusion from the studies is that the use of in situ produced propellants will provide significant reductions in Earth launch requirements. This result is consistent among all of the analyses regardless of the assumptions used to obtain the quantitative results. The determination of the best propellant combination and the amount of savings will become clearer and more apparent as the technology work progresses.

  4. Collaborative Development of e-Infrastructures and Data Management Practices for Global Change Research

    NASA Astrophysics Data System (ADS)

    Samors, R. J.; Allison, M. L.

    2016-12-01

    An e-infrastructure that supports data-intensive, multidisciplinary research is being organized under the auspices of the Belmont Forum consortium of national science funding agencies to accelerate the pace of science to address 21st century global change research challenges. The pace and breadth of change in information management across the data lifecycle means that no one country or institution can unilaterally provide the leadership and resources required to use data and information effectively, or needed to support a coordinated, global e-infrastructure. The five action themes adopted by the Belmont Forum: 1. Adopt and make enforceable Data Principles that establish a global, interoperable e-infrastructure. 2. Foster communication, collaboration and coordination between the wider research community and Belmont Forum and its projects through an e-Infrastructure Coordination, Communication, & Collaboration Office. 3. Promote effective data planning and stewardship in all Belmont Forum agency-funded research with a goal to make it enforceable. 4. Determine international and community best practice to inform Belmont Forum research e-infrastructure policy through identification and analysis of cross-disciplinary research case studies. 5. Support the development of a cross-disciplinary training curriculum to expand human capacity in technology and data-intensive analysis methods. The Belmont Forum is ideally poised to play a vital and transformative leadership role in establishing a sustained human and technical international data e-infrastructure to support global change research. In 2016, members of the 23-nation Belmont Forum began a collaborative implementation phase. Four multi-national teams are undertaking Action Themes based on the recommendations above. Tasks include mapping the landscape, identifying and documenting existing data management plans, and scheduling a series of workshops that analyse trans-disciplinary applications of existing Belmont Forum projects to identify best practices and critical gaps that may be uniquely or best addressed by the Belmont Forum funding model. Concurrent work will define challenges in conducting international and interdisciplinary data management implementation plans and identify sources of relevant expertise and knowledge.

  5. Inaugural Genomics Automation Congress and the coming deluge of sequencing data.

    PubMed

    Creighton, Chad J

    2010-10-01

    Presentations at Select Biosciences's first 'Genomics Automation Congress' (Boston, MA, USA) in 2010 focused on next-generation sequencing and the platforms and methodology around them. The meeting provided an overview of sequencing technologies, both new and emerging. Speakers shared their recent work on applying sequencing to profile cells for various levels of biomolecular complexity, including DNA sequences, DNA copy, DNA methylation, mRNA and microRNA. With sequencing time and costs continuing to drop dramatically, a virtual explosion of very large sequencing datasets is at hand, which will probably present challenges and opportunities for high-level data analysis and interpretation, as well as for information technology infrastructure.

  6. Case study and lessons learned for the Great Lakes ITS Program, Airport ITS Integration and the Road Infrastructure Management System projects, final report, Wayne County, Michigan

    DOT National Transportation Integrated Search

    2007-03-02

    This report presents the case study and lessons learned for the national evaluation of the Great Lakes Intelligent Transportation Systems (GLITS) Airport ITS Integration and Road Infrastructure Management System (RIMS) projects. The Airport ITS Integ...

  7. An Open and Scalable Learning Infrastructure for Food Safety

    ERIC Educational Resources Information Center

    Manouselis, Nikos; Thanopoulos, Charalampos; Vignare, Karen; Geith, Christine

    2013-01-01

    In the last several years, a variety of approaches and tools have been developed for giving access to open educational resources (OER) related to food safety, security, and food standards, as well to various targeted audiences (e.g., farmers, agronomists). The aim of this paper is to present a technology infrastructure currently in demonstration…

  8. Integrating child welfare, juvenile justice, and other agencies in a continuum of services.

    PubMed

    Howell, James C; Kelly, Marion R; Palmer, James; Mangum, Ronald L

    2004-01-01

    This article presents a comprehensive strategy framework for integrating mental health, child welfare, education, substance abuse, and juvenile justice system services. It proposes an infrastructure of information exchange, cross-agency client referrals, a networking protocol, interagency councils, and service integration models. This infrastructure facilitates integrated service delivery.

  9. Enabling a Secure Environment for Vehicle-to-Vehicle (V2V) and Vehicle-to-Infrastructure (V2I) Transactions : April 2012 Public Workshop Proceedings

    DOT National Transportation Integrated Search

    2012-06-08

    This report provides a summary and overview of the Public Workshop entitled, Enabling a Secure Environment for Vehicle-to-Vehicle and Vehicle-to-Infrastructure Transactions, presented by USDOT. The workshop took place on April 19-20, 2012 at th...

  10. Webinar November 18: An Overview of the Hydrogen Fueling Infrastructure

    Science.gov Websites

    Research and Station Technology (H2FIRST) Project | News | NREL Webinar November 18: An Overview of the Hydrogen Fueling Infrastructure Research and Station Technology (H2FIRST) Project Webinar ) Project November 12, 2014 The Energy Department will present a live webinar entitled "An Overview of

  11. Quantifying Security Threats and Their Impact

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aissa, Anis Ben; Abercrombie, Robert K; Sheldon, Frederick T

    In earlier works, we present a computational infrastructure that allows an analyst to estimate the security of a system in terms of the loss that each stakeholder stands to sustain as a result of security breakdowns. In this paper we illustrate this infrastructure by means of a sample example involving an e-commerce application.

  12. Quantifying Security Threats and Their Potential Impacts: A Case Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aissa, Anis Ben; Abercrombie, Robert K; Sheldon, Frederick T

    In earlier works, we present a computational infrastructure that allows an analyst to estimate the security of a system in terms of the loss that each stakeholder stands to sustain as a result of security breakdowns. In this paper, we illustrate this infrastructure by means of an e-commerce application.

  13. Cyber threat model for tactical radio networks

    NASA Astrophysics Data System (ADS)

    Kurdziel, Michael T.

    2014-05-01

    The shift to a full information-centric paradigm in the battlefield has allowed ConOps to be developed that are only possible using modern network communications systems. Securing these Tactical Networks without impacting their capabilities has been a challenge. Tactical networks with fixed infrastructure have similar vulnerabilities to their commercial counterparts (although they need to be secure against adversaries with greater capabilities, resources and motivation). However, networks with mobile infrastructure components and Mobile Ad hoc Networks (MANets) have additional unique vulnerabilities that must be considered. It is useful to examine Tactical Network based ConOps and use them to construct a threat model and baseline cyber security requirements for Tactical Networks with fixed infrastructure, mobile infrastructure and/or ad hoc modes of operation. This paper will present an introduction to threat model assessment. A definition and detailed discussion of a Tactical Network threat model is also presented. Finally, the model is used to derive baseline requirements that can be used to design or evaluate a cyber security solution that can be scaled and adapted to the needs of specific deployments.

  14. Dynamic linkages between road transport energy consumption, economic growth, and environmental quality: evidence from Pakistan.

    PubMed

    Danish; Baloch, Muhammad Awais

    2018-03-01

    The focus of the present research work is to investigate the dynamic relationship between economic growth, road transport energy consumption, and environmental quality. To this end, we rely on time series data for the period 1971 to 2014 in the context of Pakistan. To use sulfur dioxide (SO 2 ) emission from transport sector as a new proxy for measuring environmental quality, the present work employs time series technique ARDL which allows energy consumption from the transport sector, urbanization, and road infrastructure to be knotted by symmetric relationships with SO 2 emissions and economic growth. From the statistical results, we confirm that road infrastructure boosts economic growth. Simultaneously, road infrastructure and urbanization hampers environmental quality and causes to accelerate emission of SO 2 in the atmosphere. Furthermore, economic growth has a diminishing negative impact on total SO 2 emission. Moreover, we did not find any proof of the expected role of transport energy consumption in SO 2 emission. The acquired results directed that care should be taken in the expansion of road infrastructure and green city policies and planning are required in the country.

  15. Determining critical infrastructure for ocean research and societal needs in 2030

    NASA Astrophysics Data System (ADS)

    Glickson, Deborah; Barron, Eric; Fine, Rana

    2011-06-01

    The United States has jurisdiction over 3.4 million square miles of ocean—an expanse greater than the land area of all 50 states combined. This vast marine area offers researchers opportunities to investigate the ocean's role in an integrated Earth system but also presents challenges to society, including damaging tsunamis and hurricanes, industrial accidents, and outbreaks of waterborne diseases. The 2010 Gulf of Mexico Deepwater Horizon oil spill and 2011 Japanese earthquake and tsunami are vivid reminders that a broad range of infrastructure is needed to advance scientists' still incomplete understanding of the ocean. The National Research Council's (NRC) Ocean Studies Board was asked by the National Science and Technology Council's Subcommittee on Ocean Science and Technology, comprising 25 U.S. government agencies, to examine infrastructure needs for ocean research in the year 2030. This request reflects concern, among a myriad of marine issues, over the present state of aging and obsolete infrastructure, insufficient capacity, growing technological gaps, and declining national leadership in marine technological development; these issues were brought to the nation's attention in 2004 by the U.S. Commission on Ocean Policy.

  16. Quality of service provision assessment in the healthcare information and telecommunications infrastructures.

    PubMed

    Babulak, Eduard

    2006-01-01

    The continuous increase in the complexity and the heterogeneity of corporate and healthcare telecommunications infrastructures will require new assessment methods of quality of service (QoS) provision that are capable of addressing all engineering and social issues with much faster speeds. Speed and accessibility to any information at any time from anywhere will create global communications infrastructures with great performance bottlenecks that may put in danger human lives, power supplies, national economy and security. Regardless of the technology supporting the information flows, the final verdict on the QoS is made by the end user. The users' perception of telecommunications' network infrastructure QoS provision is critical to the successful business management operation of any organization. As a result, it is essential to assess the QoS Provision in the light of user's perception. This article presents a cost effective methodology to assess the user's perception of quality of service provision utilizing the existing Staffordshire University Network (SUN) by adding a component of measurement to the existing model presented by Walker. This paper presents the real examples of CISCO Networking Solutions for Health Care givers and offers a cost effective approach to assess the QoS provision within the campus network, which could be easily adapted to any health care organization or campus network in the world.

  17. ParaChoice Model.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heimer, Brandon Walter; Levinson, Rebecca Sobel; West, Todd H.

    Analysis with the ParaChoice model addresses three barriers from the VTO Multi-Year Program Plan: availability of alternative fuels and electric charging station infrastructure, availability of AFVs and electric drive vehicles, and consumer reluctance to purchase new technologies. In this fiscal year, we first examined the relationship between the availability of alternative fuels and station infrastructure. Specifically, we studied how electric vehicle charging infrastructure affects the ability of EVs to compete with vehicles that rely on mature, conventional petroleum-based fuels. Second, we studied how the availability of less costly AFVs promotes their representation in the LDV fleet. Third, we used ParaChoicemore » trade space analyses to help inform which consumers are reluctant to purchase new technologies. Last, we began analysis of impacts of alternative energy technologies on Class 8 trucks to isolate those that may most efficaciously advance HDV efficiency and petroleum use reduction goals.« less

  18. Changing Perceptions of Flooding and Stormwater as a Driver of Urban Hydrology and Biogeochemistry

    NASA Astrophysics Data System (ADS)

    Hale, R. L.

    2015-12-01

    Urbanization can have detrimental impacts on downstream ecosystems due to its effects on hydrological and biogeochemical cycles. In particular, how urban stormwater systems are designed have implications for flood regimes and biogeochemical transformations. Flood and stormwater management paradigms have shifted over time at large scales, but patterns and drivers of local stormwater infrastructure designs are unknown. We describe patterns of infrastructure design and use over the 20th century in three cities along an urbanization gradient in Utah: Salt Lake, Logan, and Heber City. To understand changes in stormwater management paradigms we conducted a historical media content analysis of newspaper articles related to flooding and stormwater in Salt Lake City from 1900 to 2012. Stormwater infrastructure design varied spatially and temporally, both within and among cities. All three cities transitioned from agriculture to urban land use, and legacies were evident in the use of agricultural canals for stormwater conveyance. Salt Lake City infrastructure transitioned from centralized storm sewers during early urbanization to decentralized detention systems in the 1970's. In contrast, newer cities, Logan and Heber, saw parallel increases in conveyance and detention systems with urbanization. The media analysis revealed significant changes in flood and stormwater management paradigms over the 20th century that were driven by complex factors including top-down regulations, local disturbances, and funding constraints. Early management paradigms focused on infrastructural solutions to address problems with private and public property damage, whereas more recent paradigms focus on behavioral solutions to flooding and green infrastructure solutions to prevent negative impacts of urban stormwater on local ecosystems. Changes in human perceptions of the environment can affect how we design urban ecosystems, with important implications for ecological functions.

  19. Agile Infrastructure Monitoring

    NASA Astrophysics Data System (ADS)

    Andrade, P.; Ascenso, J.; Fedorko, I.; Fiorini, B.; Paladin, M.; Pigueiras, L.; Santos, M.

    2014-06-01

    At the present time, data centres are facing a massive rise in virtualisation and cloud computing. The Agile Infrastructure (AI) project is working to deliver new solutions to ease the management of CERN data centres. Part of the solution consists in a new "shared monitoring architecture" which collects and manages monitoring data from all data centre resources. In this article, we present the building blocks of this new monitoring architecture, the different open source technologies selected for each architecture layer, and how we are building a community around this common effort.

  20. Theorising and testing environmental pathways to behaviour change: natural experimental study of the perception and use of new infrastructure to promote walking and cycling in local communities

    PubMed Central

    Panter, Jenna; Ogilvie, David

    2015-01-01

    Objective Some studies have assessed the effectiveness of environmental interventions to promote physical activity, but few have examined how such interventions work. We investigated the environmental mechanisms linking an infrastructural intervention with behaviour change. Design Natural experimental study. Setting Three UK municipalities (Southampton, Cardiff and Kenilworth). Participants Adults living within 5 km of new walking and cycling infrastructure. Intervention Construction or improvement of walking and cycling routes. Exposure to the intervention was defined in terms of residential proximity. Outcome measures Questionnaires at baseline and 2-year follow-up assessed perceptions of the supportiveness of the environment, use of the new infrastructure, and walking and cycling behaviours. Analysis proceeded via factor analysis of perceptions of the physical environment (step 1) and regression analysis to identify plausible pathways involving physical and social environmental mediators and refine the intervention theory (step 2) to a final path analysis to test the model (step 3). Results Participants who lived near and used the new routes reported improvements in their perceptions of provision and safety. However, path analysis (step 3, n=967) showed that the effects of the intervention on changes in time spent walking and cycling were largely (90%) explained by a simple causal pathway involving use of the new routes, and other pathways involving changes in environmental cognitions explained only a small proportion of the effect. Conclusions Physical improvement of the environment itself was the key to the effectiveness of the intervention, and seeking to change people's perceptions may be of limited value. Studies of how interventions lead to population behaviour change should complement those concerned with estimating their effects in supporting valid causal inference. PMID:26338837

Top