Sample records for ogc sensor web

  1. Discovery Mechanisms for the Sensor Web

    PubMed Central

    Jirka, Simon; Bröring, Arne; Stasch, Christoph

    2009-01-01

    This paper addresses the discovery of sensors within the OGC Sensor Web Enablement framework. Whereas services like the OGC Web Map Service or Web Coverage Service are already well supported through catalogue services, the field of sensor networks and the according discovery mechanisms is still a challenge. The focus within this article will be on the use of existing OGC Sensor Web components for realizing a discovery solution. After discussing the requirements for a Sensor Web discovery mechanism, an approach will be presented that was developed within the EU funded project “OSIRIS”. This solution offers mechanisms to search for sensors, exploit basic semantic relationships, harvest sensor metadata and integrate sensor discovery into already existing catalogues. PMID:22574038

  2. SWE-based Observation Data Delivery from the Instrument to the User - Sensor Web Technology in the NeXOS Project

    NASA Astrophysics Data System (ADS)

    Jirka, Simon; del Rio, Joaquin; Toma, Daniel; Martinez, Enoc; Delory, Eric; Pearlman, Jay; Rieke, Matthes; Stasch, Christoph

    2017-04-01

    The rapidly evolving technology for building Web-based (spatial) information infrastructures and Sensor Webs, there are new opportunities to improve the process how ocean data is collected and managed. A central element in this development is the suite of Sensor Web Enablement (SWE) standards specified by the Open Geospatial Consortium (OGC). This framework of standards comprises on the one hand data models as well as formats for measurement data (ISO/OGC Observations and Measurement, O&M) and metadata describing measurement processes and sensors (OGC Sensor Model Language, SensorML). On the other hand the SWE standards comprise (Web service) interface specifications for pull-based access to observation data (OGC Sensor Observation Service, SOS) and for controlling or configuring sensors (OGC Sensor Planning Service, SPS). Also within the European INSPIRE framework the SWE standards play an important role as the SOS is the recommended download service interface for O&M-encoded observation data sets. In the context of the EU-funded Oceans of Tomorrow initiative the NeXOS (Next generation, Cost-effective, Compact, Multifunctional Web Enabled Ocean Sensor Systems Empowering Marine, Maritime and Fisheries Management) project is developing a new generation of in-situ sensors that make use of the SWE standards to facilitate the data publication process and the integration into Web based information infrastructures. This includes the development of a dedicated firmware for instruments and sensor platforms (SEISI, Smart Electronic Interface for Sensors and Instruments) maintained by the Universitat Politècnica de Catalunya (UPC). Among other features, SEISI makes use of OGC SWE standards such OGC-PUCK, to enable a plug-and-play mechanism for sensors based on SensorML encoded metadata. Thus, if a new instrument is attached to a SEISI-based platform, it automatically configures the connection to these instruments, automatically generated data files compliant with the ISO/OGC Observations and Measurements standard and initiates the data transmission into the NeXOS Sensor Web infrastructure. Besides these platform-related developments, NeXOS has realised the full path of data transmission from the sensor to the end user application. The conceptual architecture design is implemented by a series of open source SWE software packages provided by 52°North. This comprises especially different SWE server components (i.e. OGC Sensor Observation Service), tools for data visualisation (e.g. the 52°North Helgoland SOS viewer), and an editor for providing SensorML-based metadata (52°North smle). As a result, NeXOS has demonstrated how the SWE standards help to improve marine observation data collection. Within this presentation, we will present the experiences and findings of the NeXOS project and will provide recommendation for future work directions.

  3. Applying Sensor Web Technology to Marine Sensor Data

    NASA Astrophysics Data System (ADS)

    Jirka, Simon; del Rio, Joaquin; Mihai Toma, Daniel; Nüst, Daniel; Stasch, Christoph; Delory, Eric

    2015-04-01

    In this contribution we present two activities illustrating how Sensor Web technology helps to enable a flexible and interoperable sharing of marine observation data based on standards. An important foundation is the Sensor Web Architecture developed by the European FP7 project NeXOS (Next generation Low-Cost Multifunctional Web Enabled Ocean Sensor Systems Empowering Marine, Maritime and Fisheries Management). This architecture relies on the Open Geospatial Consortium's (OGC) Sensor Web Enablement (SWE) framework. It is an exemplary solution for facilitating the interoperable exchange of marine observation data within and between (research) organisations. The architecture addresses a series of functional and non-functional requirements which are fulfilled through different types of OGC SWE components. The diverse functionalities offered by the NeXOS Sensor Web architecture are shown in the following overview: - Pull-based observation data download: This is achieved through the OGC Sensor Observation Service (SOS) 2.0 interface standard. - Push-based delivery of observation data to allow users the subscription to new measurements that are relevant for them: For this purpose there are currently several specification activities under evaluation (e.g. OGC Sensor Event Service, OGC Publish/Subscribe Standards Working Group). - (Web-based) visualisation of marine observation data: Implemented through SOS client applications. - Configuration and controlling of sensor devices: This is ensured through the OGC Sensor Planning Service 2.0 interface. - Bridging between sensors/data loggers and Sensor Web components: For this purpose several components such as the "Smart Electronic Interface for Sensor Interoperability" (SEISI) concept are developed; this is complemented by a more lightweight SOS extension (e.g. based on the W3C Efficient XML Interchange (EXI) format). To further advance this architecture, there is on-going work to develop dedicated profiles of selected OGC SWE specifications that provide stricter guidance how these standards shall be applied to marine data (e.g. SensorML 2.0 profiles stating which metadata elements are mandatory building upon the ESONET Sensor Registry developments, etc.). Within the NeXOS project the presented architecture is implemented as a set of open source components. These implementations can be re-used by all interested scientists and data providers needing tools for publishing or consuming oceanographic sensor data. In further projects such as the European project FixO3 (Fixed-point Open Ocean Observatories), these software development activities are complemented with additional efforts to provide guidance how Sensor Web technology can be applied in an efficient manner. This way, not only software components are made available but also documentation and information resources that help to understand which types of Sensor Web deployments are best suited to fulfil different types of user requirements.

  4. Advances on Sensor Web for Internet of Things

    NASA Astrophysics Data System (ADS)

    Liang, S.; Bermudez, L. E.; Huang, C.; Jazayeri, M.; Khalafbeigi, T.

    2013-12-01

    'In much the same way that HTML and HTTP enabled WWW, the Open Geospatial Consortium (OGC) Sensor Web Enablement (SWE), envisioned in 2001 [1] will allow sensor webs to become a reality.'. Due to the large number of sensor manufacturers and differing accompanying protocols, integrating diverse sensors into observation systems is not a simple task. A coherent infrastructure is needed to treat sensors in an interoperable, platform-independent and uniform way. SWE standardizes web service interfaces, sensor descriptions and data encodings as building blocks for a Sensor Web. SWE standards are now mature specifications (version 2.0) with approved OGC compliance test suites and tens of independent implementations. Many earth and space science organizations and government agencies are using the SWE standards to publish and share their sensors and observations. While SWE has been demonstrated very effective for scientific sensors, its complexity and the computational overhead may not be suitable for resource-constrained tiny sensors. In June 2012, a new OGC Standards Working Group (SWG) was formed called the Sensor Web Interface for Internet of Things (SWE-IoT) SWG. This SWG focuses on developing one or more OGC standards for resource-constrained sensors and actuators (e.g., Internet of Things devices) while leveraging the existing OGC SWE standards. In the near future, billions to trillions of small sensors and actuators will be embedded in real- world objects and connected to the Internet facilitating a concept called the Internet of Things (IoT). By populating our environment with real-world sensor-based devices, the IoT is opening the door to exciting possibilities for a variety of application domains, such as environmental monitoring, transportation and logistics, urban informatics, smart cities, as well as personal and social applications. The current SWE-IoT development aims on modeling the IoT components and defining a standard web service that makes the observations captured by IoT devices easily accessible and allows users to task the actuators on the IoT devices. The SWE IoT model links things with sensors and reuses the OGC Observation and Model (O&M) to link sensors with features of interest and observed properties Unlike most SWE standards, the SWE-IoT defines a RESTful web interface for users to perform CRUD (i.e., create, read, update, and delete) functions on resources, including Things, Sensors, Actuators, Observations, Tasks, etc. Inspired by the OASIS Open Data Protocol (OData), the SWE-IoT web service provides the multi-faceted query, which means that users can query from different entity collections and link from one entity to other related entities. This presentation will introduce the latest development of the OGC SWE-IoT standards. Potential applications and implications in Earth and Space science will also be discussed. [1] Mike Botts, Sensor Web Enablement White Paper, Open GIS Consortium, Inc. 2002

  5. Born semantic: linking data from sensors to users and balancing hardware limitations with data standards

    NASA Astrophysics Data System (ADS)

    Buck, Justin; Leadbetter, Adam

    2015-04-01

    New users for the growing volume of ocean data for purposes such as 'big data' data products and operational data assimilation/ingestion require data to be readily ingestible. This can be achieved via the application of World Wide Web Consortium (W3C) Linked Data and Open Geospatial Consortium (OGC) Sensor Web Enablement (SWE) standards to data management. As part of several Horizons 2020 European projects (SenseOCEAN, ODIP, AtlantOS) the British Oceanographic Data Centre (BODC) are working on combining existing data centre architecture and SWE software such as Sensor Observation Services with a Linked Data front end. The standards to enable data delivery are proven and well documented1,2 There are practical difficulties when SWE standards are applied to real time data because of internal hardware bandwidth restrictions and a requirement to constrain data transmission costs. A pragmatic approach is proposed where sensor metadata and data output in OGC standards are implemented "shore-side" with sensors and instruments transmitting unique resolvable web linkages to persistent OGC SensorML records published at the BODC. References: 1. World Wide Web Consortium. (2013). Linked Data. Available: http://www.w3.org/standards/semanticweb/data. Last accessed 8th October 2014. 2. Open Geospatial Consortium. (2014). Sensor Web Enablement (SWE). Available: http://www.opengeospatial.org/ogc/markets-technologies/swe. Last accessed 8th October 2014.

  6. A flexible geospatial sensor observation service for diverse sensor data based on Web service

    NASA Astrophysics Data System (ADS)

    Chen, Nengcheng; Di, Liping; Yu, Genong; Min, Min

    Achieving a flexible and efficient geospatial Sensor Observation Service (SOS) is difficult, given the diversity of sensor networks, the heterogeneity of sensor data storage, and the differing requirements of users. This paper describes development of a service-oriented multi-purpose SOS framework. The goal is to create a single method of access to the data by integrating the sensor observation service with other Open Geospatial Consortium (OGC) services — Catalogue Service for the Web (CSW), Transactional Web Feature Service (WFS-T) and Transactional Web Coverage Service (WCS-T). The framework includes an extensible sensor data adapter, an OGC-compliant geospatial SOS, a geospatial catalogue service, a WFS-T, and a WCS-T for the SOS, and a geospatial sensor client. The extensible sensor data adapter finds, stores, and manages sensor data from live sensors, sensor models, and simulation systems. Abstract factory design patterns are used during design and implementation. A sensor observation service compatible with the SWE is designed, following the OGC "core" and "transaction" specifications. It is implemented using Java servlet technology. It can be easily deployed in any Java servlet container and automatically exposed for discovery using Web Service Description Language (WSDL). Interaction sequences between a Sensor Web data consumer and an SOS, between a producer and an SOS, and between an SOS and a CSW are described in detail. The framework has been successfully demonstrated in application scenarios for EO-1 observations, weather observations, and water height gauge observations.

  7. Automated Data Quality Assurance using OGC Sensor Web Enablement Frameworks for Marine Observatories

    NASA Astrophysics Data System (ADS)

    Toma, Daniel; Bghiel, Ikram; del Rio, Joaquin; Hidalgo, Alberto; Carreras, Normandino; Manuel, Antoni

    2014-05-01

    Over the past years, environmental sensors have continuously improved by becoming smaller, cheaper, and more intelligent. Therefore, many sensor networks are increasingly deployed to monitor our environment. But due to the large number of sensor manufacturers, accompanying protocols and data encoding, automated integration and data quality assurance of diverse sensors in an observing systems is not straightforward, requiring development of data management code and manual tedious configuration. However, over the past few years it has been demonstrated that Open-Geospatial Consortium (OGC) frameworks can enable web services with fully-described sensor systems, including data processing, sensor characteristics and quality control tests and results. So far, the SWE framework does not describe how to integrate sensors on-the-fly with minimal human intervention. The data management software which enables access to sensors, data processing and quality control tests has to be implemented and the results have to be manually mapped to the SWE models. In this contribution, we describe a Sensor Plug & Play infrastructure for the Sensor Web by combining (1) OGC PUCK protocol - a simple standard embedded instrument protocol to store and retrieve directly from the devices the declarative description of sensor characteristics and quality control tests, (2) an automatic mechanism for data processing and quality control tests underlying the Sensor Web - the Sensor Interface Descriptor (SID) concept, as well as (3) a model for the declarative description of sensor which serves as a generic data management mechanism - designed as a profile and extension of OGC SWE's SensorML standard. We implement and evaluate our approach by applying it to the OBSEA Observatory, and can be used to demonstrate the ability to assess data quality for temperature, salinity, air pressure and wind speed and direction observations off the coast of Garraf, in the north-eastern Spain.

  8. NASA SensorWeb and OGC Standards for Disaster Management

    NASA Technical Reports Server (NTRS)

    Mandl, Dan

    2010-01-01

    I. Goal: Enable user to cost-effectively find and create customized data products to help manage disasters; a) On-demand; b) Low cost and non-specialized tools such as Google Earth and browsers; c) Access via open network but with sufficient security. II. Use standards to interface various sensors and resultant data: a) Wrap sensors in Open Geospatial Consortium (OGC) standards; b) Wrap data processing algorithms and servers with OGC standards c) Use standardized workflows to orchestrate and script the creation of these data; products. III. Target Web 2.0 mass market: a) Make it simple and easy to use; b) Leverage new capabilities and tools that are emerging; c) Improve speed and responsiveness.

  9. Design for Connecting Spatial Data Infrastructures with Sensor Web (sensdi)

    NASA Astrophysics Data System (ADS)

    Bhattacharya, D.; M., M.

    2016-06-01

    Integrating Sensor Web With Spatial Data Infrastructures (SENSDI) aims to extend SDIs with sensor web enablement, converging geospatial and built infrastructure, and implement test cases with sensor data and SDI. It is about research to harness the sensed environment by utilizing domain specific sensor data to create a generalized sensor webframework. The challenges being semantic enablement for Spatial Data Infrastructures, and connecting the interfaces of SDI with interfaces of Sensor Web. The proposed research plan is to Identify sensor data sources, Setup an open source SDI, Match the APIs and functions between Sensor Web and SDI, and Case studies like hazard applications, urban applications etc. We take up co-operative development of SDI best practices to enable a new realm of a location enabled and semantically enriched World Wide Web - the "Geospatial Web" or "Geosemantic Web" by setting up one to one correspondence between WMS, WFS, WCS, Metadata and 'Sensor Observation Service' (SOS); 'Sensor Planning Service' (SPS); 'Sensor Alert Service' (SAS); a service that facilitates asynchronous message interchange between users and services, and between two OGC-SWE services, called the 'Web Notification Service' (WNS). Hence in conclusion, it is of importance to geospatial studies to integrate SDI with Sensor Web. The integration can be done through merging the common OGC interfaces of SDI and Sensor Web. Multi-usability studies to validate integration has to be undertaken as future research.

  10. Sensor Webs and Virtual Globes: Enabling Understanding of Changes in a partially Glaciated Watershed

    NASA Astrophysics Data System (ADS)

    Heavner, M.; Fatland, D. R.; Habermann, M.; Berner, L.; Hood, E.; Connor, C.; Galbraith, J.; Knuth, E.; O'Brien, W.

    2008-12-01

    The University of Alaska Southeast is currently implementing a sensor web identified as the SouthEast Alaska MOnitoring Network for Science, Telecommunications, Education, and Research (SEAMONSTER). SEAMONSTER is operating in the partially glaciated Mendenhall and Lemon Creek Watersheds, in the Juneau area, on the margins of the Juneau Icefield. These watersheds are studied for both 1. long term monitoring of changes, and 2. detection and analysis of transient events (such as glacier lake outburst floods). The heterogeneous sensors (meteorologic, dual frequency GPS, water quality, lake level, etc), power and bandwidth constraints, and competing time scales of interest require autonomous reactivity of the sensor web. They also present challenges for operational management of the sensor web. The harsh conditions on the glaciers provide additional operating constraints. The tight integration of the sensor web and virtual global enabling technology enhance the project in multiple ways. We are utilizing virtual globe infrastructures to enhance both sensor web management and data access. SEAMONSTER utilizes virtual globes for education and public outreach, sensor web management, data dissemination, and enabling collaboration. Using a PosgreSQL with GIS extensions database coupled to the Open Geospatial Consortium (OGC) Geoserver, we generate near-real-time auto-updating geobrowser files of the data in multiple OGC standard formats (e.g KML, WCS). Additionally, embedding wiki pages in this database allows the development of a geospatially aware wiki describing the projects for better public outreach and education. In this presentation we will describe how we have implemented these technologies to date, the lessons learned, and our efforts towards greater OGC standard implementation. A major focus will be on demonstrating how geobrowsers and virtual globes have made this project possible.

  11. The OGC Sensor Web Enablement framework

    NASA Astrophysics Data System (ADS)

    Cox, S. J.; Botts, M.

    2006-12-01

    Sensor observations are at the core of natural sciences. Improvements in data-sharing technologies offer the promise of much greater utilisation of observational data. A key to this is interoperable data standards. The Open Geospatial Consortium's (OGC) Sensor Web Enablement initiative (SWE) is developing open standards for web interfaces for the discovery, exchange and processing of sensor observations, and tasking of sensor systems. The goal is to support the construction of complex sensor applications through real-time composition of service chains from standard components. The framework is based around a suite of standard interfaces, and standard encodings for the message transferred between services. The SWE interfaces include: Sensor Observation Service (SOS)-parameterized observation requests (by observation time, feature of interest, property, sensor); Sensor Planning Service (SPS)-tasking a sensor- system to undertake future observations; Sensor Alert Service (SAS)-subscription to an alert, usually triggered by a sensor result exceeding some value. The interface design generally follows the pattern established in the OGC Web Map Service (WMS) and Web Feature Service (WFS) interfaces, where the interaction between a client and service follows a standard sequence of requests and responses. The first obtains a general description of the service capabilities, followed by obtaining detail required to formulate a data request, and finally a request for a data instance or stream. These may be implemented in a stateless "REST" idiom, or using conventional "web-services" (SOAP) messaging. In a deployed system, the SWE interfaces are supplemented by Catalogue, data (WFS) and portrayal (WMS) services, as well as authentication and rights management. The standard SWE data formats are Observations and Measurements (O&M) which encodes observation metadata and results, Sensor Model Language (SensorML) which describes sensor-systems, Transducer Model Language (TML) which covers low-level data streams, and domain-specific GML Application Schemas for definitions of the target feature types. The SWE framework has been demonstrated in several interoperability testbeds. These were based around emergency management, security, contamination and environmental monitoring scenarios.

  12. Service Oriented Architecture for Wireless Sensor Networks in Agriculture

    NASA Astrophysics Data System (ADS)

    Sawant, S. A.; Adinarayana, J.; Durbha, S. S.; Tripathy, A. K.; Sudharsan, D.

    2012-08-01

    Rapid advances in Wireless Sensor Network (WSN) for agricultural applications has provided a platform for better decision making for crop planning and management, particularly in precision agriculture aspects. Due to the ever-increasing spread of WSNs there is a need for standards, i.e. a set of specifications and encodings to bring multiple sensor networks on common platform. Distributed sensor systems when brought together can facilitate better decision making in agricultural domain. The Open Geospatial Consortium (OGC) through Sensor Web Enablement (SWE) provides guidelines for semantic and syntactic standardization of sensor networks. In this work two distributed sensing systems (Agrisens and FieldServer) were selected to implement OGC SWE standards through a Service Oriented Architecture (SOA) approach. Online interoperable data processing was developed through SWE components such as Sensor Model Language (SensorML) and Sensor Observation Service (SOS). An integrated web client was developed to visualize the sensor observations and measurements that enables the retrieval of crop water resources availability and requirements in a systematic manner for both the sensing devices. Further, the client has also the ability to operate in an interoperable manner with any other OGC standardized WSN systems. The study of WSN systems has shown that there is need to augment the operations / processing capabilities of SOS in order to understand about collected sensor data and implement the modelling services. Also, the very low cost availability of WSN systems in future, it is possible to implement the OGC standardized SWE framework for agricultural applications with open source software tools.

  13. Adding Processing Functionality to the Sensor Web

    NASA Astrophysics Data System (ADS)

    Stasch, Christoph; Pross, Benjamin; Jirka, Simon; Gräler, Benedikt

    2017-04-01

    The Sensor Web allows discovering, accessing and tasking different kinds of environmental sensors in the Web, ranging from simple in-situ sensors to remote sensing systems. However, (geo-)processing functionality needs to be applied to integrate data from different sensor sources and to generate higher level information products. Yet, a common standardized approach for processing sensor data in the Sensor Web is still missing and the integration differs from application to application. Standardizing not only the provision of sensor data, but also the processing facilitates sharing and re-use of processing modules, enables reproducibility of processing results, and provides a common way to integrate external scalable processing facilities or legacy software. In this presentation, we provide an overview on on-going research projects that develop concepts for coupling standardized geoprocessing technologies with Sensor Web technologies. At first, different architectures for coupling sensor data services with geoprocessing services are presented. Afterwards, profiles for linear regression and spatio-temporal interpolation of the OGC Web Processing Services that allow consuming sensor data coming from and uploading predictions to Sensor Observation Services are introduced. The profiles are implemented in processing services for the hydrological domain. Finally, we illustrate how the R software can be coupled with existing OGC Sensor Web and Geoprocessing Services and present an example, how a Web app can be built that allows exploring the results of environmental models in an interactive way using the R Shiny framework. All of the software presented is available as Open Source Software.

  14. An Updated Status of the Experiments with Sensor Webs and OGC Service-Oriented Architectures to Enable Global Earth Observing System of Systems (GEOSS)

    NASA Technical Reports Server (NTRS)

    Mandl, Dan; Sohlberg, Rob; Frye, Stu; Cappelaere, P.; Derezinski, L.; Ungar, Steve; Ames, Troy; Chien, Steve; Tran, Danny

    2007-01-01

    A viewgraph presentation on experiments with sensor webs and service oriented architectures is shown. The topics include: 1) Problem; 2) Basic Service Oriented Architecture Approach; 3) Series of Experiments; and 4) Next Experiments.

  15. Evolution of System Architectures: Where Do We Need to Fail Next?

    NASA Astrophysics Data System (ADS)

    Bermudez, Luis; Alameh, Nadine; Percivall, George

    2013-04-01

    Innovation requires testing and failing. Thomas Edison was right when he said "I have not failed. I've just found 10,000 ways that won't work". For innovation and improvement of standards to happen, service Architectures have to be tested and tested. Within the Open Geospatial Consortium (OGC), testing of service architectures has occurred for the last 15 years. This talk will present an evolution of these service architectures and a possible future path. OGC is a global forum for the collaboration of developers and users of spatial data products and services, and for the advancement and development of international standards for geospatial interoperability. The OGC Interoperability Program is a series of hands-on, fast paced, engineering initiatives to accelerate the development and acceptance of OGC standards. Each initiative is organized in threads that provide focus under a particular theme. The first testbed, OGC Web Services phase 1, completed in 2003 had four threads: Common Architecture, Web Mapping, Sensor Web and Web Imagery Enablement. The Common Architecture was a cross-thread theme, to ensure that the Web Mapping and Sensor Web experiments built on a base common architecture. The architecture was based on the three main SOA components: Broker, Requestor and Provider. It proposed a general service model defining service interactions and dependencies; categorization of service types; registries to allow discovery and access of services; data models and encodings; and common services (WMS, WFS, WCS). For the latter, there was a clear distinction on the different services: Data Services (e.g. WMS), Application services (e.g. Coordinate transformation) and server-side client applications (e.g. image exploitation). The latest testbed, OGC Web Service phase 9, completed in 2012 had 5 threads: Aviation, Cross-Community Interoperability (CCI), Security and Services Interoperability (SSI), OWS Innovations and Compliance & Interoperability Testing & Evaluation (CITE). Compared to the first testbed, OWS-9 did not have a separate common architecture thread. Instead the emphasis was on brokering information models, securing them and making data available efficiently on mobile devices. The outcome is an architecture based on usability and non-intrusiveness while leveraging mediation of information models from different communities. This talk will use lessons learned from the evolution from OGC Testbed phase 1 to phase 9 to better understand how global and complex infrastructures evolve to support many communities including the Earth System Science Community.

  16. Using Sensor Web Processes and Protocols to Assimilate Satellite Data into a Forecast Model

    NASA Technical Reports Server (NTRS)

    Goodman, H. Michael; Conover, Helen; Zavodsky, Bradley; Maskey, Manil; Jedlovec, Gary; Regner, Kathryn; Li, Xiang; Lu, Jessica; Botts, Mike; Berthiau, Gregoire

    2008-01-01

    The goal of the Sensor Management Applied Research Technologies (SMART) On-Demand Modeling project is to develop and demonstrate the readiness of the Open Geospatial Consortium (OGC) Sensor Web Enablement (SWE) capabilities to integrate both space-based Earth observations and forecast model output into new data acquisition and assimilation strategies. The project is developing sensor web-enabled processing plans to assimilate Atmospheric Infrared Sounding (AIRS) satellite temperature and moisture retrievals into a regional Weather Research and Forecast (WRF) model over the southeastern United States.

  17. Marine Profiles for OGC Sensor Web Enablement Standards

    NASA Astrophysics Data System (ADS)

    Jirka, Simon

    2016-04-01

    The use of OGC Sensor Web Enablement (SWE) standards in oceanology is increasing. Several projects are developing SWE-based infrastructures to ease the sharing of marine sensor data. This work ranges from developments on sensor level to efforts addressing interoperability of data flows between observatories and organisations. The broad range of activities using SWE standards leads to a risk of diverging approaches how the SWE specifications are applied. Because the SWE standards are designed in a domain independent manner, they intentionally offer a high degree of flexibility enabling implementation across different domains and usage scenarios. At the same time this flexibility allows one to achieve similar goals in different ways. To avoid interoperability issues, an agreement is needed on how to apply SWE concepts and how to use vocabularies in a common way that will be shared by different projects, implementations, and users. To address this need, partners from several projects and initiatives (AODN, BRIDGES, envri+, EUROFLEETS/EUROFLEETS2, FixO3, FRAM, IOOS, Jerico/Jerico-Next, NeXOS, ODIP/ODIP II, RITMARE, SeaDataNet, SenseOcean, X-DOMES) have teamed up to develop marine profiles of OGC SWE standards that can serve as a common basis for developments in multiple projects and organisations. The following aspects will be especially considered: 1.) Provision of metadata: For discovering sensors/instruments as well as observation data, to facilitate the interpretation of observations, and to integrate instruments in sensor platforms, the provision of metadata is crucial. Thus, a marine profile of the OGC Sensor Model Language 2.0 (SensorML 2.0) will be developed allowing to provide metadata for different levels (e.g. observatory, instrument, and detector) and sensor types. The latter will enable metadata of a specific type to be automatically inherited by all devices/sensors of the same type. The application of further standards such as OGC PUCK will benefit from this encoding, too, by facilitating the communication with instruments. 2.) Encoding and modelling of observation data: For delivering observation data, the ISO/OGC Observations and Measurements 2.0 (O&M 2.0) standard serves as a good basis. Within an O&M profile, recommendations will be given on needed observation types that cover different aspects of marine sensing (trajectory, stationary, or profile measurements, etc.). Besides XML, further O&M encodings (e.g. JSON-based) will be considered. 3.) Data access: A profile of the OGC Sensor Observation Service 2.0 (SOS 2.0) standard will be specified to offer a common way on how this web service interface can be used for requesting marine observations and metadata. At the same time this will offer a common interface to cross-domain applications based upon tools such as the GEOSS DAB. Lightweight approaches such as REST will be considered as further bindings for the SOS interface. 4.) Backward compatibility: The profile will consider the existing observation systems so that migration paths towards the specified profiles can be offered. We will present the current state of the profile development. In particular, a comparative analysis of SWE usage in different projects, an outline of the requirements, and fundamental aspects of profiles of SWE standards will be shown.

  18. Sensor Webs with a Service-Oriented Architecture for On-demand Science Products

    NASA Technical Reports Server (NTRS)

    Mandl, Daniel; Ungar, Stephen; Ames, Troy; Justice, Chris; Frye, Stuart; Chien, Steve; Tran, Daniel; Cappelaere, Patrice; Derezinsfi, Linda; Paules, Granville; hide

    2007-01-01

    This paper describes the work being managed by the NASA Goddard Space Flight Center (GSFC) Information System Division (ISD) under a NASA Earth Science Technology Ofice (ESTO) Advanced Information System Technology (AIST) grant to develop a modular sensor web architecture which enables discovery of sensors and workflows that can create customized science via a high-level service-oriented architecture based on Open Geospatial Consortium (OGC) Sensor Web Enablement (SWE) web service standards. These capabilities serve as a prototype to a user-centric architecture for Global Earth Observing System of Systems (GEOSS). This work builds and extends previous sensor web efforts conducted at NASA/GSFC using the Earth Observing 1 (EO-1) satellite and other low-earth orbiting satellites.

  19. A Web 2.0 and OGC Standards Enabled Sensor Web Architecture for Global Earth Observing System of Systems

    NASA Technical Reports Server (NTRS)

    Mandl, Daniel; Unger, Stephen; Ames, Troy; Frye, Stuart; Chien, Steve; Cappelaere, Pat; Tran, Danny; Derezinski, Linda; Paules, Granville

    2007-01-01

    This paper will describe the progress of a 3 year research award from the NASA Earth Science Technology Office (ESTO) that began October 1, 2006, in response to a NASA Announcement of Research Opportunity on the topic of sensor webs. The key goal of this research is to prototype an interoperable sensor architecture that will enable interoperability between a heterogeneous set of space-based, Unmanned Aerial System (UAS)-based and ground based sensors. Among the key capabilities being pursued is the ability to automatically discover and task the sensors via the Internet and to automatically discover and assemble the necessary science processing algorithms into workflows in order to transform the sensor data into valuable science products. Our first set of sensor web demonstrations will prototype science products useful in managing wildfires and will use such assets as the Earth Observing 1 spacecraft, managed out of NASA/GSFC, a UASbased instrument, managed out of Ames and some automated ground weather stations, managed by the Forest Service. Also, we are collaborating with some of the other ESTO awardees to expand this demonstration and create synergy between our research efforts. Finally, we are making use of Open Geospatial Consortium (OGC) Sensor Web Enablement (SWE) suite of standards and some Web 2.0 capabilities to Beverage emerging technologies and standards. This research will demonstrate and validate a path for rapid, low cost sensor integration, which is not tied to a particular system, and thus be able to absorb new assets in an easily evolvable, coordinated manner. This in turn will help to facilitate the United States contribution to the Global Earth Observation System of Systems (GEOSS), as agreed by the U.S. and 60 other countries at the third Earth Observation Summit held in February of 2005.

  20. Accessing near real-time Antarctic meteorological data through an OGC Sensor Observation Service (SOS)

    NASA Astrophysics Data System (ADS)

    Kirsch, Peter; Breen, Paul

    2013-04-01

    We wish to highlight outputs of a project conceived from a science requirement to improve discovery and access to Antarctic meteorological data in near real-time. Given that the data was distributed in both spatial and temporal domains and is to be accessed across several science disciplines, the creation of an interoperable, OGC compliant web service was deemed the most appropriate approach. We will demonstrate an implementation of the OGC SOS Interface Standard to discover, browse, and access Antarctic meteorological data-sets. A selection of programmatic (R, Perl) and web client interfaces utilizing open technologies ( e.g. jQuery, Flot, openLayers ) will be demonstrated. In addition we will show how high level abstractions can be constructed to allow the users flexible and straightforward access to SOS retrieved data.

  1. Advancements in Open Geospatial Standards for Photogrammetry and Remote Sensing from Ogc

    NASA Astrophysics Data System (ADS)

    Percivall, George; Simonis, Ingo

    2016-06-01

    The necessity of open standards for effective sharing and use of remote sensing continues to receive increasing emphasis in policies of agencies and projects around the world. Coordination on the development of open standards for geospatial information is a vital step to insure that the technical standards are ready to support the policy objectives. The mission of the Open Geospatial Consortium (OGC) is to advance development and use of international standards and supporting services that promote geospatial interoperability. To accomplish this mission, OGC serves as the global forum for the collaboration of geospatial data / solution providers and users. Photogrammetry and remote sensing are sources of the largest and most complex geospatial information. Some of the most mature OGC standards for remote sensing include the Sensor Web Enablement (SWE) standards, the Web Coverage Service (WCS) suite of standards, encodings such as NetCDF, GMLJP2 and GeoPackage, and the soon to be approved Discrete Global Grid Systems (DGGS) standard. In collaboration with ISPRS, OGC working with government, research and industrial organizations continue to advance the state of geospatial standards for full use of photogrammetry and remote sensing.

  2. Using URIs to effectively transmit sensor data and metadata

    NASA Astrophysics Data System (ADS)

    Kokkinaki, Alexandra; Buck, Justin; Darroch, Louise; Gardner, Thomas

    2017-04-01

    Autonomous ocean observation is massively increasing the number of sensors in the ocean. Accordingly, the continuing increase in datasets produced, makes selecting sensors that are fit for purpose a growing challenge. Decision making on selecting quality sensor data, is based on the sensor's metadata, i.e. manufacturer specifications, history of calibrations etc. The Open Geospatial Consortium (OGC) has developed the Sensor Web Enablement (SWE) standards to facilitate integration and interoperability of sensor data and metadata. The World Wide Web Consortium (W3C) Semantic Web technologies enable machine comprehensibility promoting sophisticated linking and processing of data published on the web. Linking the sensor's data and metadata according to the above-mentioned standards can yield practical difficulties, because of internal hardware bandwidth restrictions and a requirement to constrain data transmission costs. Our approach addresses these practical difficulties by uniquely identifying sensor and platform models and instances through URIs, which resolve via content negotiation to either OGC's sensor meta language, sensorML or W3C's Linked Data. Data transmitted by a sensor incorporate the sensor's unique URI to refer to its metadata. Sensor and platform model URIs and descriptions are created and hosted by the British Oceanographic Data Centre (BODC) linked systems service. The sensor owner creates the sensor and platform instance URIs prior and during sensor deployment, through an updatable web form, the Sensor Instance Form (SIF). SIF enables model and instance URI association but also platform and sensor linking. The use of URIs, which are dynamically generated through the SIF, offers both practical and economical benefits to the implementation of SWE and Linked Data standards in near real time systems. Data can be linked to metadata dynamically in-situ while saving on the costs associated to the transmission of long metadata descriptions. The transmission of short URIs also enables the implementation of standards on systems where it is impractical, such as legacy hardware.

  3. A Query Language for Handling Big Observation Data Sets in the Sensor Web

    NASA Astrophysics Data System (ADS)

    Autermann, Christian; Stasch, Christoph; Jirka, Simon; Koppe, Roland

    2017-04-01

    The Sensor Web provides a framework for the standardized Web-based sharing of environmental observations and sensor metadata. While the issue of varying data formats and protocols is addressed by these standards, the fast growing size of observational data is imposing new challenges for the application of these standards. Most solutions for handling big observational datasets currently focus on remote sensing applications, while big in-situ datasets relying on vector features still lack a solid approach. Conventional Sensor Web technologies may not be adequate, as the sheer size of the data transmitted and the amount of metadata accumulated may render traditional OGC Sensor Observation Services (SOS) unusable. Besides novel approaches to store and process observation data in place, e.g. by harnessing big data technologies from mainstream IT, the access layer has to be amended to utilize and integrate these large observational data archives into applications and to enable analysis. For this, an extension to the SOS will be discussed that establishes a query language to dynamically process and filter observations at storage level, similar to the OGC Web Coverage Service (WCS) and it's Web Coverage Processing Service (WCPS) extension. This will enable applications to request e.g. spatial or temporal aggregated data sets in a resolution it is able to display or it requires. The approach will be developed and implemented in cooperation with the The Alfred Wegener Institute, Helmholtz Centre for Polar and Marine Research whose catalogue of data compromises marine observations of physical, chemical and biological phenomena from a wide variety of sensors, including mobile (like research vessels, aircrafts or underwater vehicles) and stationary (like buoys or research stations). Observations are made with a high temporal resolution and the resulting time series may span multiple decades.

  4. A System to Provide Real-Time Collaborative Situational Awareness by Web Enabling a Distributed Sensor Network

    NASA Technical Reports Server (NTRS)

    Panangadan, Anand; Monacos, Steve; Burleigh, Scott; Joswig, Joseph; James, Mark; Chow, Edward

    2012-01-01

    In this paper, we describe the architecture of both the PATS and SAP systems and how these two systems interoperate with each other forming a unified capability for deploying intelligence in hostile environments with the objective of providing actionable situational awareness of individuals. The SAP system works in concert with the UICDS information sharing middleware to provide data fusion from multiple sources. UICDS can then publish the sensor data using the OGC's Web Mapping Service, Web Feature Service, and Sensor Observation Service standards. The system described in the paper is able to integrate a spatially distributed sensor system, operating without the benefit of the Web infrastructure, with a remote monitoring and control system that is equipped to take advantage of SWE.

  5. Crowdsourcing, citizen sensing and sensor web technologies for public and environmental health surveillance and crisis management: trends, OGC standards and application examples

    PubMed Central

    2011-01-01

    'Wikification of GIS by the masses' is a phrase-term first coined by Kamel Boulos in 2005, two years earlier than Goodchild's term 'Volunteered Geographic Information'. Six years later (2005-2011), OpenStreetMap and Google Earth (GE) are now full-fledged, crowdsourced 'Wikipedias of the Earth' par excellence, with millions of users contributing their own layers to GE, attaching photos, videos, notes and even 3-D (three dimensional) models to locations in GE. From using Twitter in participatory sensing and bicycle-mounted sensors in pervasive environmental sensing, to creating a 100,000-sensor geo-mashup using Semantic Web technology, to the 3-D visualisation of indoor and outdoor surveillance data in real-time and the development of next-generation, collaborative natural user interfaces that will power the spatially-enabled public health and emergency situation rooms of the future, where sensor data and citizen reports can be triaged and acted upon in real-time by distributed teams of professionals, this paper offers a comprehensive state-of-the-art review of the overlapping domains of the Sensor Web, citizen sensing and 'human-in-the-loop sensing' in the era of the Mobile and Social Web, and the roles these domains can play in environmental and public health surveillance and crisis/disaster informatics. We provide an in-depth review of the key issues and trends in these areas, the challenges faced when reasoning and making decisions with real-time crowdsourced data (such as issues of information overload, "noise", misinformation, bias and trust), the core technologies and Open Geospatial Consortium (OGC) standards involved (Sensor Web Enablement and Open GeoSMS), as well as a few outstanding project implementation examples from around the world. PMID:22188675

  6. Crowdsourcing, citizen sensing and sensor web technologies for public and environmental health surveillance and crisis management: trends, OGC standards and application examples.

    PubMed

    Kamel Boulos, Maged N; Resch, Bernd; Crowley, David N; Breslin, John G; Sohn, Gunho; Burtner, Russ; Pike, William A; Jezierski, Eduardo; Chuang, Kuo-Yu Slayer

    2011-12-21

    'Wikification of GIS by the masses' is a phrase-term first coined by Kamel Boulos in 2005, two years earlier than Goodchild's term 'Volunteered Geographic Information'. Six years later (2005-2011), OpenStreetMap and Google Earth (GE) are now full-fledged, crowdsourced 'Wikipedias of the Earth' par excellence, with millions of users contributing their own layers to GE, attaching photos, videos, notes and even 3-D (three dimensional) models to locations in GE. From using Twitter in participatory sensing and bicycle-mounted sensors in pervasive environmental sensing, to creating a 100,000-sensor geo-mashup using Semantic Web technology, to the 3-D visualisation of indoor and outdoor surveillance data in real-time and the development of next-generation, collaborative natural user interfaces that will power the spatially-enabled public health and emergency situation rooms of the future, where sensor data and citizen reports can be triaged and acted upon in real-time by distributed teams of professionals, this paper offers a comprehensive state-of-the-art review of the overlapping domains of the Sensor Web, citizen sensing and 'human-in-the-loop sensing' in the era of the Mobile and Social Web, and the roles these domains can play in environmental and public health surveillance and crisis/disaster informatics. We provide an in-depth review of the key issues and trends in these areas, the challenges faced when reasoning and making decisions with real-time crowdsourced data (such as issues of information overload, "noise", misinformation, bias and trust), the core technologies and Open Geospatial Consortium (OGC) standards involved (Sensor Web Enablement and Open GeoSMS), as well as a few outstanding project implementation examples from around the world.

  7. Semantically-enabled sensor plug & play for the sensor web.

    PubMed

    Bröring, Arne; Maúe, Patrick; Janowicz, Krzysztof; Nüst, Daniel; Malewski, Christian

    2011-01-01

    Environmental sensors have continuously improved by becoming smaller, cheaper, and more intelligent over the past years. As consequence of these technological advancements, sensors are increasingly deployed to monitor our environment. The large variety of available sensor types with often incompatible protocols complicates the integration of sensors into observing systems. The standardized Web service interfaces and data encodings defined within OGC's Sensor Web Enablement (SWE) framework make sensors available over the Web and hide the heterogeneous sensor protocols from applications. So far, the SWE framework does not describe how to integrate sensors on-the-fly with minimal human intervention. The driver software which enables access to sensors has to be implemented and the measured sensor data has to be manually mapped to the SWE models. In this article we introduce a Sensor Plug & Play infrastructure for the Sensor Web by combining (1) semantic matchmaking functionality, (2) a publish/subscribe mechanism underlying the SensorWeb, as well as (3) a model for the declarative description of sensor interfaces which serves as a generic driver mechanism. We implement and evaluate our approach by applying it to an oil spill scenario. The matchmaking is realized using existing ontologies and reasoning engines and provides a strong case for the semantic integration capabilities provided by Semantic Web research.

  8. A semantically rich and standardised approach enhancing discovery of sensor data and metadata

    NASA Astrophysics Data System (ADS)

    Kokkinaki, Alexandra; Buck, Justin; Darroch, Louise

    2016-04-01

    The marine environment plays an essential role in the earth's climate. To enhance the ability to monitor the health of this important system, innovative sensors are being produced and combined with state of the art sensor technology. As the number of sensors deployed is continually increasing,, it is a challenge for data users to find the data that meet their specific needs. Furthermore, users need to integrate diverse ocean datasets originating from the same or even different systems. Standards provide a solution to the above mentioned challenges. The Open Geospatial Consortium (OGC) has created Sensor Web Enablement (SWE) standards that enable different sensor networks to establish syntactic interoperability. When combined with widely accepted controlled vocabularies, they become semantically rich and semantic interoperability is achievable. In addition, Linked Data is the recommended best practice for exposing, sharing and connecting information on the Semantic Web using Uniform Resource Identifiers (URIs), Resource Description Framework (RDF) and RDF Query Language (SPARQL). As part of the EU-funded SenseOCEAN project, the British Oceanographic Data Centre (BODC) is working on the standardisation of sensor metadata enabling 'plug and play' sensor integration. Our approach combines standards, controlled vocabularies and persistent URIs to publish sensor descriptions, their data and associated metadata as 5 star Linked Data and OGC SWE (SensorML, Observations & Measurements) standard. Thus sensors become readily discoverable, accessible and useable via the web. Content and context based searching is also enabled since sensors descriptions are understood by machines. Additionally, sensor data can be combined with other sensor or Linked Data datasets to form knowledge. This presentation will describe the work done in BODC to achieve syntactic and semantic interoperability in the sensor domain. It will illustrate the reuse and extension of the Semantic Sensor Network (SSN) ontology to Linked Sensor Ontology (LSO) and the steps taken to combine OGC SWE with the Linked Data approach through alignment and embodiment of other ontologies. It will then explain how data and models were annotated with controlled vocabularies to establish unambiguous semantics and interconnect them with data from different sources. Finally, it will introduce the RDF triple store where the sensor descriptions and metadata are stored and can be queried through the standard query language SPARQL. Providing different flavours of machine readable interpretations of sensors, sensor data and metadata enhances discoverability but most importantly allows seamless aggregation of information from different networks that will finally produce knowledge.

  9. A New User Interface for On-Demand Customizable Data Products for Sensors in a SensorWeb

    NASA Technical Reports Server (NTRS)

    Mandl, Daniel; Cappelaere, Pat; Frye, Stuart; Sohlberg, Rob; Ly, Vuong; Chien, Steve; Sullivan, Don

    2011-01-01

    A SensorWeb is a set of sensors, which can consist of ground, airborne and space-based sensors interoperating in an automated or autonomous collaborative manner. The NASA SensorWeb toolbox, developed at NASA/GSFC in collaboration with NASA/JPL, NASA/Ames and other partners, is a set of software and standards that (1) enables users to create virtual private networks of sensors over open networks; (2) provides the capability to orchestrate their actions; (3) provides the capability to customize the output data products and (4) enables automated delivery of the data products to the users desktop. A recent addition to the SensorWeb Toolbox is a new user interface, together with web services co-resident with the sensors, to enable rapid creation, loading and execution of new algorithms for processing sensor data. The web service along with the user interface follows the Open Geospatial Consortium (OGC) standard called Web Coverage Processing Service (WCPS). This presentation will detail the prototype that was built and how the WCPS was tested against a HyspIRI flight testbed and an elastic computation cloud on the ground with EO-1 data. HyspIRI is a future NASA decadal mission. The elastic computation cloud stores EO-1 data and runs software similar to Amazon online shopping.

  10. Interacting With A Near Real-Time Urban Digital Watershed Using Emerging Geospatial Web Technologies

    NASA Astrophysics Data System (ADS)

    Liu, Y.; Fazio, D. J.; Abdelzaher, T.; Minsker, B.

    2007-12-01

    The value of real-time hydrologic data dissemination including river stage, streamflow, and precipitation for operational stormwater management efforts is particularly high for communities where flash flooding is common and costly. Ideally, such data would be presented within a watershed-scale geospatial context to portray a holistic view of the watershed. Local hydrologic sensor networks usually lack comprehensive integration with sensor networks managed by other agencies sharing the same watershed due to administrative, political, but mostly technical barriers. Recent efforts on providing unified access to hydrological data have concentrated on creating new SOAP-based web services and common data format (e.g. WaterML and Observation Data Model) for users to access the data (e.g. HIS and HydroSeek). Geospatial Web technology including OGC sensor web enablement (SWE), GeoRSS, Geo tags, Geospatial browsers such as Google Earth and Microsoft Virtual Earth and other location-based service tools provides possibilities for us to interact with a digital watershed in near-real-time. OGC SWE proposes a revolutionary concept towards a web-connected/controllable sensor networks. However, these efforts have not provided the capability to allow dynamic data integration/fusion among heterogeneous sources, data filtering and support for workflows or domain specific applications where both push and pull mode of retrieving data may be needed. We propose a light weight integration framework by extending SWE with open source Enterprise Service Bus (e.g., mule) as a backbone component to dynamically transform, transport, and integrate both heterogeneous sensor data sources and simulation model outputs. We will report our progress on building such framework where multi-agencies" sensor data and hydro-model outputs (with map layers) will be integrated and disseminated in a geospatial browser (e.g. Microsoft Virtual Earth). This is a collaborative project among NCSA, USGS Illinois Water Science Center, Computer Science Department at UIUC funded by the Adaptive Environmental Infrastructure Sensing and Information Systems initiative at UIUC.

  11. Sensor Webs in Digital Earth

    NASA Astrophysics Data System (ADS)

    Heavner, M. J.; Fatland, D. R.; Moeller, H.; Hood, E.; Schultz, M.

    2007-12-01

    The University of Alaska Southeast is currently implementing a sensor web identified as the SouthEast Alaska MOnitoring Network for Science, Telecommunications, Education, and Research (SEAMONSTER). From power systems and instrumentation through data management, visualization, education, and public outreach, SEAMONSTER is designed with modularity in mind. We are utilizing virtual earth infrastructures to enhance both sensor web management and data access. We will describe how the design philosophy of using open, modular components contributes to the exploration of different virtual earth environments. We will also describe the sensor web physical implementation and how the many components have corresponding virtual earth representations. This presentation will provide an example of the integration of sensor webs into a virtual earth. We suggest that IPY sensor networks and sensor webs may integrate into virtual earth systems and provide an IPY legacy easily accessible to both scientists and the public. SEAMONSTER utilizes geobrowsers for education and public outreach, sensor web management, data dissemination, and enabling collaboration. We generate near-real-time auto-updating geobrowser files of the data. In this presentation we will describe how we have implemented these technologies to date, the lessons learned, and our efforts towards greater OGC standard implementation. A major focus will be on demonstrating how geobrowsers have made this project possible.

  12. A Walk through TRIDEC's intermediate Tsunami Early Warning System

    NASA Astrophysics Data System (ADS)

    Hammitzsch, M.; Reißland, S.; Lendholt, M.

    2012-04-01

    The management of natural crises is an important application field of the technology developed in the project Collaborative, Complex, and Critical Decision-Support in Evolving Crises (TRIDEC), co-funded by the European Commission in its Seventh Framework Programme. TRIDEC is based on the development of the German Indonesian Tsunami Early Warning System (GITEWS) and the Distant Early Warning System (DEWS) providing a service platform for both sensor integration and warning dissemination. In TRIDEC new developments in Information and Communication Technology (ICT) are used to extend the existing platform realising a component-based technology framework for building distributed tsunami warning systems for deployment, e.g. in the North-eastern Atlantic, the Mediterranean and Connected Seas (NEAM) region. The TRIDEC system will be implemented in three phases, each with a demonstrator. Successively, the demonstrators are addressing challenges, such as the design and implementation of a robust and scalable service infrastructure supporting the integration and utilisation of existing resources with accelerated generation of large volumes of data. These include sensor systems, geo-information repositories, simulation tools and data fusion tools. In addition to conventional sensors also unconventional sensors and sensor networks play an important role in TRIDEC. The system version presented is based on service-oriented architecture (SOA) concepts and on relevant standards of the Open Geospatial Consortium (OGC), the World Wide Web Consortium (W3C) and the Organization for the Advancement of Structured Information Standards (OASIS). In this way the system continuously gathers, processes and displays events and data coming from open sensor platforms to enable operators to quickly decide whether an early warning is necessary and to send personalized warning messages to the authorities and the population at large through a wide range of communication channels. The system integrates OGC Sensor Web Enablement (SWE) compliant sensor systems for the rapid detection of hazardous events, like earthquakes, sea level anomalies, ocean floor occurrences, and ground displacements. Using OGC Web Map Service (WMS) and Web Feature Service (WFS) spatial data are utilized to depict the situation picture. The integration of a simulation system to identify affected areas is considered using the OGC Web Processing Service (WPS). Warning messages are compiled and transmitted in the OASIS Common Alerting Protocol (CAP) together with addressing information defined via the OASIS Emergency Data Exchange Language - Distribution Element (EDXL-DE). The first system demonstrator has been designed and implemented to support plausible scenarios demonstrating the treatment of simulated tsunami threats with an essential subset of a National Tsunami Warning Centre (NTWC). The feasibility and the potentials of the implemented approach are demonstrated covering standard operations as well as tsunami detection and alerting functions. The demonstrator presented addresses information management and decision-support processes in a hypothetical natural crisis situation caused by a tsunami in the Eastern Mediterranean. Developments of the system are based to the largest extent on free and open source software (FOSS) components and industry standards. Emphasis has been and will be made on leveraging open source technologies that support mature system architecture models wherever appropriate. All open source software produced is foreseen to be published on a publicly available software repository thus allowing others to reuse results achieved and enabling further development and collaboration with a wide community including scientists, developers, users and stakeholders. This live demonstration is linked with the talk "TRIDEC Natural Crisis Management Demonstrator for Tsunamis" (EGU2012-7275) given in the session "Architecture of Future Tsunami Warning Systems" (NH5.7/ESSI1.7).

  13. Kinota: An Open-Source NoSQL implementation of OGC SensorThings for large-scale high-resolution real-time environmental monitoring

    NASA Astrophysics Data System (ADS)

    Miles, B.; Chepudira, K.; LaBar, W.

    2017-12-01

    The Open Geospatial Consortium (OGC) SensorThings API (STA) specification, ratified in 2016, is a next-generation open standard for enabling real-time communication of sensor data. Building on over a decade of OGC Sensor Web Enablement (SWE) Standards, STA offers a rich data model that can represent a range of sensor and phenomena types (e.g. fixed sensors sensing fixed phenomena, fixed sensors sensing moving phenomena, mobile sensors sensing fixed phenomena, and mobile sensors sensing moving phenomena) and is data agnostic. Additionally, and in contrast to previous SWE standards, STA is developer-friendly, as is evident from its convenient JSON serialization, and expressive OData-based query language (with support for geospatial queries); with its Message Queue Telemetry Transport (MQTT), STA is also well-suited to efficient real-time data publishing and discovery. All these attributes make STA potentially useful for use in environmental monitoring sensor networks. Here we present Kinota(TM), an Open-Source NoSQL implementation of OGC SensorThings for large-scale high-resolution real-time environmental monitoring. Kinota, which roughly stands for Knowledge from Internet of Things Analyses, relies on Cassandra its underlying data store, which is a horizontally scalable, fault-tolerant open-source database that is often used to store time-series data for Big Data applications (though integration with other NoSQL or rational databases is possible). With this foundation, Kinota can scale to store data from an arbitrary number of sensors collecting data every 500 milliseconds. Additionally, Kinota architecture is very modular allowing for customization by adopters who can choose to replace parts of the existing implementation when desirable. The architecture is also highly portable providing the flexibility to choose between cloud providers like azure, amazon, google etc. The scalable, flexible and cloud friendly architecture of Kinota makes it ideal for use in next-generation large-scale and high-resolution real-time environmental monitoring networks used in domains such as hydrology, geomorphology, and geophysics, as well as management applications such as flood early warning, and regulatory enforcement.

  14. SCHeMA web-based observation data information system

    NASA Astrophysics Data System (ADS)

    Novellino, Antonio; Benedetti, Giacomo; D'Angelo, Paolo; Confalonieri, Fabio; Massa, Francesco; Povero, Paolo; Tercier-Waeber, Marie-Louise

    2016-04-01

    It is well recognized that the need of sharing ocean data among non-specialized users is constantly increasing. Initiatives that are built upon international standards will contribute to simplify data processing and dissemination, improve user-accessibility also through web browsers, facilitate the sharing of information across the integrated network of ocean observing systems; and ultimately provide a better understanding of the ocean functioning. The SCHeMA (Integrated in Situ Chemical MApping probe) Project is developing an open and modular sensing solution for autonomous in situ high resolution mapping of a wide range of anthropogenic and natural chemical compounds coupled to master bio-physicochemical parameters (www.schema-ocean.eu). The SCHeMA web system is designed to ensure user-friendly data discovery, access and download as well as interoperability with other projects through a dedicated interface that implements the Global Earth Observation System of Systems - Common Infrastructure (GCI) recommendations and the international Open Geospatial Consortium - Sensor Web Enablement (OGC-SWE) standards. This approach will insure data accessibility in compliance with major European Directives and recommendations. Being modular, the system allows the plug-and-play of commercially available probes as well as new sensor probess under development within the project. The access to the network of monitoring probes is provided via a web-based system interface that, being implemented as a SOS (Sensor Observation Service), is providing standard interoperability and access tosensor observations systems through O&M standard - as well as sensor descriptions - encoded in Sensor Model Language (SensorML). The use of common vocabularies in all metadatabases and data formats, to describe data in an already harmonized and common standard is a prerequisite towards consistency and interoperability. Therefore, the SCHeMA SOS has adopted the SeaVox common vocabularies populated by SeaDataNet network of National Oceanographic Data Centres. The SCHeMA presentation layer, a fundamental part of the software architecture, offers to the user a bidirectional interaction with the integrated system allowing to manage and configure the sensor probes; view the stored observations and metadata, and handle alarms. The overall structure of the web portal developed within the SCHeMA initiative (Sensor Configuration, development of Core Profile interface for data access via OGC standard, external services such as web services, WMS, WFS; and Data download and query manager) will be presented and illustrated with examples of ongoing tests in costal and open sea.

  15. Research of marine sensor web based on SOA and EDA

    NASA Astrophysics Data System (ADS)

    Jiang, Yongguo; Dou, Jinfeng; Guo, Zhongwen; Hu, Keyong

    2015-04-01

    A great deal of ocean sensor observation data exists, for a wide range of marine disciplines, derived from in situ and remote observing platforms, in real-time, near-real-time and delayed mode. Ocean monitoring is routinely completed using sensors and instruments. Standardization is the key requirement for exchanging information about ocean sensors and sensor data and for comparing and combining information from different sensor networks. One or more sensors are often physically integrated into a single ocean `instrument' device, which often brings in many challenges related to diverse sensor data formats, parameters units, different spatiotemporal resolution, application domains, data quality and sensors protocols. To face these challenges requires the standardization efforts aiming at facilitating the so-called Sensor Web, which making it easy to provide public access to sensor data and metadata information. In this paper, a Marine Sensor Web, based on SOA and EDA and integrating the MBARI's PUCK protocol, IEEE 1451 and OGC SWE 2.0, is illustrated with a five-layer architecture. The Web Service layer and Event Process layer are illustrated in detail with an actual example. The demo study has demonstrated that a standard-based system can be built to access sensors and marine instruments distributed globally using common Web browsers for monitoring the environment and oceanic conditions besides marine sensor data on the Web, this framework of Marine Sensor Web can also play an important role in many other domains' information integration.

  16. SensorWeb 3G: Extending On-Orbit Sensor Capabilities to Enable Near Realtime User Configurability

    NASA Technical Reports Server (NTRS)

    Mandl, Daniel; Cappelaere, Pat; Frye, Stuart; Sohlberg, Rob; Ly, Vuong; Chien, Steve; Tran, Daniel; Davies, Ashley; Sullivan, Don; Ames, Troy; hide

    2010-01-01

    This research effort prototypes an implementation of a standard interface, Web Coverage Processing Service (WCPS), which is an Open Geospatial Consortium(OGC) standard, to enable users to define, test, upload and execute algorithms for on-orbit sensor systems. The user is able to customize on-orbit data products that result from raw data streaming from an instrument. This extends the SensorWeb 2.0 concept that was developed under a previous Advanced Information System Technology (AIST) effort in which web services wrap sensors and a standardized Extensible Markup Language (XML) based scripting workflow language orchestrates processing steps across multiple domains. SensorWeb 3G extends the concept by providing the user controls into the flight software modules associated with on-orbit sensor and thus provides a degree of flexibility which does not presently exist. The successful demonstrations to date will be presented, which includes a realistic HyspIRI decadal mission testbed. Furthermore, benchmarks that were run will also be presented along with future demonstration and benchmark tests planned. Finally, we conclude with implications for the future and how this concept dovetails into efforts to develop "cloud computing" methods and standards.

  17. Urban Climate Resilience - Connecting climate models with decision support cyberinfrastructure using open standards

    NASA Astrophysics Data System (ADS)

    Bermudez, L. E.; Percivall, G.; Idol, T. A.

    2015-12-01

    Experts in climate modeling, remote sensing of the Earth, and cyber infrastructure must work together in order to make climate predictions available to decision makers. Such experts and decision makers worked together in the Open Geospatial Consortium's (OGC) Testbed 11 to address a scenario of population displacement by coastal inundation due to the predicted sea level rise. In a Policy Fact Sheet "Harnessing Climate Data to Boost Ecosystem & Water Resilience", issued by White House Office of Science and Technology (OSTP) in December 2014, OGC committed to increase access to climate change information using open standards. In July 2015, the OGC Testbed 11 Urban Climate Resilience activity delivered on that commitment with open standards based support for climate-change preparedness. Using open standards such as the OGC Web Coverage Service and Web Processing Service and the NetCDF and GMLJP2 encoding standards, Testbed 11 deployed an interoperable high-resolution flood model to bring climate model outputs together with global change assessment models and other remote sensing data for decision support. Methods to confirm model predictions and to allow "what-if-scenarios" included in-situ sensor webs and crowdsourcing. A scenario was in two locations: San Francisco Bay Area and Mozambique. The scenarios demonstrated interoperation and capabilities of open geospatial specifications in supporting data services and processing services. The resultant High Resolution Flood Information System addressed access and control of simulation models and high-resolution data in an open, worldwide, collaborative Web environment. The scenarios examined the feasibility and capability of existing OGC geospatial Web service specifications in supporting the on-demand, dynamic serving of flood information from models with forecasting capacity. Results of this testbed included identification of standards and best practices that help researchers and cities deal with climate-related issues. Results of the testbeds will now be deployed in pilot applications. The testbed also identified areas of additional development needed to help identify scientific investments and cyberinfrastructure approaches needed to improve the application of climate science research results to urban climate resilence.

  18. ISTIMES Integrated System for Transport Infrastructures Surveillance and Monitoring by Electromagnetic Sensing

    NASA Astrophysics Data System (ADS)

    Argenti, M.; Giannini, V.; Averty, R.; Bigagli, L.; Dumoulin, J.

    2012-04-01

    The EC FP7 ISTIMES project has the goal of realizing an ICT-based system exploiting distributed and local sensors for non destructive electromagnetic monitoring in order to make critical transport infrastructures more reliable and safe. Higher situation awareness thanks to real time and detailed information and images of the controlled infrastructure status allows improving decision capabilities for emergency management stakeholders. Web-enabled sensors and a service-oriented approach are used as core of the architecture providing a sys-tem that adopts open standards (e.g. OGC SWE, OGC CSW etc.) and makes efforts to achieve full interoperability with other GMES and European Spatial Data Infrastructure initiatives as well as compliance with INSPIRE. The system exploits an open easily scalable network architecture to accommodate a wide range of sensors integrated with a set of tools for handling, analyzing and processing large data volumes from different organizations with different data models. Situation Awareness tools are also integrated in the system. Definition of sensor observations and services follows a metadata model based on the ISO 19115 Core set of metadata elements and the O&M model of OGC SWE. The ISTIMES infrastructure is based on an e-Infrastructure for geospatial data sharing, with a Data Cata-log that implements the discovery services for sensor data retrieval, acting as a broker through static connections based on standard SOS and WNS interfaces; a Decision Support component which helps decision makers providing support for data fusion and inference and generation of situation indexes; a Presentation component which implements system-users interaction services for information publication and rendering, by means of a WEB Portal using SOA design principles; A security framework using Shibboleth open source middleware based on the Security Assertion Markup Language supporting Single Sign On (SSO). ACKNOWLEDGEMENT - The research leading to these results has received funding from the European Community's Seventh Framework Programme (FP7/2007-2013) under Grant Agreement n° 225663

  19. Integration of Grid and Sensor Web for Flood Monitoring and Risk Assessment from Heterogeneous Data

    NASA Astrophysics Data System (ADS)

    Kussul, Nataliia; Skakun, Sergii; Shelestov, Andrii

    2013-04-01

    Over last decades we have witnessed the upward global trend in natural disaster occurrence. Hydrological and meteorological disasters such as floods are the main contributors to this pattern. In recent years flood management has shifted from protection against floods to managing the risks of floods (the European Flood risk directive). In order to enable operational flood monitoring and assessment of flood risk, it is required to provide an infrastructure with standardized interfaces and services. Grid and Sensor Web can meet these requirements. In this paper we present a general approach to flood monitoring and risk assessment based on heterogeneous geospatial data acquired from multiple sources. To enable operational flood risk assessment integration of Grid and Sensor Web approaches is proposed [1]. Grid represents a distributed environment that integrates heterogeneous computing and storage resources administrated by multiple organizations. SensorWeb is an emerging paradigm for integrating heterogeneous satellite and in situ sensors and data systems into a common informational infrastructure that produces products on demand. The basic Sensor Web functionality includes sensor discovery, triggering events by observed or predicted conditions, remote data access and processing capabilities to generate and deliver data products. Sensor Web is governed by the set of standards, called Sensor Web Enablement (SWE), developed by the Open Geospatial Consortium (OGC). Different practical issues regarding integration of Sensor Web with Grids are discussed in the study. We show how the Sensor Web can benefit from using Grids and vice versa. For example, Sensor Web services such as SOS, SPS and SAS can benefit from the integration with the Grid platform like Globus Toolkit. The proposed approach is implemented within the Sensor Web framework for flood monitoring and risk assessment, and a case-study of exploiting this framework, namely the Namibia SensorWeb Pilot Project, is described. The project was created as a testbed for evaluating and prototyping key technologies for rapid acquisition and distribution of data products for decision support systems to monitor floods and enable flood risk assessment. The system provides access to real-time products on rainfall estimates and flood potential forecast derived from the Tropical Rainfall Measuring Mission (TRMM) mission with lag time of 6 h, alerts from the Global Disaster Alert and Coordination System (GDACS) with lag time of 4 h, and the Coupled Routing and Excess STorage (CREST) model to generate alerts. These are alerts are used to trigger satellite observations. With deployed SPS service for NASA's EO-1 satellite it is possible to automatically task sensor with re-image capability of less 8 h. Therefore, with enabled computational and storage services provided by Grid and cloud infrastructure it was possible to generate flood maps within 24-48 h after trigger was alerted. To enable interoperability between system components and services OGC-compliant standards are utilized. [1] Hluchy L., Kussul N., Shelestov A., Skakun S., Kravchenko O., Gripich Y., Kopp P., Lupian E., "The Data Fusion Grid Infrastructure: Project Objectives and Achievements," Computing and Informatics, 2010, vol. 29, no. 2, pp. 319-334.

  20. The Earth Observation Monitor - Automated monitoring and alerting for spatial time-series data based on OGC web services

    NASA Astrophysics Data System (ADS)

    Eberle, J.; Hüttich, C.; Schmullius, C.

    2014-12-01

    Spatial time series data are freely available around the globe from earth observation satellites and meteorological stations for many years until now. They provide useful and important information to detect ongoing changes of the environment; but for end-users it is often too complex to extract this information out of the original time series datasets. This issue led to the development of the Earth Observation Monitor (EOM), an operational framework and research project to provide simple access, analysis and monitoring tools for global spatial time series data. A multi-source data processing middleware in the backend is linked to MODIS data from Land Processes Distributed Archive Center (LP DAAC) and Google Earth Engine as well as daily climate station data from NOAA National Climatic Data Center. OGC Web Processing Services are used to integrate datasets from linked data providers or external OGC-compliant interfaces to the EOM. Users can either use the web portal (webEOM) or the mobile application (mobileEOM) to execute these processing services and to retrieve the requested data for a given point or polygon in userfriendly file formats (CSV, GeoTiff). Beside providing just data access tools, users can also do further time series analyses like trend calculations, breakpoint detections or the derivation of phenological parameters from vegetation time series data. Furthermore data from climate stations can be aggregated over a given time interval. Calculated results can be visualized in the client and downloaded for offline usage. Automated monitoring and alerting of the time series data integrated by the user is provided by an OGC Sensor Observation Service with a coupled OGC Web Notification Service. Users can decide which datasets and parameters are monitored with a given filter expression (e.g., precipitation value higher than x millimeter per day, occurrence of a MODIS Fire point, detection of a time series anomaly). Datasets integrated in the SOS service are updated in near-realtime based on the linked data providers mentioned above. An alert is automatically pushed to the user if the new data meets the conditions of the registered filter expression. This monitoring service is available on the web portal with alerting by email and within the mobile app with alerting by email and push notification.

  1. Smart Cities Intelligence System (SMACiSYS) Integrating Sensor Web with Spatial Data Infrastructures (sensdi)

    NASA Astrophysics Data System (ADS)

    Bhattacharya, D.; Painho, M.

    2017-09-01

    The paper endeavours to enhance the Sensor Web with crucial geospatial analysis capabilities through integration with Spatial Data Infrastructure. The objective is development of automated smart cities intelligence system (SMACiSYS) with sensor-web access (SENSDI) utilizing geomatics for sustainable societies. There has been a need to develop automated integrated system to categorize events and issue information that reaches users directly. At present, no web-enabled information system exists which can disseminate messages after events evaluation in real time. Research work formalizes a notion of an integrated, independent, generalized, and automated geo-event analysing system making use of geo-spatial data under popular usage platform. Integrating Sensor Web With Spatial Data Infrastructures (SENSDI) aims to extend SDIs with sensor web enablement, converging geospatial and built infrastructure, and implement test cases with sensor data and SDI. The other benefit, conversely, is the expansion of spatial data infrastructure to utilize sensor web, dynamically and in real time for smart applications that smarter cities demand nowadays. Hence, SENSDI augments existing smart cities platforms utilizing sensor web and spatial information achieved by coupling pairs of otherwise disjoint interfaces and APIs formulated by Open Geospatial Consortium (OGC) keeping entire platform open access and open source. SENSDI is based on Geonode, QGIS and Java, that bind most of the functionalities of Internet, sensor web and nowadays Internet of Things superseding Internet of Sensors as well. In a nutshell, the project delivers a generalized real-time accessible and analysable platform for sensing the environment and mapping the captured information for optimal decision-making and societal benefit.

  2. Crowdsourcing, citizen sensing and Sensor Web technologies for public and environmental health surveillance and crisis management: trends, OGC standards and application examples

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kamel Boulos, Maged; Resch, Bernd; Crowley, David N.

    The PIE Activity Awareness Environment is designed to be an adaptive data triage and decision support tool that allows role and activity based situation awareness through a dynamic, trainable filtering system. This paper discusses the process and methodology involved in the application as well as some of its capabilities. 'Wikification of GIS by the masses' is a phrase-term first coined by Kamel Boulos in 2005, two years earlier than Goodchild's term 'Volunteered Geographic Information'. Six years later (2005-2011), OpenStreetMap and Google Earth (GE) are now full-fledged, crowdsourced 'Wikipedias of the Earth' par excellence, with millions of users contributing their ownmore » layers to GE, attaching photos, videos, notes and even 3-D (three dimensional) models to locations in GE. From using Twitter in participatory sensing and bicycle-mounted sensors in pervasive environmental sensing, to creating a 100,000-sensor geo-mashup using Semantic Web technology, to the 3-D visualisation of indoor and outdoor surveillance data in real-time and the development of next-generation, collaborative natural user interfaces that will power the spatially-enabled public health and emergency situation rooms of the future, where sensor data and citizen reports can be triaged and acted upon in real-time by distributed teams of professionals, this paper offers a comprehensive state-of-the-art review of the overlapping domains of the Sensor Web, citizen sensing and 'human-in-the-loop sensing' in the era of the Mobile and Social Web, and the roles these domains can play in environmental and public health surveillance and crisis/disaster informatics. We provide an in-depth review of the key issues and trends in these areas, the challenges faced when reasoning and making decisions with real-time crowdsourced data (such as issues of information overload, 'noise', misinformation, bias and trust), the core technologies and Open Geospatial Consortium (OGC) standards involved (Sensor Web Enablement and Open GeoSMS), as well as a few outstanding project implementation examples from around the world.« less

  3. Datacube Interoperability, Encoding Independence, and Analytics

    NASA Astrophysics Data System (ADS)

    Baumann, Peter; Hirschorn, Eric; Maso, Joan

    2017-04-01

    Datacubes are commonly accepted as an enabling paradigm which provides a handy abstraction for accessing and analyzing the zillions of image files delivered by the manifold satellite instruments and climate simulations, among others. Additionally, datacubes are the classic model for statistical and OLAP datacubes, so a further information category can be integrated. From a standards perspective, spatio-temporal datacubes naturally are included in the concept of coverages which encompass regular and irregular grids, point clouds, and general meshes - or, more abstractly, digital representations of spatio-temporally varying phenomena. ISO 19123, which is identical to OGC Abstract Topic 6, gives a high-level abstract definition which is complemented by the OGC Coverage Implementation Schema (CIS) which is an interoperable, yet format independent concretization of the abstract model. Currently, ISO is working on adopting OGC CIS as ISO 19123-2; the existing ISO 19123 standard is under revision by one of the abstract authors and will become ISO 19123-1. The roadmap agreed by ISO further foresees adoption of the OGC Web Coverage Service (WCS) as an ISO standard so that a complete data and service model will exist. In 2016, INSPIRE has adopted WCS as Coverage Download Service, including the datacube analytics language Web Coverage Processing Service (WCPS). The rasdaman technology (www.rasdaman.org) is both OGC and INSPIRE Reference Implementation. In the global EarthServer initiative rasdaman database sizes are exceeding 250 TB today, heading for the Petabyte frontier well in 2017. Technically, CIS defines a compact, efficient model for representing multi-dimensional datacubes in several ways. The classical coverage cube defines a domain set (where are values?), a range set (what are these values?), and range type (what do the values mean?), as well as a "bag" for arbitrary metadata. With CIS 1.1, coordinate/value pair sequences have been added, as well as tiled representations. Further, CIS 1.1 offers a unified model for any kind of regular and irregular grids, also allowing sensor models as per SensorML. Encodings include ASCII formats like GML, JSON, RDF as well as binary formats like GeoTIFF, NetCDF, JPEG2000, and GRIB2; further, a container concept allows mixed representations within one coverage file utilizing zip or other convenient package formats. Through the tight integration with the Sensor Web Enablement (SWE), a lossless "transport" from sensor into coverage world is ensured. The corresponding service model of WCS supports datacube operations ranging from simple data extraction to complex ad-hoc analytics with WPCS. Notably, W3C is working has set out on a coverage model as well; it has been designed relatively independently from the abovementioned standards, but there is informal agreement to link it into the CIS universe (which allows for different, yet interchangeable representations). Particularly interesting in the W3C proposal is the detailed semantic modeling of metadata; as CIS 1.1 supports RDF, a tight coupling seems feasible.

  4. Open Standards in Practice: An OGC China Forum Initiative

    NASA Astrophysics Data System (ADS)

    Yue, Peng; Zhang, Mingda; Taylor, Trevor; Xie, Jibo; Zhang, Hongping; Tong, Xiaochong; Yu, Jinsongdi; Huang, Juntao

    2016-11-01

    Open standards like OGC standards can be used to improve interoperability and support machine-to-machine interaction over the Web. In the Big Data era, standard-based data and processing services from various vendors could be combined to automate the extraction of information and knowledge from heterogeneous and large volumes of geospatial data. This paper introduces an ongoing OGC China forum initiative, which will demonstrate how OGC standards can benefit the interaction among multiple organizations in China. The ability to share data and processing functions across organizations using standard services could change traditional manual interactions in their business processes, and provide on-demand decision support results by on-line service integration. In the initiative, six organizations are involved in two “MashUp” scenarios on disaster management. One “MashUp” is to derive flood maps in the Poyang Lake, Jiangxi. And the other one is to generate turbidity maps on demand in the East Lake, Wuhan, China. The two scenarios engage different organizations from the Chinese community by integrating sensor observations, data, and processing services from them, and improve the automation of data analysis process using open standards.

  5. River Basin Standards Interoperability Pilot

    NASA Astrophysics Data System (ADS)

    Pesquer, Lluís; Masó, Joan; Stasch, Christoph

    2016-04-01

    There is a lot of water information and tools in Europe to be applied in the river basin management but fragmentation and a lack of coordination between countries still exists. The European Commission and the member states have financed several research and innovation projects in support of the Water Framework Directive. Only a few of them are using the recently emerging hydrological standards, such as the OGC WaterML 2.0. WaterInnEU is a Horizon 2020 project focused on creating a marketplace to enhance the exploitation of EU funded ICT models, tools, protocols and policy briefs related to water and to establish suitable conditions for new market opportunities based on these offerings. One of WaterInnEU's main goals is to assess the level of standardization and interoperability of these outcomes as a mechanism to integrate ICT-based tools, incorporate open data platforms and generate a palette of interchangeable components that are able to use the water data emerging from the recently proposed open data sharing processes and data models stimulated by initiatives such as the INSPIRE directive. As part of the standardization and interoperability activities in the project, the authors are designing an experiment (RIBASE, the present work) to demonstrate how current ICT-based tools and water data can work in combination with geospatial web services in the Scheldt river basin. The main structure of this experiment, that is the core of the present work, is composed by the following steps: - Extraction of information from river gauges data in OGC WaterML 2.0 format using SOS services (preferably compliant to the OGC SOS 2.0 Hydrology Profile Best Practice). - Model floods using a WPS 2.0, WaterML 2.0 data and weather forecast models as input. - Evaluation of the applicability of Sensor Notification Services in water emergencies. - Open distribution of the input and output data as OGC web services WaterML, / WCS / WFS and with visualization utilities: WMS. The architecture tests the combination of Gauge data in a WPS that is triggered by a meteorological alert. The data is translated into OGC WaterML 2.0 time series data format and will be ingested in a SOS 2.0. SOS data is visualized in a SOS Client that is able to handle time series. The meteorological forecast data (with the supervision of an operator manipulating the WPS user interface) ingests with WaterML 2.0 time series and terrain data is input for a flooding modelling algorithm. The WPS is able to produce flooding datasets in the form of coverages that is offered to clients via a WCS 2.0 service or a WMS 1.3 service, and downloaded and visualized by the respective clients. The WPS triggers a notification or an alert that will be monitored from an emergency control response service. Acronyms AS: Alert Service ES: Event Service ICT: Information and Communication Technology NS: Notification Service OGC: Open Geospatial Consortium RIBASE: River Basin Standards Interoperability Pilot SOS: Sensor Observation Service WaterML: Water Markup Language WCS: Web Coverage Service WMS: Web Map Service WPS: Web Processing Service

  6. Moving Beyond the 10,000 Ways That Don't Work

    NASA Astrophysics Data System (ADS)

    Bermudez, L. E.; Arctur, D. K.; Rueda, C.

    2009-12-01

    From his research in developing light bulb filaments, Thomas Edison provide us with a good lesson to advance any venture. He said "I have not failed, I've just found 10,000 ways that won't work." Advancing data and access interoperability is one of those ventures difficult to achieve because of the differences among the participating communities. Even within the marine domain, different communities exist and with them different technologies (formats and protocols) to publish data and its descriptions, and different vocabularies to name things (e.g. parameters, sensor types). Simplifying the heterogeneity of technologies is not only accomplished by adopting standards, but by creating profiles, and advancing tools that use those standards. In some cases, standards are advanced by building from existing tools. But what is the best strategy? Edison could provide us a hint. Prototypes and test beds are essential to achieve interoperability among geospatial communities. The Open Geospatial Consortium (OGC) calls them interoperability experiments. The World Wide Web Consortium (W3C) calls them incubator projects. Prototypes help test and refine specifications. The Marine Metadata Interoperability (MMI) Initiative, which is advancing marine data integration and re-use by promoting community solutions, understood this strategy and started an interoperability demonstration with the SURA Coastal Ocean Observing and Prediction (SCOOP) program. This interoperability demonstration transformed into the OGC Ocean Science Interoperability Experiment (Oceans IE). The Oceans IE brings together the Ocean-Observing community to advance interoperability of ocean observing systems by using OGC Standards. The Oceans IE Phase I investigated the use of OGC Web Feature Service (WFS) and OGC Sensor Observation Service (SOS) standards for representing and exchanging point data records from fixed in-situ marine platforms. The Oceans IE Phase I produced an engineering best practices report, advanced reference implementations, and submitted various change requests that are now being considered by the OGC SOS working group. Building on Phase I, and with a focus on semantically-enabled services, Oceans IE Phase II will continue the use and improvement of OGC specifications in the marine community. We will present the lessons learned and in particular the strategy of experimenting with technologies to advance standards to publish data in marine communities, which could also help advance interoperability in other geospatial communities. We will also discuss the growing collaborations among ocean-observing standards organizations that will bring about the institutional acceptance needed for these technologies and practices to gain traction globally.

  7. Sharing Human-Generated Observations by Integrating HMI and the Semantic Sensor Web

    PubMed Central

    Sigüenza, Álvaro; Díaz-Pardo, David; Bernat, Jesús; Vancea, Vasile; Blanco, José Luis; Conejero, David; Gómez, Luis Hernández

    2012-01-01

    Current “Internet of Things” concepts point to a future where connected objects gather meaningful information about their environment and share it with other objects and people. In particular, objects embedding Human Machine Interaction (HMI), such as mobile devices and, increasingly, connected vehicles, home appliances, urban interactive infrastructures, etc., may not only be conceived as sources of sensor information, but, through interaction with their users, they can also produce highly valuable context-aware human-generated observations. We believe that the great promise offered by combining and sharing all of the different sources of information available can be realized through the integration of HMI and Semantic Sensor Web technologies. This paper presents a technological framework that harmonizes two of the most influential HMI and Sensor Web initiatives: the W3C's Multimodal Architecture and Interfaces (MMI) and the Open Geospatial Consortium (OGC) Sensor Web Enablement (SWE) with its semantic extension, respectively. Although the proposed framework is general enough to be applied in a variety of connected objects integrating HMI, a particular development is presented for a connected car scenario where drivers' observations about the traffic or their environment are shared across the Semantic Sensor Web. For implementation and evaluation purposes an on-board OSGi (Open Services Gateway Initiative) architecture was built, integrating several available HMI, Sensor Web and Semantic Web technologies. A technical performance test and a conceptual validation of the scenario with potential users are reported, with results suggesting the approach is sound. PMID:22778643

  8. Sharing human-generated observations by integrating HMI and the Semantic Sensor Web.

    PubMed

    Sigüenza, Alvaro; Díaz-Pardo, David; Bernat, Jesús; Vancea, Vasile; Blanco, José Luis; Conejero, David; Gómez, Luis Hernández

    2012-01-01

    Current "Internet of Things" concepts point to a future where connected objects gather meaningful information about their environment and share it with other objects and people. In particular, objects embedding Human Machine Interaction (HMI), such as mobile devices and, increasingly, connected vehicles, home appliances, urban interactive infrastructures, etc., may not only be conceived as sources of sensor information, but, through interaction with their users, they can also produce highly valuable context-aware human-generated observations. We believe that the great promise offered by combining and sharing all of the different sources of information available can be realized through the integration of HMI and Semantic Sensor Web technologies. This paper presents a technological framework that harmonizes two of the most influential HMI and Sensor Web initiatives: the W3C's Multimodal Architecture and Interfaces (MMI) and the Open Geospatial Consortium (OGC) Sensor Web Enablement (SWE) with its semantic extension, respectively. Although the proposed framework is general enough to be applied in a variety of connected objects integrating HMI, a particular development is presented for a connected car scenario where drivers' observations about the traffic or their environment are shared across the Semantic Sensor Web. For implementation and evaluation purposes an on-board OSGi (Open Services Gateway Initiative) architecture was built, integrating several available HMI, Sensor Web and Semantic Web technologies. A technical performance test and a conceptual validation of the scenario with potential users are reported, with results suggesting the approach is sound.

  9. Increasing the availability and usability of terrestrial ecology data through geospatial Web services and visualization tools (Invited)

    NASA Astrophysics Data System (ADS)

    Santhana Vannan, S.; Cook, R. B.; Wilson, B. E.; Wei, Y.

    2010-12-01

    Terrestrial ecology data sets are produced from diverse data sources such as model output, field data collection, laboratory analysis and remote sensing observation. These data sets can be created, distributed, and consumed in diverse ways as well. However, this diversity can hinder the usability of the data, and limit data users’ abilities to validate and reuse data for science and application purposes. Geospatial web services, such as those described in this paper, are an important means of reducing this burden. Terrestrial ecology researchers generally create the data sets in diverse file formats, with file and data structures tailored to the specific needs of their project, possibly as tabular data, geospatial images, or documentation in a report. Data centers may reformat the data to an archive-stable format and distribute the data sets through one or more protocols, such as FTP, email, and WWW. Because of the diverse data preparation, delivery, and usage patterns, users have to invest time and resources to bring the data into the format and structure most useful for their analysis. This time-consuming data preparation process shifts valuable resources from data analysis to data assembly. To address these issues, the ORNL DAAC, a NASA-sponsored terrestrial ecology data center, has utilized geospatial Web service technology, such as Open Geospatial Consortium (OGC) Web Map Service (WMS) and OGC Web Coverage Service (WCS) standards, to increase the usability and availability of terrestrial ecology data sets. Data sets are standardized into non-proprietary file formats and distributed through OGC Web Service standards. OGC Web services allow the ORNL DAAC to store data sets in a single format and distribute them in multiple ways and formats. Registering the OGC Web services through search catalogues and other spatial data tools allows for publicizing the data sets and makes them more available across the Internet. The ORNL DAAC has also created a Web-based graphical user interface called Spatial Data Access Tool (SDAT) that utilizes OGC Web services standards and allows data distribution and consumption for users not familiar with OGC standards. SDAT also allows for users to visualize the data set prior to download. Google Earth visualizations of the data set are also provided through SDAT. The use of OGC Web service standards at the ORNL DAAC has enabled an increase in data consumption. In one case, a data set had ~10 fold increase in download through OGC Web service in comparison to the conventional FTP and WWW method of access. The increase in download suggests that users are not only finding the data sets they need but also able to consume them readily in the format they need.

  10. Autonomous Mission Operations for Sensor Webs

    NASA Astrophysics Data System (ADS)

    Underbrink, A.; Witt, K.; Stanley, J.; Mandl, D.

    2008-12-01

    We present interim results of a 2005 ROSES AIST project entitled, "Using Intelligent Agents to Form a Sensor Web for Autonomous Mission Operations", or SWAMO. The goal of the SWAMO project is to shift the control of spacecraft missions from a ground-based, centrally controlled architecture to a collaborative, distributed set of intelligent agents. The network of intelligent agents intends to reduce management requirements by utilizing model-based system prediction and autonomic model/agent collaboration. SWAMO agents are distributed throughout the Sensor Web environment, which may include multiple spacecraft, aircraft, ground systems, and ocean systems, as well as manned operations centers. The agents monitor and manage sensor platforms, Earth sensing systems, and Earth sensing models and processes. The SWAMO agents form a Sensor Web of agents via peer-to-peer coordination. Some of the intelligent agents are mobile and able to traverse between on-orbit and ground-based systems. Other agents in the network are responsible for encapsulating system models to perform prediction of future behavior of the modeled subsystems and components to which they are assigned. The software agents use semantic web technologies to enable improved information sharing among the operational entities of the Sensor Web. The semantics include ontological conceptualizations of the Sensor Web environment, plus conceptualizations of the SWAMO agents themselves. By conceptualizations of the agents, we mean knowledge of their state, operational capabilities, current operational capacities, Web Service search and discovery results, agent collaboration rules, etc. The need for ontological conceptualizations over the agents is to enable autonomous and autonomic operations of the Sensor Web. The SWAMO ontology enables automated decision making and responses to the dynamic Sensor Web environment and to end user science requests. The current ontology is compatible with Open Geospatial Consortium (OGC) Sensor Web Enablement (SWE) Sensor Model Language (SensorML) concepts and structures. The agents are currently deployed on the U.S. Naval Academy MidSTAR-1 satellite and are actively managing the power subsystem on-orbit without the need for human intervention.

  11. Multifunctional Web Enabled Ocean Sensor Systems for the Monitoring of a Changing Ocean

    NASA Astrophysics Data System (ADS)

    Pearlman, Jay; Castro, Ayoze; Corrandino, Luigi; del Rio, Joaquin; Delory, Eric; Garello, Rene; Heuermann, Rudinger; Martinez, Enoc; Pearlman, Francoise; Rolin, Jean-Francois; Toma, Daniel; Waldmann, Christoph; Zielinski, Oliver

    2016-04-01

    As stated in the 2010 "Ostend Declaration", a major challenge in the coming years is the development of a truly integrated and sustainably funded European Ocean Observing System for supporting major policy initiatives such as the Integrated Maritime Policy and the Marine Strategy Framework Directive. This will be achieved with more long-term measurements of key parameters supported by a new generation of sensors whose costs and reliability will enable broad and consistent observations. Within the NeXOS project, a framework including new sensors capabilities and interface software has been put together that embraces the key technical aspects needed to improve the temporal and spatial coverage, resolution and quality of marine observations. The developments include new, low-cost, compact and integrated sensors with multiple functionalities that will allow for the measurements useful for a number of objectives, ranging from more precise monitoring and modeling of the marine environment to an improved assessment of fisheries. The project is entering its third year and will be demonstrating initial capabilities of optical and acoustic sensor prototypes that will become available for a number of platforms. For fisheries management, there is also a series of sensors that support an Ecosystem Approach to Fisheries (EAF). The greatest capabilities for comprehensive operations will occur when these sensors can be integrated into a multisensory capability on a single platform or multiply interconnected and coordinated platforms. Within NeXOS the full processing steps starting from the sensor signal all the way up to distributing collected environmental information will be encapsulated into standardized new state of the art Smart Sensor Interface and Web components to provide both improved integration and a flexible interface for scientists to control sensor operation. The use of the OGC SWE (Sensor Web Enablement) set of standards like OGC PUCK and SensorML at the instrument to platform integration phase will provide standard mechanisms for a truly plug'n'work connection. Through this, NeXOS Instruments will maintain within themselves specific information about how a platform (buoy controller, AUV controller, Observatory controller) has to configure and communicate with the instrument without the platform needing previous knowledge about the instrument. This mechanism is now being evaluated in real platforms like a Slocum Glider from Teledyne Web research, SeaExplorer Glider from Alseamar, Provor Float from NKE, and others including non commercial platforms like Obsea seafloor cabled observatory. The latest developments in the NeXOS sensors and the integration into an observation system will be discussed, addressing demonstration plans both for a variety of platforms and scientific objectives supporting marine management.

  12. OOSTethys - Open Source Software for the Global Earth Observing Systems of Systems

    NASA Astrophysics Data System (ADS)

    Bridger, E.; Bermudez, L. E.; Maskey, M.; Rueda, C.; Babin, B. L.; Blair, R.

    2009-12-01

    An open source software project is much more than just picking the right license, hosting modular code and providing effective documentation. Success in advancing in an open collaborative way requires that the process match the expected code functionality to the developer's personal expertise and organizational needs as well as having an enthusiastic and responsive core lead group. We will present the lessons learned fromOOSTethys , which is a community of software developers and marine scientists who develop open source tools, in multiple languages, to integrate ocean observing systems into an Integrated Ocean Observing System (IOOS). OOSTethys' goal is to dramatically reduce the time it takes to install, adopt and update standards-compliant web services. OOSTethys has developed servers, clients and a registry. Open source PERL, PYTHON, JAVA and ASP tool kits and reference implementations are helping the marine community publish near real-time observation data in interoperable standard formats. In some cases publishing an OpenGeospatial Consortium (OGC), Sensor Observation Service (SOS) from NetCDF files or a database or even CSV text files could take only minutes depending on the skills of the developer. OOSTethys is also developing an OGC standard registry, Catalog Service for Web (CSW). This open source CSW registry was implemented to easily register and discover SOSs using ISO 19139 service metadata. A web interface layer over the CSW registry simplifies the registration process by harvesting metadata describing the observations and sensors from the “GetCapabilities” response of SOS. OPENIOOS is the web client, developed in PERL to visualize the sensors in the SOS services. While the number of OOSTethys software developers is small, currently about 10 around the world, the number of OOSTethys toolkit implementers is larger and growing and the ease of use has played a large role in spreading the use of interoperable standards compliant web services widely in the marine community.

  13. Design and Development of a Framework Based on Ogc Web Services for the Visualization of Three Dimensional Large-Scale Geospatial Data Over the Web

    NASA Astrophysics Data System (ADS)

    Roccatello, E.; Nozzi, A.; Rumor, M.

    2013-05-01

    This paper illustrates the key concepts behind the design and the development of a framework, based on OGC services, capable to visualize 3D large scale geospatial data streamed over the web. WebGISes are traditionally bounded to a bi-dimensional simplified representation of the reality and though they are successfully addressing the lack of flexibility and simplicity of traditional desktop clients, a lot of effort is still needed to reach desktop GIS features, like 3D visualization. The motivations behind this work lay in the widespread availability of OGC Web Services inside government organizations and in the technology support to HTML 5 and WebGL standard of the web browsers. This delivers an improved user experience, similar to desktop applications, therefore allowing to augment traditional WebGIS features with a 3D visualization framework. This work could be seen as an extension of the Cityvu project, started in 2008 with the aim of a plug-in free OGC CityGML viewer. The resulting framework has also been integrated in existing 3DGIS software products and will be made available in the next months.

  14. Proteus - A Free and Open Source Sensor Observation Service (SOS) Client

    NASA Astrophysics Data System (ADS)

    Henriksson, J.; Satapathy, G.; Bermudez, L. E.

    2013-12-01

    The Earth's 'electronic skin' is becoming ever more sophisticated with a growing number of sensors measuring everything from seawater salinity levels to atmospheric pressure. To further the scientific application of this data collection effort, it is important to make the data easily available to anyone who wants to use it. Making Earth Science data readily available will allow the data to be used in new and potentially groundbreaking ways. The US National Science and Technology Council made this clear in its most recent National Strategy for Civil Earth Observations report, when it remarked that Earth observations 'are often found to be useful for additional purposes not foreseen during the development of the observation system'. On the road to this goal the Open Geospatial Consortium (OGC) is defining uniform data formats and service interfaces to facilitate the discovery and access of sensor data. This is being done through the Sensor Web Enablement (SWE) stack of standards, which include the Sensor Observation Service (SOS), Sensor Model Language (SensorML), Observations & Measurements (O&M) and Catalog Service for the Web (CSW). End-users do not have to use these standards directly, but can use smart tools that leverage and implement them. We have developed such a tool named Proteus. Proteus is an open-source sensor data discovery client. The goal of Proteus is to be a general-purpose client that can be used by anyone for discovering and accessing sensor data via OGC-based services. Proteus is a desktop client and supports a straightforward workflow for finding sensor data. The workflow takes the user through the process of selecting appropriate services, bounding boxes, observed properties, time periods and other search facets. NASA World Wind is used to display the matching sensor offerings on a map. Data from any sensor offering can be previewed in a time series. The user can download data from a single sensor offering, or download data in bulk from all matching sensor offerings. Proteus leverages NASA World Wind's WMS capabilities and allow overlaying sensor offerings on top of any map. Specific search criteria (i.e. user discoveries) can be saved and later restored. Proteus is supports two user types: 1) the researcher/scientist interested in discovering and downloading specific sensor data as input to research processes, and 2) the data manager responsible for maintaining sensor data services (e.g. SOSs) and wants to ensure proper data and metadata delivery, verify sensor data, and receive sensor data alerts. Proteus has a Web-based companion product named the Community Hub that is used to generate sensor data alerts. Alerts can be received via an RSS feed, viewed in a Web browser or displayed directly in Proteus via a Web-based API. To advance the vision of making Earth Science data easily discoverable and accessible to end-users, professional or laymen, Proteus is available as open-source on GitHub (https://github.com/intelligentautomation/proteus).

  15. An ontology for sensor networks

    NASA Astrophysics Data System (ADS)

    Compton, Michael; Neuhaus, Holger; Bermudez, Luis; Cox, Simon

    2010-05-01

    Sensors and networks of sensors are important ways of monitoring and digitizing reality. As the number and size of sensor networks grows, so too does the amount of data collected. Users of such networks typically need to discover the sensors and data that fit their needs without necessarily understanding the complexities of the network itself. The burden on users is eased if the network and its data are expressed in terms of concepts familiar to the users and their job functions, rather than in terms of the network or how it was designed. Furthermore, the task of collecting and combining data from multiple sensor networks is made easier if metadata about the data and the networks is stored in a format and conceptual models that is amenable to machine reasoning and inference. While the OGC's (Open Geospatial Consortium) SWE (Sensor Web Enablement) standards provide for the description and access to data and metadata for sensors, they do not provide facilities for abstraction, categorization, and reasoning consistent with standard technologies. Once sensors and networks are described using rich semantics (that is, by using logic to describe the sensors, the domain of interest, and the measurements) then reasoning and classification can be used to analyse and categorise data, relate measurements with similar information content, and manage, query and task sensors. This will enable types of automated processing and logical assurance built on OGC standards. The W3C SSN-XG (Semantic Sensor Networks Incubator Group) is producing a generic ontology to describe sensors, their environment and the measurements they make. The ontology provides definitions for the structure of sensors and observations, leaving the details of the observed domain unspecified. This allows abstract representations of real world entities, which are not observed directly but through their observable qualities. Domain semantics, units of measurement, time and time series, and location and mobility ontologies can be easily attached when instantiating the ontology for any particular sensors in a domain. After a review of previous work on the specification of sensors, the group is developing the ontology in conjunction with use case development. Part of the difficulty of such work is that relevant concepts from for example OGC standards and other ontologies must be identified and aligned and also placed in a consistent and logically correct way into the ontology. In terms of alignment with OGC's SWE, the ontology is intended to be able to model concepts from SensorML and O&M. Similar to SensorML and O&M, the ontology is based around concepts of systems, processes, and observations. It supports the description of the physical and processing structure of sensors. Sensors are not constrained to physical sensing devices: rather a sensor is anything that can estimate or calculate the value of a phenomenon, so a device or computational process or combination could play the role of a sensor. The representation of a sensor in the ontology links together what is measured (the domain phenomena), the sensor's physical and other properties and its functions and processing. Parts of the ontology are well aligned with SensorML and O&M, but parts are not, and the group is working to understand how differences from (and alignment with) the OGC standards affect the application of the ontology.

  16. Generic Sensor Data Fusion Services for Web-enabled Environmental Risk Management and Decision-Support Systems

    NASA Astrophysics Data System (ADS)

    Sabeur, Zoheir; Middleton, Stuart; Veres, Galina; Zlatev, Zlatko; Salvo, Nicola

    2010-05-01

    The advancement of smart sensor technology in the last few years has led to an increase in the deployment of affordable sensors for monitoring the environment around Europe. This is generating large amounts of sensor observation information and inevitably leading to problems about how to manage large volumes of data as well as making sense out the data for decision-making. In addition, the various European Directives (Water Framework Diectives, Bathing Water Directives, Habitat Directives, etc.. ) which regulate human activities in the environment and the INSPIRE Directive on spatial information management regulations have implicitely led the designated European Member States environment agencies and authorities to put in place new sensor monitoring infrastructure and share information about environmental regions under their statutory responsibilities. They will need to work cross border and collectively reach environmental quality standards. They will also need to regularly report to the EC on the quality of the environments of which they are responsible and make such information accessible to the members of the public. In recent years, early pioneering work on the design of service oriented architecture using sensor networks has been achieved. Information web-services infrastructure using existing data catalogues and web-GIS map services can now be enriched with the deployment of new sensor observation and data fusion and modelling services using OGC standards. The deployment of the new services which describe sensor observations and intelligent data-processing using data fusion techniques can now be implemented and provide added value information with spatial-temporal uncertainties to the next generation of decision support service systems. The new decision support service systems have become key to implement across Europe in order to comply with EU environmental regulations and INSPIRE. In this paper, data fusion services using OGC standards with sensor observation data streams are described in context of a geo-distributed service infrastructure specialising in multiple environmental risk management and decision-support. The sensor data fusion services are deployed and validated in two use cases. These are respectively concerned with: 1) Microbial risks forecast in bathing waters; and 2) Geohazards in urban zones during underground tunneling activities. This research was initiated in the SANY Integrated Project(www.sany-ip.org) and funded by the European Commission under the 6th Framework Programme.

  17. A Tale of Two Observing Systems: Interoperability in the World of Microsoft Windows

    NASA Astrophysics Data System (ADS)

    Babin, B. L.; Hu, L.

    2008-12-01

    Louisiana Universities Marine Consortium's (LUMCON) and Dauphin Island Sea Lab's (DISL) Environmental Monitoring System provide a unified coastal ocean observing system. These two systems are mirrored to maintain autonomy while offering an integrated data sharing environment. Both systems collect data via Campbell Scientific Data loggers, store the data in Microsoft SQL servers, and disseminate the data in real- time on the World Wide Web via Microsoft Internet Information Servers and Active Server Pages (ASP). The utilization of Microsoft Windows technologies presented many challenges to these observing systems as open source tools for interoperability grow. The current open source tools often require the installation of additional software. In order to make data available through common standards formats, "home grown" software has been developed. One example of this is the development of software to generate xml files for transmission to the National Data Buoy Center (NDBC). OOSTethys partners develop, test and implement easy-to-use, open-source, OGC-compliant software., and have created a working prototype of networked, semantically interoperable, real-time data systems. Partnering with OOSTethys, we are developing a cookbook to implement OGC web services. The implementation will be written in ASP, will run in a Microsoft operating system environment, and will serve data via Sensor Observation Services (SOS). This cookbook will give observing systems running Microsoft Windows the tools to easily participate in the Open Geospatial Consortium (OGC) Oceans Interoperability Experiment (OCEANS IE).

  18. A Web Service Protocol Realizing Interoperable Internet of Things Tasking Capability.

    PubMed

    Huang, Chih-Yuan; Wu, Cheng-Hung

    2016-08-31

    The Internet of Things (IoT) is an infrastructure that interconnects uniquely-identifiable devices using the Internet. By interconnecting everyday appliances, various monitoring, and physical mashup applications can be constructed to improve human's daily life. In general, IoT devices provide two main capabilities: sensing and tasking capabilities. While the sensing capability is similar to the World-Wide Sensor Web, this research focuses on the tasking capability. However, currently, IoT devices created by different manufacturers follow different proprietary protocols and are locked in many closed ecosystems. This heterogeneity issue impedes the interconnection between IoT devices and damages the potential of the IoT. To address this issue, this research aims at proposing an interoperable solution called tasking capability description that allows users to control different IoT devices using a uniform web service interface. This paper demonstrates the contribution of the proposed solution by interconnecting different IoT devices for different applications. In addition, the proposed solution is integrated with the OGC SensorThings API standard, which is a Web service standard defined for the IoT sensing capability. Consequently, the Extended SensorThings API can realize both IoT sensing and tasking capabilities in an integrated and interoperable manner.

  19. Operational Marine Data Acquisition and Delivery Powered by Web and Geospatial Standards

    NASA Astrophysics Data System (ADS)

    Thomas, R.; Buck, J. J. H.

    2015-12-01

    As novel sensor types and new platforms are deployed to monitor the global oceans, the volumes of scientific and environmental data collected in the marine context are rapidly growing. In order to use these data in both the traditional operational modes and in innovative "Big Data" applications the data must be readily understood by software agents. One approach to achieving this is the application of both World Wide Web and Open Geospatial Consortium standards: namely Linked Data1 and Sensor Web Enablement2 (SWE). The British Oceanographic Data Centre (BODC) is adopting this strategy in a number of European Commission funded projects (NETMAR; SenseOCEAN; Ocean Data Interoperability Platform - ODIP; and AtlantOS) to combine its existing data archiving architecture with SWE components (such as Sensor Observation Services) and a Linked Data interface. These will evolve the data management and data transfer from a process that requires significant manual intervention to an automated operational process enabling the rapid, standards-based, ingestion and delivery of data. This poster will show the current capabilities of BODC and the status of on-going implementation of this strategy. References1. World Wide Web Consortium. (2013). Linked Data. Available:http://www.w3.org/standards/semanticweb/data. Last accessed 7th April 20152. Open Geospatial Consortium. (2014). Sensor Web Enablement (SWE). Available:http://www.opengeospatial.org/ogc/markets-technologies/swe. Last accessed 8th October 2014

  20. Sensor Management for Applied Research Technologies (SMART)-On Demand Modeling (ODM) Project

    NASA Technical Reports Server (NTRS)

    Goodman, M.; Blakeslee, R.; Hood, R.; Jedlovec, G.; Botts, M.; Li, X.

    2006-01-01

    NASA requires timely on-demand data and analysis capabilities to enable practical benefits of Earth science observations. However, a significant challenge exists in accessing and integrating data from multiple sensors or platforms to address Earth science problems because of the large data volumes, varying sensor scan characteristics, unique orbital coverage, and the steep learning curve associated with each sensor and data type. The development of sensor web capabilities to autonomously process these data streams (whether real-time or archived) provides an opportunity to overcome these obstacles and facilitate the integration and synthesis of Earth science data and weather model output. A three year project, entitled Sensor Management for Applied Research Technologies (SMART) - On Demand Modeling (ODM), will develop and demonstrate the readiness of Open Geospatial Consortium (OGC) Sensor Web Enablement (SWE) capabilities that integrate both Earth observations and forecast model output into new data acquisition and assimilation strategies. The advancement of SWE-enabled systems (i.e., use of SensorML, sensor planning services - SPS, sensor observation services - SOS, sensor alert services - SAS and common observation model protocols) will have practical and efficient uses in the Earth science community for enhanced data set generation, real-time data assimilation with operational applications, and for autonomous sensor tasking for unique data collection.

  1. An Open Source Tool to Test Interoperability

    NASA Astrophysics Data System (ADS)

    Bermudez, L. E.

    2012-12-01

    Scientists interact with information at various levels from gathering of the raw observed data to accessing portrayed processed quality control data. Geoinformatics tools help scientist on the acquisition, storage, processing, dissemination and presentation of geospatial information. Most of the interactions occur in a distributed environment between software components that take the role of either client or server. The communication between components includes protocols, encodings of messages and managing of errors. Testing of these communication components is important to guarantee proper implementation of standards. The communication between clients and servers can be adhoc or follow standards. By following standards interoperability between components increase while reducing the time of developing new software. The Open Geospatial Consortium (OGC), not only coordinates the development of standards but also, within the Compliance Testing Program (CITE), provides a testing infrastructure to test clients and servers. The OGC Web-based Test Engine Facility, based on TEAM Engine, allows developers to test Web services and clients for correct implementation of OGC standards. TEAM Engine is a JAVA open source facility, available at Sourceforge that can be run via command line, deployed in a web servlet container or integrated in developer's environment via MAVEN. The TEAM Engine uses the Compliance Test Language (CTL) and TestNG to test HTTP requests, SOAP services and XML instances against Schemas and Schematron based assertions of any type of web service, not only OGC services. For example, the OGC Web Feature Service (WFS) 1.0.0 test has more than 400 test assertions. Some of these assertions includes conformance of HTTP responses, conformance of GML-encoded data; proper values for elements and attributes in the XML; and, correct error responses. This presentation will provide an overview of TEAM Engine, introduction of how to test via the OGC Testing web site and description of performing local tests. It will also provide information about how to participate in the open source code development of TEAM Engine.

  2. Availability of the OGC geoprocessing standard: March 2011 reality check

    NASA Astrophysics Data System (ADS)

    Lopez-Pellicer, Francisco J.; Rentería-Agualimpia, Walter; Béjar, Rubén; Muro-Medrano, Pedro R.; Zarazaga-Soria, F. Javier

    2012-10-01

    This paper presents an investigation about the servers available in March 2011 conforming to the Web Processing Service interface specification published by the geospatial standards organization Open Geospatial Consortium (OGC) in 2007. This interface specification gives support to standard Web-based geoprocessing. The data used in this research were collected using a focused crawler configured for finding OGC Web services. The research goals are (i) to provide a reality check of the availability of Web Processing Service servers, (ii) to provide quantitative data about the use of different features defined in the standard that are relevant for a scalable Geoprocessing Web (e.g. long-running processes, Web-accessible data outputs), and (iii) to test if the advances in the use of search engines and focused crawlers for finding Web services can be applied for finding geoscience processing systems. Research results show the feasibility of the discovery approach and provide data about the implementation of the Web Processing Service specification. These results also show extensive use of features related to scalability, except for those related to technical and semantic interoperability.

  3. Development of Integration Framework for Sensor Network and Satellite Image based on OGC Web Services

    NASA Astrophysics Data System (ADS)

    Ninsawat, Sarawut; Yamamoto, Hirokazu; Kamei, Akihide; Nakamura, Ryosuke; Tsuchida, Satoshi; Maeda, Takahisa

    2010-05-01

    With the availability of network enabled sensing devices, the volume of information being collected by networked sensors has increased dramatically in recent years. Over 100 physical, chemical and biological properties can be sensed using in-situ or remote sensing technology. A collection of these sensor nodes forms a sensor network, which is easily deployable to provide a high degree of visibility into real-world physical processes as events unfold. The sensor observation network could allow gathering of diverse types of data at greater spatial and temporal resolution, through the use of wired or wireless network infrastructure, thus real-time or near-real time data from sensor observation network allow researchers and decision-makers to respond speedily to events. However, in the case of environmental monitoring, only a capability to acquire in-situ data periodically is not sufficient but also the management and proper utilization of data also need to be careful consideration. It requires the implementation of database and IT solutions that are robust, scalable and able to interoperate between difference and distributed stakeholders to provide lucid, timely and accurate update to researchers, planners and citizens. The GEO (Global Earth Observation) Grid is primarily aiming at providing an e-Science infrastructure for the earth science community. The GEO Grid is designed to integrate various kinds of data related to the earth observation using the grid technology, which is developed for sharing data, storage, and computational powers of high performance computing, and is accessible as a set of services. A comprehensive web-based system for integrating field sensor and data satellite image based on various open standards of OGC (Open Geospatial Consortium) specifications has been developed. Web Processing Service (WPS), which is most likely the future direction of Web-GIS, performs the computation of spatial data from distributed data sources and returns the outcome in a standard format. The interoperability capabilities and Service Oriented Architecture (SOA) of web services allow incorporating between sensor network measurement available from Sensor Observation Service (SOS) and satellite remote sensing data from Web Mapping Service (WMS) as distributed data sources for WPS. Various applications have been developed to demonstrate the efficacy of integrating heterogeneous data source. For example, the validation of the MODIS aerosol products (MOD08_D3, the Level-3 MODIS Atmosphere Daily Global Product) by ground-based measurements using the sunphotometer (skyradiometer, Prede POM-02) installed at Phenological Eyes Network (PEN) sites in Japan. Furthermore, the web-based framework system for studying a relationship between calculated Vegetation Index from MODIS satellite image surface reflectance (MOD09GA, the Surface Reflectance Daily L2G Global 1km and 500m Product) and Gross Primary Production (GPP) field measurement at flux tower site in Thailand and Japan has been also developed. The success of both applications will contribute to maximize data utilization and improve accuracy of information by validate MODIS satellite products using high degree of accuracy and temporal measurement of field measurement data.

  4. Use of ebRIM-based CSW with sensor observation services for registry and discovery of remote-sensing observations

    NASA Astrophysics Data System (ADS)

    Chen, Nengcheng; Di, Liping; Yu, Genong; Gong, Jianya; Wei, Yaxing

    2009-02-01

    Recent advances in Sensor Web geospatial data capture, such as high-resolution in satellite imagery and Web-ready data processing and modeling technologies, have led to the generation of large numbers of datasets from real-time or near real-time observations and measurements. Finding which sensor or data complies with criteria such as specific times, locations, and scales has become a bottleneck for Sensor Web-based applications, especially remote-sensing observations. In this paper, an architecture for use of the integration Sensor Observation Service (SOS) with the Open Geospatial Consortium (OGC) Catalogue Service-Web profile (CSW) is put forward. The architecture consists of a distributed geospatial sensor observation service, a geospatial catalogue service based on the ebXML Registry Information Model (ebRIM), SOS search and registry middleware, and a geospatial sensor portal. The SOS search and registry middleware finds the potential SOS, generating data granule information and inserting the records into CSW. The contents and sequence of the services, the available observations, and the metadata of the observations registry are described. A prototype system is designed and implemented using the service middleware technology and a standard interface and protocol. The feasibility and the response time of registry and retrieval of observations are evaluated using a realistic Earth Observing-1 (EO-1) SOS scenario. Extracting information from SOS requires the same execution time as record generation for CSW. The average data retrieval response time in SOS+CSW mode is 17.6% of that of the SOS-alone mode. The proposed architecture has the more advantages of SOS search and observation data retrieval than the existing sensor Web enabled systems.

  5. A Walk through TRIDEC's intermediate Tsunami Early Warning System for the Turkish and Portuguese NEAMWave12 exercise tsunami scenarios

    NASA Astrophysics Data System (ADS)

    Hammitzsch, Martin; Lendholt, Matthias; Reißland, Sven; Schulz, Jana

    2013-04-01

    On November 27-28, 2012, the Kandilli Observatory and Earthquake Research Institute (KOERI) and the Portuguese Institute for the Sea and Atmosphere (IPMA) joined other countries in the North-eastern Atlantic, the Mediterranean and Connected Seas (NEAM) region as participants in an international tsunami response exercise. The exercise, titled NEAMWave12, simulated widespread Tsunami Watch situations throughout the NEAM region. It is the first international exercise as such, in this region, where the UNESCO-IOC ICG/NEAMTWS tsunami warning chain has been tested to a full scale for the first time with different systems. One of the systems is developed in the project Collaborative, Complex, and Critical Decision-Support in Evolving Crises (TRIDEC) and has been validated in this exercise among others by KOERI and IPMA. In TRIDEC new developments in Information and Communication Technology (ICT) are used to extend the existing platform realising a component-based technology framework for building distributed tsunami warning systems for deployment, e.g. in the North-eastern Atlantic, the Mediterranean and Connected Seas (NEAM) region. The TRIDEC system will be implemented in three phases, each with a demonstrator. Successively, the demonstrators are addressing related challenges. The first and second phase system demonstrator, deployed at KOERI's crisis management room and deployed at IPMA has been designed and implemented, firstly, to support plausible scenarios for the Turkish NTWC and for the Portuguese NTWC to demonstrate the treatment of simulated tsunami threats with an essential subset of a NTWC. Secondly, the feasibility and the potentials of the implemented approach are demonstrated covering ICG/NEAMTWS standard operations as well as tsunami detection and alerting functions beyond ICG/NEAMTWS requirements. The demonstrator presented addresses information management and decision-support processes for hypothetical tsunami-related crisis situations in the context of the ICG/NEAMTWS NEAMWave12 exercise for the Turkish and Portuguese tsunami exercise scenarios. Impressions gained with the standards compliant TRIDEC system during the exercise will be reported. The system version presented is based on event-driven architecture (EDA) and service-oriented architecture (SOA) concepts and is making use of relevant standards of the Open Geospatial Consortium (OGC), the World Wide Web Consortium (W3C) and the Organization for the Advancement of Structured Information Standards (OASIS). In this way the system continuously gathers, processes and displays events and data coming from open sensor platforms to enable operators to quickly decide whether an early warning is necessary and to send personalized warning messages to the authorities and the population at large through a wide range of communication channels. The system integrates OGC Sensor Web Enablement (SWE) compliant sensor systems for the rapid detection of hazardous events, like earthquakes, sea level anomalies, ocean floor occurrences, and ground displacements. Using OGC Web Map Service (WMS) and Web Feature Service (WFS) spatial data are utilized to depict the situation picture. The integration of a simulation system to identify affected areas is considered using the OGC Web Processing Service (WPS). Warning messages are compiled and transmitted in the OASIS Common Alerting Protocol (CAP) together with addressing information defined via the OASIS Emergency Data Exchange Language - Distribution Element (EDXL-DE). This demonstration is linked with the talk 'Experiences with TRIDEC's Crisis Management Demonstrator in the Turkish NEAMWave12 exercise tsunami scenario' (EGU2013-2833) given in the session "Architecture of Future Tsunami Warning Systems" (NH5.6).

  6. Sensing Models and Sensor Network Architectures for Transport Infrastructure Monitoring in Smart Cities

    NASA Astrophysics Data System (ADS)

    Simonis, Ingo

    2015-04-01

    Transport infrastructure monitoring and analysis is one of the focus areas in the context of smart cities. With the growing number of people moving into densely populated urban metro areas, precise tracking of moving people and goods is the basis for profound decision-making and future planning. With the goal of defining optimal extensions and modifications to existing transport infrastructures, multi-modal transport has to be monitored and analysed. This process is performed on the basis of sensor networks that combine a variety of sensor models, types, and deployments within the area of interest. Multi-generation networks, consisting of a number of sensor types and versions, are causing further challenges for the integration and processing of sensor observations. These challenges are not getting any smaller with the development of the Internet of Things, which brings promising opportunities, but is currently stuck in a type of protocol war between big industry players from both the hardware and network infrastructure domain. In this paper, we will highlight how the OGC suite of standards, with the Sensor Web standards developed by the Sensor Web Enablement Initiative together with the latest developments by the Sensor Web for Internet of Things community can be applied to the monitoring and improvement of transport infrastructures. Sensor Web standards have been applied in the past to pure technical domains, but need to be broadened now in order to meet new challenges. Only cross domain approaches will allow to develop satisfying transport infrastructure approaches that take into account requirements coming form a variety of sectors such as tourism, administration, transport industry, emergency services, or private people. The goal is the development of interoperable components that can be easily integrated within data infrastructures and follow well defined information models to allow robust processing.

  7. Low-energy, low-budget sensor web enablement of an amateur weather station

    NASA Astrophysics Data System (ADS)

    Schmidt, G.; Herrnkind, S.; Klump, J.

    2008-12-01

    Sensor Web Enablement (OGC SWE) has developed in into a powerful concept with many potential applications in environmental monitoring and in other fields. This has spurred development of software applications for Sensor Observation Services (SOS), while the development of client applications still lags behind. Furthermore, the deployment of sensors in the field often places tight constraints on energy and bandwidth available for data capture and transmission. As a "proof of concept" we equipped an amateur weather station with low-budget, standard components to read the data from its base station and feed it into a sensor observation service using its standard web- service interface. We chose the weather station as an example because of its simple measured phenomena and its low data volume. As sensor observation service we chose the open source software package offered by the 52North consortium. Power consumption can be problematic when deploying a sensor platform in the field. Instead of a common PC we used a Network Storage Link Unit (NSLU2) with a Linux operating system, a configuration also known as "Debian SLUG". The power consumption of a "SLUG" is of the order of 2 to 5 Watt, compared to 40W in a small PC. The "SLUG" provides one ethernet and two USB ports, one used by its external USB hard-drive. This modular setup is open to modifications, for example the addition of a GSM modem for data transmission over a cellular telephone network. The simple setup, low price, low power consumption, and the low technological entry-level allow many potential uses of a "SLUG" in environmental sensor networks in research, education and citizen science. The use of a mature sensor observation service software allows an easy integration of monitoring networks with other web services.

  8. Sensor Alerting Capability

    NASA Astrophysics Data System (ADS)

    Henriksson, Jakob; Bermudez, Luis; Satapathy, Goutam

    2013-04-01

    There is a large amount of sensor data generated today by various sensors, from in-situ buoys to mobile underwater gliders. Providing sensor data to the users through standardized services, language and data model is the promise of OGC's Sensor Web Enablement (SWE) initiative. As the amount of data grows it is becoming difficult for data providers, planners and managers to ensure reliability of data and services and to monitor critical data changes. Intelligent Automation Inc. (IAI) is developing a net-centric alerting capability to address these issues. The capability is built on Sensor Observation Services (SOSs), which is used to collect and monitor sensor data. The alerts can be configured at the service level and at the sensor data level. For example it can alert for irregular data delivery events or a geo-temporal statistic of sensor data crossing a preset threshold. The capability provides multiple delivery mechanisms and protocols, including traditional techniques such as email and RSS. With this capability decision makers can monitor their assets and data streams, correct failures or be alerted about a coming phenomena.

  9. A Web Service Protocol Realizing Interoperable Internet of Things Tasking Capability

    PubMed Central

    Huang, Chih-Yuan; Wu, Cheng-Hung

    2016-01-01

    The Internet of Things (IoT) is an infrastructure that interconnects uniquely-identifiable devices using the Internet. By interconnecting everyday appliances, various monitoring, and physical mashup applications can be constructed to improve human’s daily life. In general, IoT devices provide two main capabilities: sensing and tasking capabilities. While the sensing capability is similar to the World-Wide Sensor Web, this research focuses on the tasking capability. However, currently, IoT devices created by different manufacturers follow different proprietary protocols and are locked in many closed ecosystems. This heterogeneity issue impedes the interconnection between IoT devices and damages the potential of the IoT. To address this issue, this research aims at proposing an interoperable solution called tasking capability description that allows users to control different IoT devices using a uniform web service interface. This paper demonstrates the contribution of the proposed solution by interconnecting different IoT devices for different applications. In addition, the proposed solution is integrated with the OGC SensorThings API standard, which is a Web service standard defined for the IoT sensing capability. Consequently, the Extended SensorThings API can realize both IoT sensing and tasking capabilities in an integrated and interoperable manner. PMID:27589759

  10. SEnviro: a sensorized platform proposal using open hardware and open standards.

    PubMed

    Trilles, Sergio; Luján, Alejandro; Belmonte, Óscar; Montoliu, Raúl; Torres-Sospedra, Joaquín; Huerta, Joaquín

    2015-03-06

    The need for constant monitoring of environmental conditions has produced an increase in the development of wireless sensor networks (WSN). The drive towards smart cities has produced the need for smart sensors to be able to monitor what is happening in our cities. This, combined with the decrease in hardware component prices and the increase in the popularity of open hardware, has favored the deployment of sensor networks based on open hardware. The new trends in Internet Protocol (IP) communication between sensor nodes allow sensor access via the Internet, turning them into smart objects (Internet of Things and Web of Things). Currently, WSNs provide data in different formats. There is a lack of communication protocol standardization, which turns into interoperability issues when connecting different sensor networks or even when connecting different sensor nodes within the same network. This work presents a sensorized platform proposal that adheres to the principles of the Internet of Things and theWeb of Things. Wireless sensor nodes were built using open hardware solutions, and communications rely on the HTTP/IP Internet protocols. The Open Geospatial Consortium (OGC) SensorThings API candidate standard was used as a neutral format to avoid interoperability issues. An environmental WSN developed following the proposed architecture was built as a proof of concept. Details on how to build each node and a study regarding energy concerns are presented.

  11. SEnviro: A Sensorized Platform Proposal Using Open Hardware and Open Standards

    PubMed Central

    Trilles, Sergio; Luján, Alejandro; Belmonte, Óscar; Montoliu, Raúl; Torres-Sospedra, Joaquín; Huerta, Joaquín

    2015-01-01

    The need for constant monitoring of environmental conditions has produced an increase in the development of wireless sensor networks (WSN). The drive towards smart cities has produced the need for smart sensors to be able to monitor what is happening in our cities. This, combined with the decrease in hardware component prices and the increase in the popularity of open hardware, has favored the deployment of sensor networks based on open hardware. The new trends in Internet Protocol (IP) communication between sensor nodes allow sensor access via the Internet, turning them into smart objects (Internet of Things and Web of Things). Currently, WSNs provide data in different formats. There is a lack of communication protocol standardization, which turns into interoperability issues when connecting different sensor networks or even when connecting different sensor nodes within the same network. This work presents a sensorized platform proposal that adheres to the principles of the Internet of Things and the Web of Things. Wireless sensor nodes were built using open hardware solutions, and communications rely on the HTTP/IP Internet protocols. The Open Geospatial Consortium (OGC) SensorThings API candidate standard was used as a neutral format to avoid interoperability issues. An environmental WSN developed following the proposed architecture was built as a proof of concept. Details on how to build each node and a study regarding energy concerns are presented. PMID:25756864

  12. THE TSUNAMI SERVICE BUS, AN INTEGRATION PLATFORM FOR HETEROGENEOUS SENSOR SYSTEMS

    NASA Astrophysics Data System (ADS)

    Fleischer, J.; Häner, R.; Herrnkind, S.; Kriegel, U.; Schwarting, H.; Wächter, J.

    2009-12-01

    The Tsunami Service Bus (TSB) is the sensor integration platform of the German Indonesian Tsunami Early Warning System (GITEWS) [1]. The primary goal of GITEWS is to deliver reliable tsunami warnings as fast as possible. This is achieved on basis of various sensor systems like seismometers, ocean instrumentation, and GPS stations, all providing fundamental data to support prediction of tsunami wave propagation by the GITEWS warning center. However, all these sensors come with their own proprietary data formats and specific behavior. Also new sensor types might be added, old sensors will be replaced. To keep GITEWS flexible the TSB was developed in order to access and control sensors in a uniform way. To meet these requirements the TSB follows the architectural blueprint of a Service Oriented Architecture (SOA). The integration platform implements dedicated services communicating via a service infrastructure. The functionality required for early warnings is provided by loosely coupled services replacing the "hard-wired" coupling at data level. Changes in the sensor specification are confined to the data level without affecting the warning center. Great emphasis was laid on following the Sensor Web Enablement (SWE) standard [2], specified by the Open Geospatial Consortium (OGC) [3]. As a result the full functionality needed in GITEWS could be achieved by implementing the four SWE services: The Sensor Observation Service for retrieving sensor measurements, the Sensor Alert Service in order to deliver sensor alerts, the Sensor Planning Service for tasking sensors, and the Web Notification Service for conduction messages to various media channels. Beyond these services the TSB also follows SWE Observation & Measurements specifications (O&M) for data encoding and Sensor Model Language (SensorML) for meta information. Moreover, accessing sensors via the TSB is not restricted to GITEWS. Multiple instances of the TSB can be composed to realize federate warning system. Beside the already operating TSB at the BMKG warning center [4], two other organizations in Indonesia ([5], [6]) consider using the TSB, making their data centers available to GITEWS. The presentation takes a look at the concepts and implementation and reflects the usefulness of the mentioned standards. REFERENCES [1] GITEWS is a project of the German Federal Government to aid the recon¬struction of the tsunami-prone region of the Indian Ocean, http://www.gitews.org/ [2] SWE, www.opengeospatial.org/projects/groups/sensorweb [3] OGC, www.opengeospatial.org [4] Meteorological and Geophysical Agency of Indonesia (BMKG), www.bmg.go.id [5] National Coordinating Agency for Surveys and Mapping (BAKOSURTANAL), www.bakosurtanal.go.id [6] Agency for the Assessment & Application of Technology (BPPT), www.bppt.go.id

  13. Borderless Geospatial Web (bolegweb)

    NASA Astrophysics Data System (ADS)

    Cetl, V.; Kliment, T.; Kliment, M.

    2016-06-01

    The effective access and use of geospatial information (GI) resources acquires a critical value of importance in modern knowledge based society. Standard web services defined by Open Geospatial Consortium (OGC) are frequently used within the implementations of spatial data infrastructures (SDIs) to facilitate discovery and use of geospatial data. This data is stored in databases located in a layer, called the invisible web, thus are ignored by search engines. SDI uses a catalogue (discovery) service for the web as a gateway to the GI world through the metadata defined by ISO standards, which are structurally diverse to OGC metadata. Therefore, a crosswalk needs to be implemented to bridge the OGC resources discovered on mainstream web with those documented by metadata in an SDI to enrich its information extent. A public global wide and user friendly portal of OGC resources available on the web ensures and enhances the use of GI within a multidisciplinary context and bridges the geospatial web from the end-user perspective, thus opens its borders to everybody. Project "Crosswalking the layers of geospatial information resources to enable a borderless geospatial web" with the acronym BOLEGWEB is ongoing as a postdoctoral research project at the Faculty of Geodesy, University of Zagreb in Croatia (http://bolegweb.geof.unizg.hr/). The research leading to the results of the project has received funding from the European Union Seventh Framework Programme (FP7 2007-2013) under Marie Curie FP7-PEOPLE-2011-COFUND. The project started in the November 2014 and is planned to be finished by the end of 2016. This paper provides an overview of the project, research questions and methodology, so far achieved results and future steps.

  14. Sensor Webs as Virtual Data Systems for Earth Science

    NASA Astrophysics Data System (ADS)

    Moe, K. L.; Sherwood, R.

    2008-05-01

    The NASA Earth Science Technology Office established a 3-year Advanced Information Systems Technology (AIST) development program in late 2006 to explore the technical challenges associated with integrating sensors, sensor networks, data assimilation and modeling components into virtual data systems called "sensor webs". The AIST sensor web program was initiated in response to a renewed emphasis on the sensor web concepts. In 2004, NASA proposed an Earth science vision for a more robust Earth observing system, coupled with remote sensing data analysis tools and advances in Earth system models. The AIST program is conducting the research and developing components to explore the technology infrastructure that will enable the visionary goals. A working statement for a NASA Earth science sensor web vision is the following: On-demand sensing of a broad array of environmental and ecological phenomena across a wide range of spatial and temporal scales, from a heterogeneous suite of sensors both in-situ and in orbit. Sensor webs will be dynamically organized to collect data, extract information from it, accept input from other sensor / forecast / tasking systems, interact with the environment based on what they detect or are tasked to perform, and communicate observations and results in real time. The focus on sensor webs is to develop the technology and prototypes to demonstrate the evolving sensor web capabilities. There are 35 AIST projects ranging from 1 to 3 years in duration addressing various aspects of sensor webs involving space sensors such as Earth Observing-1, in situ sensor networks such as the southern California earthquake network, and various modeling and forecasting systems. Some of these projects build on proof-of-concept demonstrations of sensor web capabilities like the EO-1 rapid fire response initially implemented in 2003. Other projects simulate future sensor web configurations to evaluate the effectiveness of sensor-model interactions for producing improved science predictions. Still other projects are maturing technology to support autonomous operations, communications and system interoperability. This paper will highlight lessons learned by various projects during the first half of the AIST program. Several sensor web demonstrations have been implemented and resulting experience with evolving standards, such as the Open Geospatial Consortium (OGC) Sensor Web Enablement (SWE) among others, will be featured. The role of sensor webs in support of the intergovernmental Group on Earth Observations' Global Earth Observation System of Systems (GEOSS) will also be discussed. The GEOSS vision is a distributed system of systems that builds on international components to supply observing and processing systems that are, in the whole, comprehensive, coordinated and sustained. Sensor web prototypes are under development to demonstrate how remote sensing satellite data, in situ sensor networks and decision support systems collaborate in applications of interest to GEO, such as flood monitoring. Furthermore, the international Committee on Earth Observation Satellites (CEOS) has stepped up to the challenge to provide the space-based systems component for GEOSS. CEOS has proposed "virtual constellations" to address emerging data gaps in environmental monitoring, avoid overlap among observing systems, and make maximum use of existing space and ground assets. Exploratory applications that support the objectives of virtual constellations will also be discussed as a future role for sensor webs.

  15. Serving Satellite Remote Sensing Data to User Community through the OGC Interoperability Protocols

    NASA Astrophysics Data System (ADS)

    di, L.; Yang, W.; Bai, Y.

    2005-12-01

    Remote sensing is one of the major methods for collecting geospatial data. Hugh amount of remote sensing data has been collected by space agencies and private companies around the world. For example, NASA's Earth Observing System (EOS) is generating more than 3 Tb of remote sensing data per day. The data collected by EOS are processed, distributed, archived, and managed by the EOS Data and Information System (EOSDIS). Currently, EOSDIS is managing several petabytes of data. All of those data are not only valuable for global change research, but also useful for local and regional application and decision makings. How to make the data easily accessible to and usable by the user community is one of key issues for realizing the full potential of these valuable datasets. In the past several years, the Open Geospatial Consortium (OGC) has developed several interoperability protocols aiming at making geospatial data easily accessible to and usable by the user community through Internet. The protocols particularly relevant to the discovery, access, and integration of multi-source satellite remote sensing data are the Catalog Service for Web (CS/W) and Web Coverage Services (WCS) Specifications. The OGC CS/W specifies the interfaces, HTTP protocol bindings, and a framework for defining application profiles required to publish and access digital catalogues of metadata for geographic data, services, and related resource information. The OGC WCS specification defines the interfaces between web-based clients and servers for accessing on-line multi-dimensional, multi-temporal geospatial coverage in an interoperable way. Based on definitions by OGC and ISO 19123, coverage data include all remote sensing images as well as gridded model outputs. The Laboratory for Advanced Information Technology and Standards (LAITS), George Mason University, has been working on developing and implementing OGC specifications for better serving NASA Earth science data to the user community for many years. We have developed the NWGISS software package that implements multiple OGC specifications, including OGC WMS, WCS, CS/W, and WFS. As a part of NASA REASON GeoBrain project, the NWGISS WCS and CS/W servers have been extended to provide operational access to NASA EOS data at data pools through OGC protocols and to make both services chainable in the web-service chaining. The extensions in the WCS server include the implementation of WCS 1.0.0 and WCS 1.0.2, and the development of WSDL description of the WCS services. In order to find the on-line EOS data resources, the CS/W server is extended at the backend to search metadata in NASA ECHO. This presentation reports those extensions and discuss lessons-learned on the implementation. It also discusses the advantage, disadvantages, and future improvement of OGC specifications, particularly the WCS.

  16. Proposal for a Web Encoding Service (wes) for Spatial Data Transactio

    NASA Astrophysics Data System (ADS)

    Siew, C. B.; Peters, S.; Rahman, A. A.

    2015-10-01

    Web services utilizations in Spatial Data Infrastructure (SDI) have been well established and standardized by Open Geospatial Consortium (OGC). Similar web services for 3D SDI are also being established in recent years, with extended capabilities to handle 3D spatial data. The increasing popularity of using City Geographic Markup Language (CityGML) for 3D city modelling applications leads to the needs for large spatial data handling for data delivery. This paper revisits the available web services in OGC Web Services (OWS), and propose the background concepts and requirements for encoding spatial data via Web Encoding Service (WES). Furthermore, the paper discusses the data flow of the encoder within web service, e.g. possible integration with Web Processing Service (WPS) or Web 3D Services (W3DS). The integration with available web service could be extended to other available web services for efficient handling of spatial data, especially 3D spatial data.

  17. Adapting the CUAHSI Hydrologic Information System to OGC standards

    NASA Astrophysics Data System (ADS)

    Valentine, D. W.; Whitenack, T.; Zaslavsky, I.

    2010-12-01

    The CUAHSI Hydrologic Information System (HIS) provides web and desktop client access to hydrologic observations via water data web services using an XML schema called “WaterML”. The WaterML 1.x specification and the corresponding Water Data Services have been the backbone of the HIS service-oriented architecture (SOA) and have been adopted for serving hydrologic data by several federal agencies and many academic groups. The central discovery service, HIS Central, is based on an metadata catalog that references 4.7 billion observations, organized as 23 million data series from 1.5 million sites from 51 organizations. Observations data are published using HydroServer nodes that have been deployed at 18 organizations. Usage of HIS has increased by 8x from 2008 to 2010, and doubled in usage from 1600 data series a day in 2009 to 3600 data series a day in the first half of 2010. The HIS central metadata catalog currently harvests information from 56 Water Data Services. We collaborate on the catalog updates with two federal partners, USGS and US EPA: their data series are periodically reloaded into the HIS metadata catalog. We are pursuing two main development directions in the HIS project: Cloud-based computing, and further compliance with Open Geospatial Consortium (OGC) standards. The goal of moving to cloud-computing is to provide a scalable collaborative system with a simpler deployment and less dependence of hardware maintenance and staff. This move requires re-architecting the information models underlying the metadata catalog, and Water Data Services to be independent of the underlying relational database model, allowing for implementation on both relational databases, and cloud-based processing systems. Cloud-based HIS central resources can be managed collaboratively; partners share responsibility for their metadata by publishing data series information into the centralized catalog. Publishing data series will use REST-based service interfaces, like OData, as the basis for ingesting data series information into a cloud-hosted catalog. The future HIS services involve providing information via OGC Standards that will allow for observational data access from commercial GIS applications. Use of standards will allow for tools to access observational data from other projects using standards, such as the Ocean Observatories Initiative, and for tools from such projects to be integrated into the HIS toolset. With international collaborators, we have been developing a water information exchange language called “WaterML 2.0” which will be used to deliver observations data over OGC Sensor Observation Services (SOS). A software stack of OGC standard services will provide access to HIS information. In addition to SOS, Web Mapping and Feature Services (WMS, and WFS) will provide access to location information. Catalog Services for the Web (CSW) will provide a catalog for water information that is both centralized, and distributed. We intend the OGC standards supplement the existing HIS service interfaces, rather than replace the present service interfaces. The ultimate goal of this development is expand access to hydrologic observations data, and create an environment where these data can be seamlessly integrated with standards-compliant data resources.

  18. Common Approach to Geoprocessing of Uav Data across Application Domains

    NASA Astrophysics Data System (ADS)

    Percivall, G. S.; Reichardt, M.; Taylor, T.

    2015-08-01

    UAVs are a disruptive technology bringing new geographic data and information to many application domains. UASs are similar to other geographic imagery systems so existing frameworks are applicable. But the diversity of UAVs as platforms along with the diversity of available sensors are presenting challenges in the processing and creation of geospatial products. Efficient processing and dissemination of the data is achieved using software and systems that implement open standards. The challenges identified point to the need for use of existing standards and extending standards. Results from the use of the OGC Sensor Web Enablement set of standards are presented. Next steps in the progress of UAVs and UASs may follow the path of open data, open source and open standards.

  19. DISTANT EARLY WARNING SYSTEM for Tsunamis - A wide-area and multi-hazard approach

    NASA Astrophysics Data System (ADS)

    Hammitzsch, Martin; Lendholt, Matthias; Wächter, Joachim

    2010-05-01

    The DEWS (Distant Early Warning System) [1] project, funded under the 6th Framework Programme of the European Union, has the objective to create a new generation of interoperable early warning systems based on an open sensor platform. This platform integrates OGC [2] SWE [3] compliant sensor systems for the rapid detection of hazardous events, like earthquakes, sea level anomalies, ocean floor occurrences, and ground displacements in the case of tsunami early warning. Based on the upstream information flow DEWS focuses on the improvement of downstream capacities of warning centres especially by improving information logistics for effective and targeted warning message aggregation for a multilingual environment. Multiple telecommunication channels will be used for the dissemination of warning messages. Wherever possible, existing standards have been integrated. The Command and Control User Interface (CCUI), a rich client application based on Eclipse RCP (Rich Client Platform) [4] and the open source GIS uDig [5], integrates various OGC services. Using WMS (Web Map Service) [6] and WFS (Web Feature Service) [7] spatial data are utilized to depict the situation picture and to integrate a simulation system via WPS (Web Processing Service) [8] to identify affected areas. Warning messages are compiled and transmitted in the OASIS [9] CAP (Common Alerting Protocol) [10] standard together with addressing information defined via EDXL-DE (Emergency Data Exchange Language - Distribution Element) [11]. Internal interfaces are realized with SOAP [12] web services. Based on results of GITEWS [13] - in particular the GITEWS Tsunami Service Bus [14] - the DEWS approach provides an implementation for tsunami early warning systems but other geological paradigms are going to follow, e.g. volcanic eruptions or landslides. Therefore in future also multi-hazard functionality is conceivable. The specific software architecture of DEWS makes it possible to dock varying sensors to the system and to extend the CCUI with hazard specific functionality. The presentation covers the DEWS project, the system architecture and the CCUI in conjunction with details of information logistics. The DEWS Wide Area Centre connecting national centres to allow the international communication and warning exchange is presented also. REFERENCES: [1] DEWS, www.dews-online.org [2] OGC, www.opengeospatial.org [3] SWE, www.opengeospatial.org/projects/groups/sensorweb [4] Eclipse RCP, www.eclipse.org/home/categories/rcp.php [5] uDig, udig.refractions.net [6] WMS, www.opengeospatial.org/standards/wms [7] WFS, www.opengeospatial.org/standards/wfs [8] WPS, www.opengeospatial.org/standards/wps [9] OASIS, www.oasis-open.org [10] CAP, www.oasis-open.org/specs/#capv1.1 [11] EDXL-DE, www.oasis-open.org/specs/#edxlde-v1.0 [12] SOAP, www.w3.org/TR/soap [13] GITEWS (German Indonesian Tsunami Early Warning System) is a project of the German Federal Government to aid the recon¬struction of the tsunami-prone Indian Ocean region, www.gitews.org [14] The Tsunami Service Bus is the GITEWS sensor system integration platform offering standardised services for the detection and monitoring of tsunamis

  20. Developing an Online Framework for Publication of Uncertainty Information in Hydrological Modeling

    NASA Astrophysics Data System (ADS)

    Etienne, E.; Piasecki, M.

    2012-12-01

    Inaccuracies in data collection and parameters estimation, and imperfection of models structures imply uncertain predictions of the hydrological models. Finding a way to communicate the uncertainty information in a model output is important in decision-making. This work aims to publish uncertainty information (computed by project partner at Penn State) associated with hydrological predictions on catchments. To this end we have developed a DB schema (derived from the CUAHSI ODM design) which is focused on storing uncertainty information and its associated metadata. The technologies used to build the system are: OGC's Sensor Observation Service (SOS) for publication, the uncertML markup language (also developed by the OGC) to describe uncertainty information, and use of the Interoperability and Automated Mapping (INTAMAP) Web Processing Service (WPS) that handles part of the statistics computations. We develop a service to provide users with the capability to exploit all the functionality of the system (based on DRUPAL). Users will be able to request and visualize uncertainty data, and also publish their data in the system.

  1. Sensor web enablement in a network of low-energy, low-budget amateur weather stations

    NASA Astrophysics Data System (ADS)

    Herrnkind, S.; Klump, J.; Schmidt, G.

    2009-04-01

    Sensor Web Enablement (OGC SWE) has developed in into a powerful concept with many potential applications in environmental monitoring and in other fields. This has spurred development of software applications for Sensor Observation Services (SOS), while the development of client applications still lags behind. Furthermore, the deployment of sensors in the field often places tight constraints on energy and bandwidth available for data capture and transmission. As a „proof of concept" we equipped amateur weather stations with low-budget, standard components to read the data from its base station and feed the weather observation data into the sensor observation service using its standard web-service interface. We chose amateur weather station as an example because of the simplicity of measured phenomena and low data volume. As sensor observation service we chose the open source software package offered by the 52°North consortium. Furthermore, we investigated registry services for sensors and measured phenomena. When deploying a sensor platform in the field, power consumption can be an issue. Instead of common PCs we used Network Storage Link Units (NSLU2) with a Linux operating system, also known as "Debian SLUG". The power consumption of a "SLUG" is of the order of 1W, compared to 40W in a small PC. The "SLUG" provides one ethernet and two USB ports, one used by its external USB hard-drive. This modular set-up is open to modifications, for example the addition of a GSM modem for data transmission over a cellular telephone network. The simple set-up, low price, low power consumption, and the low technological entry-level allow many potential uses of a "SLUG" in environmental sensor networks in research, education and citizen science. The use of a mature sensor observation service software allows an easy integration of monitoring networks with other web services.

  2. Interoperable Data Access Services for NOAA IOOS

    NASA Astrophysics Data System (ADS)

    de La Beaujardiere, J.

    2008-12-01

    The Integrated Ocean Observing System (IOOS) is intended to enhance our ability to collect, deliver, and use ocean information. The goal is to support research and decision-making by providing data on our open oceans, coastal waters, and Great Lakes in the formats, rates, and scales required by scientists, managers, businesses, governments, and the public. The US National Oceanic and Atmospheric Administration (NOAA) is the lead agency for IOOS. NOAA's IOOS office supports the development of regional coastal observing capability and promotes data management efforts to increase data accessibility. Geospatial web services have been established at NOAA data providers including the National Data Buoy Center (NDBC), the Center for Operational Oceanographic Products and Services (CO-OPS), and CoastWatch, and at regional data provider sites. Services established include Open-source Project for a Network Data Access Protocol (OpenDAP), Open Geospatial Consortium (OGC) Sensor Observation Service (SOS), and OGC Web Coverage Service (WCS). These services provide integrated access to data holdings that have been aggregated at each center from multiple sources. We wish to collaborate with other groups to improve our service offerings to maximize interoperability and enhance cross-provider data integration, and to share common service components such as registries, catalogs, data conversion, and gateways. This paper will discuss the current status of NOAA's IOOS efforts and possible next steps.

  3. US Geoscience Information Network, Web Services for Geoscience Information Discovery and Access

    NASA Astrophysics Data System (ADS)

    Richard, S.; Allison, L.; Clark, R.; Coleman, C.; Chen, G.

    2012-04-01

    The US Geoscience information network has developed metadata profiles for interoperable catalog services based on ISO19139 and the OGC CSW 2.0.2. Currently data services are being deployed for the US Dept. of Energy-funded National Geothermal Data System. These services utilize OGC Web Map Services, Web Feature Services, and THREDDS-served NetCDF for gridded datasets. Services and underlying datasets (along with a wide variety of other information and non information resources are registered in the catalog system. Metadata for registration is produced by various workflows, including harvest from OGC capabilities documents, Drupal-based web applications, transformation from tabular compilations. Catalog search is implemented using the ESRI Geoportal open-source server. We are pursuing various client applications to demonstrated discovery and utilization of the data services. Currently operational applications allow catalog search and data acquisition from map services in an ESRI ArcMap extension, a catalog browse and search application built on openlayers and Django. We are developing use cases and requirements for other applications to utilize geothermal data services for resource exploration and evaluation.

  4. Leveraging Open Standard Interfaces in Accessing and Processing NASA Data Model Outputs

    NASA Astrophysics Data System (ADS)

    Falke, S. R.; Alameh, N. S.; Hoijarvi, K.; de La Beaujardiere, J.; Bambacus, M. J.

    2006-12-01

    An objective of NASA's Earth Science Division is to develop advanced information technologies for processing, archiving, accessing, visualizing, and communicating Earth Science data. To this end, NASA and other federal agencies have collaborated with the Open Geospatial Consortium (OGC) to research, develop, and test interoperability specifications within projects and testbeds benefiting the government, industry, and the public. This paper summarizes the results of a recent effort under the auspices of the OGC Web Services testbed phase 4 (OWS-4) to explore standardization approaches for accessing and processing the outputs of NASA models of physical phenomena. Within the OWS-4 context, experiments were designed to leverage the emerging OGC Web Processing Service (WPS) and Web Coverage Service (WCS) specifications to access, filter and manipulate the outputs of the NASA Goddard Earth Observing System (GEOS) and Goddard Chemistry Aerosol Radiation and Transport (GOCART) forecast models. In OWS-4, the intent is to provide the users with more control over the subsets of data that they can extract from the model results as well as over the final portrayal of that data. To meet that goal, experiments have been designed to test the suitability of use of OGC's Web Processing Service (WPS) and Web Coverage Service (WCS) for filtering, processing and portraying the model results (including slices by height or by time), and to identify any enhancements to the specs to meet the desired objectives. This paper summarizes the findings of the experiments highlighting the value of the Web Processing Service in providing standard interfaces for accessing and manipulating model data within spatial and temporal frameworks. The paper also points out the key shortcomings of the WPS especially in terms in comparison with a SOAP/WSDL approach towards solving the same problem.

  5. Establishing Transportation Framework Services Using the Open Geospatial Consortium Web Feature Service Specification

    NASA Astrophysics Data System (ADS)

    Yang, C.; Wong, D. W.; Phillips, T.; Wright, R. A.; Lindsey, S.; Kafatos, M.

    2005-12-01

    As a teamed partnership of the Center for Earth Observing and Space Research (CEOSR) at George Mason University (GMU), Virginia Department of Transportation (VDOT), Bureau of Transportation Statistics at the Department of Transportation (BTS/DOT), and Intergraph, we established Transportation Framework Data Services using Open Geospatial Consortium (OGC)'s Web Feature Service (WFS) Specification to enable the sharing of transportation data among the federal level with data from BTS/DOT, the state level through VDOT, the industries through Intergraph. CEOSR develops WFS solutions using Intergraph software. Relevant technical documents are also developed and disseminated through the partners. The WFS is integrated with operational geospatial systems at CEOSR and VDOT. CEOSR works with Intergraph on developing WFS solutions and technical documents. GeoMedia WebMap WFS toolkit is used with software and technical support from Intergraph. ESRI ArcIMS WFS connector is used with GMU's campus license of ESRI products. Tested solutions are integrated with framework data service operational systems, including 1) CEOSR's interoperable geospatial information services, FGDC clearinghouse Node, Geospatial One Stop (GOS) portal, and WMS services, 2) VDOT's state transportation data and GIS infrastructure, and 3)BTS/DOT's national transportation data. The project presents: 1) develop and deploy an operational OGC WFS 1.1 interfaces at CEOSR for registering with FGDC/GOS Portal and responding to Web ``POST'' requests for transportation Framework data as listed in Table 1; 2) build the WFS service that can return the data that conform to the drafted ANSI/INCITS L1 Standard (when available) for each identified theme in the format given by OGC Geography Markup Language (GML) Version 3.0 or higher; 3) integrate the OGC WFS with CEOSR's clearinghouse nodes, 4) establish a formal partnership to develop and share WFS-based geospatial interoperability technology among GMU, VDOT, BTS/DOT, and Intergraph; and 5) develop WFS-based solutions and technical documents using the GeoMedia WebMap WFS toolkit. Geospatial Web Feature Service is demonstrated to be more efficient in sharing vector data and supports direct Internet access transportation data. Developed WFS solutions also enhanced the interoperable service provided by CEOSR through the FGDC clearinghouse node and the GOS Portal.

  6. Publishing Platform for Aerial Orthophoto Maps, the Complete Stack

    NASA Astrophysics Data System (ADS)

    Čepický, J.; Čapek, L.

    2016-06-01

    When creating set of orthophoto maps from mosaic compositions, using airborne systems, such as popular drones, we need to publish results of the work to users. Several steps need to be performed in order get large scale raster data published. As first step, data have to be shared as service (OGC WMS as view service, OGC WCS as download service). But for some applications, OGC WMTS is handy as well, for faster view of the data. Finally the data have to become a part of web mapping application, so that they can be used and evaluated by non-technical users. In this talk, we would like to present automated line of those steps, where user puts in orthophoto image and as a result, OGC Open Web Services are published as well as web mapping application with the data. The web mapping application can be used as standard presentation platform for such type of big raster data to generic user. The publishing platform - Geosense online map information system - can be also used for combination of data from various resources and for creating of unique map compositions and as input for better interpretations of photographed phenomenons. The whole process is successfully tested with eBee drone with raster data resolution 1.5-4 cm/px on many areas and result is also used for creation of derived datasets, usually suited for property management - the records of roads, pavements, traffic signs, public lighting, sewage system, grave locations, and others.

  7. Oceanids command and control (C2) data system - Marine autonomous systems data for vehicle piloting, scientific data users, operational data assimilation, and big data

    NASA Astrophysics Data System (ADS)

    Buck, J. J. H.; Phillips, A.; Lorenzo, A.; Kokkinaki, A.; Hearn, M.; Gardner, T.; Thorne, K.

    2017-12-01

    The National Oceanography Centre (NOC) operate a fleet of approximately 36 autonomous marine platforms including submarine gliders, autonomous underwater vehicles, and autonomous surface vehicles. Each platform effectivity has the capability to observe the ocean and collect data akin to a small research vessel. This is creating a growth in data volumes and complexity while the amount of resource available to manage data remains static. The OceanIds Command and Control (C2) project aims to solve these issues by fully automating the data archival, processing and dissemination. The data architecture being implemented jointly by NOC and the Scottish Association for Marine Science (SAMS) includes a single Application Programming Interface (API) gateway to handle authentication, forwarding and delivery of both metadata and data. Technicians and principle investigators will enter expedition data prior to deployment of vehicles enabling automated data processing when vehicles are deployed. The system will support automated metadata acquisition from platforms as this technology moves towards operational implementation. The metadata exposure to the web builds on a prototype developed by the European Commission supported SenseOCEAN project and is via open standards including World Wide Web Consortium (W3C) RDF/XML and the use of the Semantic Sensor Network ontology and Open Geospatial Consortium (OGC) SensorML standard. Data will be delivered in the marine domain Everyone's Glider Observatory (EGO) format and OGC Observations and Measurements. Additional formats will be served by implementation of endpoints such as the NOAA ERDDAP tool. This standardised data delivery via the API gateway enables timely near-real-time data to be served to Oceanids users, BODC users, operational users and big data systems. The use of open standards will also enable web interfaces to be rapidly built on the API gateway and delivery to European research infrastructures that include aligned reference models for data infrastructure.

  8. Usage of Wireless Sensor Networks in a service based spatial data infrastructure for Landslide Monitoring and Early Warning

    NASA Astrophysics Data System (ADS)

    Arnhardt, C.; Fernandez-Steeger, T. M.; Walter, K.; Kallash, A.; Niemeyer, F.; Azzam, R.; Bill, R.

    2007-12-01

    The joint project Sensor based Landslide Early Warning System (SLEWS) aims at a systematic development of a prototyping alarm- and early warning system for the detection of mass movements by application of an ad hoc wireless sensor network (WSN). Next to the development of suitable sensor setups, sensor fusion and network fusion are applied to enhance data quality and reduce false alarm rates. Of special interest is the data retrieval, processing and visualization in GI-Systems. Therefore a suitable serviced based Spatial Data Infrastructure (SDI) will be developed with respect to existing and upcoming Open Geospatial Consortium (OGC) standards.The application of WSN provides a cheap and easy to set up solution for special monitoring and data gathering in large areas. Measurement data from different low-cost transducers for deformation observation (acceleration, displacement, tilting) is collected by distributed sensor nodes (motes), which interact separately and connect each other in a self-organizing manner. Data are collected and aggregated at the beacon (transmission station) and further operations like data pre-processing and compression can be performed. The WSN concept provides next to energy efficiency, miniaturization, real-time monitoring and remote operation, but also new monitoring strategies like sensor and network fusion. Since not only single sensors can be integrated at single motes either cross-validation or redundant sensor setups are possible to enhance data quality. The planned monitoring and information system will include a mobile infrastructure (information technologies and communication components) as well as methods and models to estimate surface deformation parameters (positioning systems). The measurements result in heterogeneous observation sets that have to be integrated in a common adjustment and filtering approach. Reliable real-time information will be obtained using a range of sensor input and algorithms, from which early warnings and prognosis may be derived. Implementation of sensor algorithms is an important task to form the business logic. This will be represented in self-contained web-based processing services (WPS). In the future different types of sensor networks can communicate via an infrastructure of OGC services using an interoperable way by standardized protocols as the Sensor Markup Language (SensorML) and Observations & Measurements Schema (O&M). Synchronous and asynchronous information services as the Sensor Alert Service (SAS) and the Web Notification Services (WNS) will provide defined users and user groups with time-critical readings from the observation site. Techniques using services for visualizing mapping data (WMS), meta data (CSW), vector (WFS) and raster data (WCS) will range from high detailed expert based output to fuzzy graphical warning elements.The expected results will be an advancement regarding classical alarm and early warning systems as the WSN are free scalable, extensible and easy to install.

  9. Cross-Dataset Analysis and Visualization Driven by Expressive Web Services

    NASA Astrophysics Data System (ADS)

    Alexandru Dumitru, Mircea; Catalin Merticariu, Vlad

    2015-04-01

    The deluge of data that is hitting us every day from satellite and airborne sensors is changing the workflow of environmental data analysts and modelers. Web geo-services play now a fundamental role, and are no longer needed to preliminary download and store the data, but rather they interact in real-time with GIS applications. Due to the very large amount of data that is curated and made available by web services, it is crucial to deploy smart solutions for optimizing network bandwidth, reducing duplication of data and moving the processing closer to the data. In this context we have created a visualization application for analysis and cross-comparison of aerosol optical thickness datasets. The application aims to help researchers identify and visualize discrepancies between datasets coming from various sources, having different spatial and time resolutions. It also acts as a proof of concept for integration of OGC Web Services under a user-friendly interface that provides beautiful visualizations of the explored data. The tool was built on top of the World Wind engine, a Java based virtual globe built by NASA and the open source community. For data retrieval and processing we exploited the OGC Web Coverage Service potential: the most exciting aspect being its processing extension, a.k.a. the OGC Web Coverage Processing Service (WCPS) standard. A WCPS-compliant service allows a client to execute a processing query on any coverage offered by the server. By exploiting a full grammar, several different kinds of information can be retrieved from one or more datasets together: scalar condensers, cross-sectional profiles, comparison maps and plots, etc. This combination of technology made the application versatile and portable. As the processing is done on the server-side, we ensured that the minimal amount of data is transferred and that the processing is done on a fully-capable server, leaving the client hardware resources to be used for rendering the visualization. The application offers a set of features to visualize and cross-compare the datasets. Users can select a region of interest in space and time on which an aerosol map layer is plotted. Hovmoeller time-latitude and time-longitude profiles can be displayed by selecting orthogonal cross-sections on the globe. Statistics about the selected dataset are also displayed in different text and plot formats. The datasets can also be cross-compared either by using the delta map tool or the merged map tool. For more advanced users, a WCPS query console is also offered allowing users to process their data with ad-hoc queries and then choose how to display the results. Overall, the user has a rich set of tools that can be used to visualize and cross-compare the aerosol datasets. With our application we have shown how the NASA WorldWind framework can be used to display results processed efficiently - and entirely - on the server side using the expressiveness of the OGC WCPS web-service. The application serves not only as a proof of concept of a new paradigm in working with large geospatial data but also as an useful tool for environmental data analysts.

  10. Design and Implement AN Interoperable Internet of Things Application Based on AN Extended Ogc Sensorthings Api Standard

    NASA Astrophysics Data System (ADS)

    Huang, C. Y.; Wu, C. H.

    2016-06-01

    The Internet of Things (IoT) is an infrastructure that interconnects uniquely-identifiable devices using the Internet. By interconnecting everyday appliances, various monitoring and physical mashup applications can be constructed to improve people's daily life. However, IoT devices created by different manufacturers follow different proprietary protocols and cannot communicate with each other. This heterogeneity issue causes different products to be locked in multiple closed ecosystems that we call IoT silos. In order to address this issue, a common industrial solution is the hub approach, which implements connectors to communicate with IoT devices following different protocols. However, with the growing number of proprietary protocols proposed by device manufacturers, IoT hubs need to support and maintain a lot of customized connectors. Hence, we believe the ultimate solution to address the heterogeneity issue is to follow open and interoperable standard. Among the existing IoT standards, the Open Geospatial Consortium (OGC) SensorThings API standard supports comprehensive conceptual model and query functionalities. The first version of SensorThings API mainly focuses on connecting to IoT devices and sharing sensor observations online, which is the sensing capability. Besides the sensing capability, IoT devices could also be controlled via the Internet, which is the tasking capability. While the tasking capability was not included in the first version of the SensorThings API standard, this research aims on defining the tasking capability profile and integrates with the SensorThings API standard, which we call the extended-SensorThings API in this paper. In general, this research proposes a lightweight JSON-based web service description, the "Tasking Capability Description", allowing device owners and manufacturers to describe different IoT device protocols. Through the extended- SensorThings API, users and applications can follow a coherent protocol to control IoT devices that use different communication protocols, which could consequently achieve the interoperable Internet of Things infrastructure.

  11. Web Services Implementations at Land Process and Goddard Earth Sciences Distributed Active Archive Centers

    NASA Astrophysics Data System (ADS)

    Cole, M.; Bambacus, M.; Lynnes, C.; Sauer, B.; Falke, S.; Yang, W.

    2007-12-01

    NASA's vast array of scientific data within its Distributed Active Archive Centers (DAACs) is especially valuable to both traditional research scientists as well as the emerging market of Earth Science Information Partners. For example, the air quality science and management communities are increasingly using satellite derived observations in their analyses and decision making. The Air Quality Cluster in the Federation of Earth Science Information Partners (ESIP) uses web infrastructures of interoperability, or Service Oriented Architecture (SOA), to extend data exploration, use, and analysis and provides a user environment for DAAC products. In an effort to continually offer these NASA data to the broadest research community audience, and reusing emerging technologies, both NASA's Goddard Earth Science (GES) and Land Process (LP) DAACs have engaged in a web services pilot project. Through these projects both GES and LP have exposed data through the Open Geospatial Consortiums (OGC) Web Services standards. Reusing several different existing applications and implementation techniques, GES and LP successfully exposed a variety data, through distributed systems to be ingested into multiple end-user systems. The results of this project will enable researchers world wide to access some of NASA's GES & LP DAAC data through OGC protocols. This functionality encourages inter-disciplinary research while increasing data use through advanced technologies. This paper will concentrate on the implementation and use of OGC Web Services, specifically Web Map and Web Coverage Services (WMS, WCS) at GES and LP DAACs, and the value of these services within scientific applications, including integration with the DataFed air quality web infrastructure and in the development of data analysis web applications.

  12. Towards Semantic Web Services on Large, Multi-Dimensional Coverages

    NASA Astrophysics Data System (ADS)

    Baumann, P.

    2009-04-01

    Observed and simulated data in the Earth Sciences often come as coverages, the general term for space-time varying phenomena as set forth by standardization bodies like the Open GeoSpatial Consortium (OGC) and ISO. Among such data are 1-d time series, 2-D surface data, 3-D surface data time series as well as x/y/z geophysical and oceanographic data, and 4-D metocean simulation results. With increasing dimensionality the data sizes grow exponentially, up to Petabyte object sizes. Open standards for exploiting coverage archives over the Web are available to a varying extent. The OGC Web Coverage Service (WCS) standard defines basic extraction operations: spatio-temporal and band subsetting, scaling, reprojection, and data format encoding of the result - a simple interoperable interface for coverage access. More processing functionality is available with products like Matlab, Grid-type interfaces, and the OGC Web Processing Service (WPS). However, these often lack properties known as advantageous from databases: declarativeness (describe results rather than the algorithms), safe in evaluation (no request can keep a server busy infinitely), and optimizable (enable the server to rearrange the request so as to produce the same result faster). WPS defines a geo-enabled SOAP interface for remote procedure calls. This allows to webify any program, but does not allow for semantic interoperability: a function is identified only by its function name and parameters while the semantics is encoded in the (only human readable) title and abstract. Hence, another desirable property is missing, namely an explicit semantics which allows for machine-machine communication and reasoning a la Semantic Web. The OGC Web Coverage Processing Service (WCPS) language, which has been adopted as an international standard by OGC in December 2008, defines a flexible interface for the navigation, extraction, and ad-hoc analysis of large, multi-dimensional raster coverages. It is abstract in that it does not anticipate any particular protocol. One such protocol is given by the OGC Web Coverage Service (WCS) Processing Extension standard which ties WCPS into WCS. Another protocol which makes WCPS an OGC Web Processing Service (WPS) Profile is under preparation. Thereby, WCPS bridges WCS and WPS. The conceptual model of WCPS relies on the coverage model of WCS, which in turn is based on ISO 19123. WCS currently addresses raster-type coverages where a coverage is seen as a function mapping points from a spatio-temporal extent (its domain) into values of some cell type (its range). A retrievable coverage has an identifier associated, further the CRSs supported and, for each range field (aka band, channel), the interpolation methods applicable. The WCPS language offers access to one or several such coverages via a functional, side-effect free language. The following example, which derives the NDVI (Normalized Difference Vegetation Index) from given coverages C1, C2, and C3 within the regions identified by the binary mask R, illustrates the language concept: for c in ( C1, C2, C3 ), r in ( R ) return encode( (char) (c.nir - c.red) / (c.nir + c.red), H˜DF-EOS\\~ ) The result is a list of three HDF-EOS encoded images containing masked NDVI values. Note that the same request can operate on coverages of any dimensionality. The expressive power of WCPS includes statistics, image, and signal processing up to recursion, to maintain safe evaluation. As both syntax and semantics of any WCPS expression is well known the language is Semantic Web ready: clients can construct WCPS requests on the fly, servers can optimize such requests (this has been investigated extensively with the rasdaman raster database system) and automatically distribute them for processing in a WCPS-enabled computing cloud. The WCPS Reference Implementation is being finalized now that the standard is stable; it will be released in open source once ready. Among the future tasks is to extend WCPS to general meshes, in synchronization with the WCS standard. In this talk WCPS is presented in the context of OGC standardization. The author is co-chair of OGC's WCS Working Group (WG) and Coverages WG.

  13. Informatics and Decisions support in Galway Bay (SmartBay) using ERDDAP, OGC Technologies and Third Party Data Sources to Provide Services to the Marine Community.

    NASA Astrophysics Data System (ADS)

    Delaney, Conor; Gaughan, Paul; Smyth, Damian

    2013-04-01

    The global marine sector generates and consumes vast quantities of operational and forecast data on a daily basis. One of the key challenges facing the sector relates to the management and transformation of that data into knowledge. The Irish Marine Institute (MI) generates oceanographic and environmental data on a regular and frequent basis. This data comes from operational ocean models run on the MI's high performance computer (HPC) and various environmental observation sensors systems. Some of the data published by the Marine Institute is brokered by the Environmental Research Division's Data Access Program (ERDDAP) data broker, which is a broker technology that uses technology based on OPeNDAP and Open Geospatial Consortium (OGC) standards. The broker provides a consistent web service interface to the data services of the Marine Institute; these services include wave, tide and weather sensors and numerical model output. An ERDDAP server publishes data in a number of standard and developer friendly ways, including some OGC formats. The data on the MI ERDDAP (http://erddap.marine.ie) server is published as OpenData. The marine work package of the FP7 funded ENVIROFI project (http://www.envirofi.eu/) has used the ERDDAP data broker as a core resource in the development of its Marine Asset management decision Support Tool (MAST) portal and phone App. Communication between MAST and ERDDAP is via a Uniform Resource Identifier (Linked Data). A key objective of the MAST prototype is to demonstrate the potential of next-generation dynamic web-based products and services and how they can be harnessed to facilitate growth of both the marine and IT sectors. The use case driving the project is the management of ocean energy assets in the marine environment. In particular the provision of information that aid in the decision making process surrounding maintenance at sea. This question is common to any offshore industry and solution proposed here is applicable to other users of Galway Bay, Ireland. The architecture of the MAST is based on the concepts of Representational State Transfer (REST), Resource Orientated Architecture (ROA), Service Orientated Architecture (SOA), OpenData and MASHUPS. In this paper we demonstrate the architecture of the MAST system and discuss the potential of ERDDAP technology to serve complex data in formats that are accessible to the general developer community. We also discuss of the potential of next generation web technologies and OpenData to encourage the use of valuable marine data resources.

  14. Web Services as Building Blocks for an Open Coastal Observing System

    NASA Astrophysics Data System (ADS)

    Breitbach, G.; Krasemann, H.

    2012-04-01

    In coastal observing systems it is needed to integrate different observing methods like remote sensing, in-situ measurements, and models into a synoptic view of the state of the observed region. This integration can be based solely on web services combining data and metadata. Such an approach is pursued for COSYNA (Coastal Observing System for Northern and Artic seas). Data from satellite and radar remote sensing, measurements of buoys, stations and Ferryboxes are the observation part of COSYNA. These data are assimilated into models to create pre-operational forecasts. For discovering data an OGC Web Feature Service (WFS) is used by the COSYNA data portal. This Web Feature Service knows the necessary metadata not only for finding data, but in addition the URLs of web services to view and download the data. To make the data from different resources comparable a common vocabulary is needed. For COSYNA the standard names from CF-conventions are stored within the metadata whenever possible. For the metadata an INSPIRE and ISO19115 compatible data format is used. The WFS is fed from the metadata-system using database-views. Actual data are stored in two different formats, in NetCDF-files for gridded data and in an RDBMS for time-series-like data. The web service URLs are mostly standard based the standards are mainly OGC standards. Maps were created from netcdf files with the help of the ncWMS tool whereas a self-developed java servlet is used for maps of moving measurement platforms. In this case download of data is offered via OGC SOS. For NetCDF-files OPeNDAP is used for the data download. The OGC CSW is used for accessing extended metadata. The concept of data management in COSYNA will be presented which is independent of the special services used in COSYNA. This concept is parameter and data centric and might be useful for other observing systems.

  15. Design, Implementation and Applications of 3d Web-Services in DB4GEO

    NASA Astrophysics Data System (ADS)

    Breunig, M.; Kuper, P. V.; Dittrich, A.; Wild, P.; Butwilowski, E.; Al-Doori, M.

    2013-09-01

    The object-oriented database architecture DB4GeO was originally designed to support sub-surface applications in the geo-sciences. This is reflected in DB4GeO's geometric data model as well as in its import and export functions. Initially, these functions were designed for communication with 3D geological modeling and visualization tools such as GOCAD or MeshLab. However, it soon became clear that DB4GeO was suitable for a much wider range of applications. Therefore it is natural to move away from a standalone solution and to open the access to DB4GeO data by standardized OGC web-services. Though REST and OGC services seem incompatible at first sight, the implementation in DB4GeO shows that OGC-based implementation of web-services may use parts of the DB4GeO-REST implementation. Starting with initial solutions in the history of DB4GeO, this paper will introduce the design, adaptation (i.e. model transformation), and first steps in the implementation of OGC Web Feature (WFS) and Web Processing Services (WPS), as new interfaces to DB4GeO data and operations. Among its capabilities, DB4GeO can provide data in different data formats like GML, GOCAD, or DB3D XML through a WFS, as well as its ability to run operations like a 3D-to-2D service, or mesh-simplification (Progressive Meshes) through a WPS. We then demonstrate, an Android-based mobile 3D augmented reality viewer for DB4GeO that uses the Web Feature Service to visualize 3D geo-database query results. Finally, we explore future research work considering DB4GeO in the framework of the research group "Computer-Aided Collaborative Subway Track Planning in Multi-Scale 3D City and Building Models".

  16. Development of a Ground Water Data Portal for Interoperable Data Exchange within the U.S. National Ground Water Monitoring Network and Beyond

    NASA Astrophysics Data System (ADS)

    Booth, N. L.; Brodaric, B.; Lucido, J. M.; Kuo, I.; Boisvert, E.; Cunningham, W. L.

    2011-12-01

    The need for a national groundwater monitoring network within the United States is profound and has been recognized by organizations outside government as a major data gap for managing ground-water resources. Our country's communities, industries, agriculture, energy production and critical ecosystems rely on water being available in adequate quantity and suitable quality. To meet this need the Subcommittee on Ground Water, established by the Federal Advisory Committee on Water Information, created a National Ground Water Monitoring Network (NGWMN) envisioned as a voluntary, integrated system of data collection, management and reporting that will provide the data needed to address present and future ground-water management questions raised by Congress, Federal, State and Tribal agencies and the public. The NGWMN Data Portal is the means by which policy makers, academics and the public will be able to access ground water data through one seamless web-based application from disparate data sources. Data systems in the United States exist at many organizational and geographic levels and differing vocabulary and data structures have prevented data sharing and reuse. The data portal will facilitate the retrieval of and access to groundwater data on an as-needed basis from multiple, dispersed data repositories allowing the data to continue to be housed and managed by the data provider while being accessible for the purposes of the national monitoring network. This work leverages Open Geospatial Consortium (OGC) data exchange standards and information models. To advance these standards for supporting the exchange of ground water information, an OGC Interoperability Experiment was organized among international participants from government, academia and the private sector. The experiment focused on ground water data exchange across the U.S. / Canadian border. WaterML2.0, an evolving international standard for water observations, encodes ground water levels and is exchanged using the OGC Sensor Observation Service (SOS) standard. Ground Water Markup Language (GWML) encodes well log, lithology and construction information and is exchanged using the OGC Web Feature Service (WFS) standard. Within the NGWMN Data Portal, data exchange between distributed data provider repositories is achieved through the use of these web services and a central mediation hub, which performs both format (syntactic) and nomenclature (semantic) mediation, conforming heterogeneous inputs into common standards-based outputs. Through these common standards, interoperability between the U.S. NGWMN and Canada's Groundwater Information Network (GIN) is achieved, advancing a ground water virtual observatory across North America.

  17. An Interoperable Architecture for Air Pollution Early Warning System Based on Sensor Web

    NASA Astrophysics Data System (ADS)

    Samadzadegan, F.; Zahmatkesh, H.; Saber, M.; Ghazi khanlou, H. J.

    2013-09-01

    Environmental monitoring systems deal with time-sensitive issues which require quick responses in emergency situations. Handling the sensor observations in near real-time and obtaining valuable information is challenging issues in these systems from a technical and scientific point of view. The ever-increasing population growth in urban areas has caused certain problems in developing countries, which has direct or indirect impact on human life. One of applicable solution for controlling and managing air quality by considering real time and update air quality information gathered by spatially distributed sensors in mega cities, using sensor web technology for developing monitoring and early warning systems. Urban air quality monitoring systems using functionalities of geospatial information system as a platform for analysing, processing, and visualization of data in combination with Sensor Web for supporting decision support systems in disaster management and emergency situations. This system uses Sensor Web Enablement (SWE) framework of the Open Geospatial Consortium (OGC), which offers a standard framework that allows the integration of sensors and sensor data into spatial data infrastructures. SWE framework introduces standards for services to access sensor data and discover events from sensor data streams as well as definition set of standards for the description of sensors and the encoding of measurements. The presented system provides capabilities to collect, transfer, share, process air quality sensor data and disseminate air quality status in real-time. It is possible to overcome interoperability challenges by using standard framework. In a routine scenario, air quality data measured by in-situ sensors are communicated to central station where data is analysed and processed. The extracted air quality status is processed for discovering emergency situations, and if necessary air quality reports are sent to the authorities. This research proposed an architecture to represent how integrate air quality sensor data stream into geospatial data infrastructure to present an interoperable air quality monitoring system for supporting disaster management systems by real time information. Developed system tested on Tehran air pollution sensors for calculating Air Quality Index (AQI) for CO pollutant and subsequently notifying registered users in emergency cases by sending warning E-mails. Air quality monitoring portal used to retrieving and visualize sensor observation through interoperable framework. This system provides capabilities to retrieve SOS observation using WPS in a cascaded service chaining pattern for monitoring trend of timely sensor observation.

  18. Incorpoaration of Geosensor Networks Into Internet of Things for Environmental Monitoring

    NASA Astrophysics Data System (ADS)

    Habibi, R.; Alesheikh, A. A.

    2015-12-01

    Thanks to the recent advances of miniaturization and the falling costs for sensors and also communication technologies, Internet specially, the number of internet-connected things growth tremendously. Moreover, geosensors with capability of generating high spatial and temporal resolution data, measuring a vast diversity of environmental data and automated operations provide powerful abilities to environmental monitoring tasks. Geosensor nodes are intuitively heterogeneous in terms of the hardware capabilities and communication protocols to take part in the Internet of Things scenarios. Therefore, ensuring interoperability is an important step. With this respect, the focus of this paper is particularly on incorporation of geosensor networks into Internet of things through an architecture for monitoring real-time environmental data with use of OGC Sensor Web Enablement standards. This approach and its applicability is discussed in the context of an air pollution monitoring scenario.

  19. Sensor metadata blueprints and computer-aided editing for disciplined SensorML

    NASA Astrophysics Data System (ADS)

    Tagliolato, Paolo; Oggioni, Alessandro; Fugazza, Cristiano; Pepe, Monica; Carrara, Paola

    2016-04-01

    The need for continuous, accurate, and comprehensive environmental knowledge has led to an increase in sensor observation systems and networks. The Sensor Web Enablement (SWE) initiative has been promoted by the Open Geospatial Consortium (OGC) to foster interoperability among sensor systems. The provision of metadata according to the prescribed SensorML schema is a key component for achieving this and nevertheless availability of correct and exhaustive metadata cannot be taken for granted. On the one hand, it is awkward for users to provide sensor metadata because of the lack in user-oriented, dedicated tools. On the other, the specification of invariant information for a given sensor category or model (e.g., observed properties and units of measurement, manufacturer information, etc.), can be labor- and timeconsuming. Moreover, the provision of these details is error prone and subjective, i.e., may differ greatly across distinct descriptions for the same system. We provide a user-friendly, template-driven metadata authoring tool composed of a backend web service and an HTML5/javascript client. This results in a form-based user interface that conceals the high complexity of the underlying format. This tool also allows for plugging in external data sources providing authoritative definitions for the aforementioned invariant information. Leveraging these functionalities, we compiled a set of SensorML profiles, that is, sensor metadata blueprints allowing end users to focus only on the metadata items that are related to their specific deployment. The natural extension of this scenario is the involvement of end users and sensor manufacturers in the crowd-sourced evolution of this collection of prototypes. We describe the components and workflow of our framework for computer-aided management of sensor metadata.

  20. OGC and Grid Interoperability in enviroGRIDS Project

    NASA Astrophysics Data System (ADS)

    Gorgan, Dorian; Rodila, Denisa; Bacu, Victor; Giuliani, Gregory; Ray, Nicolas

    2010-05-01

    EnviroGRIDS (Black Sea Catchment Observation and Assessment System supporting Sustainable Development) [1] is a 4-years FP7 Project aiming to address the subjects of ecologically unsustainable development and inadequate resource management. The project develops a Spatial Data Infrastructure of the Black Sea Catchment region. The geospatial technologies offer very specialized functionality for Earth Science oriented applications as well as the Grid oriented technology that is able to support distributed and parallel processing. One challenge of the enviroGRIDS project is the interoperability between geospatial and Grid infrastructures by providing the basic and the extended features of the both technologies. The geospatial interoperability technology has been promoted as a way of dealing with large volumes of geospatial data in distributed environments through the development of interoperable Web service specifications proposed by the Open Geospatial Consortium (OGC), with applications spread across multiple fields but especially in Earth observation research. Due to the huge volumes of data available in the geospatial domain and the additional introduced issues (data management, secure data transfer, data distribution and data computation), the need for an infrastructure capable to manage all those problems becomes an important aspect. The Grid promotes and facilitates the secure interoperations of geospatial heterogeneous distributed data within a distributed environment, the creation and management of large distributed computational jobs and assures a security level for communication and transfer of messages based on certificates. This presentation analysis and discusses the most significant use cases for enabling the OGC Web services interoperability with the Grid environment and focuses on the description and implementation of the most promising one. In these use cases we give a special attention to issues such as: the relations between computational grid and the OGC Web service protocols, the advantages offered by the Grid technology - such as providing a secure interoperability between the distributed geospatial resource -and the issues introduced by the integration of distributed geospatial data in a secure environment: data and service discovery, management, access and computation. enviroGRIDS project proposes a new architecture which allows a flexible and scalable approach for integrating the geospatial domain represented by the OGC Web services with the Grid domain represented by the gLite middleware. The parallelism offered by the Grid technology is discussed and explored at the data level, management level and computation level. The analysis is carried out for OGC Web service interoperability in general but specific details are emphasized for Web Map Service (WMS), Web Feature Service (WFS), Web Coverage Service (WCS), Web Processing Service (WPS) and Catalog Service for Web (CSW). Issues regarding the mapping and the interoperability between the OGC and the Grid standards and protocols are analyzed as they are the base in solving the communication problems between the two environments: grid and geospatial. The presetation mainly highlights how the Grid environment and Grid applications capabilities can be extended and utilized in geospatial interoperability. Interoperability between geospatial and Grid infrastructures provides features such as the specific geospatial complex functionality and the high power computation and security of the Grid, high spatial model resolution and geographical area covering, flexible combination and interoperability of the geographical models. According with the Service Oriented Architecture concepts and requirements of interoperability between geospatial and Grid infrastructures each of the main functionality is visible from enviroGRIDS Portal and consequently, by the end user applications such as Decision Maker/Citizen oriented Applications. The enviroGRIDS portal is the single way of the user to get into the system and the portal faces a unique style of the graphical user interface. Main reference for further information: [1] enviroGRIDS Project, http://www.envirogrids.net/

  1. A Security Architecture for Grid-enabling OGC Web Services

    NASA Astrophysics Data System (ADS)

    Angelini, Valerio; Petronzio, Luca

    2010-05-01

    In the proposed presentation we describe an architectural solution for enabling a secure access to Grids and possibly other large scale on-demand processing infrastructures through OGC (Open Geospatial Consortium) Web Services (OWS). This work has been carried out in the context of the security thread of the G-OWS Working Group. G-OWS (gLite enablement of OGC Web Services) is an international open initiative started in 2008 by the European CYCLOPS , GENESI-DR, and DORII Project Consortia in order to collect/coordinate experiences in the enablement of OWS's on top of the gLite Grid middleware. G-OWS investigates the problem of the development of Spatial Data and Information Infrastructures (SDI and SII) based on the Grid/Cloud capacity in order to enable Earth Science applications and tools. Concerning security issues, the integration of OWS compliant infrastructures and gLite Grids needs to address relevant challenges, due to their respective design principles. In fact OWS's are part of a Web based architecture that demands security aspects to other specifications, whereas the gLite middleware implements the Grid paradigm with a strong security model (the gLite Grid Security Infrastructure: GSI). In our work we propose a Security Architectural Framework allowing the seamless use of Grid-enabled OGC Web Services through the federation of existing security systems (mostly web based) with the gLite GSI. This is made possible mediating between different security realms, whose mutual trust is established in advance during the deployment of the system itself. Our architecture is composed of three different security tiers: the user's security system, a specific G-OWS security system, and the gLite Grid Security Infrastructure. Applying the separation-of-concerns principle, each of these tiers is responsible for controlling the access to a well-defined resource set, respectively: the user's organization resources, the geospatial resources and services, and the Grid resources. While the gLite middleware is tied to a consolidated security approach based on X.509 certificates, our system is able to support different kinds of user's security infrastructures. Our central component, the G-OWS Security Framework, is based on the OASIS WS-Trust specifications and on the OGC GeoRM architectural framework. This allows to satisfy advanced requirements such as the enforcement of specific geospatial policies and complex secure web service chained requests. The typical use case is represented by a scientist belonging to a given organization who issues a request to a G-OWS Grid-enabled Web Service. The system initially asks the user to authenticate to his/her organization's security system and, after verification of the user's security credentials, it translates the user's digital identity into a G-OWS identity. This identity is linked to a set of attributes describing the user's access rights to the G-OWS services and resources. Inside the G-OWS Security system, access restrictions are applied making use of the enhanced Geospatial capabilities specified by the OGC GeoXACML. If the required action needs to make use of the Grid environment the system checks if the user is entitled to access a Grid infrastructure. In that case his/her identity is translated to a temporary Grid security token using the Short Lived Credential Services (IGTF Standard). In our case, for the specific gLite Grid infrastructure, some information (VOMS Attributes) is plugged into the Grid Security Token to grant the access to the user's Virtual Organization Grid resources. The resulting token is used to submit the request to the Grid and also by the various gLite middleware elements to verify the user's grants. Basing on the presented framework, the G-OWS Security Working Group developed a prototype, enabling the execution of OGC Web Services on the EGEE Production Grid through the federation with a Shibboleth based security infrastructure. Future plans aim to integrate other Web authentication services such as OpenID, Kerberos and WS-Federation.

  2. Scientific Workflows and the Sensor Web for Virtual Environmental Observatories

    NASA Astrophysics Data System (ADS)

    Simonis, I.; Vahed, A.

    2008-12-01

    Virtual observatories mature from their original domain and become common practice for earth observation research and policy building. The term Virtual Observatory originally came from the astronomical research community. Here, virtual observatories provide universal access to the available astronomical data archives of space and ground-based observatories. Further on, as those virtual observatories aim at integrating heterogeneous ressources provided by a number of participating organizations, the virtual observatory acts as a coordinating entity that strives for common data analysis techniques and tools based on common standards. The Sensor Web is on its way to become one of the major virtual observatories outside of the astronomical research community. Like the original observatory that consists of a number of telescopes, each observing a specific part of the wave spectrum and with a collection of astronomical instruments, the Sensor Web provides a multi-eyes perspective on the current, past, as well as future situation of our planet and its surrounding spheres. The current view of the Sensor Web is that of a single worldwide collaborative, coherent, consistent and consolidated sensor data collection, fusion and distribution system. The Sensor Web can perform as an extensive monitoring and sensing system that provides timely, comprehensive, continuous and multi-mode observations. This technology is key to monitoring and understanding our natural environment, including key areas such as climate change, biodiversity, or natural disasters on local, regional, and global scales. The Sensor Web concept has been well established with ongoing global research and deployment of Sensor Web middleware and standards and represents the foundation layer of systems like the Global Earth Observation System of Systems (GEOSS). The Sensor Web consists of a huge variety of physical and virtual sensors as well as observational data, made available on the Internet at standardized interfaces. All data sets and sensor communication follow well-defined abstract models and corresponding encodings, mostly developed by the OGC Sensor Web Enablement initiative. Scientific progress is currently accelerated by an emerging new concept called scientific workflows, which organize and manage complex distributed computations. A scientific workflow represents and records the highly complex processes that a domain scientist typically would follow in exploration, discovery and ultimately, transformation of raw data to publishable results. The challenge is now to integrate the benefits of scientific workflows with those provided by the Sensor Web in order to leverage all resources for scientific exploration, problem solving, and knowledge generation. Scientific workflows for the Sensor Web represent the next evolutionary step towards efficient, powerful, and flexible earth observation frameworks and platforms. Those platforms support the entire process from capturing data, sharing and integrating, to requesting additional observations. Multiple sites and organizations will participate on single platforms and scientists from different countries and organizations interact and contribute to large-scale research projects. Simultaneously, the data- and information overload becomes manageable, as multiple layers of abstraction will free scientists to deal with underlying data-, processing or storage peculiarities. The vision are automated investigation and discovery mechanisms that allow scientists to pose queries to the system, which in turn would identify potentially related resources, schedules processing tasks and assembles all parts in workflows that may satisfy the query.

  3. Managing and Integrating Open Environmental Data - Technological Requirements and Challenges

    NASA Astrophysics Data System (ADS)

    Devaraju, Anusuriya; Kunkel, Ralf; Jirka, Simon

    2014-05-01

    Understanding environment conditions and trends requires information. This information is usually generated from sensor observations. Today, several infrastructures (e.g., GEOSS, EarthScope, NEON, NETLAKE, OOI, TERENO, WASCAL, and PEER-EurAqua) have been deployed to promote full and open exchange of environmental data. Standards for interfaces as well as data models/formats (OGC, CUAHSI, INSPIRE, SEE Grid, ISO) and open source tools have been developed to support seamless data exchange between various domains and organizations. In spite of this growing interest, it remains a challenge to manage and integrate open environmental data on the fly due to the distributed and heterogeneous nature of the data. Intuitive tools and standardized interfaces are vital to hide the technical complexity of underlying data management infrastructures. Meaningful descriptions of raw sensor data are necessary to achieve interoperability among different sources. As raw sensor data sets usually goes through several layers of summarization and aggregation, metadata and quality measures associated with these should be captured. Further processing of sensor data sets requires that they should be made compatible with existing environmental models. We need data policies and management plans on how to handle and publish open sensor data coming from different institutions. Clearly, a better management and usability of open environmental data is crucial, not only to gather large amounts of data, but also to cater various aspects such as data integration, privacy and trust, uncertainty, quality control, visualization, and data management policies. The proposed talk presents several key findings in terms of requirements, ongoing developments and technical challenges concerning these aspects from our recent work. This includes two workshops on open observation data and supporting tools, as well as the long-term environmental monitoring initiatives such as TERENO and TERENO-MED. Workshops Details: Spin the Sensor Web: Sensor Web Workshop 2013, Muenster, 21st-22nd November 2013 (http://52north.org/news/spin-the-sensor-web-sensor-web-workshop-2013) Special Session on Management of Open Environmental Observation Data - MOEOD 2014, Lisbon, 8th January 2014 (http://www.sensornets.org/MOEOD.aspx?y=2014) Monitoring Networks: TERENO : http://teodoor.icg.kfa-juelich.de/ TERENO-MED : http://www.tereno-med.net/

  4. Pragmatic service development and customisation with the CEDA OGC Web Services framework

    NASA Astrophysics Data System (ADS)

    Pascoe, Stephen; Stephens, Ag; Lowe, Dominic

    2010-05-01

    The CEDA OGC Web Services framework (COWS) emphasises rapid service development by providing a lightweight layer of OGC web service logic on top of Pylons, a mature web application framework for the Python language. This approach gives developers a flexible web service development environment without compromising access to the full range of web application tools and patterns: Model-View-Controller paradigm, XML templating, Object-Relational-Mapper integration and authentication/authorization. We have found this approach useful for exploring evolving standards and implementing protocol extensions to meet the requirements of operational deployments. This paper outlines how COWS is being used to implement customised WMS, WCS, WFS and WPS services in a variety of web applications from experimental prototypes to load-balanced cluster deployments serving 10-100 simultaneous users. In particular we will cover 1) The use of Climate Science Modeling Language (CSML) in complex-feature aware WMS, WCS and WFS services, 2) Extending WMS to support applications with features specific to earth system science and 3) A cluster-enabled Web Processing Service (WPS) supporting asynchronous data processing. The COWS WPS underpins all backend services in the UK Climate Projections User Interface where users can extract, plot and further process outputs from a multi-dimensional probabilistic climate model dataset. The COWS WPS supports cluster job execution, result caching, execution time estimation and user management. The COWS WMS and WCS components drive the project-specific NCEO and QESDI portals developed by the British Atmospheric Data Centre. These portals use CSML as a backend description format and implement features such as multiple WMS layer dimensions and climatology axes that are beyond the scope of general purpose GIS tools and yet vital for atmospheric science applications.

  5. Central Asia Water (CAWa) - A visualization platform for hydro-meteorological sensor data

    NASA Astrophysics Data System (ADS)

    Stender, Vivien; Schroeder, Matthias; Wächter, Joachim

    2014-05-01

    Water is an indispensable necessity of life for people in the whole world. In central Asia, water is the key factor for economic development, but is already a narrow resource in this region. In fact of climate change, the water problem handling will be a big challenge for the future. The regional research Network "Central Asia Water" (CAWa) aims at providing a scientific basis for transnational water resources management for the five Central Asia States Kyrgyzstan, Uzbekistan, Tajikistan, Turkmenistan and Kazakhstan. CAWa is part of the Central Asia Water Initiative (also known as the Berlin Process) which was launched by the Federal Foreign Office on 1 April 2008 at the "Water Unites" conference in Berlin. To produce future scenarios and strategies for sustainable water management, data on water reserves and the use of water in Central Asia must therefore be collected consistently across the region. Hydro-meteorological stations equipped with sophisticated sensors are installed in Central Asia and send their data via real-time satellite communication to the operation centre of the monitoring network and to the participating National Hydro-meteorological Services.[1] The challenge for CAWa is to integrate the whole aspects of data management, data workflows, data modeling and visualizations in a proper design of a monitoring infrastructure. The use of standardized interfaces to support data transfer and interoperability is essential in CAWa. An uniform treatment of sensor data can be realized by the OGC Sensor Web Enablement (SWE) , which makes a number of standards and interface definitions available: Observation & Measurement (O&M) model for the description of observations and measurements, Sensor Model Language (SensorML) for the description of sensor systems, Sensor Observation Service (SOS) for obtaining sensor observations, Sensor Planning Service (SPS) for tasking sensors, Web Notification Service (WNS) for asynchronous dialogues and Sensor Alert Service (SAS) for sending alerts. An OpenSource web-platform bundles the data, provided by the SWE web services of the hydro-meteorological stations, and provides tools for data visualization and data access. The visualization tool was implemented by using OpenSource tools like GeoExt/ExtJS and OpenLayers. Using the application the user can query the relevant sensor data, select parameter and time period, visualize and finally download the data. [1] http://www.cawa-project.net

  6. 303(d) Listed Impaired Waters

    EPA Pesticide Factsheets

    Geospatial data for 303(d) Impaired Waters are available as prepackaged national downloads or as GIS web and and data services. EPA provides geospatial data in the formats: GIS compatible shapefiles and geodatabases and ESRI and OGC web mapping.

  7. The deegree framework - Spatial Data Infrastructure solution for end-users and developers

    NASA Astrophysics Data System (ADS)

    Kiehle, Christian; Poth, Andreas

    2010-05-01

    The open source software framework deegree is a comprehensive implementa­tion of standards as defined by ISO and Open Geospatial Consortium (OGC). It has been developed with two goals in mind: provide a uniform framework for implementing Spatial Data Infrastructures (SDI) and adhering to standards as strictly as possible. Although being open source software (Lesser GNU Public Li­cense, LGPL), deegree has been developed with a business model in mind: providing the general building blocks of SDIs without license fees and offer cus­tomization, consulting and tailoring by specialized companies. The core of deegree is a comprehensive Java Application Programming Inter­face (API) offering access to spatial features, analysis, metadata and coordinate reference systems. As a library, deegree can and has been integrated as a core module inside spatial information systems. It is reference implementation for several OGC standards and based on an ISO 19107 geometry model. For end users, deegree is shipped as a web application providing easy-to-set-up components for web mapping and spatial analysis. Since 2000, deegree has been the backbone of many productive SDIs, first and foremost for governmental stakeholders (e.g. Federal Agency for Cartography and Geodesy in Germany, the Ministry of Housing, Spatial Planning and the En­vironment in the Netherlands, etc.) as well as for research and development projects as an early adoption of standards, drafts and discussion papers. Be­sides mature standards like Web Map Service, Web Feature Service and Cata­logue Services, deegree also implements rather new standards like the Sensor Observation Service, the Web Processing Service and the Web Coordinate Transformation Service (WCTS). While a robust background in standardization (knowledge and implementation) is a must for consultancy, standard-compliant services and encodings alone do not provide solutions for customers. The added value is comprised by a sophistic­ated set of client software, desktop and web environments. A focus lies on different client solutions for specific standards like the Web Pro­cessing Service and the Web Coordinate Transformation Service. On the other hand, complex geoportal solutions comprised of multiple standards and en­hanced by components for user management, security and map client function­ality show the demanding requirements of real world solutions. The XPlan-GML-standard as defined by the German spatial planing authorities is a good ex­ample of how complex real-world requirements can get. XPlan-GML is intended to provide a framework for digital spatial planning documents and requires complex Geography Markup Language (GML) features along with Symbology Encoding (SE), Filter Encoding (FE), Web Map Services (WMS), Web Feature Services (WFS). This complex in­frastructure should be used by urban and spatial planners and therefore re­quires a user-friendly graphical interface hiding the complexity of the underly­ing infrastructure. Based on challenges faced within customer projects, the importance of easy to use software components is focused. SDI solution should be build upon ISO/OGC-standards, but more important, should be user-friendly and support the users in spatial data management and analysis.

  8. Connecting long-tail scientists with big data centers using SaaS

    NASA Astrophysics Data System (ADS)

    Percivall, G. S.; Bermudez, L. E.

    2012-12-01

    Big data centers and long tail scientists represent two extremes in the geoscience research community. Interoperability and inter-use based on software-as-a-service (SaaS) increases access to big data holdings by this underserved community of scientists. Large, institutional data centers have long been recognized as vital resources in the geoscience community. Permanent data archiving and dissemination centers provide "access to the data and (are) a critical source of people who have experience in the use of the data and can provide advice and counsel for new applications." [NRC] The "long-tail of science" is the geoscience researchers that work separate from institutional data centers [Heidorn]. Long-tail scientists need to be efficient consumers of data from large, institutional data centers. Discussions in NSF EarthCube capture the challenges: "Like the vast majority of NSF-funded researchers, Alice (a long-tail scientist) works with limited resources. In the absence of suitable expertise and infrastructure, the apparently simple task that she assigns to her graduate student becomes an information discovery and management nightmare. Downloading and transforming datasets takes weeks." [Foster, et.al.] The long-tail metaphor points to methods to bridge the gap, i.e., the Web. A decade ago, OGC began building a geospatial information space using open, web standards for geoprocessing [ORM]. Recently, [Foster, et.al.] accurately observed that "by adopting, adapting, and applying semantic web and SaaS technologies, we can make the use of geoscience data as easy and convenient as consumption of online media." SaaS places web services into Cloud Computing. SaaS for geospatial is emerging rapidly building on the first-generation geospatial web, e.g., OGC Web Coverage Service [WCS] and the Data Access Protocol [DAP]. Several recent examples show progress in applying SaaS to geosciences: - NASA's Earth Data Coherent Web has a goal to improve science user experience using Web Services (e.g. W*S, SOAP, RESTful) to reduce barriers to using EOSDIS data [ECW]. - NASA's LANCE provides direct access to vast amounts of satellite data using the OGC Web Map Tile Service (WMTS). - NOAA's Unified Access Framework for Gridded Data (UAF Grid) is a web service based capability for direct access to a variety of datasets using netCDF, OPeNDAP, THREDDS, WMS and WCS. [UAF] Tools to access SaaS's are many and varied: some proprietary, others open source; some run in browsers, others are stand-alone applications. What's required is interoperability using web interfaces offered by the data centers. NOAA's UAF service stack supports Matlab, ArcGIS, Ferret, GrADS, Google Earth, IDV, LAS. Any SaaS that offers OGC Web Services (WMS, WFS, WCS) can be accessed by scores of clients [OGC]. While there has been much progress in the recent year toward offering web services for the long-tail of scientists, more needs to be done. Web services offer data access but more than access is needed for inter-use of data, e.g. defining data schemas that allow for data fusion, addressing coordinate systems, spatial geometry, and semantics for observations. Connecting long-tail scientists with large, data centers using SaaS and, in the future, semantic web, will address this large and currently underserved user community.

  9. Datacube Services in Action, Using Open Source and Open Standards

    NASA Astrophysics Data System (ADS)

    Baumann, P.; Misev, D.

    2016-12-01

    Array Databases comprise novel, promising technology for massive spatio-temporal datacubes, extending the SQL paradigm of "any query, anytime" to n-D arrays. On server side, such queries can be optimized, parallelized, and distributed based on partitioned array storage. The rasdaman ("raster data manager") system, which has pioneered Array Databases, is available in open source on www.rasdaman.org. Its declarative query language extends SQL with array operators which are optimized and parallelized on server side. The rasdaman engine, which is part of OSGeo Live, is mature and in operational use databases individually holding dozens of Terabytes. Further, the rasdaman concepts have strongly impacted international Big Data standards in the field, including the forthcoming MDA ("Multi-Dimensional Array") extension to ISO SQL, the OGC Web Coverage Service (WCS) and Web Coverage Processing Service (WCPS) standards, and the forthcoming INSPIRE WCS/WCPS; in both OGC and INSPIRE, OGC is WCS Core Reference Implementation. In our talk we present concepts, architecture, operational services, and standardization impact of open-source rasdaman, as well as experiences made.

  10. Operational Use of OGC Web Services at the Met Office

    NASA Astrophysics Data System (ADS)

    Wright, Bruce

    2010-05-01

    The Met Office has adopted the Service-Orientated Architecture paradigm to deliver services to a range of customers through Rich Internet Applications (RIAs). The approach uses standard Open Geospatial Consortium (OGC) web services to provide information to web-based applications through a range of generic data services. "Invent", the Met Office beta site, is used to showcase Met Office future plans for presenting web-based weather forecasts, product and information to the public. This currently hosts a freely accessible Weather Map Viewer, written in JavaScript, which accesses a Web Map Service (WMS), to deliver innovative web-based visualizations of weather and its potential impacts to the public. The intention is to engage the public in the development of new web-based services that more accurately meet their needs. As the service is intended for public use within the UK, it has been designed to support a user base of 5 million, the analysed level of UK web traffic reaching the Met Office's public weather information site. The required scalability has been realised through the use of multi-tier tile caching: - WMS requests are made for 256x256 tiles for fixed areas and zoom levels; - a Tile Cache, developed in house, efficiently serves tiles on demand, managing WMS request for the new tiles; - Edge Servers, externally hosted by Akamai, provide a highly scalable (UK-centric) service for pre-cached tiles, passing new requests to the Tile Cache; - the Invent Weather Map Viewer uses the Google Maps API to request tiles from Edge Servers. (We would expect to make use of the Web Map Tiling Service, when it becomes an OGC standard.) The Met Office delivers specialist commercial products to market sectors such as transport, utilities and defence, which exploit a Web Feature Service (WFS) for data relating forecasts and observations to specific geographic features, and a Web Coverage Service (WCS) for sub-selections of gridded data. These are locally rendered as maps or graphs, and combined with the WMS pre-rendered images and text, in a FLEX application, to provide sophisticated, user impact-based view of the weather. The OGC web services supporting these applications have been developed in collaboration with commercial companies. Visual Weather was originally a desktop application for forecasters, but IBL have developed it to expose the full range of forecast and observation data through standard web services (WCS and WMS). Forecasts and observations relating to specific locations and geographic features are held in an Oracle Database, and exposed as a WFS using Snowflake Software's GO-Publisher application. The Met Office has worked closely with both IBL and Snowflake Software to ensure that the web services provided strike a balance between conformance to the standards and performance in an operational environment. This has proved challenging in areas where the standards are rapidly evolving (e.g. WCS) or do not allow adequate description of the Met-Ocean domain (e.g. multiple time coordinates and parametric vertical coordinates). It has also become clear that careful selection of the features to expose, based on the way in which you expect users to query those features, in necessary in order to deliver adequate performance. These experiences are providing useful 'real-world' input in to the recently launched OGC MetOcean Domain Working Group and World Meteorological Organisation (WMO) initiatives in this area.

  11. Decentralized Orchestration of Composite Ogc Web Processing Services in the Cloud

    NASA Astrophysics Data System (ADS)

    Xiao, F.; Shea, G. Y. K.; Cao, J.

    2016-09-01

    Current web-based GIS or RS applications generally rely on centralized structure, which has inherent drawbacks such as single points of failure, network congestion, and data inconsistency, etc. The inherent disadvantages of traditional GISs need to be solved for new applications on Internet or Web. Decentralized orchestration offers performance improvements in terms of increased throughput and scalability and lower response time. This paper investigates build time and runtime issues related to decentralized orchestration of composite geospatial processing services based on OGC WPS standard specification. A case study of dust storm detection was demonstrated to evaluate the proposed method and the experimental results indicate that the method proposed in this study is effective for its ability to produce the high quality solution at a low cost of communications for geospatial processing service composition problem.

  12. Dynamic reusable workflows for ocean science

    USGS Publications Warehouse

    Signell, Richard; Fernandez, Filipe; Wilcox, Kyle

    2016-01-01

    Digital catalogs of ocean data have been available for decades, but advances in standardized services and software for catalog search and data access make it now possible to create catalog-driven workflows that automate — end-to-end — data search, analysis and visualization of data from multiple distributed sources. Further, these workflows may be shared, reused and adapted with ease. Here we describe a workflow developed within the US Integrated Ocean Observing System (IOOS) which automates the skill-assessment of water temperature forecasts from multiple ocean forecast models, allowing improved forecast products to be delivered for an open water swim event. A series of Jupyter Notebooks are used to capture and document the end-to-end workflow using a collection of Python tools that facilitate working with standardized catalog and data services. The workflow first searches a catalog of metadata using the Open Geospatial Consortium (OGC) Catalog Service for the Web (CSW), then accesses data service endpoints found in the metadata records using the OGC Sensor Observation Service (SOS) for in situ sensor data and OPeNDAP services for remotely-sensed and model data. Skill metrics are computed and time series comparisons of forecast model and observed data are displayed interactively, leveraging the capabilities of modern web browsers. The resulting workflow not only solves a challenging specific problem, but highlights the benefits of dynamic, reusable workflows in general. These workflows adapt as new data enters the data system, facilitate reproducible science, provide templates from which new scientific workflows can be developed, and encourage data providers to use standardized services. As applied to the ocean swim event, the workflow exposed problems with two of the ocean forecast products which led to improved regional forecasts once errors were corrected. While the example is specific, the approach is general, and we hope to see increased use of dynamic notebooks across the geoscience domains.

  13. Publication of sensor data in the long-term environmental monitoring infrastructure TERENO

    NASA Astrophysics Data System (ADS)

    Stender, V.; Schroeder, M.; Klump, J. F.

    2014-12-01

    Terrestrial Environmental Observatories (TERENO) is an interdisciplinary and long-term research project spanning an Earth observation network across Germany. It includes four test sites within Germany from the North German lowlands to the Bavarian Alps and is operated by six research centers of the Helmholtz Association. TERENO Northeast is one of the sub-observatories of TERENO and is operated by the German Research Centre for Geosciences GFZ in Potsdam. This observatory investigates geoecological processes in the northeastern lowland of Germany by collecting large amounts of environmentally relevant data. The success of long-term projects like TERENO depends on well-organized data management, data exchange between the partners involved and on the availability of the captured data. Data discovery and dissemination are facilitated not only through data portals of the regional TERENO observatories but also through a common spatial data infrastructure TEODOOR (TEreno Online Data repOsitORry). TEODOOR bundles the data, provided by the different web services of the single observatories, and provides tools for data discovery, visualization and data access. The TERENO Northeast data infrastructure integrates data from more than 200 instruments and makes data available through standard web services. TEODOOR accesses the OGC Sensor Web Enablement (SWE) interfaces offered by the regional observatories. In addition to the SWE interface, TERENO Northeast also publishes time series of environmental sensor data through the online research data publication platform DataCite. The metadata required by DataCite are created in an automated process by extracting information from the SWE SensorML to create ISO 19115 compliant metadata. The GFZ data management tool kit panMetaDocs is used to register Digital Object Identifiers (DOI) and preserve file based datasets. In addition to DOI, the International Geo Sample Numbers (IGSN) is used to uniquely identify research specimens.

  14. Incorporating Quality Control Information in the Sensor Web

    NASA Astrophysics Data System (ADS)

    Devaraju, Anusuriya; Kunkel, Ralf; Bogena, Heye

    2013-04-01

    The rapid development of sensing technologies had led to the creation of large amounts of heterogeneous environmental observations. The Sensor Web provides a wider access to sensors and observations via common protocols and specifications. Observations typically go through several levels of quality control, and aggregation before they are made available to end-users. Raw data are usually inspected, and related quality flags are assigned. Data are gap-filled, and errors are removed. New data series may also be derived from one or more corrected data sets. Until now, it is unclear how these kinds of information can be captured in the Sensor Web Enablement (SWE) framework. Apart from the quality measures (e.g., accuracy, precision, tolerance, or confidence), the levels of observational series, the changes applied, and the methods involved must be specified. It is important that this kind of quality control information is well described and communicated to end-users to allow for a better usage and interpretation of data products. In this paper, we describe how quality control information can be incorporated into the SWE framework. Concerning this, first, we introduce the TERENO (TERrestrial ENvironmental Observatories), an initiative funded by the large research infrastructure program of the Helmholtz Association in Germany. The main goal of the initiative is to facilitate the study of long-term effects of climate and land use changes. The TERENO Online Data RepOsitORry (TEODOOR) is a software infrastructure that supports acquisition, provision, and management of observations within TERENO via SWE specifications and several other OGC web services. Next, we specify changes made to the existing observational data model to incorporate quality control information. Here, we describe the underlying TERENO data policy in terms of provision and maintenance issues. We present data levels, and their implementation within TEODOOR. The data levels are adapted from those used by other similar systems such as CUAHSI, EarthScope and WMO. Finally, we outline recommendations for future work.

  15. Current Efforts in European Projects to Facilitate the Sharing of Scientific Observation Data

    NASA Astrophysics Data System (ADS)

    Bredel, Henning; Rieke, Matthes; Maso, Joan; Jirka, Simon; Stasch, Christoph

    2017-04-01

    This presentation is intended to provide an overview of currently ongoing efforts in European projects to facilitate and promote the interoperable sharing of scientific observation data. This will be illustrated through two examples: a prototypical portal developed in the ConnectinGEO project for matching available (in-situ) data sources to the needs of users and a joint activity of several research projects to harmonise the usage of the OGC Sensor Web Enablement standards for providing access to marine observation data. ENEON is an activity initiated by the European ConnectinGEO project to coordinate in-situ Earth observation networks with the aim to harmonise the access to observations, improve discoverability, and identify/close gaps in European earth observation data resources. In this context, ENEON commons has been developed as a supporting Web portal for facilitating discovery, access, re-use and creation of knowledge about observations, networks, and related activities (e.g. projects). The portal is based on developments resulting from the European WaterInnEU project and has been extended to cover the requirements for handling knowledge about in-situ earth observation networks. A first prototype of the portal was completed in January 2017 which offers functionality for interactive discussion, information exchange and querying information about data delivered by different observation networks. Within this presentation, we will introduce the presented prototype and initiate a discussion about potential future work directions. The second example concerns the harmonisation of data exchange in the marine domain. There are many organisation who operate ocean observatories or data archives. In recent years, the application of the OGC Sensor Web Enablement (SWE) technology has become more and more popular to increase the interoperability between marine observation networks. However, as the SWE standards were intentionally designed in a domain independent manner, there are still a significant degrees of freedom how the same information could be handled in the SWE framework. Thus, further domain-specific agreements are necessary to describe more precisely, how SWE standards shall be applied in specific contexts. Within this presentation we will report the current status of the marine SWE profiles initiative which has the aim to develop guidance and recommendations for the application of SWE standards for ocean observation data. This initiative which is supported by projects such as NeXOS, FixO3, ODIP 2, BRIDGES and SeaDataCloud has already lead to first results, which will be introduced in the proposed presentation. In summary we will introduce two different building blocks how earth observation networks can be coordinated to ensure better discoverability through intelligent portal solutions and to ensure a common, interoperable exchange of the collected data through dedicated domain profiles of Sensor Web standard.

  16. WMS and WFS Standards Implementation of Weather Data

    NASA Astrophysics Data System (ADS)

    Armstrong, M.

    2005-12-01

    CustomWeather is private weather company that delivers global weather data products. CustomWeather has built a mapping platform according to OGC standards. Currently, both a Web Mapping Service (WMS) and Web Feature Service (WFS) are supported by CustomWeather. Supporting open geospatial standards has lead to number of positive changes internally to the processes of CustomWeather, along with those of the clients accessing the data. Quite a number of challenges surfaced during this process, particularly with respect to combining a wide variety of raw modeling and sensor data into a single delivery platform. Open standards have, however, made the delivery of very different data products rather seamless. The discussion will address the issues faced in building an OGC-based mapping platform along with the limitations encountered. While the availability of these data products through open standards is still very young, there have already been many adopters in the utility and navigation industries. The discussion will take a closer look at the different approach taken by these two industries as they utilize interoperability standards with existing data. Insight will be given in regards to applications already taking advantage of this new technology and how this is affecting decision-making processes. CustomWeather has observed considerable interest and potential benefit in this technology from developing countries. Weather data is a key element in disaster management. Interoperability is literally opening up a world of data and has the potential to quickly enable functionality that would otherwise take considerable time to implement. The discussion will briefly touch on our experience.

  17. Implementation and Evaluation of Four Interoperable Open Standards for the Internet of Things.

    PubMed

    Jazayeri, Mohammad Ali; Liang, Steve H L; Huang, Chih-Yuan

    2015-09-22

    Recently, researchers are focusing on a new use of the Internet called the Internet of Things (IoT), in which enabled electronic devices can be remotely accessed over the Internet. As the realization of IoT concept is still in its early stages, manufacturers of Internet-connected devices and IoT web service providers are defining their proprietary protocols based on their targeted applications. Consequently, IoT becomes heterogeneous in terms of hardware capabilities and communication protocols. Addressing these heterogeneities by following open standards is a necessary step to communicate with various IoT devices. In this research, we assess the feasibility of applying existing open standards on resource-constrained IoT devices. The standard protocols developed in this research are OGC PUCK over Bluetooth, TinySOS, SOS over CoAP, and OGC SensorThings API. We believe that by hosting open standard protocols on IoT devices, not only do the devices become self-describable, self-contained, and interoperable, but innovative applications can also be easily developed with standardized interfaces. In addition, we use memory consumption, request message size, response message size, and response latency to benchmark the efficiency of the implemented protocols. In all, this research presents and evaluates standard-based solutions to better understand the feasibility of applying existing standards to the IoT vision.

  18. Rapid-response Sensor Networks Leveraging Open Standards and the Internet of Things

    NASA Astrophysics Data System (ADS)

    Bermudez, L. E.; Lieberman, J. E.; Lewis, L.; Botts, M.; Liang, S.

    2016-12-01

    New sensor technologies provide an unparalleled capability to collect large numbers of diverse observations about the world around us. Networks of such sensors are especially effective for capturing and analyzing unexpected, fast moving events if they can be deployed with a minimum of time, effort, and cost. A rapid-response sensing and processing capability is extremely important in quickly unfolding events not only to collect data for future research.but also to support response efforts that may be needed by providing up-to-date knowledge of the situation. A recent pilot activity coordinated by the Open Geospatial Consortium combined Sensor Web Enablement (SWE) standards with Internet of Things (IoT) practices to understand better how to set up rapid-response sensor networks in comparable event situations involving accidents or disasters. The networks included weather and environmental sensors, georeferenced UAV and PTZ imagery collectors, and observations from "citizen sensors", as well as virtual observations generated by predictive models. A key feature of each "SWE-IoT" network was one or more Sensor Hubs that connected local, often proprietary sensor device protocols to a common set of standard SWE data types and standard Web interfaces on an IP-based internetwork. This IoT approach provided direct, common, interoperable access to all sensor readings from anywhere on the internetwork of sensors, Hubs, and applications. Sensor Hubs also supported an automated discovery protocol in which activated Hubs registered themselves with a canonical catalog service. As each sensor (wireless or wired) was activated within range of an authorized Hub, it registered itself with that Hub, which in turn registered the sensor and its capabilities with the catalog. Sensor Hub functions were implemented in a range of component types, from personal devices such as smartphones and Raspberry Pi's to full cloud-based sensor services platforms. Connected into a network "constellation" the Hubs also enabled reliable exchange and persistence of sensor data in constrained communications environments. Pilot results are being documented in public OGC engineering reports and are feeding into improved standards to support SWE-IoT networks for a range of domains and applications.

  19. Next generation of weather generators on web service framework

    NASA Astrophysics Data System (ADS)

    Chinnachodteeranun, R.; Hung, N. D.; Honda, K.; Ines, A. V. M.

    2016-12-01

    Weather generator is a statistical model that synthesizes possible realization of long-term historical weather in future. It generates several tens to hundreds of realizations stochastically based on statistical analysis. Realization is essential information as a crop modeling's input for simulating crop growth and yield. Moreover, they can be contributed to analyzing uncertainty of weather to crop development stage and to decision support system on e.g. water management and fertilizer management. Performing crop modeling requires multidisciplinary skills which limit the usage of weather generator only in a research group who developed it as well as a barrier for newcomers. To improve the procedures of performing weather generators as well as the methodology to acquire the realization in a standard way, we implemented a framework for providing weather generators as web services, which support service interoperability. Legacy weather generator programs were wrapped in the web service framework. The service interfaces were implemented based on an international standard that was Sensor Observation Service (SOS) defined by Open Geospatial Consortium (OGC). Clients can request realizations generated by the model through SOS Web service. Hierarchical data preparation processes required for weather generator are also implemented as web services and seamlessly wired. Analysts and applications can invoke services over a network easily. The services facilitate the development of agricultural applications and also reduce the workload of analysts on iterative data preparation and handle legacy weather generator program. This architectural design and implementation can be a prototype for constructing further services on top of interoperable sensor network system. This framework opens an opportunity for other sectors such as application developers and scientists in other fields to utilize weather generators.

  20. The International Solid Earth Research Virtual Observatory

    NASA Astrophysics Data System (ADS)

    Fox, G.; Pierce, M.; Rundle, J.; Donnellan, A.; Parker, J.; Granat, R.; Lyzenga, G.; McLeod, D.; Grant, L.

    2004-12-01

    We describe the architecture and initial implementation of the International Solid Earth Research Virtual Observatory (iSERVO). This has been prototyped within the USA as SERVOGrid and expansion is planned to Australia, China, Japan and other countries. We base our design on a globally scalable distributed "cyber-infrastructure" or Grid built around a Web Services-based approach consistent with the extended Web Service Interoperability approach. The Solid Earth Science Working Group of NASA has identified several challenges for Earth Science research. In order to investigate these, we need to couple numerical simulation codes and data mining tools to observational data sets. This observational data are now available on-line in internet-accessible forms, and the quantity of this data is expected to grow explosively over the next decade. We architect iSERVO as a loosely federated Grid of Grids with each country involved supporting a national Solid Earth Research Grid. The national Grid Operations, possibly with dedicated control centers, are linked together to support iSERVO where an International Grid control center may eventually be necessary. We address the difficult multi-administrative domain security and ownership issues by exposing capabilities as services for which the risk of abuse is minimized. We support large scale simulations within a single domain using service-hosted tools (mesh generation, data repository and sensor access, GIS, visualization). Simulations typically involve sequential or parallel machines in a single domain supported by cross-continent services. We use Web Services implement Service Oriented Architecture (SOA) using WSDL for service description and SOAP for message formats. These are augmented by UDDI, WS-Security, WS-Notification/Eventing and WS-ReliableMessaging in the WS-I+ approach. Support for the latter two capabilities will be available over the next 6 months from the NaradaBrokering messaging system. We augment these specifications with the powerful portlet architecture using WSRP and JSR168 supported by such portal containers as uPortal, WebSphere, and Apache JetSpeed2. The latter portal aggregates component user interfaces for each iSERVO service allowing flexible customization of the user interface. We exploit the portlets produced by the NSF NMI (Middleware initiative) OGCE activity. iSERVO also uses specifications from the Open Geographical Information Systems (GIS) Consortium (OGC) that defines a number of standards for modeling earth surface feature data and services for interacting with this data. The data models are expressed in the XML-based Geography Markup Language (GML), and the OGC service framework are being adapted to use the Web Service model. The SERVO prototype includes a GIS Grid that currently includes the core WMS and WFS (Map and Feature) services. We will follow the best practice in the Grid and Web Service field and will adapt our technology as appropriate. For example, we expect to support services built on WS-RF when is finalized and to make use of the database interfaces OGSA-DAI and its WS-I+ versions. Finally, we review advances in Web Service scripting (such as HPSearch) and workflow systems (such as GCF) and their applications to iSERVO.

  1. Sensor Nanny, data management services for marine observation operators

    NASA Astrophysics Data System (ADS)

    Loubrieu, Thomas; Détoc, Jérôme; Thorel, Arnaud; Azelmat, Hamza

    2016-04-01

    In marine sciences, the diversity of observed properties (from water physic to contaminants in observed in biological individuals or sediment) and observation methodologies (from manned sampling and analysis in labs to large automated networks of homogeneous platforms) requires different expertises and thus dedicated scientific program (ARGO, EMSO, GLOSS, GOSHIP, OceanSites, GOSUD, Geotrace, SOCAT, member state environment monitoring networks, experimental research…). However, all of them requires similar IT services to support the maintenance of their network (calibrations, deployment strategy, spare part management...) and their data management. In Europe, the National Oceanographic Data Centres coordinated by the IOC/IODE and SeaDataNet provide reliable reference services (e.g. vocabularies, contact directories), standards and long term data preservation. Besides the regional operational oceanographic centres (ROOSes) coordinated by EuroGOOS and Copernicus In-Situ Thematic Assembly Centre provide efficient data management for near real time or delayed mode services focused on physics and bio-geo-chemistry in the water column. Other e-infrastructures, such as euroBIS for biodiversity, are focused on specific disciplines. Beyond the current scope of these well established infrastructures, Sensor Nanny is a web application providing services for operators of observatories to manage their observations on the "cloud". The application stands against the reference services (vocabularies, organization directory) and standard profiles (OGC/Sensor Web Enablement) provided by SeaDataNet. The application provides an on-line editor to graphically describe, literally draw, their observatory (acquisition and processing systems). The observatory description is composed by the user from a palette of hundreds of pre-defined sensors or hardware linked together. In addition, the data providers can upload their data in CSV and netCDF formats on a dropbox-like system. The latest enables to synchronise and safe guard in real-time local data resources on the cloud. The users can thus share their data on-line with their partners. The native format for the observatory and observation description are sensorML and O&M from the OGC/Sensor Web Enablement suite. This provides the flexibility required by the diversity and complexity of the observation programs. The observatory descriptions and observation data are indexed so to be very fluently browsed, filtered and visualized in a portal, with spatio-temporal and keyword criteria, whatever the number of observations (currently 2.5 millions observation points from French research vessels, ARGO profiling floats and EMSO-Azores deep sea observatory). The key component used for the development are owncloud for the file synchronization and sharing and elasticSearch for the scalable indexation of the observatories and observation. The foreseen developments aim at interfacing the application with Information Systems used to manage instrument maintenance (calibration, spare parts), for example LabCollector. Downstream, the application could also be further integrated with marine data services (SeaDataNet, Copernicus, EMODNET, ...). This will help data providers to streamline the publication or their data in these infrastructures. As feedback, the application will provide data providers with usage statistics dashboards. This latest part is funded by JERICO-NEXT project. The application has been also demonstrated in ODIP and SeaDataNet2 project and is being developed further for data providers in AtlantOS.

  2. Big Geo Data Services: From More Bytes to More Barrels

    NASA Astrophysics Data System (ADS)

    Misev, Dimitar; Baumann, Peter

    2016-04-01

    The data deluge is affecting the oil and gas industry just as much as many other industries. However, aside from the sheer volume there is the challenge of data variety, such as regular and irregular grids, multi-dimensional space/time grids, point clouds, and TINs and other meshes. A uniform conceptualization for modelling and serving them could save substantial effort, such as the proverbial "department of reformatting". The notion of a coverage actually can accomplish this. Its abstract model in ISO 19123 together with the concrete, interoperable OGC Coverage Implementation Schema (CIS), which is currently under adoption as ISO 19123-2, provieds a common platform for representing any n-D grid type, point clouds, and general meshes. This is paired by the OGC Web Coverage Service (WCS) together with its datacube analytics language, the OGC Web Coverage Processing Service (WCPS). The OGC WCS Core Reference Implementation, rasdaman, relies on Array Database technology, i.e. a NewSQL/NoSQL approach. It supports the grid part of coverages, with installations of 100+ TB known and single queries parallelized across 1,000+ cloud nodes. Recent research attempts to address the point cloud and mesh part through a unified query model. The Holy Grail envisioned is that these approaches can be merged into a single service interface at some time. We present both grid amd point cloud / mesh approaches and discuss status, implementation, standardization, and research perspectives, including a live demo.

  3. Towards Direct Manipulation and Remixing of Massive Data: The EarthServer Approach

    NASA Astrophysics Data System (ADS)

    Baumann, P.

    2012-04-01

    Complex analytics on "big data" is one of the core challenges of current Earth science, generating strong requirements for on-demand processing and fil tering of massive data sets. Issues under discussion include flexibility, performance, scalability, and the heterogeneity of the information types invo lved. In other domains, high-level query languages (such as those offered by database systems) have proven successful in the quest for flexible, scalable data access interfaces to massive amounts of data. However, due to the lack of support for many of the Earth science data structures, database systems are only used for registries and catalogs, but not for the bulk of spatio-temporal data. One core information category in this field is given by coverage data. ISO 19123 defines coverages, simplifying, as a representation of a "space-time varying phenomenon". This model can express a large class of Earth science data structures, including rectified and non-rectified rasters, curvilinear grids, point clouds, TINs, general meshes, trajectories, surfaces, and solids. This abstract definition, which is too high-level to establish interoperability, is concretized by the OGC GML 3.2.1 Application Schema for Coverages Standard into an interoperable representation. The OGC Web Coverage Processing Service (WCPS) Standard defines a declarative query language on multi-dimensional raster-type coverages, such as 1D in-situ sensor timeseries, 2D EO imagery, 3D x/y/t image time series and x/y/z geophysical data, 4D x/y/z/t climate and ocean data. Hence, important ingredients for versatile coverage retrieval are given - however, this potential has not been fully unleashed by service architectures up to now. The EU FP7-INFRA project EarthServer, launched in September 2011, aims at enabling standards-based on-demand analytics over the Web for Earth science data based on an integration of W3C XQuery for alphanumeric data and OGC-WCPS for raster data. Ultimately, EarthServer will support all OGC coverage types. The platform used by EarthServer is the rasdaman raster database system. To exploit heterogeneous multi-parallel platforms, automatic request distribution and orchestration is being established. Client toolkits are under development which will allow to quickly compose bespoke interactive clients, ranging from mobile devices over Web clients to high-end immersive virtual reality. The EarthServer platform has been deployed in six large-scale data centres with the aim of setting up Lighthouse Applications addressing all Earth Sciences, including satellite and airborne earth observation as well as use cases from atmosphere, ocean, snow, and ice monitoring, and geology on Earth and Mars. These services, each of which will ultimately host at least 100 TB, will form a peer cloud with distributed query processing for arbitrarily mixing database and in-situ access. With its ability to directly manipulate, analyze and remix massive data, the goal of EarthServer is to lift the data providers' semantic level from data stewardship to service stewardship.

  4. Improving data discoverability, accessibility, and interoperability with the Esri ArcGIS Platform at the NASA Atmospheric Science Data Center (ASDC).

    NASA Astrophysics Data System (ADS)

    Tisdale, M.

    2017-12-01

    NASA's Atmospheric Science Data Center (ASDC) is operationally using the Esri ArcGIS Platform to improve data discoverability, accessibility and interoperability to meet the diversifying user requirements from government, private, public and academic communities. The ASDC is actively working to provide their mission essential datasets as ArcGIS Image Services, Open Geospatial Consortium (OGC) Web Mapping Services (WMS), and OGC Web Coverage Services (WCS) while leveraging the ArcGIS multidimensional mosaic dataset structure. Science teams at ASDC are utilizing these services through the development of applications using the Web AppBuilder for ArcGIS and the ArcGIS API for Javascript. These services provide greater exposure of ASDC data holdings to the GIS community and allow for broader sharing and distribution to various end users. These capabilities provide interactive visualization tools and improved geospatial analytical tools for a mission critical understanding in the areas of the earth's radiation budget, clouds, aerosols, and tropospheric chemistry. The presentation will cover how the ASDC is developing geospatial web services and applications to improve data discoverability, accessibility, and interoperability.

  5. OneGeology Web Services and Portal as a global geological SDI - latest standards and technology

    NASA Astrophysics Data System (ADS)

    Duffy, Tim; Tellez-Arenas, Agnes

    2014-05-01

    The global coverage of OneGeology Web Services (www.onegeology.org and portal.onegeology.org) achieved since 2007 from the 120 participating geological surveys will be reviewed and issues arising discussed. Recent enhancements to the OneGeology Web Services capabilities will be covered including new up to 5 star service accreditation scheme utilising the ISO/OGC Web Mapping Service standard version 1.3, core ISO 19115 metadata additions and Version 2.0 Web Feature Services (WFS) serving the new IUGS-CGI GeoSciML V3.2 geological web data exchange language standard (http://www.geosciml.org/) with its associated 30+ IUGS-CGI available vocabularies (http://resource.geosciml.org/ and http://srvgeosciml.brgm.fr/eXist2010/brgm/client.html). Use of the CGI simpelithology and timescale dictionaries now allow those who wish to do so to offer data harmonisation to query their GeoSciML 3.2 based Web Feature Services and their GeoSciML_Portrayal V2.0.1 (http://www.geosciml.org/) Web Map Services in the OneGeology portal (http://portal.onegeology.org). Contributing to OneGeology involves offering to serve ideally 1:1000,000 scale geological data (in practice any scale now is warmly welcomed) as an OGC (Open Geospatial Consortium) standard based WMS (Web Mapping Service) service from an available WWW server. This may either be hosted within the Geological Survey or a neighbouring, regional or elsewhere institution that offers to serve that data for them i.e. offers to help technically by providing the web serving IT infrastructure as a 'buddy'. OneGeology is a standards focussed Spatial Data Infrastructure (SDI) and works to ensure that these standards work together and it is now possible for European Geological Surveys to register their INSPIRE web services within the OneGeology SDI (e.g. see http://www.geosciml.org/geosciml/3.2/documentation/cookbook/INSPIRE_GeoSciML_Cookbook%20_1.0.pdf). The Onegeology portal (http://portal.onegeology.org) is the first port of call for anyone wishing to discover the availability of global geological web services and has new functionality to view and use such services including multiple projection support. KEYWORDS : OneGeology; GeoSciML V 3.2; Data exchange; Portal; INSPIRE; Standards; OGC; Interoperability; GeoScience information; WMS; WFS; Cookbook.

  6. Modelling noise propagation using Grid Resources. Progress within GDI-Grid

    NASA Astrophysics Data System (ADS)

    Kiehle, Christian; Mayer, Christian; Padberg, Alexander; Stapelfeld, Hartmut

    2010-05-01

    Modelling noise propagation using Grid Resources. Progress within GDI-Grid. GDI-Grid (english: SDI-Grid) is a research project funded by the German Ministry for Science and Education (BMBF). It aims at bridging the gaps between OGC Web Services (OWS) and Grid infrastructures and identifying the potential of utilizing the superior storage capacities and computational power of grid infrastructures for geospatial applications while keeping the well-known service interfaces specified by the OGC. The project considers all major OGC webservice interfaces for Web Mapping (WMS), Feature access (Web Feature Service), Coverage access (Web Coverage Service) and processing (Web Processing Service). The major challenge within GDI-Grid is the harmonization of diverging standards as defined by standardization bodies for Grid computing and spatial information exchange. The project started in 2007 and will continue until June 2010. The concept for the gridification of OWS developed by lat/lon GmbH and the Department of Geography of the University of Bonn is applied to three real-world scenarios in order to check its practicability: a flood simulation, a scenario for emergency routing and a noise propagation simulation. The latter scenario is addressed by the Stapelfeldt Ingenieurgesellschaft mbH located in Dortmund adapting their LimA software to utilize grid resources. Noise mapping of e.g. traffic noise in urban agglomerates and along major trunk roads is a reoccurring demand of the EU Noise Directive. Input data requires road net and traffic, terrain, buildings and noise protection screens as well as population distribution. Noise impact levels are generally calculated in 10 m grid and along relevant building facades. For each receiver position sources within a typical range of 2000 m are split down into small segments, depending on local geometry. For each of the segments propagation analysis includes diffraction effects caused by all obstacles on the path of sound propagation. This immense intensive calculation needs to be performed for a major part of European landscape. A LINUX version of the commercial LimA software for noise mapping analysis has been implemented on a test cluster within the German D-GRID computer network. Results and performance indicators will be presented. The presentation is an extension to last-years presentation "Spatial Data Infrastructures and Grid Computing: the GDI-Grid project" that described the gridification concept developed in the GDI-Grid project and provided an overview of the conceptual gaps between Grid Computing and Spatial Data Infrastructures. Results from the GDI-Grid project are incorporated in the OGC-OGF (Open Grid Forum) collaboration efforts as well as the OGC WPS 2.0 standards working group developing the next major version of the WPS specification.

  7. WIFIRE Data Model and Catalog for Wildfire Data and Tools

    NASA Astrophysics Data System (ADS)

    Altintas, I.; Crawl, D.; Cowart, C.; Gupta, A.; Block, J.; de Callafon, R.

    2014-12-01

    The WIFIRE project (wifire.ucsd.edu) is building an end-to-end cyberinfrastructure for real-time and data-driven simulation, prediction and visualization of wildfire behavior. WIFIRE may be used by wildfire management authorities in the future to predict wildfire rate of spread and direction, and assess the effectiveness of high-density sensor networks in improving fire and weather predictions. WIFIRE has created a data model for wildfire resources including sensed and archived data, sensors, satellites, cameras, modeling tools, workflows and social information including Twitter feeds. This data model and associated wildfire resource catalog includes a detailed description of the HPWREN sensor network, SDG&E's Mesonet, and NASA MODIS. In addition, the WIFIRE data-model describes how to integrate the data from multiple heterogeneous sources to provide detailed fire-related information. The data catalog describes 'Observables' captured by each instrument using multiple ontologies including OGC SensorML and NASA SWEET. Observables include measurements such as wind speed, air temperature, and relative humidity, as well as their accuracy and resolution. We have implemented a REST service for publishing to and querying from the catalog using Web Application Description Language (WADL). We are creating web-based user interfaces and mobile device Apps that use the REST interface for dissemination to wildfire modeling community and project partners covering academic, private, and government laboratories while generating value to emergency officials and the general public. Additionally, the Kepler scientific workflow system is instrumented to interact with this data catalog to access real-time streaming and archived wildfire data and stream it into dynamic data-driven wildfire models at scale.

  8. Architecture of a spatial data service system for statistical analysis and visualization of regional climate changes

    NASA Astrophysics Data System (ADS)

    Titov, A. G.; Okladnikov, I. G.; Gordov, E. P.

    2017-11-01

    The use of large geospatial datasets in climate change studies requires the development of a set of Spatial Data Infrastructure (SDI) elements, including geoprocessing and cartographical visualization web services. This paper presents the architecture of a geospatial OGC web service system as an integral part of a virtual research environment (VRE) general architecture for statistical processing and visualization of meteorological and climatic data. The architecture is a set of interconnected standalone SDI nodes with corresponding data storage systems. Each node runs a specialized software, such as a geoportal, cartographical web services (WMS/WFS), a metadata catalog, and a MySQL database of technical metadata describing geospatial datasets available for the node. It also contains geospatial data processing services (WPS) based on a modular computing backend realizing statistical processing functionality and, thus, providing analysis of large datasets with the results of visualization and export into files of standard formats (XML, binary, etc.). Some cartographical web services have been developed in a system’s prototype to provide capabilities to work with raster and vector geospatial data based on OGC web services. The distributed architecture presented allows easy addition of new nodes, computing and data storage systems, and provides a solid computational infrastructure for regional climate change studies based on modern Web and GIS technologies.

  9. Implementing CUAHSI and SWE observation data models in the long-term monitoring infrastructure TERENO

    NASA Astrophysics Data System (ADS)

    Klump, J. F.; Stender, V.; Schroeder, M.

    2013-12-01

    Terrestrial Environmental Observatories (TERENO) is an interdisciplinary and long-term research project spanning an Earth observation network across Germany. It includes four test sites within Germany from the North German lowlands to the Bavarian Alps and is operated by six research centers of the Helmholtz Association. The contribution by the participating research centers is organized as regional observatories. The challenge for TERENO and its observatories is to integrate all aspects of data management, data workflows, data modeling and visualizations into the design of a monitoring infrastructure. TERENO Northeast is one of the sub-observatories of TERENO and is operated by the German Research Centre for Geosciences GFZ in Potsdam. This observatory investigates geoecological processes in the northeastern lowland of Germany by collecting large amounts of environmentally relevant data. The success of long-term projects like TERENO depends on well-organized data management, data exchange between the partners involved and on the availability of the captured data. Data discovery and dissemination are facilitated not only through data portals of the regional TERENO observatories but also through a common spatial data infrastructure TEODOOR. TEODOOR bundles the data, provided by the different web services of the single observatories, and provides tools for data discovery, visualization and data access. The TERENO Northeast data infrastructure integrates data from more than 200 instruments and makes the data available through standard web services. Data are stored following the CUAHSI observation data model in combination with the 52° North Sensor Observation Service data model. The data model was implemented using the PostgreSQL/PostGIS DBMS. Especially in a long-term project, such as TERENO, care has to be taken in the data model. We chose to adopt the CUAHSI observational data model because it is designed to store observations and descriptive information (metadata) about the data values in combination with information about the sensor systems. Also the CUAHSI model is supported by a large and active international user community. The 52° North SOS data model can be modeled as a sub-set of the CUHASI data model. In our implementation the 52° North SWE data model is implemented as database views of the CUHASI model to avoid redundant data storage. An essential aspect in TERENO Northeast is the use of standard OGS web services to facilitate data exchange and interoperability. A uniform treatment of sensor data can be realized through OGC Sensor Web Enablement (SWE) which makes a number of standards and interface definitions available: Observation & Measurement (O&M) model for the description of observations and measurements, Sensor Model Language (SensorML) for the description of sensor systems, Sensor Observation Service (SOS) for obtaining sensor observations, Sensor Planning Service (SPS) for tasking sensors, Web Notification Service (WNS) for asynchronous dialogues and Sensor Alert Service (SAS) for sending alerts.

  10. NeXOS, developing and evaluating a new generation of insitu ocean observation systems.

    NASA Astrophysics Data System (ADS)

    Delory, Eric; del Rio, Joaquin; Golmen, Lars; Roar Hareide, Nils; Pearlman, Jay; Rolin, Jean-Francois; Waldmann, Christoph; Zielinski, Oliver

    2017-04-01

    Ocean biological, chemical or physical processes occur over widely varying scales in space and time: from micro- to kilometer scales, from less than seconds to centuries. While space systems supply important data and information, insitu data is necessary for comprehensive modeling and forecasting of ocean dynamics. Yet, collection of in-situ observation on these scales is inherently challenging and remains generally difficult and costly in time and resources. This paper address the innovations and significant developments for a new generation of insitu sensors in FP7 European Union project "Next generation, Cost- effective, Compact, Multifunctional Web Enabled Ocean Sensor Systems Empowering Marine, Maritime and Fisheries Management" or "NeXOS" for short. Optical and acoustics sensors are the focus of NeXOS but NeXOS moves beyond just sensors as systems that simultaneously address multiple objectives and applications are becoming increasingly important. Thus NeXOS takes a perspective of both sensors and sensor systems with significant advantages over existing observing capabilities via the implementation of innovations such as multiplatform integration, greater reliability through better antifouling management and greater sensor and data interoperability through use of OGC standards. This presentation will address the sensor system development and field-testing of the new NeXOS sensor systems. This is being done on multiple platforms including profiling floats, gliders, ships, buoys and subsea stations. The implementation of a data system based on SWE and PUCK furthers interoperability across measurements and platforms. This presentation will review the sensor system capabilities, the status of field tests and recommendations for long-term ocean monitoring.

  11. Implementation and Evaluation of Four Interoperable Open Standards for the Internet of Things

    PubMed Central

    Jazayeri, Mohammad Ali; Liang, Steve H. L.; Huang, Chih-Yuan

    2015-01-01

    Recently, researchers are focusing on a new use of the Internet called the Internet of Things (IoT), in which enabled electronic devices can be remotely accessed over the Internet. As the realization of IoT concept is still in its early stages, manufacturers of Internet-connected devices and IoT web service providers are defining their proprietary protocols based on their targeted applications. Consequently, IoT becomes heterogeneous in terms of hardware capabilities and communication protocols. Addressing these heterogeneities by following open standards is a necessary step to communicate with various IoT devices. In this research, we assess the feasibility of applying existing open standards on resource-constrained IoT devices. The standard protocols developed in this research are OGC PUCK over Bluetooth, TinySOS, SOS over CoAP, and OGC SensorThings API. We believe that by hosting open standard protocols on IoT devices, not only do the devices become self-describable, self-contained, and interoperable, but innovative applications can also be easily developed with standardized interfaces. In addition, we use memory consumption, request message size, response message size, and response latency to benchmark the efficiency of the implemented protocols. In all, this research presents and evaluates standard-based solutions to better understand the feasibility of applying existing standards to the IoT vision. PMID:26402683

  12. SensorWeb Hub infrastructure for open access to scientific research data

    NASA Astrophysics Data System (ADS)

    de Filippis, Tiziana; Rocchi, Leandro; Rapisardi, Elena

    2015-04-01

    The sharing of research data is a new challenge for the scientific community that may benefit from a large amount of information to solve environmental issues and sustainability in agriculture and urban contexts. Prerequisites for this challenge is the development of an infrastructure that ensure access, management and preservation of data, technical support for a coordinated and harmonious management of data that, in the framework of Open Data Policies, should encourages the reuse and the collaboration. The neogeography and the citizen as sensors approach, highlight that new data sources need a new set of tools and practices so to collect, validate, categorize, and use / access these "crowdsourced" data, that integrate the data sets produced in the scientific field, thus "feeding" the overall available data for analysis and research. When the scientific community embraces the dimension of collaboration and sharing, access and re-use, in order to accept the open innovation approach, it should redesign and reshape the processes of data management: the challenges of technological and cultural innovation, enabled by web 2.0 technologies, bring to the scenario where the sharing of structured and interoperable data will constitute the unavoidable building block to set up a new paradigm of scientific research. In this perspective the Institute of Biometeorology, CNR, whose aim is contributing to sharing and development of research data, has developed the "SensorWebHub" (SWH) infrastructure to support the scientific activities carried out in several research projects at national and international level. It is designed to manage both mobile and fixed open source meteorological and environmental sensors, in order to integrate the existing agro-meteorological and urban monitoring networks. The proposed architecture uses open source tools to ensure sustainability in the development and deployment of web applications with geographic features and custom analysis, as requested by the different research projects. The SWH components are organized in typical client-server architecture and interact from the sensing process to the representation of the results to the end-users. The Web Application enables to view and analyse the data stored in the GeoDB. The interface is designed following Internet browsers specifications allowing the visualization of collected data in different formats (tabular, chart and geographic map). The services for the dissemination of geo-referenced information, adopt the OGC specifications. SWH is a bottom-up collaborative initiative to share real time research data and pave the way for a open innovation approach in the scientific research. Until now this framework has been used for several WebGIS applications and WebApp for environmental monitoring at different temporal and spatial scales.

  13. GéoSAS: A modular and interoperable Open Source Spatial Data Infrastructure for research

    NASA Astrophysics Data System (ADS)

    Bera, R.; Squividant, H.; Le Henaff, G.; Pichelin, P.; Ruiz, L.; Launay, J.; Vanhouteghem, J.; Aurousseau, P.; Cudennec, C.

    2015-05-01

    To-date, the commonest way to deal with geographical information and processes still appears to consume local resources, i.e. locally stored data processed on a local desktop or server. The maturity and subsequent growing use of OGC standards to exchange data on the World Wide Web, enhanced in Europe by the INSPIRE Directive, is bound to change the way people (and among them research scientists, especially in environmental sciences) make use of, and manage, spatial data. A clever use of OGC standards can help scientists to better store, share and use data, in particular for modelling. We propose a framework for online processing by making an intensive use of OGC standards. We illustrate it using the Spatial Data Infrastructure (SDI) GéoSAS which is the SDI set up for researchers' needs in our department. It is based on the existing open source, modular and interoperable Spatial Data Architecture geOrchestra.

  14. We have "born digital" - now what about "born semantic"?

    NASA Astrophysics Data System (ADS)

    Leadbetter, Adam; Fredericks, Janet

    2014-05-01

    The phrase "born-digital" refers to those materials which originate in a digital form. In Earth and Space Sciences, this is now very much the norm for data: analogue to digital converters sit on instrument boards and produce a digital record of the observed environment. While much effort has been put in to creating and curating these digital data, there has been little work on using semantic mark up of data from the point of collection - what we term 'born semantic'. In this presentation we report on two efforts to expand this area: Qartod-to-OGC (Q2O) and SenseOCEAN. These projects have taken a common approach to 'born semantic': create or reuse appropriate controlled vocabularies, published to World Wide Web Commission (W3C) standards use standards from the Open Geospatial Consortium's Sensor Web Enablement (SWE) initiative to describe instrument setup, deployment and/or outputs using terms from those controlled vocabularies embed URLs from the controlled vocabularies within the SWE documents in a "Linked Data" conformant approach Q2O developed best practices examples of SensorML descriptions of Original Equipment Manufacturers' metadata (model characteristics, capabilities, manufacturer contact, etc ...) set-up and deployment SensorML files; and data centre process-lineage using registered vocabularies to describe terms (including input, output, processes, parameters, quality control flags) One Q2O use case, the Martha's Vineyard Coastal Observatory ADCP Waves instance, uses SensorML and registered vocabularies to fully describe the process of computing wave parameters from sensed properties, including quality control tests and associated results. The European Commission Framework Programme 7 project SenseOCEAN draws together world leading marine sensor developers to create a highly integrated multifunction and cost-effective in situ marine biogeochemical sensor system. This project will provide a quantum leap in the ability to measure crucial biogeochemical parameters. Innovations will be combined with state of the art sensor technology to produce a modular sensor system that can be deployed on many platforms. The sensor descriptions are being profiled in SensorML and the controlled vocabularies are being repurposed from those used within the European Commission SeaDataNet project and published on the community standard NERC Vocabulary Server.

  15. How NASA's Atmospheric Science Data Center (ASDC) is operationally using the Esri ArcGIS Platform to improve data discoverability, accessibility and interoperability to meet the diversifying government, private, public and academic communities' driven requirements.

    NASA Astrophysics Data System (ADS)

    Tisdale, M.

    2016-12-01

    NASA's Atmospheric Science Data Center (ASDC) is operationally using the Esri ArcGIS Platform to improve data discoverability, accessibility and interoperability to meet the diversifying government, private, public and academic communities' driven requirements. The ASDC is actively working to provide their mission essential datasets as ArcGIS Image Services, Open Geospatial Consortium (OGC) Web Mapping Services (WMS), OGC Web Coverage Services (WCS) and leveraging the ArcGIS multidimensional mosaic dataset structure. Science teams and ASDC are utilizing these services, developing applications using the Web AppBuilder for ArcGIS and ArcGIS API for Javascript, and evaluating restructuring their data production and access scripts within the ArcGIS Python Toolbox framework and Geoprocessing service environment. These capabilities yield a greater usage and exposure of ASDC data holdings and provide improved geospatial analytical tools for a mission critical understanding in the areas of the earth's radiation budget, clouds, aerosols, and tropospheric chemistry.

  16. EarthServer: a Summary of Achievements in Technology, Services, and Standards

    NASA Astrophysics Data System (ADS)

    Baumann, Peter

    2015-04-01

    Big Data in the Earth sciences, the Tera- to Exabyte archives, mostly are made up from coverage data, according to ISO and OGC defined as the digital representation of some space-time varying phenomenon. Common examples include 1-D sensor timeseries, 2-D remote sensing imagery, 3D x/y/t image timese ries and x/y/z geology data, and 4-D x/y/z/t atmosphere and ocean data. Analytics on such data requires on-demand processing of sometimes significant complexity, such as getting the Fourier transform of satellite images. As network bandwidth limits prohibit transfer of such Big Data it is indispensable to devise protocols allowing clients to task flexible and fast processing on the server. The transatlantic EarthServer initiative, running from 2011 through 2014, has united 11 partners to establish Big Earth Data Analytics. A key ingredient has been flexibility for users to ask whatever they want, not impeded and complicated by system internals. The EarthServer answer to this is to use high-level, standards-based query languages which unify data and metadata search in a simple, yet powerful way. A second key ingredient is scalability. Without any doubt, scalability ultimately can only be achieved through parallelization. In the past, parallelizing cod e has been done at compile time and usually with manual intervention. The EarthServer approach is to perform a samentic-based dynamic distribution of queries fragments based on networks optimization and further criteria. The EarthServer platform is comprised by rasdaman, the pioneer and leading Array DBMS built for any-size multi-dimensional raster data being extended with support for irregular grids and general meshes; in-situ retrieval (evaluation of database queries on existing archive structures, avoiding data import and, hence, duplication); the aforementioned distributed query processing. Additionally, Web clients for multi-dimensional data visualization are being established. Client/server interfaces are strictly based on OGC and W3C standards, in particular the Web Coverage Processing Service (WCPS) which defines a high-level coverage query language. Reviewers have attested EarthServer that "With no doubt the project has been shaping the Big Earth Data landscape through the standardization activities within OGC, ISO and beyond". We present the project approach, its outcomes and impact on standardization and Big Data technology, and vistas for the future.

  17. Publication of sensor data in the long-term environmental sub-observatory TERENO Northeast

    NASA Astrophysics Data System (ADS)

    Stender, Vivien; Ulbricht, Damian; Klump, Jens

    2017-04-01

    Terrestrial Environmental Observatories (TERENO) is an interdisciplinary and long-term research project spanning an Earth observation network across Germany. It includes four test sites within Germany from the North German lowlands to the Bavarian Alps and is operated by six research centers of the Helmholtz Association. TERENO Northeast is one of the sub-observatories of TERENO and is operated by the German Research Centre for Geosciences GFZ in Potsdam. This observatory investigates geoecological processes in the northeastern lowland of Germany by collecting large amounts of environmentally relevant data. The success of long-term projects like TERENO depends on well-organized data management, data exchange between the partners involved and on the availability of the captured data. Data discovery and dissemination are facilitated not only through data portals of the regional TERENO observatories but also through a common spatial data infrastructure TEODOOR (TEreno Online Data repOsitORry). TEODOOR bundles the data provided by the different web services of the single observatories and provides tools for data discovery, visualization and data access. The TERENO Northeast data infrastructure integrates data from more than 200 instruments and makes data available through standard web services. TEODOOR accesses the OGC Sensor Web Enablement (SWE) interfaces offered by the regional observatories. In addition to the SWE interface, TERENO Northeast also publishes time series of environmental sensor data through the DOI registration service at GFZ Potsdam. This service uses the DataCite infrastructure to make research data citable and is able to keep and disseminate metadata popular to the geosciences [1]. The metadata required by DataCite are created in an automated process by extracting information from the SWE SensorML metadata. The GFZ data management tool kit panMetaDocs is used to manage and archive file based datasets and to register Digital Object Identifiers (DOI) for published data. In this presentation we will report on current advances in publication of time series data from environmental sensor networks. [1]http://doidb.wdc-terra.org/oaip/oai?verb=ListRecords&metadataPrefix=iso19139&set=DOIDB.TERENO

  18. Exploring NASA GES DISC Data with Interoperable Services

    NASA Technical Reports Server (NTRS)

    Zhao, Peisheng; Yang, Wenli; Hegde, Mahabal; Wei, Jennifer C.; Kempler, Steven; Pham, Long; Teng, William; Savtchenko, Andrey

    2015-01-01

    Overview of NASA GES DISC (NASA Goddard Earth Science Data and Information Services Center) data with interoperable services: Open-standard and Interoperable Services Improve data discoverability, accessibility, and usability with metadata, catalogue and portal standards Achieve data, information and knowledge sharing across applications with standardized interfaces and protocols Open Geospatial Consortium (OGC) Data Services and Specifications Web Coverage Service (WCS) -- data Web Map Service (WMS) -- pictures of data Web Map Tile Service (WMTS) --- pictures of data tiles Styled Layer Descriptors (SLD) --- rendered styles.

  19. Interoperability in planetary research for geospatial data analysis

    NASA Astrophysics Data System (ADS)

    Hare, Trent M.; Rossi, Angelo P.; Frigeri, Alessandro; Marmo, Chiara

    2018-01-01

    For more than a decade there has been a push in the planetary science community to support interoperable methods for accessing and working with geospatial data. Common geospatial data products for planetary research include image mosaics, digital elevation or terrain models, geologic maps, geographic location databases (e.g., craters, volcanoes) or any data that can be tied to the surface of a planetary body (including moons, comets or asteroids). Several U.S. and international cartographic research institutions have converged on mapping standards that embrace standardized geospatial image formats, geologic mapping conventions, U.S. Federal Geographic Data Committee (FGDC) cartographic and metadata standards, and notably on-line mapping services as defined by the Open Geospatial Consortium (OGC). The latter includes defined standards such as the OGC Web Mapping Services (simple image maps), Web Map Tile Services (cached image tiles), Web Feature Services (feature streaming), Web Coverage Services (rich scientific data streaming), and Catalog Services for the Web (data searching and discoverability). While these standards were developed for application to Earth-based data, they can be just as valuable for planetary domain. Another initiative, called VESPA (Virtual European Solar and Planetary Access), will marry several of the above geoscience standards and astronomy-based standards as defined by International Virtual Observatory Alliance (IVOA). This work outlines the current state of interoperability initiatives in use or in the process of being researched within the planetary geospatial community.

  20. Coherent visualization of spatial data adapted to roles, tasks, and hardware

    NASA Astrophysics Data System (ADS)

    Wagner, Boris; Peinsipp-Byma, Elisabeth

    2012-06-01

    Modern crisis management requires that users with different roles and computer environments have to deal with a high volume of various data from different sources. For this purpose, Fraunhofer IOSB has developed a geographic information system (GIS) which supports the user depending on available data and the task he has to solve. The system provides merging and visualization of spatial data from various civilian and military sources. It supports the most common spatial data standards (OGC, STANAG) as well as some proprietary interfaces, regardless if these are filebased or database-based. To set the visualization rules generic Styled Layer Descriptors (SLDs) are used, which are an Open Geospatial Consortium (OGC) standard. SLDs allow specifying which data are shown, when and how. The defined SLDs consider the users' roles and task requirements. In addition it is possible to use different displays and the visualization also adapts to the individual resolution of the display. Too high or low information density is avoided. Also, our system enables users with different roles to work together simultaneously using the same data base. Every user is provided with the appropriate and coherent spatial data depending on his current task. These so refined spatial data are served via the OGC services Web Map Service (WMS: server-side rendered raster maps), or the Web Map Tile Service - (WMTS: pre-rendered and cached raster maps).

  1. Progress of Interoperability in Planetary Research for Geospatial Data Analysis

    NASA Astrophysics Data System (ADS)

    Hare, T. M.; Gaddis, L. R.

    2015-12-01

    For nearly a decade there has been a push in the planetary science community to support interoperable methods of accessing and working with geospatial data. Common geospatial data products for planetary research include image mosaics, digital elevation or terrain models, geologic maps, geographic location databases (i.e., craters, volcanoes) or any data that can be tied to the surface of a planetary body (including moons, comets or asteroids). Several U.S. and international cartographic research institutions have converged on mapping standards that embrace standardized image formats that retain geographic information (e.g., GeoTiff, GeoJpeg2000), digital geologic mapping conventions, planetary extensions for symbols that comply with U.S. Federal Geographic Data Committee cartographic and geospatial metadata standards, and notably on-line mapping services as defined by the Open Geospatial Consortium (OGC). The latter includes defined standards such as the OGC Web Mapping Services (simple image maps), Web Feature Services (feature streaming), Web Coverage Services (rich scientific data streaming), and Catalog Services for the Web (data searching and discoverability). While these standards were developed for application to Earth-based data, they have been modified to support the planetary domain. The motivation to support common, interoperable data format and delivery standards is not only to improve access for higher-level products but also to address the increasingly distributed nature of the rapidly growing volumes of data. The strength of using an OGC approach is that it provides consistent access to data that are distributed across many facilities. While data-steaming standards are well-supported by both the more sophisticated tools used in Geographic Information System (GIS) and remote sensing industries, they are also supported by many light-weight browsers which facilitates large and small focused science applications and public use. Here we provide an overview of the interoperability initiatives that are currently ongoing in the planetary research community, examples of their successful application, and challenges that remain.

  2. Emerging Geospatial Sharing Technologies in Earth and Space Science Informatics

    NASA Astrophysics Data System (ADS)

    Singh, R.; Bermudez, L. E.

    2013-12-01

    Emerging Geospatial Sharing Technologies in Earth and Space Science Informatics The Open Geospatial Consortium (OGC) mission is to serve as a global forum for the collaboration of developers and users of spatial data products and services, and to advance the development of international standards for geospatial interoperability. The OGC coordinates with over 400 institutions in the development of geospatial standards. In the last years two main trends are making disruptions in geospatial applications: mobile and context sharing. People now have more and more mobile devices to support their work and personal life. Mobile devices are intermittently connected to the internet and have smaller computing capacity than a desktop computer. Based on this trend a new OGC file format standard called GeoPackage will enable greater geospatial data sharing on mobile devices. GeoPackage is perhaps best understood as the natural evolution of Shapefiles, which have been the predominant lightweight geodata sharing format for two decades. However the format is extremely limited. Four major shortcomings are that only vector points, lines, and polygons are supported; property names are constrained by the dBASE format; multiple files are required to encode a single data set; and multiple Shapefiles are required to encode multiple data sets. A more modern lingua franca for geospatial data is long overdue. GeoPackage fills this need with support for vector data, image tile matrices, and raster data. And it builds upon a database container - SQLite - that's self-contained, single-file, cross-platform, serverless, transactional, and open source. A GeoPackage, in essence, is a set of SQLite database tables whose content and layout is described in the candidate GeoPackage Implementation Specification available at https://portal.opengeospatial.org/files/?artifact_id=54838&version=1. The second trend is sharing client 'contexts'. When a user is looking into an article or a product on the web, they can easily share this information with colleagues or friends via an email that includes URLs (links to web resources) and attachments (inline data). In the case of geospatial information, a user would like to share a map created from different OGC sources, which may include for example, WMS and WFS links, and GML and KML annotations. The emerging OGC file format is called the OGC Web Services Context Document (OWS Context), which allows clients to reproduce a map previously created by someone else. Context sharing is important in a variety of domains, from emergency response, where fire, police and emergency medical personnel need to work off a common map, to multi-national military operations, where coalition forces need to share common data sources, but have cartographic displays in different languages and symbology sets. OWS Contexts can be written in XML (building upon the Atom Syndication Format) or JSON. This presentation will provide an introduction of GeoPackage and OWS Context and how they can be used to advance sharing of Earth and Space Science information.

  3. Database Organisation in a Web-Enabled Free and Open-Source Software (foss) Environment for Spatio-Temporal Landslide Modelling

    NASA Astrophysics Data System (ADS)

    Das, I.; Oberai, K.; Sarathi Roy, P.

    2012-07-01

    Landslides exhibit themselves in different mass movement processes and are considered among the most complex natural hazards occurring on the earth surface. Making landslide database available online via WWW (World Wide Web) promotes the spreading and reaching out of the landslide information to all the stakeholders. The aim of this research is to present a comprehensive database for generating landslide hazard scenario with the help of available historic records of landslides and geo-environmental factors and make them available over the Web using geospatial Free & Open Source Software (FOSS). FOSS reduces the cost of the project drastically as proprietary software's are very costly. Landslide data generated for the period 1982 to 2009 were compiled along the national highway road corridor in Indian Himalayas. All the geo-environmental datasets along with the landslide susceptibility map were served through WEBGIS client interface. Open source University of Minnesota (UMN) mapserver was used as GIS server software for developing web enabled landslide geospatial database. PHP/Mapscript server-side application serve as a front-end application and PostgreSQL with PostGIS extension serve as a backend application for the web enabled landslide spatio-temporal databases. This dynamic virtual visualization process through a web platform brings an insight into the understanding of the landslides and the resulting damage closer to the affected people and user community. The landslide susceptibility dataset is also made available as an Open Geospatial Consortium (OGC) Web Feature Service (WFS) which can be accessed through any OGC compliant open source or proprietary GIS Software.

  4. Communicating and visualizing data quality through Web Map Services

    NASA Astrophysics Data System (ADS)

    Roberts, Charles; Blower, Jon; Maso, Joan; Diaz, Daniel; Griffiths, Guy; Lewis, Jane

    2014-05-01

    The sharing and visualization of environmental data through OGC Web Map Services is becoming increasingly common. However, information about the quality of data is rarely presented. (In this presentation we consider mostly data uncertainty as a measure of quality, although we acknowledge that many other quality measures are relevant to the geoscience community.) In the context of the GeoViQua project (http://www.geoviqua.org) we have developed conventions and tools for using WMS to deliver data quality information. The "WMS-Q" convention describes how the WMS specification can be used to publish quality information at the level of datasets, variables and individual pixels (samples). WMS-Q requires no extensions to the WMS 1.3.0 specification, being entirely backward-compatible. (An earlier version of WMS-Q was published as OGC Engineering Report 12-160.) To complement the WMS-Q convention, we have also developed extensions to the OGC Symbology Encoding (SE) specification, enabling uncertain geoscience data to be portrayed using a variety of visualization techniques. These include contours, stippling, blackening, whitening, opacity, bivariate colour maps, confidence interval triangles and glyphs. There may also be more extensive applications of these methods beyond the visual representation of uncertainty. In this presentation we will briefly describe the scope of the WMS-Q and "extended SE" specifications and then demonstrate the innovations using open-source software based upon ncWMS (http://ncwms.sf.net). We apply the tools to a variety of datasets including Earth Observation data from the European Space Agency's Climate Change Initiative. The software allows uncertain raster data to be shared through Web Map Services, giving the user fine control over data visualization.

  5. A Web-based Visualization System for Three Dimensional Geological Model using Open GIS

    NASA Astrophysics Data System (ADS)

    Nemoto, T.; Masumoto, S.; Nonogaki, S.

    2017-12-01

    A three dimensional geological model is an important information in various fields such as environmental assessment, urban planning, resource development, waste management and disaster mitigation. In this study, we have developed a web-based visualization system for 3D geological model using free and open source software. The system has been successfully implemented by integrating web mapping engine MapServer and geographic information system GRASS. MapServer plays a role of mapping horizontal cross sections of 3D geological model and a topographic map. GRASS provides the core components for management, analysis and image processing of the geological model. Online access to GRASS functions has been enabled using PyWPS that is an implementation of WPS (Web Processing Service) Open Geospatial Consortium (OGC) standard. The system has two main functions. Two dimensional visualization function allows users to generate horizontal and vertical cross sections of 3D geological model. These images are delivered via WMS (Web Map Service) and WPS OGC standards. Horizontal cross sections are overlaid on the topographic map. A vertical cross section is generated by clicking a start point and an end point on the map. Three dimensional visualization function allows users to visualize geological boundary surfaces and a panel diagram. The user can visualize them from various angles by mouse operation. WebGL is utilized for 3D visualization. WebGL is a web technology that brings hardware-accelerated 3D graphics to the browser without installing additional software. The geological boundary surfaces can be downloaded to incorporate the geologic structure in a design on CAD and model for various simulations. This study was supported by JSPS KAKENHI Grant Number JP16K00158.

  6. Development of a spatial decision support system for flood risk management in Brazil that combines volunteered geographic information with wireless sensor networks

    NASA Astrophysics Data System (ADS)

    Horita, Flávio E. A.; Albuquerque, João Porto de; Degrossi, Lívia C.; Mendiondo, Eduardo M.; Ueyama, Jó

    2015-07-01

    Effective flood risk management requires updated information to ensure that the correct decisions can be made. This can be provided by Wireless Sensor Networks (WSN) which are a low-cost means of collecting updated information about rivers. Another valuable resource is Volunteered Geographic Information (VGI) which is a comparatively new means of improving the coverage of monitored areas because it is able to supply supplementary information to the WSN and thus support decision-making in flood risk management. However, there still remains the problem of how to combine WSN data with VGI. In this paper, an attempt is made to investigate AGORA-DS, which is a Spatial Decision Support System (SDSS) that is able to make flood risk management more effective by combining these data sources, i.e. WSN with VGI. This approach is built over a conceptual model that complies with the interoperable standards laid down by the Open Geospatial Consortium (OGC) - e.g. Sensor Observation Service (SOS) and Web Feature Service (WFS) - and seeks to combine and present unified information in a web-based decision support tool. This work was deployed in a real scenario of flood risk management in the town of São Carlos in Brazil. The evidence obtained from this deployment confirmed that interoperable standards can support the integration of data from distinct data sources. In addition, they also show that VGI is able to provide information about areas of the river basin which lack data since there is no appropriate station in the area. Hence it provides a valuable support for the WSN data. It can thus be concluded that AGORA-DS is able to combine information provided by WSN and VGI, and provide useful information for supporting flood risk management.

  7. Advances in a distributed approach for ocean model data interoperability

    USGS Publications Warehouse

    Signell, Richard P.; Snowden, Derrick P.

    2014-01-01

    An infrastructure for earth science data is emerging across the globe based on common data models and web services. As we evolve from custom file formats and web sites to standards-based web services and tools, data is becoming easier to distribute, find and retrieve, leaving more time for science. We describe recent advances that make it easier for ocean model providers to share their data, and for users to search, access, analyze and visualize ocean data using MATLAB® and Python®. These include a technique for modelers to create aggregated, Climate and Forecast (CF) metadata convention datasets from collections of non-standard Network Common Data Form (NetCDF) output files, the capability to remotely access data from CF-1.6-compliant NetCDF files using the Open Geospatial Consortium (OGC) Sensor Observation Service (SOS), a metadata standard for unstructured grid model output (UGRID), and tools that utilize both CF and UGRID standards to allow interoperable data search, browse and access. We use examples from the U.S. Integrated Ocean Observing System (IOOS®) Coastal and Ocean Modeling Testbed, a project in which modelers using both structured and unstructured grid model output needed to share their results, to compare their results with other models, and to compare models with observed data. The same techniques used here for ocean modeling output can be applied to atmospheric and climate model output, remote sensing data, digital terrain and bathymetric data.

  8. The tsunami service bus, an integration platform for heterogeneous sensor systems

    NASA Astrophysics Data System (ADS)

    Haener, R.; Waechter, J.; Kriegel, U.; Fleischer, J.; Mueller, S.

    2009-04-01

    1. INTRODUCTION Early warning systems are long living and evolving: New sensor-systems and -types may be developed and deployed, sensors will be replaced or redeployed on other locations and the functionality of analyzing software will be improved. To ensure a continuous operability of those systems their architecture must be evolution-enabled. From a computer science point of view an evolution-enabled architecture must fulfill following criteria: • Encapsulation of and functionality on data in standardized services. Access to proprietary sensor data is only possible via these services. • Loose coupling of system constituents which easily can be achieved by implementing standardized interfaces. • Location transparency of services what means that services can be provided everywhere. • Separation of concerns that means breaking a system into distinct features which overlap in functionality as little as possible. A Service Oriented Architecture (SOA) as e. g. realized in the German Indonesian Tsunami Early Warning System (GITEWS) and the advantages of functional integration on the basis of services described below adopt these criteria best. 2. SENSOR INTEGRATION Integration of data from (distributed) data sources is just a standard task in computer science. From few well known solution patterns, taking into account performance and security requirements of early warning systems only functional integration should be considered. Precondition for this is that systems are realized compliant to SOA patterns. Functionality is realized in form of dedicated components communicating via a service infrastructure. These components provide their functionality in form of services via standardized and published interfaces which could be used to access data maintained in - and functionality provided by dedicated components. Functional integration replaces the tight coupling at data level by a dependency on loosely coupled services. If the interfaces of the service providing components remain unchanged, components can be maintained and evolved independently on each other and service functionality as a whole can be reused. In GITEWS the functional integration pattern was adopted by applying the principles of an Enterprise Service Bus (ESB) as a backbone. Four services provided by the so called Tsunami Service Bus (TSB) which are essential for early warning systems are realized compliant to services specified within the Sensor Web Enablement (SWE) initiative of the Open Geospatial Consortium (OGC). 3. ARCHITECTURE The integration platform was developed to access proprietary, heterogeneous sensor data and to provide them in a uniform manner for further use. Its core, the TSB provides both a messaging-backbone and -interfaces on the basis of a Java Messaging Service (JMS). The logical architecture of GITEWS consists of four independent layers: • A resource layer where physical or virtual sensors as well as data or model storages provide relevant measurement-, event- and analysis-data: Utilizable for the TSB are any kind of data. In addition to sensors databases, model data and processing applications are adopted. SWE specifies encoding both to access and to describe these data in a comprehensive way: 1. Sensor Model Language (SensorML): Standardized description of sensors and sensor data 2. Observations and Measurements (O&M): Model and encoding of sensor measurements • A service layer to collect and conduct data from heterogeneous and proprietary resources and provide them via standardized interfaces: The TSB enables interaction with sensors via the following services: 1. Sensor Observation Service (SOS): Standardized access to sensor data 2. Sensor Planning Service (SPS): Controlling of sensors and sensor networks 3. Sensor Alert Service (SAS): Active sending of data if defined events occur 4. Web Notification Service (WNS): Conduction of asynchronous dialogues between services • An orchestration layer where atomic services are composed and arranged to high level processes like a decision support process: One of the outstanding features of service-oriented architectures is the possibility to compose new services from existing ones, which can be done programmatically or via declaration (workflow or process design). This allows e. g. the definition of new warning processes which could be adapted easily to new requirements. • An access layer which may contain graphical user interfaces for decision support, monitoring- or visualization-systems: To for example visualize time series graphical user interfaces request sensor data simply via the SOS. 4.BENEFIT The integration platform is realized on top of well known and widely used open source software implementing industrial standards. New sensors could be added easily to the infrastructure. Client components don't need to be adjusted if new sensor-types or -individuals are added to the system, because they access the sensors via standardized services. With implementing SWE fully compatible to the OGC specification it is possible to establish the "detection" and integration of sensors via the Web. Thus realizing a system of systems that combines early warning system functionality at different levels of detail (distant early warning systems, monitoring systems and any sensor system) is feasible.

  9. Automatic publishing ISO 19115 metadata with PanMetaDocs using SensorML information

    NASA Astrophysics Data System (ADS)

    Stender, Vivien; Ulbricht, Damian; Schroeder, Matthias; Klump, Jens

    2014-05-01

    Terrestrial Environmental Observatories (TERENO) is an interdisciplinary and long-term research project spanning an Earth observation network across Germany. It includes four test sites within Germany from the North German lowlands to the Bavarian Alps and is operated by six research centers of the Helmholtz Association. The contribution by the participating research centers is organized as regional observatories. A challenge for TERENO and its observatories is to integrate all aspects of data management, data workflows, data modeling and visualizations into the design of a monitoring infrastructure. TERENO Northeast is one of the sub-observatories of TERENO and is operated by the German Research Centre for Geosciences (GFZ) in Potsdam. This observatory investigates geoecological processes in the northeastern lowland of Germany by collecting large amounts of environmentally relevant data. The success of long-term projects like TERENO depends on well-organized data management, data exchange between the partners involved and on the availability of the captured data. Data discovery and dissemination are facilitated not only through data portals of the regional TERENO observatories but also through a common spatial data infrastructure TEODOOR (TEreno Online Data repOsitORry). TEODOOR bundles the data, provided by the different web services of the single observatories, and provides tools for data discovery, visualization and data access. The TERENO Northeast data infrastructure integrates data from more than 200 instruments and makes data available through standard web services. Geographic sensor information and services are described using the ISO 19115 metadata schema. TEODOOR accesses the OGC Sensor Web Enablement (SWE) interfaces offered by the regional observatories. In addition to the SWE interface, TERENO Northeast also published data through DataCite. The necessary metadata are created in an automated process by extracting information from the SWE SensorML to create ISO 19115 compliant metadata. The resulting metadata file is stored in the GFZ Potsdam data infrastructure. The publishing workflow for file based research datasets at GFZ Potsdam is based on the eSciDoc infrastructure, using PanMetaDocs (PMD) as the graphical user interface. PMD is a collaborative, metadata based data and information exchange platform [1]. Besides SWE, metadata are also syndicated by PMD through an OAI-PMH interface. In addition, metadata from other observatories, projects or sensors in TERENO can be accessed through the TERENO Northeast data portal. [1] http://meetingorganizer.copernicus.org/EGU2012/EGU2012-7058-2.pdf

  10. Enhancing the Value of Sensor-based Observations by Capturing the Knowledge of How An Observation Came to Be

    NASA Astrophysics Data System (ADS)

    Fredericks, J.; Rueda-Velasquez, C. A.

    2016-12-01

    As we move from keeping data on our disks to sharing it with the world, often in real-time, we are obligated to also tell an unknown user about how our observations were made. Data that are shared must not only have ownership metadata, unit descriptions and content formatting information. The provider must also share information that is needed to assess the data as it relates to potential re-use. A user must be able to assess the limitations and capabilities of the sensor, as it is configured, to understand its value. For example, when an instrument is configured, it typically affects the data accuracy and operational limits of the sensor. An operator may sacrifice data accuracy to achieve a broader operational range and visa versa. If you are looking at newly discovered data, it is important to be able to find all of the information that relates to assessing the data quality for your particular application. Traditionally, metadata are captured by data managers who usually do not know how the data are collected. By the time data are distributed, this knowledge is often gone, buried within notebooks or hidden in documents that are not machine-harvestable and often not human-readable. In a recently funded NSF EarthCube Integrative Activity called X-DOMES (Cross-Domain Observational Metadata in EnviroSensing), mechanisms are underway to enable the capture of sensor and deployment metadata by sensor manufacturers and field operators. The support has enabled the development of a community ontology repository (COR) within the Earth Science Information Partnership (ESIP) community, fostering easy creation of resolvable terms for the broader community. This tool enables non-experts to easily develop W3C standards-based content, promoting the implementation of Semantic Web technologies for enhanced discovery of content and interoperability in workflows. The X-DOMES project is also developing a SensorML Viewer/Editor to provide an easy interface for sensor manufacturers and field operators to fully-describe sensor capabilities and configuration/deployment content - automatically generating it in machine-harvestable encodings that can be referenced by data managers and/or associated with the data through web-services, such as the OGC SWE Sensor Observation Service.

  11. A Reference Implementation of the OGC CSW EO Standard for the ESA HMA-T project

    NASA Astrophysics Data System (ADS)

    Bigagli, Lorenzo; Boldrini, Enrico; Papeschi, Fabrizio; Vitale, Fabrizio

    2010-05-01

    This work was developed in the context of the ESA Heterogeneous Missions Accessibility (HMA) project, whose main objective is to involve the stakeholders, namely National space agencies, satellite or mission owners and operators, in an harmonization and standardization process of their ground segment services and related interfaces. Among HMA objectives was the specification, conformance testing, and experimentation of two Extension Packages (EPs) of the ebRIM Application Profile (AP) of the OGC Catalog Service for the Web (CSW) specification: the Earth Observation Products (EO) EP (OGC 06-131) and the Cataloguing of ISO Metadata (CIM) EP (OGC 07-038). Our contributions have included the development and deployment of Reference Implementations (RIs) for both the above specifications, and their integration with the ESA Service Support Environment (SSE). The RIs are based on the GI-cat framework, an implementation of a distributed catalog service, able to query disparate Earth and Space Science data sources (e.g. OGC Web Services, Unidata THREDDS) and to expose several standard interfaces for data discovery (e.g. OGC CSW ISO AP). Following our initial planning, the GI-cat framework has been extended in order to expose the CSW.ebRIM-CIM and CSW.ebRIM-EO interfaces, and to distribute queries to CSW.ebRIM-CIM and CSW.ebRIM-EO data sources. We expected that a mapping strategy would suffice for accommodating CIM, but this proved to be unpractical during implementation. Hence, a model extension strategy was eventually implemented for both the CIM and EO EPs, and the GI-cat federal model was enhanced in order to support the underlying ebRIM AP. This work has provided us with new insights into the different data models for geospatial data, and the technologies for their implementation. The extension is used by suitable CIM and EO profilers (front-end mediator components) and accessors (back-end mediator components), that relate ISO 19115 concepts to EO and CIM ones. Moreover, a mapping to GI-cat federal model was developed for each EP (quite limited for EO; complete for CIM), in order to enable the discovery of resources through any of GI-cat profilers. The query manager was also improved. GI-cat-EO and -CIM installation packages were made available for distribution, and two RI instances were deployed on the Amazon EC2 facility (plus an ad-hoc instance returning incorrect control data). Integration activities of the EO RI with the ESA SSE Portal for Earth Observation Products were also successfully carried on. During our work, we have contributed feedback and comments to the CIM and EO EP specification working groups. Our contributions resulted in version 0.2.5 of the EO EP, recently approved as an OGC standard, and were useful to consolidate version 0.1.11 of the CIM EP (still being developed).

  12. Cloud Based Web 3d GIS Taiwan Platform

    NASA Astrophysics Data System (ADS)

    Tsai, W.-F.; Chang, J.-Y.; Yan, S. Y.; Chen, B.

    2011-09-01

    This article presents the status of the web 3D GIS platform, which has been developed in the National Applied Research Laboratories. The purpose is to develop a global earth observation 3D GIS platform for applications to disaster monitoring and assessment in Taiwan. For quick response to preliminary and detailed assessment after a natural disaster occurs, the web 3D GIS platform is useful to access, transfer, integrate, display and analyze the multi-scale huge data following the international OGC standard. The framework of cloud service for data warehousing management and efficiency enhancement using VMWare is illustrated in this article.

  13. Interoperability And Value Added To Earth Observation Data

    NASA Astrophysics Data System (ADS)

    Gasperi, J.

    2012-04-01

    Geospatial web services technology has provided a new means for geospatial data interoperability. Open Geospatial Consortium (OGC) services such as Web Map Service (WMS) to request maps on the Internet, Web Feature Service (WFS) to exchange vectors or Catalog Service for the Web (CSW) to search for geospatialized data have been widely adopted in the Geosciences community in general and in the remote sensing community in particular. These services make Earth Observation data available to a wider range of public users than ever before. The mapshup web client offers an innovative and efficient user interface that takes advantage of the power of interoperability. This presentation will demonstrate how mapshup can be effectively used in the context of natural disasters management.

  14. A BPMN solution for chaining OGC services to quality assure location-based crowdsourced data

    NASA Astrophysics Data System (ADS)

    Meek, Sam; Jackson, Mike; Leibovici, Didier G.

    2016-02-01

    The Open Geospatial Consortium (OGC) Web Processing Service (WPS) standard enables access to a centralized repository of processes and services from compliant clients. A crucial part of the standard includes the provision to chain disparate processes and services to form a reusable workflow. To date this has been realized by methods such as embedding XML requests, using Business Process Execution Language (BPEL) engines and other external orchestration engines. Although these allow the user to define tasks and data artifacts as web services, they are often considered inflexible and complicated, often due to vendor specific solutions and inaccessible documentation. This paper introduces a new method of flexible service chaining using the standard Business Process Markup Notation (BPMN). A prototype system has been developed upon an existing open source BPMN suite to illustrate the advantages of the approach. The motivation for the software design is qualification of crowdsourced data for use in policy-making. The software is tested as part of a project that seeks to qualify, assure, and add value to crowdsourced data in a biological monitoring use case.

  15. OpenSearch technology for geospatial resources discovery

    NASA Astrophysics Data System (ADS)

    Papeschi, Fabrizio; Enrico, Boldrini; Mazzetti, Paolo

    2010-05-01

    In 2005, the term Web 2.0 has been coined by Tim O'Reilly to describe a quickly growing set of Web-based applications that share a common philosophy of "mutually maximizing collective intelligence and added value for each participant by formalized and dynamic information sharing". Around this same period, OpenSearch a new Web 2.0 technology, was developed. More properly, OpenSearch is a collection of technologies that allow publishing of search results in a format suitable for syndication and aggregation. It is a way for websites and search engines to publish search results in a standard and accessible format. Due to its strong impact on the way the Web is perceived by users and also due its relevance for businesses, Web 2.0 has attracted the attention of both mass media and the scientific community. This explosive growth in popularity of Web 2.0 technologies like OpenSearch, and practical applications of Service Oriented Architecture (SOA) resulted in an increased interest in similarities, convergence, and a potential synergy of these two concepts. SOA is considered as the philosophy of encapsulating application logic in services with a uniformly defined interface and making these publicly available via discovery mechanisms. Service consumers may then retrieve these services, compose and use them according to their current needs. A great degree of similarity between SOA and Web 2.0 may be leading to a convergence between the two paradigms. They also expose divergent elements, such as the Web 2.0 support to the human interaction in opposition to the typical SOA machine-to-machine interaction. According to these considerations, the Geospatial Information (GI) domain, is also moving first steps towards a new approach of data publishing and discovering, in particular taking advantage of the OpenSearch technology. A specific GI niche is represented by the OGC Catalog Service for Web (CSW) that is part of the OGC Web Services (OWS) specifications suite, which provides a set of services for discovery, access, and processing of geospatial resources in a SOA framework. GI-cat is a distributed CSW framework implementation developed by the ESSI Lab of the Italian National Research Council (CNR-IMAA) and the University of Florence. It provides brokering and mediation functionalities towards heterogeneous resources and inventories, exposing several standard interfaces for query distribution. This work focuses on a new GI-cat interface which allows the catalog to be queried according to the OpenSearch syntax specification, thus filling the gap between the SOA architectural design of the CSW and the Web 2.0. At the moment, there is no OGC standard specification about this topic, but an official change request has been proposed in order to enable the OGC catalogues to support OpenSearch queries. In this change request, an OpenSearch extension is proposed providing a standard mechanism to query a resource based on temporal and geographic extents. Two new catalog operations are also proposed, in order to publish a suitable OpenSearch interface. This extended interface is implemented by the modular GI-cat architecture adding a new profiling module called "OpenSearch profiler". Since GI-cat also acts as a clearinghouse catalog, another component called "OpenSearch accessor" is added in order to access OpenSearch compliant services. An important role in the GI-cat extension, is played by the adopted mapping strategy. Two different kind of mappings are required: query, and response elements mapping. Query mapping is provided in order to fit the simple OpenSearch query syntax to the complex CSW query expressed by the OGC Filter syntax. GI-cat internal data model is based on the ISO-19115 profile, that is more complex than the simple XML syndication formats, such as RSS 2.0 and Atom 1.0, suggested by OpenSearch. Once response elements are available, in order to be presented, they need to be translated from the GI-cat internal data model, to the above mentioned syndication formats; the mapping processing, is bidirectional. When GI-cat is used to access OpenSearch compliant services, the CSW query must be mapped to the OpenSearch query, and the response elements, must be translated according to the GI-cat internal data model. As results of such extensions, GI-cat provides a user friendly facade to the complex CSW interface, thus enabling it to be queried, for example, using a browser toolbar.

  16. Realising the Uncertainty Enabled Model Web

    NASA Astrophysics Data System (ADS)

    Cornford, D.; Bastin, L.; Pebesma, E. J.; Williams, M.; Stasch, C.; Jones, R.; Gerharz, L.

    2012-12-01

    The FP7 funded UncertWeb project aims to create the "uncertainty enabled model web". The central concept here is that geospatial models and data resources are exposed via standard web service interfaces, such as the Open Geospatial Consortium (OGC) suite of encodings and interface standards, allowing the creation of complex workflows combining both data and models. The focus of UncertWeb is on the issue of managing uncertainty in such workflows, and providing the standards, architecture, tools and software support necessary to realise the "uncertainty enabled model web". In this paper we summarise the developments in the first two years of UncertWeb, illustrating several key points with examples taken from the use case requirements that motivate the project. Firstly we address the issue of encoding specifications. We explain the usage of UncertML 2.0, a flexible encoding for representing uncertainty based on a probabilistic approach. This is designed to be used within existing standards such as Observations and Measurements (O&M) and data quality elements of ISO19115 / 19139 (geographic information metadata and encoding specifications) as well as more broadly outside the OGC domain. We show profiles of O&M that have been developed within UncertWeb and how UncertML 2.0 is used within these. We also show encodings based on NetCDF and discuss possible future directions for encodings in JSON. We then discuss the issues of workflow construction, considering discovery of resources (both data and models). We discuss why a brokering approach to service composition is necessary in a world where the web service interfaces remain relatively heterogeneous, including many non-OGC approaches, in particular the more mainstream SOAP and WSDL approaches. We discuss the trade-offs between delegating uncertainty management functions to the service interfaces themselves and integrating the functions in the workflow management system. We describe two utility services to address conversion between uncertainty types, and between the spatial / temporal support of service inputs / outputs. Finally we describe the tools being generated within the UncertWeb project, considering three main aspects: i) Elicitation of uncertainties on model inputs. We are developing tools to enable domain experts to provide judgements about input uncertainties from UncertWeb model components (e.g. parameters in meteorological models) which allow panels of experts to engage in the process and reach a consensus view on the current knowledge / beliefs about that parameter or variable. We are developing systems for continuous and categorical variables as well as stationary spatial fields. ii) Visualisation of the resulting uncertain outputs from the end of the workflow, but also at intermediate steps. At this point we have prototype implementations driven by the requirements from the use cases that motivate UncertWeb. iii) Sensitivity and uncertainty analysis on model outputs. Here we show the design of the overall system we are developing, including the deployment of an emulator framework to allow computationally efficient approaches. We conclude with a summary of the open issues and remaining challenges we are facing in UncertWeb, and provide a brief overview of how we plan to tackle these.

  17. GeoNetwork powered GI-cat: a geoportal hybrid solution

    NASA Astrophysics Data System (ADS)

    Baldini, Alessio; Boldrini, Enrico; Santoro, Mattia; Mazzetti, Paolo

    2010-05-01

    To the aim of setting up a Spatial Data Infrastructures (SDI) the creation of a system for the metadata management and discovery plays a fundamental role. An effective solution is the use of a geoportal (e.g. FAO/ESA geoportal), that has the important benefit of being accessible from a web browser. With this work we present a solution based integrating two of the available frameworks: GeoNetwork and GI-cat. GeoNetwork is an opensource software designed to improve accessibility of a wide variety of data together with the associated ancillary information (metadata), at different scale and from multidisciplinary sources; data are organized and documented in a standard and consistent way. GeoNetwork implements both the Portal and Catalog components of a Spatial Data Infrastructure (SDI) defined in the OGC Reference Architecture. It provides tools for managing and publishing metadata on spatial data and related services. GeoNetwork allows harvesting of various types of web data sources e.g. OGC Web Services (e.g. CSW, WCS, WMS). GI-cat is a distributed catalog based on a service-oriented framework of modular components and can be customized and tailored to support different deployment scenarios. It can federate a multiplicity of catalogs services, as well as inventory and access services in order to discover and access heterogeneous ESS resources. The federated resources are exposed by GI-cat through several standard catalog interfaces (e.g. OGC CSW AP ISO, OpenSearch, etc.) and by the GI-cat extended interface. Specific components implement mediation services for interfacing heterogeneous service providers, each of which exposes a specific standard specification; such components are called Accessors. These mediating components solve providers data modelmultiplicity by mapping them onto the GI-cat internal data model which implements the ISO 19115 Core profile. Accessors also implement the query protocol mapping; first they translate the query requests expressed according to the interface protocols exposed by GI-cat into the multiple query dialects spoken by the resource service providers. Currently, a number of well-accepted catalog and inventory services are supported, including several OGC Web Services, THREDDS Data Server, SeaDataNet Common Data Index, GBIF and OpenSearch engines. A GeoNetwork powered GI-cat has been developed in order to exploit the best of the two frameworks. The new system uses a modified version of GeoNetwork web interface in order to add the capability of querying also the specified GI-cat catalog and not only the GeoNetwork internal database. The resulting system consists in a geoportal in which GI-cat plays the role of the search engine. This new system allows to distribute the query on the different types of data sources linked to a GI-cat. The metadata results of the query are then visualized by the Geonetwork web interface. This configuration was experimented in the framework of GIIDA, a project of the Italian National Research Council (CNR) focused on data accessibility and interoperability. A second advantage of this solution is achieved setting up a GeoNetwork catalog amongst the accessors of the GI-cat instance. Such a configuration will allow in turn GI-cat to run the query against the internal GeoNetwork database. This allows to have both the harvesting and the metadata editor functionalities provided by GeoNetwork and the distributed search functionality of GI-cat available in a consistent way through the same web interface.

  18. New implementation of OGC Web Processing Service in Python programming language. PyWPS-4 and issues we are facing with processing of large raster data using OGC WPS

    NASA Astrophysics Data System (ADS)

    Čepický, Jáchym; Moreira de Sousa, Luís

    2016-06-01

    The OGC® Web Processing Service (WPS) Interface Standard provides rules for standardizing inputs and outputs (requests and responses) for geospatial processing services, such as polygon overlay. The standard also defines how a client can request the execution of a process, and how the output from the process is handled. It defines an interface that facilitates publishing of geospatial processes and client discovery of processes and and binding to those processes into workflows. Data required by a WPS can be delivered across a network or they can be available at a server. PyWPS was one of the first implementations of OGC WPS on the server side. It is written in the Python programming language and it tries to connect to all existing tools for geospatial data analysis, available on the Python platform. During the last two years, the PyWPS development team has written a new version (called PyWPS-4) completely from scratch. The analysis of large raster datasets poses several technical issues in implementing the WPS standard. The data format has to be defined and validated on the server side and binary data have to be encoded using some numeric representation. Pulling raster data from remote servers introduces security risks, in addition, running several processes in parallel has to be possible, so that system resources are used efficiently while preserving security. Here we discuss these topics and illustrate some of the solutions adopted within the PyWPS implementation.

  19. Environmental Monitoring Using Sensor Networks

    NASA Astrophysics Data System (ADS)

    Yang, J.; Zhang, C.; Li, X.; Huang, Y.; Fu, S.; Acevedo, M. F.

    2008-12-01

    Environmental observatories, consisting of a variety of sensor systems, computational resources and informatics, are important for us to observe, model, predict, and ultimately help preserve the health of the nature. The commoditization and proliferation of coin-to-palm sized wireless sensors will allow environmental monitoring with unprecedented fine spatial and temporal resolution. Once scattered around, these sensors can identify themselves, locate their positions, describe their functions, and self-organize into a network. They communicate through wireless channel with nearby sensors and transmit data through multi-hop protocols to a gateway, which can forward information to a remote data server. In this project, we describe an environmental observatory called Texas Environmental Observatory (TEO) that incorporates a sensor network system with intertwined wired and wireless sensors. We are enhancing and expanding the existing wired weather stations to include wireless sensor networks (WSNs) and telemetry using solar-powered cellular modems. The new WSNs will monitor soil moisture and support long-term hydrologic modeling. Hydrologic models are helpful in predicting how changes in land cover translate into changes in the stream flow regime. These models require inputs that are difficult to measure over large areas, especially variables related to storm events, such as soil moisture antecedent conditions and rainfall amount and intensity. This will also contribute to improve rainfall estimations from meteorological radar data and enhance hydrological forecasts. Sensor data are transmitted from monitoring site to a Central Data Collection (CDC) Server. We incorporate a GPRS modem for wireless telemetry, a single-board computer (SBC) as Remote Field Gateway (RFG) Server, and a WSN for distributed soil moisture monitoring. The RFG provides effective control, management, and coordination of two independent sensor systems, i.e., a traditional datalogger-based wired sensor system and the WSN-based wireless sensor system. The RFG also supports remote manipulation of the devices in the field such as the SBC, datalogger, and WSN. Sensor data collected from the distributed monitoring stations are stored in a database (DB) Server. The CDC Server acts as an intermediate component to hide the heterogeneity of different devices and support data validation required by the DB Server. Daemon programs running on the CDC Server pre-process the data before it is inserted into the database, and periodically perform synchronization tasks. A SWE-compliant data repository is installed to enable data exchange, accepting data from both internal DB Server and external sources through the OGC web services. The web portal, i.e. TEO Online, serves as a user-friendly interface for data visualization, analysis, synthesis, modeling, and K-12 educational outreach activities. It also provides useful capabilities for system developers and operators to remotely monitor system status and remotely update software and system configuration, which greatly simplifies the system debugging and maintenance tasks. We also implement Sensor Observation Services (SOS) at this layer, conforming to the SWE standard to facilitate data exchange. The standard SensorML/O&M data representation makes it easy to integrate our sensor data into the existing Geographic Information Systems (GIS) web services and exchange the data with other organizations.

  20. The GeoDataPortal: A Standards-based Environmental Modeling Data Access and Manipulation Toolkit

    NASA Astrophysics Data System (ADS)

    Blodgett, D. L.; Kunicki, T.; Booth, N.; Suftin, I.; Zoerb, R.; Walker, J.

    2010-12-01

    Environmental modelers from fields of study such as climatology, hydrology, geology, and ecology rely on many data sources and processing methods that are common across these disciplines. Interest in inter-disciplinary, loosely coupled modeling and data sharing is increasing among scientists from the USGS, other agencies, and academia. For example, hydrologic modelers need downscaled climate change scenarios and land cover data summarized for the watersheds they are modeling. Subsequently, ecological modelers are interested in soil moisture information for a particular habitat type as predicted by the hydrologic modeler. The USGS Center for Integrated Data Analytics Geo Data Portal (GDP) project seeks to facilitate this loose model coupling data sharing through broadly applicable open-source web processing services. These services simplify and streamline the time consuming and resource intensive tasks that are barriers to inter-disciplinary collaboration. The GDP framework includes a catalog describing projects, models, data, processes, and how they relate. Using newly introduced data, or sources already known to the catalog, the GDP facilitates access to sub-sets and common derivatives of data in numerous formats on disparate web servers. The GDP performs many of the critical functions needed to summarize data sources into modeling units regardless of scale or volume. A user can specify their analysis zones or modeling units as an Open Geospatial Consortium (OGC) standard Web Feature Service (WFS). Utilities to cache Shapefiles and other common GIS input formats have been developed to aid in making the geometry available for processing via WFS. Dataset access in the GDP relies primarily on the Unidata NetCDF-Java library’s common data model. Data transfer relies on methods provided by Unidata’s Thematic Real-time Environmental Data Distribution System Data Server (TDS). TDS services of interest include the Open-source Project for a Network Data Access Protocol (OPeNDAP) standard for gridded time series, the OGC’s Web Coverage Service for high-density static gridded data, and Unidata’s CDM-remote for point time series. OGC WFS and Sensor Observation Service (SOS) are being explored as mechanisms to serve and access static or time series data attributed to vector geometry. A set of standardized XML-based output formats allows easy transformation into a wide variety of “model-ready” formats. Interested users will have the option of submitting custom transformations to the GDP or transforming the XML output as a post-process. The GDP project aims to support simple, rapid development of thin user interfaces (like web portals) to commonly needed environmental modeling-related data access and manipulation tools. Standalone, service-oriented components of the GDP framework provide the metadata cataloging, data subset access, and spatial-statistics calculations needed to support interdisciplinary environmental modeling.

  1. Common Data Models and Efficient Reproducible Workflows for Distributed Ocean Model Skill Assessment

    NASA Astrophysics Data System (ADS)

    Signell, R. P.; Snowden, D. P.; Howlett, E.; Fernandes, F. A.

    2014-12-01

    Model skill assessment requires discovery, access, analysis, and visualization of information from both sensors and models, and traditionally has been possible only by a few experts. The US Integrated Ocean Observing System (US-IOOS) consists of 17 Federal Agencies and 11 Regional Associations that produce data from various sensors and numerical models; exactly the information required for model skill assessment. US-IOOS is seeking to develop documented skill assessment workflows that are standardized, efficient, and reproducible so that a much wider community can participate in the use and assessment of model results. Standardization requires common data models for observational and model data. US-IOOS relies on the CF Conventions for observations and structured grid data, and on the UGRID Conventions for unstructured (e.g. triangular) grid data. This allows applications to obtain only the data they require in a uniform and parsimonious way using web services: OPeNDAP for model output and OGC Sensor Observation Service (SOS) for observed data. Reproducibility is enabled with IPython Notebooks shared on GitHub (http://github.com/ioos). These capture the entire skill assessment workflow, including user input, search, access, analysis, and visualization, ensuring that workflows are self-documenting and reproducible by anyone, using free software. Python packages for common data models are Pyugrid and the British Met Office Iris package. Python packages required to run the workflows (pyugrid, pyoos, and the British Met Office Iris package) are also available on GitHub and on Binstar.org so that users can run scenarios using the free Anaconda Python distribution. Hosted services such as Wakari enable anyone to reproduce these workflows for free, without installing any software locally, using just their web browser. We are also experimenting with Wakari Enterprise, which allows multi-user access from a web browser to an IPython Server running where large quantities of model output reside, increasing the efficiency. The open development and distribution of these workflows, and the software on which they depend, is an educational resource for those new to the field and a center of focus where practitioners can contribute new software and ideas.

  2. An integrative solution for managing, tracing and citing sensor-related information

    NASA Astrophysics Data System (ADS)

    Koppe, Roland; Gerchow, Peter; Macario, Ana; Schewe, Ingo; Rehmcke, Steven; Düde, Tobias

    2017-04-01

    In a data-driven scientific world, the need to capture information on sensors used in the data acquisition process has become increasingly important. Following the recommendations of the Open Geospatial Consortium (OGC), we started by adopting the SensorML standard for describing platforms, devices and sensors. However, it soon became obvious to us that understanding, implementing and filling such standards costs significant effort and cannot be expected from every scientist individually. So we developed a web-based sensor management solution (https://sensor.awi.de) for describing platforms, devices and sensors as hierarchy of systems which supports tracing changes to a system whereas hiding complexity. Each platform contains devices where each device can have sensors associated with specific identifiers, contacts, events, related online resources (e.g. manufacturer factsheets, calibration documentation, data processing documentation), sensor output parameters and geo-location. In order to better understand and address real world requirements, we have closely interacted with field-going scientists in the context of the key national infrastructure project "FRontiers in Arctic marine Monitoring ocean observatory" (FRAM) during the software development. We learned that not only the lineage of observations is crucial for scientists but also alert services using value ranges, flexible output formats and information on data providers (e.g. FTP sources) for example. Mostly important, persistent and citable versions of sensor descriptions are required for traceability and reproducibility allowing seamless integration with existing information systems, e.g. PANGAEA. Within the context of the EU-funded Ocean Data Interoperability Platform project (ODIP II) and in cooperation with 52north we are proving near real-time data via Sensor Observation Services (SOS) along with sensor descriptions based on our sensor management solution. ODIP II also aims to develop a harmonized SensorML profile for the marine community which we will be adopting in our solution as soon as available. In this presentation we will show our sensor management solution which is embedded in our data flow framework to offer out-of-the-box interoperability with existing information systems and standards. In addition, we will present real world examples and challenges related to the description and traceability of sensor metadata.

  3. The evolution of the CUAHSI Water Markup Language (WaterML)

    NASA Astrophysics Data System (ADS)

    Zaslavsky, I.; Valentine, D.; Maidment, D.; Tarboton, D. G.; Whiteaker, T.; Hooper, R.; Kirschtel, D.; Rodriguez, M.

    2009-04-01

    The CUAHSI Hydrologic Information System (HIS, his.cuahsi.org) uses web services as the core data exchange mechanism which provides programmatic connection between many heterogeneous sources of hydrologic data and a variety of online and desktop client applications. The service message schema follows the CUAHSI Water Markup Language (WaterML) 1.x specification (see OGC Discussion Paper 07-041r1). Data sources that can be queried via WaterML-compliant water data services include national and international repositories such as USGS NWIS (National Water Information System), USEPA STORET (Storage & Retrieval), USDA SNOTEL (Snowpack Telemetry), NCDC ISH and ISD(Integrated Surface Hourly and Daily Data), MODIS (Moderate Resolution Imaging Spectroradiometer), and DAYMET (Daily Surface Weather Data and Climatological Summaries). Besides government data sources, CUAHSI HIS provides access to a growing number of academic hydrologic observation networks. These networks are registered by researchers associated with 11 hydrologic observatory testbeds around the US, and other research, government and commercial groups wishing to join the emerging CUAHSI Water Data Federation. The Hydrologic Information Server (HIS Server) software stack deployed at NSF-supported hydrologic observatory sites and other universities around the country, supports a hydrologic data publication workflow which includes the following steps: (1) observational data are loaded from static files or streamed from sensors into a local instance of an Observations Data Model (ODM) database; (2) a generic web service template is configured for the new ODM instance to expose the data as a WaterML-compliant water data service, and (3) the new water data service is registered at the HISCentral registry (hiscentral.cuahsi.org), its metadata are harvested and semantically tagged using concepts from a hydrologic ontology. As a result, the new service is indexed in the CUAHSI central metadata catalog, and becomes available for spatial and semantics-based queries. The main component of interoperability across hydrologic data repositories in CUAHSI HIS is mapping different repository schemas and semantics to a shared community information model for observations made at stationary points. This information model has been implemented as both a relational schema (ODM) and an XML schema (WaterML). Its main design drivers have been data storage and data interchange needs of hydrology researchers, a series of community reviews of the ODM, and the practices of hydrologic data modeling and presentation adopted by federal agencies as observed in agency online data access applications, such as NWISWeb and USEPA STORET. The goal of the first version of WaterML was to encode the semantics of hydrologic observations discovery and retrieval and implement water data services in a way that is generic across different data providers. In particular, this implied maintaining a single common representation for the key constructs returned to web service calls, related to observations, features of interest, observation procedures, observation series, etc. Another WaterML design consideration was to create (in version 1 of CUAHSI HIS in particular) a fairly rigid, compact, and simple XML schema which was easy to generate and parse, thus creating the least barrier for adoption by hydrologists. Each of the three main request methods in the water data web services - GetSiteInfo, GetVariableInfo, and GetValues - has a corresponding response element in WaterML: SiteResponse, VariableResponse, and TimeSeriesResponse. The strictness and compactness of the first version of WaterML supported its community adoption. Over the last two years, several ODM and WaterML implementations for various platforms have emerged, and several Water Data Services client applications have been created by outside groups in both industry and academia. In a significant development, the WaterML specification has been adopted by federal agencies. The experimental USGS NWIS Daily Values web service returns WaterML-compliant TimeSeriesResponse. NCDC is also prototyping WaterML for data delivery, and has developed a REST-based service that generates WaterML- compliant output for its integrated station network. These agency-supported web services provide a much more efficient way to deliver agency data compared to the web site scraper services that the CUAHSI HIS project developed initially. Adoption of WaterML by the US Geological Survey is particularly significant because the USGS maintains by far the largest water data repository in the United States. For version 1.1, WaterML has evolved to reflect the deployment experience at hydrologic observatory testbeds, as well as feedback from hydrologic data repository managers at federal and state agencies. Further development of WaterML and enhancement of the underlying information model is the focus of the recently established OGC Hydrology Domain Working Group, whose mission is to profile OGC standards (GML, O&M, SOS, WCS, WFS) for the water resources domain and thus ensure WaterML's wider applicability and easier implementation. WaterML 2.0 is envisioned as an OGC-compliant application schema that supports OGC features, can express different types of observations and various groupings of observations, and allows researchers to define custom metadata elements. This presentation will discuss the information model underlying WaterML and describe the rationale, design drivers and evolution of WaterML and the water data services, illustrating their recent application in the context of CUAHSI HIS and the hydrologic observatory testbeds.

  4. Designing Crop Simulation Web Service with Service Oriented Architecture Principle

    NASA Astrophysics Data System (ADS)

    Chinnachodteeranun, R.; Hung, N. D.; Honda, K.

    2015-12-01

    Crop simulation models are efficient tools for simulating crop growth processes and yield. Running crop models requires data from various sources as well as time-consuming data processing, such as data quality checking and data formatting, before those data can be inputted to the model. It makes the use of crop modeling limited only to crop modelers. We aim to make running crop models convenient for various users so that the utilization of crop models will be expanded, which will directly improve agricultural applications. As the first step, we had developed a prototype that runs DSSAT on Web called as Tomorrow's Rice (v. 1). It predicts rice yields based on a planting date, rice's variety and soil characteristics using DSSAT crop model. A user only needs to select a planting location on the Web GUI then the system queried historical weather data from available sources and expected yield is returned. Currently, we are working on weather data connection via Sensor Observation Service (SOS) interface defined by Open Geospatial Consortium (OGC). Weather data can be automatically connected to a weather generator for generating weather scenarios for running the crop model. In order to expand these services further, we are designing a web service framework consisting of layers of web services to support compositions and executions for running crop simulations. This framework allows a third party application to call and cascade each service as it needs for data preparation and running DSSAT model using a dynamic web service mechanism. The framework has a module to manage data format conversion, which means users do not need to spend their time curating the data inputs. Dynamic linking of data sources and services are implemented using the Service Component Architecture (SCA). This agriculture web service platform demonstrates interoperability of weather data using SOS interface, convenient connections between weather data sources and weather generator, and connecting various services for running crop models for decision support.

  5. Exposing Coverage Data to the Semantic Web within the MELODIES project: Challenges and Solutions

    NASA Astrophysics Data System (ADS)

    Riechert, Maik; Blower, Jon; Griffiths, Guy

    2016-04-01

    Coverage data, typically big in data volume, assigns values to a given set of spatiotemporal positions, together with metadata on how to interpret those values. Existing storage formats like netCDF, HDF and GeoTIFF all have various restrictions that prevent them from being preferred formats for use over the web, especially the semantic web. Factors that are relevant here are the processing complexity, the semantic richness of the metadata, and the ability to request partial information, such as a subset or just the appropriate metadata. Making coverage data available within web browsers opens the door to new ways for working with such data, including new types of visualization and on-the-fly processing. As part of the European project MELODIES (http://melodiesproject.eu) we look into the challenges of exposing such coverage data in an interoperable and web-friendly way, and propose solutions using a host of emerging technologies like JSON-LD, the DCAT and GeoDCAT-AP ontologies, the CoverageJSON format, and new approaches to REST APIs for coverage data. We developed the CoverageJSON format within the MELODIES project as an additional way to expose coverage data to the web, next to having simple rendered images available using standards like OGC's WMS. CoverageJSON partially incorporates JSON-LD but does not encode individual data values as semantic resources, making use of the technology in a practical manner. The development also focused on it being a potential output format for OGC WCS. We will demonstrate how existing netCDF data can be exposed as CoverageJSON resources on the web together with a REST API that allows users to explore the data and run operations such as spatiotemporal subsetting. We will show various use cases from the MELODIES project, including reclassification of a Land Cover dataset client-side within the browser with the ability for the user to influence the reclassification result by making use of the above technologies.

  6. Rolling Deck to Repository (R2R): Big Data and Standard Services for the Fleet Community

    NASA Astrophysics Data System (ADS)

    Arko, R. A.; Carbotte, S. M.; Chandler, C. L.; Smith, S. R.; Stocks, K. I.

    2014-12-01

    The Rolling Deck to Repository (R2R; http://rvdata.us/) program curates underway environmental sensor data from the U.S. academic oceanographic research fleet, ensuring data sets are routinely and consistently documented, preserved in long-term archives, and disseminated to the science community. Currently 25 in-service vessels contribute 7 terabytes of data to R2R each year, acquired from a full suite of geophysical, oceanographic, meteorological, and navigational sensors on over 400 cruises worldwide. To accommodate this large volume and variety of data, R2R has developed highly efficient stewardship procedures. These include scripted "break out" of cruise data packages from each vessel based on standard filename and directory patterns; automated harvest of cruise metadata from the UNOLS Office via Web Services and from OpenXML-based forms submitted by vessel operators; scripted quality assessment routines that calculate statistical summaries and standard ratings for selected data types; adoption of community-standard controlled vocabularies for vessel codes, instrument types, etc, provided by the NERC Vocabulary Server, in lieu of maintaining custom local term lists; and a standard package structure based on the IETF BagIt format for delivering data to long-term archives. Documentation and standard post-field products, including quality-controlled shiptrack navigation data for every cruise, are published in multiple services and formats to satisfy a diverse range of clients. These include Catalog Service for Web (CSW), GeoRSS, and OAI-PMH discovery services via a GeoNetwork portal; OGC Web Map and Feature Services for GIS clients; a citable Digital Object Identifier (DOI) for each dataset; ISO 19115-2 standard geospatial metadata records suitable for submission to long-term archives as well as the POGO global catalog; and Linked Open Data resources with a SPARQL query endpoint for Semantic Web clients. R2R participates in initiatives such as the Ocean Data Interoperability Platform (ODIP) and the NSF EarthCube OceanLink project to promote community-standard formats, vocabularies, and services among ocean data providers.

  7. A Story of a Crashed Plane in US-Mexican border

    NASA Astrophysics Data System (ADS)

    Bermudez, Luis; Hobona, Gobe; Vretanos, Peter; Peterson, Perry

    2013-04-01

    A plane has crashed on the US-Mexican border. The search and rescue command center planner needs to find information about the crash site, a mountain, nearby mountains for the establishment of a communications tower, as well as ranches for setting up a local incident center. Events like this one occur all over the world and exchanging information seamlessly is key to save lives and prevent further disasters. This abstract describes an interoperability testbed that applied this scenario using technologies based on Open Geospatial Consortium (OGC) standards. The OGC, which has about 500 members, serves as a global forum for the collaboration of developers and users of spatial data products and services, and to advance the development of international standards for geospatial interoperability. The OGC Interoperability Program conducts international interoperability testbeds, such as the OGC Web Services Phase 9 (OWS-9), that encourages rapid development, testing, validation, demonstration and adoption of open, consensus based standards and best practices. The Cross-Community Interoperability (CCI) thread in OWS-9 advanced the Web Feature Service for Gazetteers (WFS-G) by providing a Single Point of Entry Global Gazetteer (SPEGG), where a user can submit a single query and access global geographic names data across multiple Federal names databases. Currently users must make two queries with differing input parameters against two separate databases to obtain authoritative cross border geographic names data. The gazetteers in this scenario included: GNIS and GNS. GNIS or Geographic Names Information System is managed by USGS. It was first developed in 1964 and contains information about domestic and Antarctic names. GNS or GeoNET Names Server provides the Geographic Names Data Base (GNDB) and it is managed by National Geospatial Intelligence Agency (NGA). GNS has been in service since 1994, and serves names for areas outside the United States and its dependent areas, as well as names for undersea features. The following challenges were advanced: Cascaded WFS-G servers (allowing to query multiple WFSs with a "parent" WFS), implemented query names filters (e.g. fuzzy search, text search), implemented dealing with multilingualism and diacritics, implemented advanced spatial constraints (e.g. search by radial search and nearest neighbor) and semantically mediated feature types (e.g. mountain vs. hill). To enable semantic mediation, a series of semantic mappings were defined between the NGA GNS, USGS GNIS and the Alexandria Digital Library (ADL) Gazetteer. The mappings were encoded in the Web Ontology Language (OWL) to enable them to be used by semantic web technologies. The semantic mappings were then published for ingestion into a semantic mediator that used the mappings to associate location types from one gazetteer with location types in another. The semantic mediator was then able to transform requests on the fly, providing a single point of entry WFS-G to multiple gazetteers. The presentation will provide a live presentation of the work performed, highlight main developments, and discuss future development.

  8. The Namibia Early Flood Warning System, A CEOS Pilot Project

    NASA Technical Reports Server (NTRS)

    Mandl, Daniel; Frye, Stuart; Cappelaere, Pat; Sohlberg, Robert; Handy, Matthew; Grossman, Robert

    2012-01-01

    Over the past year few years, an international collaboration has developed a pilot project under the auspices of Committee on Earth Observation Satellite (CEOS) Disasters team. The overall team consists of civilian satellite agencies. For this pilot effort, the development team consists of NASA, Canadian Space Agency, Univ. of Maryland, Univ. of Colorado, Univ. of Oklahoma, Ukraine Space Research Institute and Joint Research Center(JRC) for European Commission. This development team collaborates with regional , national and international agencies to deliver end-to-end disaster coverage. In particular, the team in collaborating on this effort with the Namibia Department of Hydrology to begin in Namibia . However, the ultimate goal is to expand the functionality to provide early warning over the South Africa region. The initial collaboration was initiated by United Nations Office of Outer Space Affairs and CEOS Working Group for Information Systems and Services (WGISS). The initial driver was to demonstrate international interoperability using various space agency sensors and models along with regional in-situ ground sensors. In 2010, the team created a preliminary semi-manual system to demonstrate moving and combining key data streams and delivering the data to the Namibia Department of Hydrology during their flood season which typically is January through April. In this pilot, a variety of moderate resolution and high resolution satellite flood imagery was rapidly delivered and used in conjunction with flood predictive models in Namibia. This was collected in conjunction with ground measurements and was used to examine how to create a customized flood early warning system. During the first year, the team made use of SensorWeb technology to gather various sensor data which was used to monitor flood waves traveling down basins originating in Angola, but eventually flooding villages in Namibia. The team made use of standardized interfaces such as those articulated under the Open Cloud Consortium (OGC) Sensor Web Enablement (SWE) set of web services was good [1][2]. However, it was discovered that in order to make a system like this functional, there were many performance issues. Data sets were large and located in a variety of location behind firewalls and had to be accessed across open networks, so security was an issue. Furthermore, the network access acted as bottleneck to transfer map products to where they are needed. Finally, during disasters, many users and computer processes act in parallel and thus it was very easy to overload the single string of computers stitched together in a virtual system that was initially developed. To address some of these performance issues, the team partnered with the Open Cloud Consortium (OCC) who supplied a Computation Cloud located at the University of Illinois at Chicago and some manpower to administer this Cloud. The Flood SensorWeb [3] system was interfaced to the Cloud to provide a high performance user interface and product development engine. Figure 1 shows the functional diagram of the Flood SensorWeb. Figure 2 shows some of the functionality of the Computation Cloud that was integrated. A significant portion of the original system was ported to the Cloud and during the past year, technical issues were resolved which included web access to the Cloud, security over the open Internet, beginning experiments on how to handle surge capacity by using the virtual machines in the cloud in parallel, using tiling techniques to render large data sets as layers on map, interfaces to allow user to customize the data processing/product chain and other performance enhancing techniques. The conclusion reached from the effort and this presentation is that defining the interoperability standards in a small fraction of the work. For example, once open web service standards were defined, many users could not make use of the standards due to security restrictions. Furthermore, once an interoperable sysm is functional, then a surge of users can render a system unusable, especially in the disaster domain.

  9. Remote Sensing Information Gateway: A free application and web service for fast, convenient, interoperable access to large repositories of atmospheric data

    NASA Astrophysics Data System (ADS)

    Plessel, T.; Szykman, J.; Freeman, M.

    2012-12-01

    EPA's Remote Sensing Information Gateway (RSIG) is a widely used free applet and web service for quickly and easily retrieving, visualizing and saving user-specified subsets of atmospheric data - by variable, geographic domain and time range. Petabytes of available data include thousands of variables from a set of NASA and NOAA satellites, aircraft, ground stations and EPA air-quality models. The RSIG applet is used by atmospheric researchers and uses the rsigserver web service to obtain data and images. The rsigserver web service is compliant with the Open Geospatial Consortium Web Coverage Service (OGC-WCS) standard to facilitate data discovery and interoperability. Since rsigserver is publicly accessible, it can be (and is) used by other applications. This presentation describes the architecture and technical implementation details of this successful system with an emphasis on achieving convenience, high-performance, data integrity and security.

  10. Analysis of hydrological processes across the Northern Eurasia with recently re-developed online informational system

    NASA Astrophysics Data System (ADS)

    Shiklomanov, A. I.; Proussevitch, A. A.; Gordov, E. P.; Okladnikov, I.; Titov, A. G.

    2016-12-01

    The volume of georeferenced datasets used for hydrology and climate research is growing immensely due to recent advances in modeling, high performance computers, and sensor networks, as well as initiation of a set of large scale complex global and regional monitoring experiments. To facilitate the management and analysis of these extensive data pools we developed Web-based data management, visualization, and analysis system - RIMS - http://earthatlas.sr.unh.edu/ (Rapid Integrated Mapping and Analysis System) with a focus on hydrological applications. Recently, under collaboration with Russian colleagues from the Institute of Monitoring of Climatic and Ecological Systems SB RAS, Russia, we significantly re-designed the RIMS to include the latest Web and GIS technologies in compliance with the Open Geospatial Consortium (OGC) standards. An upgraded RIMS can be successfully applied to address multiple research problems using an extensive data archive and embedded tools for data computations, visualizations and distributions. We will demonstrate current possibility of the system providing several results of applied data analysis fulfilled for territory of the Northern Eurasia. These results will include the analysis of historical, contemporary and future changes in climate and hydrology based on station and gridded data, investigations of recent extreme hydrological events, their anomalies, causes and potential impacts, and creation and analysis of new data sets through integration of social and geophysical data.

  11. Web GIS in practice IV: publishing your health maps and connecting to remote WMS sources using the Open Source UMN MapServer and DM Solutions MapLab

    PubMed Central

    Boulos, Maged N Kamel; Honda, Kiyoshi

    2006-01-01

    Open Source Web GIS software systems have reached a stage of maturity, sophistication, robustness and stability, and usability and user friendliness rivalling that of commercial, proprietary GIS and Web GIS server products. The Open Source Web GIS community is also actively embracing OGC (Open Geospatial Consortium) standards, including WMS (Web Map Service). WMS enables the creation of Web maps that have layers coming from multiple different remote servers/sources. In this article we present one easy to implement Web GIS server solution that is based on the Open Source University of Minnesota (UMN) MapServer. By following the accompanying step-by-step tutorial instructions, interested readers running mainstream Microsoft® Windows machines and with no prior technical experience in Web GIS or Internet map servers will be able to publish their own health maps on the Web and add to those maps additional layers retrieved from remote WMS servers. The 'digital Asia' and 2004 Indian Ocean tsunami experiences in using free Open Source Web GIS software are also briefly described. PMID:16420699

  12. Enabling Web-Based GIS Tools for Internet and Mobile Devices To Improve and Expand NASA Data Accessibility and Analysis Functionality for the Renewable Energy and Agricultural Applications

    NASA Astrophysics Data System (ADS)

    Ross, A.; Stackhouse, P. W.; Tisdale, B.; Tisdale, M.; Chandler, W.; Hoell, J. M., Jr.; Kusterer, J.

    2014-12-01

    The NASA Langley Research Center Science Directorate and Atmospheric Science Data Center have initiated a pilot program to utilize Geographic Information System (GIS) tools that enable, generate and store climatological averages using spatial queries and calculations in a spatial database resulting in greater accessibility of data for government agencies, industry and private sector individuals. The major objectives of this effort include the 1) Processing and reformulation of current data to be consistent with ESRI and openGIS tools, 2) Develop functions to improve capability and analysis that produce "on-the-fly" data products, extending these past the single location to regional and global scales. 3) Update the current web sites to enable both web-based and mobile application displays for optimization on mobile platforms, 4) Interact with user communities in government and industry to test formats and usage of optimization, and 5) develop a series of metrics that allow for monitoring of progressive performance. Significant project results will include the the development of Open Geospatial Consortium (OGC) compliant web services (WMS, WCS, WFS, WPS) that serve renewable energy and agricultural application products to users using GIS software and tools. Each data product and OGC service will be registered within ECHO, the Common Metadata Repository, the Geospatial Platform, and Data.gov to ensure the data are easily discoverable and provide data users with enhanced access to SSE data, parameters, services, and applications. This effort supports cross agency, cross organization, and interoperability of SSE data products and services by collaborating with DOI, NRCan, NREL, NCAR, and HOMER for requirements vetting and test bed users before making available to the wider public.

  13. Business logic for geoprocessing of distributed geodata

    NASA Astrophysics Data System (ADS)

    Kiehle, Christian

    2006-12-01

    This paper describes the development of a business-logic component for the geoprocessing of distributed geodata. The business logic acts as a mediator between the data and the user, therefore playing a central role in any spatial information system. The component is used in service-oriented architectures to foster the reuse of existing geodata inventories. Based on a geoscientific case study of groundwater vulnerability assessment and mapping, the demands for such architectures are identified with special regard to software engineering tasks. Methods are derived from the field of applied Geosciences (Hydrogeology), Geoinformatics, and Software Engineering. In addition to the development of a business logic component, a forthcoming Open Geospatial Consortium (OGC) specification is introduced: the OGC Web Processing Service (WPS) specification. A sample application is introduced to demonstrate the potential of WPS for future information systems. The sample application Geoservice Groundwater Vulnerability is described in detail to provide insight into the business logic component, and demonstrate how information can be generated out of distributed geodata. This has the potential to significantly accelerate the assessment and mapping of groundwater vulnerability. The presented concept is easily transferable to other geoscientific use cases dealing with distributed data inventories. Potential application fields include web-based geoinformation systems operating on distributed data (e.g. environmental planning systems, cadastral information systems, and others).

  14. Web-based spatial analysis with the ILWIS open source GIS software and satellite images from GEONETCast

    NASA Astrophysics Data System (ADS)

    Lemmens, R.; Maathuis, B.; Mannaerts, C.; Foerster, T.; Schaeffer, B.; Wytzisk, A.

    2009-12-01

    This paper involves easy accessible integrated web-based analysis of satellite images with a plug-in based open source software. The paper is targeted to both users and developers of geospatial software. Guided by a use case scenario, we describe the ILWIS software and its toolbox to access satellite images through the GEONETCast broadcasting system. The last two decades have shown a major shift from stand-alone software systems to networked ones, often client/server applications using distributed geo-(web-)services. This allows organisations to combine without much effort their own data with remotely available data and processing functionality. Key to this integrated spatial data analysis is a low-cost access to data from within a user-friendly and flexible software. Web-based open source software solutions are more often a powerful option for developing countries. The Integrated Land and Water Information System (ILWIS) is a PC-based GIS & Remote Sensing software, comprising a complete package of image processing, spatial analysis and digital mapping and was developed as commercial software from the early nineties onwards. Recent project efforts have migrated ILWIS into a modular, plug-in-based open source software, and provide web-service support for OGC-based web mapping and processing. The core objective of the ILWIS Open source project is to provide a maintainable framework for researchers and software developers to implement training components, scientific toolboxes and (web-) services. The latest plug-ins have been developed for multi-criteria decision making, water resources analysis and spatial statistics analysis. The development of this framework is done since 2007 in the context of 52°North, which is an open initiative that advances the development of cutting edge open source geospatial software, using the GPL license. GEONETCast, as part of the emerging Global Earth Observation System of Systems (GEOSS), puts essential environmental data at the fingertips of users around the globe. This user-friendly and low-cost information dissemination provides global information as a basis for decision-making in a number of critical areas, including public health, energy, agriculture, weather, water, climate, natural disasters and ecosystems. GEONETCast makes available satellite images via Digital Video Broadcast (DVB) technology. An OGC WMS interface and plug-ins which convert GEONETCast data streams allow an ILWIS user to integrate various distributed data sources with data locally stored on his machine. Our paper describes a use case in which ILWIS is used with GEONETCast satellite imagery for decision making processes in Ghana. We also explain how the ILWIS software can be extended with additional functionality by means of building plug-ins and unfold our plans to implement other OGC standards, such as WCS and WPS in the same context. Especially, the latter one can be seen as a major step forward in terms of moving well-proven desktop based processing functionality to the web. This enables the embedding of ILWIS functionality in Spatial Data Infrastructures or even the execution in scalable and on-demand cloud computing environments.

  15. Arctic Research Mapping Application (ARMAP): 2D Maps and 3D Globes Support Arctic Science

    NASA Astrophysics Data System (ADS)

    Johnson, G.; Gaylord, A. G.; Brady, J. J.; Cody, R. P.; Aguilar, J. A.; Dover, M.; Garcia-Lavigne, D.; Manley, W.; Score, R.; Tweedie, C. E.

    2007-12-01

    The Arctic Research Mapping Application (ARMAP) is a suite of online services to provide support of Arctic science. These services include: a text based online search utility, 2D Internet Map Server (IMS); 3D globes and Open Geospatial Consortium (OGC) Web Map Services (WMS). With ARMAP's 2D maps and 3D globes, users can navigate to areas of interest, view a variety of map layers, and explore U.S. Federally funded research projects. Projects can be queried by location, year, funding program, discipline, and keyword. Links take you to specific information and other web sites associated with a particular research project. The Arctic Research Logistics Support Service (ARLSS) database is the foundation of ARMAP including US research funded by the National Science Foundation, National Aeronautics and Space Administration, National Oceanic and Atmospheric Administration, and the United States Geological Survey. Avoiding a duplication of effort has been a primary objective of the ARMAP project which incorporates best practices (e.g. Spatial Data Infrastructure and OGC standard web services and metadata) and off the shelf technologies where appropriate. The ARMAP suite provides tools for users of various levels of technical ability to interact with the data by importing the web services directly into their own GIS applications and virtual globes; performing advanced GIS queries; simply printing maps from a set of predefined images in the map gallery; browsing the layers in an IMS; or by choosing to "fly to" sites using a 3D globe. With special emphasis on the International Polar Year (IPY), ARMAP has targeted science planners, scientists, educators, and the general public. In sum, ARMAP goes beyond a simple map display to enable analysis, synthesis, and coordination of Arctic research. ARMAP may be accessed via the gateway web site at http://www.armap.org.

  16. The new OGC Catalogue Services 3.0 specification - status of work

    NASA Astrophysics Data System (ADS)

    Bigagli, Lorenzo; Voges, Uwe

    2013-04-01

    We report on the work of the Open Geospatial Consortium Catalogue Services 3.0 Standards Working Group (OGC Cat 3.0 SWG for short), started in March 2008, with the purpose to process change requests on the Catalogue Services 2.0.2 Implementation Specification (OGC 07-006r1) and produce a revised version thereof, comprising the related XML schemas and abstract test suite. The work was initially intended as a minor revision (version 2.1), but later retargeted as a major update of the standard and rescheduled (the anticipated roadmap ended in 2008). The target audience of Catalogue Services 3.0 includes: • Implementors of catalogue services solutions. • Designers and developers of catalogue services profiles. • Providers/users of catalogue services. The two main general areas of enhancement included: restructuring the specification document according to the OGC standard for modular specifications (OGC 08-131r3, also known as Core and Extension model); incorporating the current mass-market technologies for discovery on the Web, namely OpenSearch. The document was initially split into four parts: the general model and the three protocol bindings HTTP, Z39.50, and CORBA. The CORBA binding, which was very rarely implemented, and the Z39.50 binding have later been dropped. Parts of the Z39.50 binding, namely Search/Retrieve via URL (SRU; same semantics as Z39.50, but stateless), have been provided as a discussion paper (OGC 12-082) for possibly developing a future SRU profile. The Catalogue Services 3.0 specification is structured as follows: • Part 1: General Model (Core) • Part 2: HTTP Protocol Binding (CSW) In CSW, the GET/KVP encoding is mandatory. The POST/XML encoding is optional. SOAP is supported as a special case of the POST/XML encoding. OpenSearch must always be supported, regardless of the implemented profiles, along with the OpenSearch Geospatial and Temporal Extensions (OGC 10-032r2). The latter specifies spatial (e.g. point-plus-radius, bounding box, polygons, in EPSG:4326/WGS84 coordinates) and temporal constraints (e.g. time start/end) for searching. The temporal extent (begin/end) is added as a core queryable and returnable property. Plenty of other changes are incorporated, including improvements to query distribution, WSDL and schema documents, requirements and conformance classes. In conclusion, CS 3.0 is a long-awaited revision of the previous CS 2.0.2 Implementation Specification (approved in early 2007), whose requirements started to be collected as early as July 2007. Various profiles of CS 2.0.2 exist, with related dependencies, and will need to be harmonized with this specification.

  17. Collaborative Science Using Web Services and the SciFlo Grid Dataflow Engine

    NASA Astrophysics Data System (ADS)

    Wilson, B. D.; Manipon, G.; Xing, Z.; Yunck, T.

    2006-12-01

    The General Earth Science Investigation Suite (GENESIS) project is a NASA-sponsored partnership between the Jet Propulsion Laboratory, academia, and NASA data centers to develop a new suite of Web Services tools to facilitate multi-sensor investigations in Earth System Science. The goal of GENESIS is to enable large-scale, multi-instrument atmospheric science using combined datasets from the AIRS, MODIS, MISR, and GPS sensors. Investigations include cross-comparison of spaceborne climate sensors, cloud spectral analysis, study of upper troposphere-stratosphere water transport, study of the aerosol indirect cloud effect, and global climate model validation. The challenges are to bring together very large datasets, reformat and understand the individual instrument retrievals, co-register or re-grid the retrieved physical parameters, perform computationally-intensive data fusion and data mining operations, and accumulate complex statistics over months to years of data. To meet these challenges, we have developed a Grid computing and dataflow framework, named SciFlo, in which we are deploying a set of versatile and reusable operators for data access, subsetting, registration, mining, fusion, compression, and advanced statistical analysis. SciFlo leverages remote Web Services, called via Simple Object Access Protocol (SOAP) or REST (one-line) URLs, and the Grid Computing standards (WS-* &Globus Alliance toolkits), and enables scientists to do multi-instrument Earth Science by assembling reusable Web Services and native executables into a distributed computing flow (tree of operators). The SciFlo client &server engines optimize the execution of such distributed data flows and allow the user to transparently find and use datasets and operators without worrying about the actual location of the Grid resources. In particular, SciFlo exploits the wealth of datasets accessible by OpenGIS Consortium (OGC) Web Mapping Servers & Web Coverage Servers (WMS/WCS), and by Open Data Access Protocol (OpenDAP) servers. The scientist injects a distributed computation into the Grid by simply filling out an HTML form or directly authoring the underlying XML dataflow document, and results are returned directly to the scientist's desktop. Once an analysis has been specified for a chunk or day of data, it can be easily repeated with different control parameters or over months of data. Recently, the Earth Science Information Partners (ESIP) Federation sponsored a collaborative activity in which several ESIP members advertised their respective WMS/WCS and SOAP services, developed some collaborative science scenarios for atmospheric and aerosol science, and then choreographed services from multiple groups into demonstration workflows using the SciFlo engine and a Business Process Execution Language (BPEL) workflow engine. For several scenarios, the same collaborative workflow was executed in three ways: using hand-coded scripts, by executing a SciFlo document, and by executing a BPEL workflow document. We will discuss the lessons learned from this activity, the need for standardized interfaces (like WMS/WCS), the difficulty in agreeing on even simple XML formats and interfaces, and further collaborations that are being pursued.

  18. The Future of Geospatial Standards

    NASA Astrophysics Data System (ADS)

    Bermudez, L. E.; Simonis, I.

    2016-12-01

    The OGC is an international not-for-profit standards development organization (SDO) committed to making quality standards for the geospatial community. A community of more than 500 member organizations with more than 6,000 people registered at the OGC communication platform drives the development of standards that are freely available for anyone to use and to improve sharing of the world's geospatial data. OGC standards are applied in a variety of application domains including Environment, Defense and Intelligence, Smart Cities, Aviation, Disaster Management, Agriculture, Business Development and Decision Support, and Meteorology. Profiles help to apply information models to different communities, thus adapting to particular needs of that community while ensuring interoperability by using common base models and appropriate support services. Other standards address orthogonal aspects such as handling of Big Data, Crowd-sourced information, Geosemantics, or container for offline data usage. Like most SDOs, the OGC develops and maintains standards through a formal consensus process under the OGC Standards Program (OGC-SP) wherein requirements and use cases are discussed in forums generally open to the public (Domain Working Groups, or DWGs), and Standards Working Groups (SWGs) are established to create standards. However, OGC is unique among SDOs in that it also operates the OGC Interoperability Program (OGC-IP) to provide real-world testing of existing and proposed standards. The OGC-IP is considered the experimental playground, where new technologies are researched and developed in a user-driven process. Its goal is to prototype, test, demonstrate, and promote OGC Standards in a structured environment. Results from the OGC-IP often become requirements for new OGC standards or identify deficiencies in existing OGC standards that can be addressed. This presentation will provide an analysis of the work advanced in the OGC consortium including standards and testbeds, where we can extract a trend for the future of geospatial standards. We see a number of key elements in focus, but simultaneously a broadening of standards to address particular communities' needs.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hatano, Yu; Department of Cardivascular Medicine, Tokyo Medical and Dental University, 1-5-45, Yushima, Bunkyo-ku, Tokyo 113-8510; Nakahama, Ken-ichi, E-mail: nakacell@tmd.ac.jp

    Highlights: • M-CSF and RANKL expressing HeLa cells induced osteoclastogenesis in vitro. • We established OGC-containing tumor model in vivo. • OGC-containing tumor became larger independent of M-CSF or RANKL effect. • VEGF-C secreted from OGCs was a one of candidates for OGC-containing tumor growth. - Abstract: Tumors with osteoclast-like giant cells (OGCs) have been reported in a variety of organs and exert an invasive and prometastatic phenotype, but the functional role of OGCs in the tumor environment has not been fully clarified. We established tumors containing OGCs to clarify the role of OGCs in tumor phenotype. A mixture ofmore » HeLa cells expressing macrophage colony-stimulating factor (M-CSF, HeLa-M) and receptor activator of nuclear factor-κB ligand (RANKL, HeLa-R) effectively supported the differentiation of osteoclast-like cells from bone marrow macrophages in vitro. Moreover, a xenograft study showed OGC formation in a tumor composed of HeLa-M and HeLa-R. Surprisingly, the tumors containing OGCs were significantly larger than the tumors without OGCs, although the growth rates were not different in vitro. Histological analysis showed that lymphangiogenesis and macrophage infiltration in the tumor containing OGCs, but not in other tumors were accelerated. According to quantitative PCR analysis, vascular endothelial growth factor (VEGF)-C mRNA expression increased with differentiation of osteoclast-like cells. To investigate whether VEGF-C expression is responsible for tumor growth and macrophage infiltration, HeLa cells overexpressing VEGF-C (HeLa-VC) were established and transplanted into mice. Tumors composed of HeLa-VC mimicked the phenotype of the tumors containing OGCs. Furthermore, the vascular permeability of tumor microvessels also increased in tumors containing OGCs and to some extent in VEGF-C-expressing tumors. These results suggest that macrophage infiltration and vascular permeability are possible mediators in these tumors. These findings revealed that OGCs in the tumor environment promoted tumor growth and lymphangiogenesis, at least in part, by secreting VEGF-C.« less

  20. Testing Conformance to Standards: Notes on the OGC CITE Initiative

    NASA Astrophysics Data System (ADS)

    Bigagli, Lorenzo; Vitale, Fabrizio

    2010-05-01

    In this work, we report on the issues and lessons learnt from our recent experience on assessing service compliance to OGC geospatial standards. Official conformity is warranted by the OGC Compliance & Interoperability Testing & Evaluation (CITE) initiative, through a centrally managed repository of tests, typically developed via initiatives funded by external sponsors. In particular, we have been involved in the ESA-led Heterogeneous Missions Accessibility Testbed (HMA-T) project. HMA-T objectives included the definition of specifications (and related compliance tests) for Earth Observation (EO) Product discovery and access. Our activities have focused on the EO and Catalogue of ISO Metadata (CIM) Extension Packages (EPs) of the ebRIM Application Profile (AP) of the Catalogue Service for the Web (CSW) OGC standard. Our main contributions have regarded the definition of Abstract Test Suites (ATS's) for the above specifications, as well as the development of Reference Implementations (RIs) and concrete Executable Test Suites (ETS's). According to the state-of-the-art, we have implemented the ETS's in Compliance Test Language (CTL), an OGC standard dialect of XML, and deployed the scripts onto the open-source Test Evaluation And Measurement (TEAM) Engine, the official OGC compliance test platform. A significant challenge was to accommodate legacy services, that can not support data publishing. Hence, we could not assume the presence of control test data, necessary for exhaustive assessment. To cope with this, we have proposed and experimented tests for assessing the internal coherence of a target service instance. Another issue was to assess the overall behavior of a target service instance. Although quite obvious, this requirement proved to be hard (if unviable) to implement, since the design of the OGC catalogue specification is multi-layered (i.e. comprised of EP, AP, binding and core functionalities) and, according to the current OGC rationale, ATS/ETS at each layer are supposed to only assess layer-specific functionalities. We argue that a sequence of individually correct "steps" may not result in an overall correct "route". In general, our experience suggests that an ATS is conveniently modeled as a multi-dimensional structure, in that compliance with a (multi-layered) specification may be partitioned into several orthogonal axes (e.g. operation, binding, data model). Hence, the correspondence between Abstract and Executable Test Cases (ATC, ETC, respectively) is not simply biunivocal, but generally an ATC maps to a number of ETCs (or to a single, "multi-modal" ETC), whose actual run-time behavior depend on the intended point of intersection of the ATS axes. This suggests possible benefits of an Aspect-Oriented extension of the ATC/ETC/CTL conceptual model. We identified several other open issues in the current CITE framework: noticeably, the lack of support for the test development phase (the design of the current tools seems more oriented to the certification use-case). Other results we obtained include: the definition of best practices for improved ETS/CTL documentation and the implementation of functionalities for its extraction and formatting; improvements to the readability of test logs and implementation of appropriate log consolidation functionalities in the TEAM Engine; comments and bug reports on the CSW 2.0.2 ETS. These results have been appropriately contributed to the relevant stakeholders. Besides, this work has provided us with new insights into the general OGC specification framework, particularly into the rationale for modular specifications.

  1. SCHeMA open and modular in situ sensing solution

    NASA Astrophysics Data System (ADS)

    Tercier-Waeber, Marie Louise; Novellino, Antonio

    2017-04-01

    Marine environments are highly vulnerable and influenced by a wide diversity of anthropogenic and natural substances and organisms that may have adverse effects on the ecosystem equilibrium, on living resources and, ultimately, on human health. Identification of relevant types of hazards at the appropriate temporal and spatial scale is crucial to detect their sources and origin, to understand the processes governing their magnitude and distribution, and to ultimately evaluate and manage their risks and consequences preventing economic losses. This can be addressed only by the development of innovative, compact, rugged, automated, sensor networks allowing long-term monitoring. Development of such tools is a challenging task as it requires many analytical and technical innovations. The FP7-OCEAN 2013-SCHeMA project aims to contribute to meet this challenge by providing an open and modular sensing solution for autonomous in situ high resolution mapping of a range of anthropogenic and natural chemical compounds (trace metals, nutrients, anthropogenic organic compounds, toxic algae species and toxins, species relevant to the carbon cycle). To achieve this, SCHeMA activities focus on the development of : 1) an array of miniature sensor probes taking advantage of various innovative solutions, namely: (polymer-based) gel-integrated sensors; solid state ion-selective membrane sensors coupled to an on-line desalination module; mid-infrared optical sensors; optochemical multichannel devices; enOcean technology; 2) dedicated hardware, firmware and software components allowing their plug-and-play integration, localization as well as wireless bidirectional communication via advanced OGC-SWE wired/wireless dedicated interfaces; 3) a web-based front-end system compatible with EU standard requirements and principles (INSPIRE, GEO/GEOSS) and configured to insure easy interoperability with national, regional and local marine observation systems. This lecture will present examples of innovative approaches and devices successfully developed and currently explored. Potentiality of the SCHeMA individual probes and integrated system to provide new type of high-resolution environmental data will be illustrated by examples of field application in selected coastal areas. www.schema-ocean.eu

  2. Building asynchronous geospatial processing workflows with web services

    NASA Astrophysics Data System (ADS)

    Zhao, Peisheng; Di, Liping; Yu, Genong

    2012-02-01

    Geoscience research and applications often involve a geospatial processing workflow. This workflow includes a sequence of operations that use a variety of tools to collect, translate, and analyze distributed heterogeneous geospatial data. Asynchronous mechanisms, by which clients initiate a request and then resume their processing without waiting for a response, are very useful for complicated workflows that take a long time to run. Geospatial contents and capabilities are increasingly becoming available online as interoperable Web services. This online availability significantly enhances the ability to use Web service chains to build distributed geospatial processing workflows. This paper focuses on how to orchestrate Web services for implementing asynchronous geospatial processing workflows. The theoretical bases for asynchronous Web services and workflows, including asynchrony patterns and message transmission, are examined to explore different asynchronous approaches to and architecture of workflow code for the support of asynchronous behavior. A sample geospatial processing workflow, issued by the Open Geospatial Consortium (OGC) Web Service, Phase 6 (OWS-6), is provided to illustrate the implementation of asynchronous geospatial processing workflows and the challenges in using Web Services Business Process Execution Language (WS-BPEL) to develop them.

  3. Story Maps as an Effective Social Medium for Data Synthesis, Communication, and Dissemination

    NASA Astrophysics Data System (ADS)

    Wright, D. J.; Verrill, A.; Artz, M.; Deming, R.

    2014-12-01

    The story map is a new medium for sharing not only data, but also photos, videos, sounds, and maps, as a way to tell a specific and compelling story by way of that content. It is emerging as a popular and effective social media too. The user may employ some fairly sophisticated cartographic functionality without advanced training in cartography or GIS. Story maps are essentially web map applications built from web maps, which in turn are built from web-accessible data (including OGC WMS, WFS). This paper will emphasize the approaches and technologies of web-based GIS to tell "stories" about important connections among scientists, resource managers, and policy makers focused on oceans and coasts within the US; and how combining the new medium of "intelligent Web maps" with text, multimedia content, and intuitive user experiences has a great potential to synthesize the data, and it primary interpretative message in order to inform, educate, and inspire about a wide variety of ocean science and policy issues.

  4. The Water SWITCH-ON Spatial Information Platform (SIP)

    NASA Astrophysics Data System (ADS)

    Sala Calero, J., Sr.; Boot, G., Sr.; Dihé, P., Sr.; Arheimer, B.

    2017-12-01

    The amount of hydrological open data is continually growing and providing opportunities to the scientific community. Although the existing data portals (GEOSS Portal, INSPIRE community geoportal and others) enable access to open data, many users still find browsing through them difficult. Moreover, the time spent on gathering and preparing data usually is more significant than the time spent on the experiment itself. Thus, any improvement on searching, understanding, accessing or using open data is greatly beneficial. The Spatial Information Platform (SIP) has been developed to tackle these issues within the SWITCH-ON European Commission funded FP7 project. The SIP has been designed as a set of tools based on open standards that provide to the user all the necessary functionalities as described in the Publish-Find-Bind (PFB) pattern. In other words, this means that the SIP helps users to locate relevant and suitable data for their experiments analysis, to access and transform it (filtering, extraction, selection, conversion, aggregation). Moreover, the SIP can be used to provide descriptive information about the data and to publish it so others can find and use it. The SIP is based on existing open data protocols such as the OGC/CSW, OGC/WMS, OpenDAP and open-source components like PostgreSQL/PostGIS, GeoServer and pyCSW. The SIP is divided in three main user interfaces: the BYOD (Browse your open dataset) web interface, the Expert GUI tool and the Upload Data and Metadata web interface. The BYOD HTML5 client is the main entry point for users that want to browse through open data in the SIP. The BYOD has a map interface based on Leaflet JavaScript libraries so that the users can search more efficiently. The web-based Open Data Registration Tool is a user-friendly upload and metadata description interface (geographical extent, license, DOI generation). The Expert GUI is a desktop application that provides full metadata editing capabilities for the metadata moderators of the project. In conclusion, the Spatial Information Platform (SIP) provides to its community a set of tools for better understanding and ease of use of hydrological open-data. Moreover, the SIP has been based on well-known OGC standards that will allow the connection and data harvesting from popular open data portals such as the GEOSS system of systems.

  5. Model application for rapid detection of the exact location when calling an ambulance using OGC Open GeoSMS Standards

    NASA Astrophysics Data System (ADS)

    Sukic, Enes; Stoimenov, Leonid

    2016-02-01

    The web has penetrated just about every sphere of human interest and using information from the web has become ubiquitous among different categories of users. Medicine has long being using the benefits of modern technologies and without them it cannot function. This paper offers a proposal of use and mutual collaboration of several modern technologies within facilitating the location and communication between persons in need of emergency medical assistance and the emergency head offices, i.e., the ambulance. The main advantage of the proposed model is the technical possibility of implementation and use of these technologies in developing countries and low implementation cost.

  6. Global Imagery Browse Services (GIBS) - Rapidly Serving NASA Imagery for Applications and Science Users

    NASA Astrophysics Data System (ADS)

    Schmaltz, J. E.; Ilavajhala, S.; Plesea, L.; Hall, J. R.; Boller, R. A.; Chang, G.; Sadaqathullah, S.; Kim, R.; Murphy, K. J.; Thompson, C. K.

    2012-12-01

    Expedited processing of imagery from NASA satellites for near-real time use by non-science applications users has a long history, especially since the beginning of the Terra and Aqua missions. Several years ago, the Land Atmosphere Near-real-time Capability for EOS (LANCE) was created to greatly expand the range of near-real time data products from a variety of Earth Observing System (EOS) instruments. NASA's Earth Observing System Data and Information System (EOSDIS) began exploring methods to distribute these data as imagery in an intuitive, geo-referenced format, which would be available within three hours of acquisition. Toward this end, EOSDIS has developed the Global Imagery Browse Services (GIBS, http://earthdata.nasa.gov/gibs) to provide highly responsive, scalable, and expandable imagery services. The baseline technology chosen for GIBS was a Tiled Web Mapping Service (TWMS) developed at the Jet Propulsion Laboratory. Using this, global images and mosaics are divided into tiles with fixed bounding boxes for a pyramid of fixed resolutions. Initially, the satellite imagery is created at the existing data systems for each sensor, ensuring the oversight of those most knowledgeable about the science. There, the satellite data is geolocated and converted to an image format such as JPEG, TIFF, or PNG. The GIBS ingest server retrieves imagery from the various data systems and converts them into image tiles, which are stored in a highly-optimized raster format named Meta Raster Format (MRF). The image tiles are then served to users via HTTP by means of an Apache module. Services are available for the entire globe (lat-long projection) and for both polar regions (polar stereographic projection). Requests to the services can be made with the non-standard, but widely known, TWMS format or via the well-known OGC Web Map Tile Service (WMTS) standard format. Standard OGC Web Map Service (WMS) access to the GIBS server is also available. In addition, users may request a KML pyramid. This variety of access methods allows stakeholders to develop visualization/browse clients for a diverse variety of specific audiences. Currently, EOSDIS is providing an OpenLayers web client, Worldview (http://earthdata.nasa.gov/worldview), as an interface to GIBS. A variety of other existing clients can also be developed using such tools as Google Earth, Google Earth browser Plugin, ESRI's Adobe Flash/Flex Client Library, NASA World Wind, Perceptive Pixel Client, Esri's iOS Client Library, and OpenLayers for Mobile. The imagery browse capabilities from GIBS can be combined with other EOSDIS services (i.e. ECHO OpenSearch) via a client that ties them both together to provide an interface that enables data download from the onscreen imagery. Future plans for GIBS include providing imagery based on science quality data from the entire data record of these EOS instruments.

  7. EarthServer - 3D Visualization on the Web

    NASA Astrophysics Data System (ADS)

    Wagner, Sebastian; Herzig, Pasquale; Bockholt, Ulrich; Jung, Yvonne; Behr, Johannes

    2013-04-01

    EarthServer (www.earthserver.eu), funded by the European Commission under its Seventh Framework Program, is a project to enable the management, access and exploration of massive, multi-dimensional datasets using Open GeoSpatial Consortium (OGC) query and processing language standards like WCS 2.0 and WCPS. To this end, a server/client architecture designed to handle Petabyte/Exabyte volumes of multi-dimensional data is being developed and deployed. As an important part of the EarthServer project, six Lighthouse Applications, major scientific data exploitation initiatives, are being established to make cross-domain, Earth Sciences related data repositories available in an open and unified manner, as service endpoints based on solutions and infrastructure developed within the project. Clients technology developed and deployed in EarthServer ranges from mobile and web clients to immersive virtual reality systems, all designed to interact with a physically and logically distributed server infrastructure using exclusively OGC standards. In this contribution, we would like to present our work on a web-based 3D visualization and interaction client for Earth Sciences data using only technology found in standard web browsers without requiring the user to install plugins or addons. Additionally, we are able to run the earth data visualization client on a wide range of different platforms with very different soft- and hardware requirements such as smart phones (e.g. iOS, Android), different desktop systems etc. High-quality, hardware-accelerated visualization of 3D and 4D content in standard web browsers can be realized now and we believe it will become more and more common to use this fast, lightweight and ubiquitous platform to provide insights into big datasets without requiring the user to set up a specialized client first. With that in mind, we will also point out some of the limitations we encountered using current web technologies. Underlying the EarthServer web client and on top of HTML5, WebGL and JavaScript we have developed the X3DOM framework (www.x3dom.org), which makes possible to embed declarative X3D scenegraphs, an ISO standard XML-based file format for representing 3D computer graphics, directly within HTML, thus enabling developers to rapidly design 3D content that blends seamlessly into HTML interfaces using Javascript. This approach (commonly referred to as a polyfill layer) is used to mimic native web browser support for declarative 3D content and is an important component in our web client architecture.

  8. Implementation of data node in spatial information grid based on WS resource framework and WS notification

    NASA Astrophysics Data System (ADS)

    Zhang, Dengrong; Yu, Le

    2006-10-01

    Abstract-An approach of constructing a data node in spatial information grid (SIG) based on Web Service Resource Framework (WSRF) and Web Service Notification (WSN) is described in this paper. Attentions are paid to construct and implement SIG's resource layer, which is the most important part. A study on this layer find out, it is impossible to require persistent interaction with the clients of the services in common SIG architecture because of inheriting "stateless" and "not persistent" limitations of Web Service. A WSRF/WSN-based data node is designed to hurdle this short comes. Three different access modes are employed to test the availability of this node. Experimental results demonstrate this service node can successfully respond to standard OGC requests and returns specific spatial data in different network environment, also is stateful, dynamic and persistent.

  9. Autonomous Sensorweb Operations for Integrated Space, In-Situ Monitoring of Volcanic Activity

    NASA Technical Reports Server (NTRS)

    Chien, Steve A.; Doubleday, Joshua; Kedar, Sharon; Davies, Ashley G.; Lahusen, Richard; Song, Wenzhan; Shirazi, Behrooz; Mandl, Daniel; Frye, Stuart

    2010-01-01

    We have deployed and demonstrated operations of an integrated space in-situ sensorweb for monitoring volcanic activity. This sensorweb includes a network of ground sensors deployed to the Mount Saint Helens volcano as well as the Earth Observing One spacecraft. The ground operations and space operations are interlinked in that ground-based intelligent event detections can cause the space segment to acquire additional data via observation requests and space-based data acquisitions (thermal imagery) can trigger reconfigurations of the ground network to allocate increased bandwidth to areas of the network best situated to observe the activity. The space-based operations are enabled by an automated mission planning and tasking capability which utilizes several Opengeospatial Consortium (OGC) Sensorweb Enablement (SWE) standards which enable acquiring data, alerts, and tasking using web services. The ground-based segment also supports similar protocols to enable seamless tasking and data delivery. The space-based segment also supports onboard development of data products (thermal summary images indicating areas of activity, quicklook context images, and thermal activity alerts). These onboard developed products have reduced data volume (compared to the complete images) which enables them to be transmitted to the ground more rapidly in engineering channels.

  10. Smart sensing surveillance system

    NASA Astrophysics Data System (ADS)

    Hsu, Charles; Chu, Kai-Dee; O'Looney, James; Blake, Michael; Rutar, Colleen

    2010-04-01

    An effective public safety sensor system for heavily-populated applications requires sophisticated and geographically-distributed infrastructures, centralized supervision, and deployment of large-scale security and surveillance networks. Artificial intelligence in sensor systems is a critical design to raise awareness levels, improve the performance of the system and adapt to a changing scenario and environment. In this paper, a highly-distributed, fault-tolerant, and energy-efficient Smart Sensing Surveillance System (S4) is presented to efficiently provide a 24/7 and all weather security operation in crowded environments or restricted areas. Technically, the S4 consists of a number of distributed sensor nodes integrated with specific passive sensors to rapidly collect, process, and disseminate heterogeneous sensor data from near omni-directions. These distributed sensor nodes can cooperatively work to send immediate security information when new objects appear. When the new objects are detected, the S4 will smartly select the available node with a Pan- Tilt- Zoom- (PTZ) Electro-Optics EO/IR camera to track the objects and capture associated imagery. The S4 provides applicable advanced on-board digital image processing capabilities to detect and track the specific objects. The imaging detection operations include unattended object detection, human feature and behavior detection, and configurable alert triggers, etc. Other imaging processes can be updated to meet specific requirements and operations. In the S4, all the sensor nodes are connected with a robust, reconfigurable, LPI/LPD (Low Probability of Intercept/ Low Probability of Detect) wireless mesh network using Ultra-wide band (UWB) RF technology. This UWB RF technology can provide an ad-hoc, secure mesh network and capability to relay network information, communicate and pass situational awareness and messages. The Service Oriented Architecture of S4 enables remote applications to interact with the S4 network and use the specific presentation methods. In addition, the S4 is compliant with Open Geospatial Consortium - Sensor Web Enablement (OGC-SWE) standards to efficiently discover, access, use, and control heterogeneous sensors and their metadata. These S4 capabilities and technologies have great potential for both military and civilian applications, enabling highly effective security support tools for improving surveillance activities in densely crowded environments. The S4 system is directly applicable to solutions for emergency response personnel, law enforcement, and other homeland security missions, as well as in applications requiring the interoperation of sensor networks with handheld or body-worn interface devices.

  11. New Generation Sensor Web Enablement

    PubMed Central

    Bröring, Arne; Echterhoff, Johannes; Jirka, Simon; Simonis, Ingo; Everding, Thomas; Stasch, Christoph; Liang, Steve; Lemmens, Rob

    2011-01-01

    Many sensor networks have been deployed to monitor Earth’s environment, and more will follow in the future. Environmental sensors have improved continuously by becoming smaller, cheaper, and more intelligent. Due to the large number of sensor manufacturers and differing accompanying protocols, integrating diverse sensors into observation systems is not straightforward. A coherent infrastructure is needed to treat sensors in an interoperable, platform-independent and uniform way. The concept of the Sensor Web reflects such a kind of infrastructure for sharing, finding, and accessing sensors and their data across different applications. It hides the heterogeneous sensor hardware and communication protocols from the applications built on top of it. The Sensor Web Enablement initiative of the Open Geospatial Consortium standardizes web service interfaces and data encodings which can be used as building blocks for a Sensor Web. This article illustrates and analyzes the recent developments of the new generation of the Sensor Web Enablement specification framework. Further, we relate the Sensor Web to other emerging concepts such as the Web of Things and point out challenges and resulting future work topics for research on Sensor Web Enablement. PMID:22163760

  12. HELI-DEM portal for geo-processing services

    NASA Astrophysics Data System (ADS)

    Cannata, Massimiliano; Antonovic, Milan; Molinari, Monia

    2014-05-01

    HELI-DEM (Helvetia-Italy Digital Elevation Model) is a project developed in the framework of Italy/Switzerland Operational Programme for Trans-frontier Cooperation 2007-2013 whose major aim is to create a unified digital terrain model that includes the alpine and sub-alpine areas between Italy and Switzerland. The partners of the project are: Lombardy Region, Piedmont Region, Polytechnic of Milan, Polytechnic of Turin and Fondazione Politecnico from Italy; Institute of Earth Sciences (SUPSI) from Switzerland. The digital terrain model has been produced by integrating and validating the different elevation data available for the areas of interest, characterized by different reference frame, resolutions and accuracies: DHM at 25 m resolution from Swisstopo, DTM at 20 m resolution from Lombardy Region, DTM at 5 m resolution from Piedmont Region and DTM LiDAR PST-A at about 1 m resolution, that covers the main river bed areas and is produced by the Italian Ministry of the Environment. Further results of the project are: the generation of a unique Italian Swiss geoid with an accuracy of few centimeters (Gilardoni et al. 2012); the establishment of a GNSS permanent network, prototype of a transnational positioning service; the development of a geo-portal, entirely based on open source technologies and open standards, which provides the cross-border DTM and offers some capabilities of analysis and processing through the Internet. With this talk, the authors want to present the main steps of the project with a focus on the HELI-DEM geo-portal development carried out by the Institute of Earth Sciences, which is the access point to the DTM outputted from the project. The portal, accessible at http://geoservice.ist.supsi.ch/helidem, is a demonstration of open source technologies combined for providing access to geospatial functionalities to wide non GIS expert public. In fact, the system is entirely developed using only Open Standards and Free and Open Source Software (FOSS) both on the server side (services) and on the client side (interface). In addition to self developed code the system relies mainly on teh software GRASS 7 [1], ZOO-project [2], Geoserver [3] and OpenLayers [4] and the standards WMS [5], WCS [6] and WPS [7]. At the time of writing, the portal offers features like profiling, contour extraction, watershed delineation and analysis, derivatives calculation, data extraction, coordinate conversion but it is evolving and it is planned to extend to a series of environmental modeling that the IST developed in the past like dam break simulation, landslide run-out estimation and floods due to landslide impact in artificial basins. [1] Neteler M., Mitasova H., Open Source GIS: A GRASS GIS Approach. 3rd Ed. 406 pp, Springer, New York, 2008. [2] Fenoy G., Bozon N., Raghavan V., ZOO Project: The Open Wps Platform. Proceeding of 1st International Workshop on Pervasive Web Mapping, Geoprocessing and Services (WebMGS). Como, http://www.isprs.org/proceedings/XXXVIII/4-W13/ID_32.pdf, 26-27 agosto 2010. [3] Giannecchini S., Aime A., GeoServer, il server open source per la gestione interoperabile dei dati geospaziali. Atti 15a Conferenza Nazionale ASITA. Reggia di Colorno, 15-18 novembre 2011. [4] Perez A.S., OpenLayers Cookbook. Packt Publishing, 2012. ISBN 1849517843. [5] OGC, OpenGIS Web Map Server Implementation Specification, http://www.opengeospatial.org/standards/wms, 2006. [6] OGC, OGC WCS 2.0 Interface Standard - Core, http://portal.opengeospatial.org/files/?artifact_id=41437, 2010b. [7] OGC, OpenGIS Web Processing Service, http://portal.opengeospatial.org/files/?artifact_id=24151, 2007.

  13. GeoCENS: a geospatial cyberinfrastructure for the world-wide sensor web.

    PubMed

    Liang, Steve H L; Huang, Chih-Yuan

    2013-10-02

    The world-wide sensor web has become a very useful technique for monitoring the physical world at spatial and temporal scales that were previously impossible. Yet we believe that the full potential of sensor web has thus far not been revealed. In order to harvest the world-wide sensor web's full potential, a geospatial cyberinfrastructure is needed to store, process, and deliver large amount of sensor data collected worldwide. In this paper, we first define the issue of the sensor web long tail followed by our view of the world-wide sensor web architecture. Then, we introduce the Geospatial Cyberinfrastructure for Environmental Sensing (GeoCENS) architecture and explain each of its components. Finally, with demonstration of three real-world powered-by-GeoCENS sensor web applications, we believe that the GeoCENS architecture can successfully address the sensor web long tail issue and consequently realize the world-wide sensor web vision.

  14. GeoCENS: A Geospatial Cyberinfrastructure for the World-Wide Sensor Web

    PubMed Central

    Liang, Steve H.L.; Huang, Chih-Yuan

    2013-01-01

    The world-wide sensor web has become a very useful technique for monitoring the physical world at spatial and temporal scales that were previously impossible. Yet we believe that the full potential of sensor web has thus far not been revealed. In order to harvest the world-wide sensor web's full potential, a geospatial cyberinfrastructure is needed to store, process, and deliver large amount of sensor data collected worldwide. In this paper, we first define the issue of the sensor web long tail followed by our view of the world-wide sensor web architecture. Then, we introduce the Geospatial Cyberinfrastructure for Environmental Sensing (GeoCENS) architecture and explain each of its components. Finally, with demonstration of three real-world powered-by-GeoCENS sensor web applications, we believe that the GeoCENS architecture can successfully address the sensor web long tail issue and consequently realize the world-wide sensor web vision. PMID:24152921

  15. GIS Technologies For The New Planetary Science Archive (PSA)

    NASA Astrophysics Data System (ADS)

    Docasal, R.; Barbarisi, I.; Rios, C.; Macfarlane, A. J.; Gonzalez, J.; Arviset, C.; De Marchi, G.; Martinez, S.; Grotheer, E.; Lim, T.; Besse, S.; Heather, D.; Fraga, D.; Barthelemy, M.

    2015-12-01

    Geographical information system (GIS) is becoming increasingly used for planetary science. GIS are computerised systems for the storage, retrieval, manipulation, analysis, and display of geographically referenced data. Some data stored in the Planetary Science Archive (PSA), for instance, a set of Mars Express/Venus Express data, have spatial metadata associated to them. To facilitate users in handling and visualising spatial data in GIS applications, the new PSA should support interoperability with interfaces implementing the standards approved by the Open Geospatial Consortium (OGC). These standards are followed in order to develop open interfaces and encodings that allow data to be exchanged with GIS Client Applications, well-known examples of which are Google Earth and NASA World Wind as well as open source tools such as Openlayers. The technology already exists within PostgreSQL databases to store searchable geometrical data in the form of the PostGIS extension. An existing open source maps server is GeoServer, an instance of which has been deployed for the new PSA, uses the OGC standards to allow, among others, the sharing, processing and editing of data and spatial data through the Web Feature Service (WFS) standard as well as serving georeferenced map images through the Web Map Service (WMS). The final goal of the new PSA, being developed by the European Space Astronomy Centre (ESAC) Science Data Centre (ESDC), is to create an archive which enables science exploitation of ESA's planetary missions datasets. This can be facilitated through the GIS framework, offering interfaces (both web GUI and scriptable APIs) that can be used more easily and scientifically by the community, and that will also enable the community to build added value services on top of the PSA.

  16. The GEOSS Clearinghouse based on the GeoNetwork opensource

    NASA Astrophysics Data System (ADS)

    Liu, K.; Yang, C.; Wu, H.; Huang, Q.

    2010-12-01

    The Global Earth Observation System of Systems (GEOSS) is established to support the study of the Earth system in a global community. It provides services for social management, quick response, academic research, and education. The purpose of GEOSS is to achieve comprehensive, coordinated and sustained observations of the Earth system, improve monitoring of the state of the Earth, increase understanding of Earth processes, and enhance prediction of the behavior of the Earth system. In 2009, GEO called for a competition for an official GEOSS clearinghouse to be selected as a source to consolidating catalogs for Earth observations. The Joint Center for Intelligent Spatial Computing at George Mason University worked with USGS to submit a solution based on the open-source platform - GeoNetwork. In the spring of 2010, the solution is selected as the product for GEOSS clearinghouse. The GEOSS Clearinghouse is a common search facility for the Intergovernmental Group on Ea rth Observation (GEO). By providing a list of harvesting functions in Business Logic, GEOSS clearinghouse can collect metadata from distributed catalogs including other GeoNetwork native nodes, webDAV/sitemap/WAF, catalog services for the web (CSW)2.0, GEOSS Component and Service Registry (http://geossregistries.info/), OGC Web Services (WCS, WFS, WMS and WPS), OAI Protocol for Metadata Harvesting 2.0, ArcSDE Server and Local File System. Metadata in GEOSS clearinghouse are managed in a database (MySQL, Postgresql, Oracle, or MckoiDB) and an index of the metadata is maintained through Lucene engine. Thus, EO data, services, and related resources can be discovered and accessed. It supports a variety of geospatial standards including CSW and SRU for search, FGDC and ISO metadata, and WMS related OGC standards for data access and visualization, as linked from the metadata.

  17. Ubiquitous healthcare computing with SEnsor Grid Enhancement with Data Management System (SEGEDMA).

    PubMed

    Preve, Nikolaos

    2011-12-01

    Wireless Sensor Network (WSN) can be deployed to monitor the health of patients suffering from critical diseases. Also a wireless network consisting of biomedical sensors can be implanted into the patient's body and can monitor the patients' conditions. These sensor devices, apart from having an enormous capability of collecting data from their physical surroundings, are also resource constraint in nature with a limited processing and communication ability. Therefore we have to integrate them with the Grid technology in order to process and store the collected data by the sensor nodes. In this paper, we proposed the SEnsor Grid Enhancement Data Management system, called SEGEDMA ensuring the integration of different network technologies and the continuous data access to system users. The main contribution of this work is to achieve the interoperability of both technologies through a novel network architecture ensuring also the interoperability of Open Geospatial Consortium (OGC) and HL7 standards. According to the results, SEGEDMA can be applied successfully in a decentralized healthcare environment.

  18. First Prototype of a Web Map Interface for ESA's Planetary Science Archive (PSA)

    NASA Astrophysics Data System (ADS)

    Manaud, N.; Gonzalez, J.

    2014-04-01

    We present a first prototype of a Web Map Interface that will serve as a proof of concept and design for ESA's future fully web-based Planetary Science Archive (PSA) User Interface. The PSA is ESA's planetary science archiving authority and central repository for all scientific and engineering data returned by ESA's Solar System missions [1]. All data are compliant with NASA's Planetary Data System (PDS) Standards and are accessible through several interfaces [2]: in addition to serving all public data via FTP and the Planetary Data Access Protocol (PDAP), a Java-based User Interface provides advanced search, preview, download, notification and delivery-basket functionality. It allows the user to query and visualise instrument observations footprints using a map-based interface (currently only available for Mars Express HRSC and OMEGA instruments). During the last decade, the planetary mapping science community has increasingly been adopting Geographic Information System (GIS) tools and standards, originally developed for and used in Earth science. There is an ongoing effort to produce and share cartographic products through Open Geospatial Consortium (OGC) Web Services, or as standalone data sets, so that they can be readily used in existing GIS applications [3,4,5]. Previous studies conducted at ESAC [6,7] have helped identify the needs of Planetary GIS users, and define key areas of improvement for the future Web PSA User Interface. Its web map interface shall will provide access to the full geospatial content of the PSA, including (1) observation geometry footprints of all remote sensing instruments, and (2) all georeferenced cartographic products, such as HRSC map-projected data or OMEGA global maps from Mars Express. It shall aim to provide a rich user experience for search and visualisation of this content using modern and interactive web mapping technology. A comprehensive set of built-in context maps from external sources, such as MOLA topography, TES infrared maps or planetary surface nomenclature, provided in both simple cylindrical and polar stereographic projections, shall enhance this user experience. In addition, users should be able to import and export data in commonly used open- GIS formats. It is also intended to serve all PSA geospatial data through OGC-compliant Web Services so that they can be captured, visualised and analysed directly from GIS software, along with data from other sources. The following figure illustrates how the PSA web map interface and services shall fit in a typical Planetary GIS user working environment.

  19. Exchanging the Context between OGC Geospatial Web clients and GIS applications using Atom

    NASA Astrophysics Data System (ADS)

    Maso, Joan; Díaz, Paula; Riverola, Anna; Pons, Xavier

    2013-04-01

    Currently, the discovery and sharing of geospatial information over the web still presents difficulties. News distribution through website content was simplified by the use of Really Simple Syndication (RSS) and Atom syndication formats. This communication exposes an extension of Atom to redistribute references to geospatial information in a Spatial Data Infrastructure distributed environment. A geospatial client can save the status of an application that involves several OGC services of different kind and direct data and share this status with other users that need the same information and use different client vendor products in an interoperable way. The extensibility of the Atom format was essential to define a format that could be used in RSS enabled web browser, Mass Market map viewers and emerging geospatial enable integrated clients that support Open Geospatial Consortium (OGC) services. Since OWS Context has been designed as an Atom extension, it is possible to see the document in common places where Atom documents are valid. Internet web browsers are able to present the document as a list of items with title, abstract, time, description and downloading features. OWS Context uses GeoRSS so that, the document can be to be interpreted by both Google maps and Bing Maps as items that have the extent represented in a dynamic map. Another way to explode a OWS Context is to develop an XSLT to transform the Atom feed into an HTML5 document that shows the exact status of the client view window that saved the context document. To accomplish so, we use the width and height of the client window, and the extent of the view in world (geographic) coordinates in order to calculate the scale of the map. Then, we can mix elements in world coordinates (such as CF-NetCDF files or GML) with elements in pixel coordinates (such as WMS maps, WMTS tiles and direct SVG content). A smarter map browser application called MiraMon Map Browser is able to write a context document and read it again to recover the context of the previous view or load a context generated by another application. The possibility to store direct links to direct files in OWS Context is particularly interesting for GIS desktop solutions. This communication also presents the development made in the MiraMon desktop GIS solution to include OWS Context. MiraMon software is able to deal either with local files, web services and database connections. As in any other GIS solution, MiraMon team designed its own file (MiraMon Map MMM) for storing and sharing the status of a GIS session. The new OWS Context format is now adopted as an interoperable substitution of the MMM. The extensibility of the format makes it possible to map concepts in the MMM to current OWS Context elements (such as titles, data links, extent, etc) and to generate new elements that are able to include all extra metadata not currently covered by OWS Context. These developments were done in the nine edition of the OpenGIS Web Services Interoperability Experiment (OWS-9) and are demonstrated in this communication.

  20. A Tsunami-Focused Tide Station Data Sharing Framework

    NASA Astrophysics Data System (ADS)

    Kari, U. S.; Marra, J. J.; Weinstein, S. A.

    2006-12-01

    The Indian Ocean Tsunami of 26 December 2004 made it clear that information about tide stations that could be used to support detection and warning (such as location, collection and transmission capabilities, operator identification) are insufficiently known or not readily accessible. Parties interested in addressing this problem united under the Pacific Region Data Integrated Data Enterprise (PRIDE), and in 2005 began a multiyear effort to develop a distributed metadata system describing tide stations starting with pilot activities in a regional framework and focusing on tsunami detection and warning systems being developed by various agencies. First, a plain semantic description of the tsunami-focused tide station metadata was developed. The semantic metadata description was, in turn, developed into a formal metadata schema championed by International Tsunami Information Centre (ITIC) as part of a larger effort to develop a prototype web service under the PRIDE program in 2005. Under the 2006 PRIDE program the formal metadata schema was then expanded to corral input parameters for the TideTool application used by Pacific Tsunami Warning Center (PTWC) to drill down into wave activity at a tide station that is located using a web service developed on this metadata schema. This effort contributed to formalization of web service dissemination of PTWC watch and warning tsunami bulletins. During this time, the data content and sharing issues embodied in this schema have been discussed at various forums. The result is that the various stakeholders have different data provider and user perspectives (semantic content) and also exchange formats (not limited to just XML). The challenge then, is not only to capture all data requirements, but also to have formal representation that is easily transformed into any specified format. The latest revision of the tide gauge schema (Version 0.3), begins to address this challenge. It encompasses a broader range of provider and user perspectives, such as station operators, warning system managers, disaster managers, other marine hazard warning systems (such as storm surges and sea level change monitoring and research. In the next revision(s), we hope to take into account various relevant standards, including specifically, the Open Geospatial Consortium (OGC) Sensor Web Enablement (SWE) Framework, that will serve all prospective stakeholders in the most useful (extensible, scalable) manner. This is because Sensor ML has addressed many of the challenges we face already, through very useful fundamental modeling consideration and data types that are particular to sensors in general, with perhaps some extension needed for tide gauges. As a result of developing this schema, and associated client application architectures, we hope to have a much more distributed network of data providers, who are able to contribute to a global tide station metadata from the comfort of their own Information Technology (IT) departments.

  1. Autonomy and Sensor Webs: The Evolution of Mission Operations

    NASA Technical Reports Server (NTRS)

    Sherwood, Rob

    2008-01-01

    Demonstration of these sensor web capabilities will enable fast responding science campaigns that combine spaceborne, airborne, and ground assets. Sensor webs will also require new operations paradigms. These sensor webs will be operated directly by scientists using science goals to control their instruments. We will explore these new operations architectures through a study of existing sensor web prototypes.

  2. A Prototype Land Information Sensor Web: Design, Implementation and Implication for the SMAP Mission

    NASA Astrophysics Data System (ADS)

    Su, H.; Houser, P.; Tian, Y.; Geiger, J. K.; Kumar, S. V.; Gates, L.

    2009-12-01

    Land Surface Model (LSM) predictions are regular in time and space, but these predictions are influenced by errors in model structure, input variables, parameters and inadequate treatment of sub-grid scale spatial variability. Consequently, LSM predictions are significantly improved through observation constraints made in a data assimilation framework. Several multi-sensor satellites are currently operating which provide multiple global observations of the land surface, and its related near-atmospheric properties. However, these observations are not optimal for addressing current and future land surface environmental problems. To meet future earth system science challenges, NASA will develop constellations of smart satellites in sensor web configurations which provide timely on-demand data and analysis to users, and can be reconfigured based on the changing needs of science and available technology. A sensor web is more than a collection of satellite sensors. That means a sensor web is a system composed of multiple platforms interconnected by a communication network for the purpose of performing specific observations and processing data required to support specific science goals. Sensor webs can eclipse the value of disparate sensor components by reducing response time and increasing scientific value, especially when the two-way interaction between the model and the sensor web is enabled. The study of a prototype Land Information Sensor Web (LISW) is sponsored by NASA, trying to integrate the Land Information System (LIS) in a sensor web framework which allows for optimal 2-way information flow that enhances land surface modeling using sensor web observations, and in turn allows sensor web reconfiguration to minimize overall system uncertainty. This prototype is based on a simulated interactive sensor web, which is then used to exercise and optimize the sensor web modeling interfaces. The Land Information Sensor Web Service-Oriented Architecture (LISW-SOA) has been developed and it is the very first sensor web framework developed especially for the land surface studies. Synthetic experiments based on the LISW-SOA and the virtual sensor web provide a controlled environment in which to examine the end-to-end performance of the prototype, the impact of various sensor web design trade-offs and the eventual value of sensor webs for a particular prediction or decision support. In this paper, the design, implementation of the LISW-SOA and the implication for the Soil Moisture Active and Passive (SMAP) mission is presented. Particular attention is focused on examining the relationship between the economic investment on a sensor web (space and air borne, ground based) and the accuracy of the model predicted soil moisture, which can be achieved by using such sensor observations. The Study of Virtual Land Information Sensor Web (LISW) is expected to provide some necessary a priori knowledge for designing and deploying the next generation Global Earth Observing System of systems (GEOSS).

  3. Interoperability In The New Planetary Science Archive (PSA)

    NASA Astrophysics Data System (ADS)

    Rios, C.; Barbarisi, I.; Docasal, R.; Macfarlane, A. J.; Gonzalez, J.; Arviset, C.; Grotheer, E.; Besse, S.; Martinez, S.; Heather, D.; De Marchi, G.; Lim, T.; Fraga, D.; Barthelemy, M.

    2015-12-01

    As the world becomes increasingly interconnected, there is a greater need to provide interoperability with software and applications that are commonly being used globally. For this purpose, the development of the new Planetary Science Archive (PSA), by the European Space Astronomy Centre (ESAC) Science Data Centre (ESDC), is focused on building a modern science archive that takes into account internationally recognised standards in order to provide access to the archive through tools from third parties, for example by the NASA Planetary Data System (PDS), the VESPA project from the Virtual Observatory of Paris as well as other international institutions. The protocols and standards currently being supported by the new Planetary Science Archive at this time are the Planetary Data Access Protocol (PDAP), the EuroPlanet-Table Access Protocol (EPN-TAP) and Open Geospatial Consortium (OGC) standards. The architecture of the PSA consists of a Geoserver (an open-source map server), the goal of which is to support use cases such as the distribution of search results, sharing and processing data through a OGC Web Feature Service (WFS) and a Web Map Service (WMS). This server also allows the retrieval of requested information in several standard output formats like Keyhole Markup Language (KML), Geography Markup Language (GML), shapefile, JavaScript Object Notation (JSON) and Comma Separated Values (CSV), among others. The provision of these various output formats enables end-users to be able to transfer retrieved data into popular applications such as Google Mars and NASA World Wind.

  4. An interoperability experiment for sharing hydrological rating tables

    NASA Astrophysics Data System (ADS)

    Lemon, D.; Taylor, P.; Sheahan, P.

    2013-12-01

    The increasing demand on freshwater resources is requiring authorities to produce more accurate and timely estimates of their available water. Calculation of continuous time-series of river discharge and storage volumes generally requires rating tables. These approximate relationships between two phenomena, such as river level and discharge, and allow us to produce continuous estimates of a phenomenon that may be impractical or impossible to measure directly. Standardised information models or access mechanisms for rating tables are required to support sharing and exchange of water flow data. An Interoperability Experiment (IE) is underway to test an information model that describes rating tables, the observations made to build these ratings, and river cross-section data. The IE is an initiative of the joint World Meteorological Organisation/Open Geospatial Consortium's Hydrology Domain Working Group (HydroDWG) and the model will be published as WaterML2.0 part 2. Interoperability Experiments (IEs) are low overhead, multiple member projects that are run under the OGC's interoperability program to test existing and emerging standards. The HydroDWG has previously run IEs to test early versions of OGC WaterML2.0 part 1 - timeseries. This IE is focussing on two key exchange scenarios: Sharing rating tables and gauging observations between water agencies. Through the use of standard OGC web services, rating tables and associated data will be made available from water agencies. The (Australian) Bureau of Meteorology will retrieve rating tables on-demand from water authorities, allowing the Bureau to run conversions of data within their own systems. Exposing rating tables and gaugings for online analysis and educational purposes. A web client will be developed to enable exploration and visualization of rating tables, gaugings and related metadata for monitoring points. The client gives a quick view into available rating tables, their periods of applicability and the standard deviation of observations against the relationship. An example of this client running can be seen at the link provided. The result of the IE will form the basis for the standardisation of WaterML2.0 part 2. The use of the standard will lead to increased transparency and accessibility of rating tables, while also improving general understanding of this important hydrological concept.

  5. GENESIS SciFlo: Choreographing Interoperable Web Services on the Grid using a Semantically-Enabled Dataflow Execution Environment

    NASA Astrophysics Data System (ADS)

    Wilson, B. D.; Manipon, G.; Xing, Z.

    2007-12-01

    The General Earth Science Investigation Suite (GENESIS) project is a NASA-sponsored partnership between the Jet Propulsion Laboratory, academia, and NASA data centers to develop a new suite of Web Services tools to facilitate multi-sensor investigations in Earth System Science. The goal of GENESIS is to enable large-scale, multi-instrument atmospheric science using combined datasets from the AIRS, MODIS, MISR, and GPS sensors. Investigations include cross-comparison of spaceborne climate sensors, cloud spectral analysis, study of upper troposphere-stratosphere water transport, study of the aerosol indirect cloud effect, and global climate model validation. The challenges are to bring together very large datasets, reformat and understand the individual instrument retrievals, co-register or re-grid the retrieved physical parameters, perform computationally-intensive data fusion and data mining operations, and accumulate complex statistics over months to years of data. To meet these challenges, we have developed a Grid computing and dataflow framework, named SciFlo, in which we are deploying a set of versatile and reusable operators for data access, subsetting, registration, mining, fusion, compression, and advanced statistical analysis. SciFlo leverages remote Web Services, called via Simple Object Access Protocol (SOAP) or REST (one-line) URLs, and the Grid Computing standards (WS-* & Globus Alliance toolkits), and enables scientists to do multi- instrument Earth Science by assembling reusable Web Services and native executables into a distributed computing flow (tree of operators). The SciFlo client & server engines optimize the execution of such distributed data flows and allow the user to transparently find and use datasets and operators without worrying about the actual location of the Grid resources. In particular, SciFlo exploits the wealth of datasets accessible by OpenGIS Consortium (OGC) Web Mapping Servers & Web Coverage Servers (WMS/WCS), and by Open Data Access Protocol (OpenDAP) servers. SciFlo also publishes its own SOAP services for space/time query and subsetting of Earth Science datasets, and automated access to large datasets via lists of (FTP, HTTP, or DAP) URLs which point to on-line HDF or netCDF files. Typical distributed workflows obtain datasets by calling standard WMS/WCS servers or discovering and fetching data granules from ftp sites; invoke remote analysis operators available as SOAP services (interface described by a WSDL document); and merge results into binary containers (netCDF or HDF files) for further analysis using local executable operators. Naming conventions (HDFEOS and CF-1.0 for netCDF) are exploited to automatically understand and read on-line datasets. More interoperable conventions, and broader adoption of existing converntions, are vital if we are to "scale up" automated choreography of Web Services beyond toy applications. Recently, the ESIP Federation sponsored a collaborative activity in which several ESIP members developed some collaborative science scenarios for atmospheric and aerosol science, and then choreographed services from multiple groups into demonstration workflows using the SciFlo engine and a Business Process Execution Language (BPEL) workflow engine. We will discuss the lessons learned from this activity, the need for standardized interfaces (like WMS/WCS), the difficulty in agreeing on even simple XML formats and interfaces, the benefits of doing collaborative science analysis at the "touch of a button" once services are connected, and further collaborations that are being pursued.

  6. Gazetteer Brokering through Semantic Mediation

    NASA Astrophysics Data System (ADS)

    Hobona, G.; Bermudez, L. E.; Brackin, R.

    2013-12-01

    A gazetteer is a geographical directory containing some information regarding places. It provides names, location and other attributes for places which may include points of interest (e.g. buildings, oilfields and boreholes), and other features. These features can be published via web services conforming to the Gazetteer Application Profile of the Web Feature Service (WFS) standard of the Open Geospatial Consortium (OGC). Against the backdrop of advances in geophysical surveys, there has been a significant increase in the amount of data referenced to locations. Gazetteers services have played a significant role in facilitating access to such data, including through provision of specialized queries such as text, spatial and fuzzy search. Recent developments in the OGC have led to advances in gazetteers such as support for multilingualism, diacritics, and querying via advanced spatial constraints (e.g. search by radial search and nearest neighbor). A challenge remaining however, is that gazetteers produced by different organizations have typically been modeled differently. Inconsistencies from gazetteers produced by different organizations may include naming the same feature in a different way, naming the attributes differently, locating the feature in a different location, and providing fewer or more attributes than the other services. The Gazetteer application profile of the WFS is a starting point to address such inconsistencies by providing a standardized interface based on rules specified in ISO 19112, the international standard for spatial referencing by geographic identifiers. The profile, however, does not provide rules to deal with semantic inconsistencies. The USGS and NGA commissioned research into the potential for a Single Point of Entry Global Gazetteer (SPEGG). The research was conducted by the Cross Community Interoperability thread of the OGC testbed, referenced OWS-9. The testbed prototyped approaches for brokering gazetteers through use of semantic web technologies, including ontologies and a semantic mediator. The semantically-enhanced SPEGG allowed a client to submit a single query (e.g. ';hills') and to retrieve data from two separate gazetteers with different vocabularies (e.g. where one refers to ';summits' another refers to ';hills'). Supporting the SPEGG was a SPARQL server that held the ontologies and processed queries on them. Earth Science surveys and forecast always have a place on Earth. Being able to share the information about a place and solve inconsistencies about that place from different sources will enable geoscientists to better do their research. In the advent of mobile geo computing and location based services (LBS), brokering gazetteers will provide geoscientists with access to gazetteer services rich with information and functionality beyond that offered by current generic gazetteers.

  7. A SMART groundwater portal: An OGC web services orchestration framework for hydrology to improve data access and visualisation in New Zealand

    NASA Astrophysics Data System (ADS)

    Klug, Hermann; Kmoch, Alexander

    2014-08-01

    Transboundary and cross-catchment access to hydrological data is the key to designing successful environmental policies and activities. Electronic maps based on distributed databases are fundamental for planning and decision making in all regions and for all spatial and temporal scales. Freshwater is an essential asset in New Zealand (and globally) and the availability as well as accessibility of hydrological information held by or held for public authorities and businesses are becoming a crucial management factor. Access to and visual representation of environmental information for the public is essential for attracting greater awareness of water quality and quantity matters. Detailed interdisciplinary knowledge about the environment is required to ensure that the environmental policy-making community of New Zealand considers regional and local differences of hydrological statuses, while assessing the overall national situation. However, cross-regional and inter-agency sharing of environmental spatial data is complex and challenging. In this article, we firstly provide an overview of the state of the art standard compliant techniques and methodologies for the practical implementation of simple, measurable, achievable, repeatable, and time-based (SMART) hydrological data management principles. Secondly, we contrast international state of the art data management developments with the present status for groundwater information in New Zealand. Finally, for the topics (i) data access and harmonisation, (ii) sensor web enablement and (iii) metadata, we summarise our findings, provide recommendations on future developments and highlight the specific advantages resulting from a seamless view, discovery, access, and analysis of interoperable hydrological information and metadata for decision making.

  8. The IRIS Data Management Center: Enabling Access to Observational Time Series Spanning Decades

    NASA Astrophysics Data System (ADS)

    Ahern, T.; Benson, R.; Trabant, C.

    2009-04-01

    The Incorporated Research Institutions for Seismology (IRIS) is funded by the National Science Foundation (NSF) to operate the facilities to generate, archive, and distribute seismological data to research communities in the United States and internationally. The IRIS Data Management System (DMS) is responsible for the ingestion, archiving, curation and distribution of these data. The IRIS Data Management Center (DMC) manages data from more than 100 permanent seismic networks, hundreds of temporary seismic deployments as well as data from other geophysical observing networks such as magnetotelluric sensors, ocean bottom sensors, superconducting gravimeters, strainmeters, surface meteorological measurements, and in-situ atmospheric pressure measurements. The IRIS DMC has data from more than 20 different types of sensors. The IRIS DMC manages approximately 100 terabytes of primary observational data. These data are archived in multiple distributed storage systems that insure data availability independent of any single catastrophic failure. Storage systems include both RAID systems of greater than 100 terabytes as well as robotic tape robots of petabyte capacity. IRIS performs routine transcription of the data to new media and storage systems to insure the long-term viability of the scientific data. IRIS adheres to the OAIS Data Preservation Model in most cases. The IRIS data model requires the availability of metadata describing the characteristics and geographic location of sensors before data can be fully archived. IRIS works with the International Federation of Digital Seismographic Networks (FDSN) in the definition and evolution of the metadata. The metadata insures that the data remain useful to both current and future generations of earth scientists. Curation of the metadata and time series is one of the most important activities at the IRIS DMC. Data analysts and an automated quality assurance system monitor the quality of the incoming data. This insures data are of acceptably high quality. The formats and data structures used by the seismological community are esoteric. IRIS and its FDSN partners are developing web services that can transform the data holdings to structures that are more easily used by broader scientific communities. For instance, atmospheric scientists are interested in using global observations of microbarograph data but that community does not understand the methods of applying instrument corrections to the observations. Web processing services under development at IRIS will transform these data in a manner that allows direct use within such analysis tools as MATLAB® already in use by that community. By continuing to develop web-service based methods of data discovery and access, IRIS is enabling broader access to its data holdings. We currently support data discovery using many of the Open Geospatial Consortium (OGC) web mapping services. We are involved in portal technologies to support data discovery and distribution for all data from the EarthScope project. We are working with computer scientists at several universities including the University of Washington as part of a DataNet proposal and we intend to enhance metadata, further develop ontologies, develop a Registry Service to aid in the discovery of data sets and services, and in general improve the semantic interoperability of the data managed at the IRIS DMC. Finally IRIS has been identified as one of four scientific organizations that the External Research Division of Microsoft wants to work with in the development of web services and specifically with the development of a scientific workflow engine. More specific details of current and future developments at the IRIS DMC will be included in this presentation.

  9. Grid enablement of OpenGeospatial Web Services: the G-OWS Working Group

    NASA Astrophysics Data System (ADS)

    Mazzetti, Paolo

    2010-05-01

    In last decades two main paradigms for resource sharing emerged and reached maturity: the Web and the Grid. They both demonstrate suitable for building Distributed Computing Infrastructures (DCIs) supporting the coordinated sharing of resources (i.e. data, information, services, etc) on the Internet. Grid and Web DCIs have much in common as a result of their underlying Internet technology (protocols, models and specifications). However, being based on different requirements and architectural approaches, they show some differences as well. The Web's "major goal was to be a shared information space through which people and machines could communicate" [Berners-Lee 1996]. The success of the Web, and its consequent pervasiveness, made it appealing for building specialized systems like the Spatial Data Infrastructures (SDIs). In this systems the introduction of Web-based geo-information technologies enables specialized services for geospatial data sharing and processing. The Grid was born to achieve "flexible, secure, coordinated resource sharing among dynamic collections of individuals, institutions, and resources" [Foster 2001]. It specifically focuses on large-scale resource sharing, innovative applications, and, in some cases, high-performance orientation. In the Earth and Space Sciences (ESS) the most part of handled information is geo-referred (geo-information) since spatial and temporal meta-information is of primary importance in many application domains: Earth Sciences, Disasters Management, Environmental Sciences, etc. On the other hand, in several application areas there is the need of running complex models which require the large processing and storage capabilities that the Grids are able to provide. Therefore the integration of geo-information and Grid technologies might be a valuable approach in order to enable advanced ESS applications. Currently both geo-information and Grid technologies have reached a high level of maturity, allowing to build such an integration on existing solutions. More specifically, the Open Geospatial Consortium (OGC) Web Services (OWS) specifications play a fundamental role in geospatial information sharing (e.g. in INSPIRE Implementing Rules, GEOSS architecture, GMES Services, etc.). On the Grid side, the gLite middleware, developed in the European EGEE (Enabling Grids for E-sciencE) Projects, is widely spread in Europe and beyond, proving its high scalability and it is one of the middleware chosen for the future European Grid Infrastructure (EGI) initiative. Therefore the convergence between OWS and gLite technologies would be desirable for a seamless access to the Grid capabilities through OWS-compliant systems. Anyway, to achieve this harmonization there are some obstacles to overcome. Firstly, a semantics mismatch must be addressed: gLite handle low-level (e.g. close to the machine) concepts like "file", "data", "instruments", "job", etc., while geo-information services handle higher-level (closer to the human) concepts like "coverage", "observation", "measurement", "model", etc. Secondly, an architectural mismatch must be addressed: OWS implements a Web Service-Oriented-Architecture which is stateless, synchronous and with no embedded security (which is demanded to other specs), while gLite implements the Grid paradigm in an architecture which is stateful, asynchronous (even not fully event-based) and with strong embedded security (based on the VO paradigm). In recent years many initiatives and projects have worked out possible approaches for implementing Grid-enabled OWSs. Just to mention some: (i) in 2007 the OGC has signed a Memorandum of Understanding with the Open Grid Forum, "a community of users, developers, and vendors leading the global standardization effort for grid computing."; (ii) the OGC identified "WPS Profiles - Conflation; and Grid processing" as one of the tasks in the Geo Processing Workflow theme of the OWS Phase 6 (OWS-6); (iii) several national, European and international projects investigated different aspects of this integration, developing demonstrators and Proof-of-Concepts; In this context, "gLite enablement of OpenGeospatial Web Services" (G-OWS) is an initiative started in 2008 by the European CYCLOPS, GENESI-DR, and DORII Projects Consortia in order to collect/coordinate experiences on the enablement of OWS on top of the gLite middleware [GOWS]. Currently G-OWS counts ten member organizations from Europe and beyond, and four European Projects involved. It broadened its scope to the development of Spatial Data and Information Infrastructures (SDI and SII) based on the Grid/Cloud capacity in order to enable Earth Science applications and tools. Its operational objectives are the following: i) to contribute to the OGC-OGF initiative; ii) to release a reference implementation as standard gLite APIs (under the gLite software license); iii) to release a reference model (including procedures and guidelines) for OWS Grid-ification, as far as gLite is concerned; iv) to foster and promote the formation of consortiums for participation to projects/initiatives aimed at building Grid-enabled SDIs To achieve this objectives G-OWS bases its activities on two main guiding principles: a) the adoption of a service-oriented architecture based on the information modelling approach, and b) standardization as a means of achieving interoperability (i.e. adoption of standards from ISO TC211, OGC OWS, OGF). In the first year of activity G-OWS has designed a general architectural framework stemming from the FP6 CYCLOPS studies and enriched by the outcomes of other projects and initiatives involved (i.e. FP7 GENESI-DR, FP7 DORII, AIST GeoGrid, etc.). Some proof-of-concepts have been developed to demonstrate the flexibility and scalability of such architectural framework. The G-OWS WG developed implementations of gLite-enabled Web Coverage Service (WCS) and Web Processing Service (WPS), and an implementation of a Shibboleth authentication for gLite-enabled OWS in order to evaluate the possible integration of Web and Grid security models. The presentation will aim to communicate the G-OWS organization, activities, future plans and means to involve the ESSI community. References [Berners-Lee 1996] T. Berners-Lee, "WWW: Past, present, and future". IEEE Computer, 29(10), Oct. 1996, pp. 69-77. [Foster 2001] I. Foster, C. Kesselman and S. Tuecke, "The Anatomy of the Grid. The International Journal ofHigh Performance Computing Applications", 15(3):200-222, Fall 2001 [GOWS] G-OWS WG, https://www.g-ows.org/, accessed: 15 January 2010

  10. Semantically-Enabled Sensor Plug & Play for the Sensor Web

    PubMed Central

    Bröring, Arne; Maúe, Patrick; Janowicz, Krzysztof; Nüst, Daniel; Malewski, Christian

    2011-01-01

    Environmental sensors have continuously improved by becoming smaller, cheaper, and more intelligent over the past years. As consequence of these technological advancements, sensors are increasingly deployed to monitor our environment. The large variety of available sensor types with often incompatible protocols complicates the integration of sensors into observing systems. The standardized Web service interfaces and data encodings defined within OGC’s Sensor Web Enablement (SWE) framework make sensors available over the Web and hide the heterogeneous sensor protocols from applications. So far, the SWE framework does not describe how to integrate sensors on-the-fly with minimal human intervention. The driver software which enables access to sensors has to be implemented and the measured sensor data has to be manually mapped to the SWE models. In this article we introduce a Sensor Plug & Play infrastructure for the Sensor Web by combining (1) semantic matchmaking functionality, (2) a publish/subscribe mechanism underlying the SensorWeb, as well as (3) a model for the declarative description of sensor interfaces which serves as a generic driver mechanism. We implement and evaluate our approach by applying it to an oil spill scenario. The matchmaking is realized using existing ontologies and reasoning engines and provides a strong case for the semantic integration capabilities provided by Semantic Web research. PMID:22164033

  11. Towards Big Earth Data Analytics: The EarthServer Approach

    NASA Astrophysics Data System (ADS)

    Baumann, Peter

    2013-04-01

    Big Data in the Earth sciences, the Tera- to Exabyte archives, mostly are made up from coverage data whereby the term "coverage", according to ISO and OGC, is defined as the digital representation of some space-time varying phenomenon. Common examples include 1-D sensor timeseries, 2-D remote sensing imagery, 3D x/y/t image timeseries and x/y/z geology data, and 4-D x/y/z/t atmosphere and ocean data. Analytics on such data requires on-demand processing of sometimes significant complexity, such as getting the Fourier transform of satellite images. As network bandwidth limits prohibit transfer of such Big Data it is indispensable to devise protocols allowing clients to task flexible and fast processing on the server. The EarthServer initiative, funded by EU FP7 eInfrastructures, unites 11 partners from computer and earth sciences to establish Big Earth Data Analytics. One key ingredient is flexibility for users to ask what they want, not impeded and complicated by system internals. The EarthServer answer to this is to use high-level query languages; these have proven tremendously successful on tabular and XML data, and we extend them with a central geo data structure, multi-dimensional arrays. A second key ingredient is scalability. Without any doubt, scalability ultimately can only be achieved through parallelization. In the past, parallelizing code has been done at compile time and usually with manual intervention. The EarthServer approach is to perform a samentic-based dynamic distribution of queries fragments based on networks optimization and further criteria. The EarthServer platform is comprised by rasdaman, an Array DBMS enabling efficient storage and retrieval of any-size, any-type multi-dimensional raster data. In the project, rasdaman is being extended with several functionality and scalability features, including: support for irregular grids and general meshes; in-situ retrieval (evaluation of database queries on existing archive structures, avoiding data import and, hence, duplication); the aforementioned distributed query processing. Additionally, Web clients for multi-dimensional data visualization are being established. Client/server interfaces are strictly based on OGC and W3C standards, in particular the Web Coverage Processing Service (WCPS) which defines a high-level raster query language. We present the EarthServer project with its vision and approaches, relate it to the current state of standardization, and demonstrate it by way of large-scale data centers and their services using rasdaman.

  12. Cytologic Features of Malignant Melanoma with Osteoclast-Like Giant Cells.

    PubMed

    Jiménez-Heffernan, José A; Adrados, Magdalena; Muñoz-Hernández, Patricia; Fernández-Rico, Paloma; Ballesteros-García, Ana I; Fraga, Javier

    2018-01-01

    Malignant melanoma showing numerous osteoclast-like giant cells (OGCs) is an uncommon morphologic phenomenon, rarely mentioned in the cytologic literature. The few reported cases seem to have an aggressive clinical behavior. Although most findings support monocyte/macrophage differentiation, the exact nature of OGCs is not clear. A 57-year-old woman presented with an inguinal lymphadenopathy. Sixteen years before, cutaneous malignant melanoma of the lower limb had been excised. Needle aspiration revealed abundant neoplastic single cells as well as numerous multinucleated OGCs. Occasional neoplastic giant cells were also present. Nuclei of OGCs were monomorphic with oval morphology and were smaller than those of melanoma cells. The immunophenotype of OGCs (S100-, HMB45-, Melan-A-, SOX10-, Ki67-, CD163-, BRAF-, CD68+, MiTF+, p16+) was the expected for reactive OGCs of monocyte/macrophage origin. The tumor has shown an aggressive behavior with further metastases to the axillary lymph nodes and oral cavity. Numerous OGCs are a rare and relevant finding in malignant melanoma. Their presence should not induce confusion with other tumors rich in osteoclastic cells. Since a relevant number of OGCs in melanoma may mean a more aggressive behavior, and patients may benefit from specific treatments, their presence should be mentioned in the pathologic report. © 2018 S. Karger AG, Basel.

  13. Brandenburg 3D - a comprehensive 3D Subsurface Model, Conception of an Infrastructure Node and a Web Application

    NASA Astrophysics Data System (ADS)

    Kerschke, Dorit; Schilling, Maik; Simon, Andreas; Wächter, Joachim

    2014-05-01

    The Energiewende and the increasing scarcity of raw materials will lead to an intensified utilization of the subsurface in Germany. Within this context, geological 3D modeling is a fundamental approach for integrated decision and planning processes. Initiated by the development of the European Geospatial Infrastructure INSPIRE, the German State Geological Offices started digitizing their predominantly analog archive inventory. Until now, a comprehensive 3D subsurface model of Brandenburg did not exist. Therefore the project B3D strived to develop a new 3D model as well as a subsequent infrastructure node to integrate all geological and spatial data within the Geodaten-Infrastruktur Brandenburg (Geospatial Infrastructure, GDI-BB) and provide it to the public through an interactive 2D/3D web application. The functionality of the web application is based on a client-server architecture. Server-sided, all available spatial data is published through GeoServer. GeoServer is designed for interoperability and acts as the reference implementation of the Open Geospatial Consortium (OGC) Web Feature Service (WFS) standard that provides the interface that allows requests for geographical features. In addition, GeoServer implements, among others, the high performance certified compliant Web Map Service (WMS) that serves geo-referenced map images. For publishing 3D data, the OGC Web 3D Service (W3DS), a portrayal service for three-dimensional geo-data, is used. The W3DS displays elements representing the geometry, appearance, and behavior of geographic objects. On the client side, the web application is solely based on Free and Open Source Software and leans on the JavaScript API WebGL that allows the interactive rendering of 2D and 3D graphics by means of GPU accelerated usage of physics and image processing as part of the web page canvas without the use of plug-ins. WebGL is supported by most web browsers (e.g., Google Chrome, Mozilla Firefox, Safari, and Opera). The web application enables an intuitive navigation through all available information and allows the visualization of geological maps (2D), seismic transects (2D/3D), wells (2D/3D), and the 3D-model. These achievements will alleviate spatial and geological data management within the German State Geological Offices and foster the interoperability of heterogeneous systems. It will provide guidance to a systematic subsurface management across system, domain and administrative boundaries on the basis of a federated spatial data infrastructure, and include the public in the decision processes (e-Governance). Yet, the interoperability of the systems has to be strongly propelled forward through agreements on standards that need to be decided upon in responsible committees. The project B3D is funded with resources from the European Fund for Regional Development (EFRE).

  14. Smart sensing surveillance system

    NASA Astrophysics Data System (ADS)

    Hsu, Charles; Chu, Kai-Dee; O'Looney, James; Blake, Michael; Rutar, Colleen

    2010-04-01

    Unattended ground sensor (UGS) networks have been widely used in remote battlefield and other tactical applications over the last few decades due to the advances of the digital signal processing. The UGS network can be applied in a variety of areas including border surveillance, special force operations, perimeter and building protection, target acquisition, situational awareness, and force protection. In this paper, a highly-distributed, fault-tolerant, and energyefficient Smart Sensing Surveillance System (S4) is presented to efficiently provide 24/7 and all weather security operation in a situation management environment. The S4 is composed of a number of distributed nodes to collect, process, and disseminate heterogeneous sensor data. Nearly all S4 nodes have passive sensors to provide rapid omnidirectional detection. In addition, Pan- Tilt- Zoom- (PTZ) Electro-Optics EO/IR cameras are integrated to selected nodes to track the objects and capture associated imagery. These S4 camera-connected nodes will provide applicable advanced on-board digital image processing capabilities to detect and track the specific objects. The imaging detection operations include unattended object detection, human feature and behavior detection, and configurable alert triggers, etc. In the S4, all the nodes are connected with a robust, reconfigurable, LPI/LPD (Low Probability of Intercept/ Low Probability of Detect) wireless mesh network using Ultra-wide band (UWB) RF technology, which can provide an ad-hoc, secure mesh network and capability to relay network information, communicate and pass situational awareness and messages. The S4 utilizes a Service Oriented Architecture such that remote applications can interact with the S4 network and use the specific presentation methods. The S4 capabilities and technologies have great potential for both military and civilian applications, enabling highly effective security support tools for improving surveillance activities in densely crowded environments and near perimeters and borders. The S4 is compliant with Open Geospatial Consortium - Sensor Web Enablement (OGC-SWE®) standards. It would be directly applicable to solutions for emergency response personnel, law enforcement, and other homeland security missions, as well as in applications requiring the interoperation of sensor networks with handheld or body-worn interface devices.

  15. OGC Dashboard

    EPA Pesticide Factsheets

    The Office of General Counsel (OGC) has an ongoing business process engineering and business process automation initiative which has helped the office reduce administrative labor costs while increasing employee effectiveness. Supporting this effort is a system of automated routines accessible through a portal' interface called OGC Dashboard. The dashboard helps OGC track work progress, legal case load, written work products such as legal briefs and advice, and scheduling processes such as employee leave plans (via calendar) and travel compensatory time off.

  16. Sensor system for web inspection

    DOEpatents

    Sleefe, Gerard E.; Rudnick, Thomas J.; Novak, James L.

    2002-01-01

    A system for electrically measuring variations over a flexible web has a capacitive sensor including spaced electrically conductive, transmit and receive electrodes mounted on a flexible substrate. The sensor is held against a flexible web with sufficient force to deflect the path of the web, which moves relative to the sensor.

  17. Use of the Earth Observing One (EO-1) Satellite for the Namibia SensorWeb Flood Early Warning Pilot

    NASA Technical Reports Server (NTRS)

    Mandl, Daniel; Frye, Stuart; Cappelaere, Pat; Handy, Matthew; Policelli, Fritz; Katjizeu, McCloud; Van Langenhove, Guido; Aube, Guy; Saulnier, Jean-Francois; Sohlberg, Rob; hide

    2012-01-01

    The Earth Observing One (EO-1) satellite was launched in November 2000 as a one year technology demonstration mission for a variety of space technologies. After the first year, it was used as a pathfinder for the creation of SensorWebs. A SensorWeb is the integration of variety of space, airborne and ground sensors into a loosely coupled collaborative sensor system that automatically provides useful data products. Typically, a SensorWeb is comprised of heterogeneous sensors tied together with a messaging architecture and web services. Disasters are the perfect arena to use SensorWebs. One SensorWeb pilot project that has been active since 2009 is the Namibia Early Flood Warning SensorWeb pilot project. The Pilot Project was established under the auspices of the Namibian Ministry of Agriculture Water and Forestry (MAWF)/Department of Water Affairs, the Committee on Earth Observing Satellites (CEOS)/Working Group on Information Systems and Services (WGISS) and moderated by the United Nations Platform for Space-based Information for Disaster Management and Emergency Response (UN-SPIDER). The effort began by identifying and prototyping technologies which enabled the rapid gathering and dissemination of both space-based and ground sensor data and data products for the purpose of flood disaster management and water-borne disease management. This was followed by an international collaboration to build small portions of the identified system which was prototyped during that past few years during the flood seasons which occurred in the February through May timeframe of 2010 and 2011 with further prototyping to occur in 2012. The SensorWeb system features EO-1 data along with other data sets from such satellites as Radarsat, Terra and Aqua. Finally, the SensorWeb team also began to examine the socioeconomic component to determine the impact of the SensorWeb technology and how best to assist in the infusion of this technology in lesser affluent areas with low levels of basic infrastructure. This paper provides an overview of these efforts, highlighting the EO-1 usage in this SensorWeb.

  18. Analyzing CRISM hyperspectral imagery using PlanetServer.

    NASA Astrophysics Data System (ADS)

    Figuera, Ramiro Marco; Pham Huu, Bang; Minin, Mikhail; Flahaut, Jessica; Halder, Anik; Rossi, Angelo Pio

    2017-04-01

    Mineral characterization of planetary surfaces bears great importance for space exploration. In order to perform it, orbital hyperspectral imagery is widely used. In our research we use Compact Reconnaissance Imaging Spectrometer for Mars (CRISM) [1] TRDR L observations with a spectral range of 1 to 4 µm. PlanetServer comprises a server, a web client and a Python client/API. The server side uses the Array DataBase Management System (DBMS) Raster Data Manager (Rasdaman) Community Edition [2]. OGC standards such as the Web Coverage Processing Service (WCPS) [3], an SQL-like language capable to query information along the image cube, are implemented in the PetaScope component [4]. The client side uses NASA's Web World Wind [5] allowing the user to access the data in an intuitive way. The client consists of a globe where all cubes are deployed, a main menu where projections, base maps and RGB combinations are provided, and a plot dock where the spectral information is shown. The RGB combinator tool allows to do band combination such as the CRISM products [6] using WCPS. The spectral information is retrieved using WCPS and shown in the plot dock/widget. The USGS splib06a library [7] is available to compare CRISM vs. laboratory spectra. The Python API provides an environment to create RGB combinations that can be embedded into existing pipelines. All employed libraries and tools are open source and can be easily adapted to other datasets. PlanetServer stands as a promising tool for spectral analysis on planetary bodies. M3/Moon and OMEGA datasets will be soon available. [1] S. Murchie et al., "Compact Connaissance Imaging Spectrometer for Mars (CRISM) on Mars Reconnaissance Orbiter (MRO)," J. Geophys. Res. E Planets,2007. [2] P. Baumann, A. Dehmel, P. Furtado, R. Ritsch, and N. Widmann, "The multidimensional database system RasDaMan," ACM SIGMOD Rec., vol. 27, no. 2, pp. 575-577, Jun. 1998. [3] P. Baumann, "The OGC web coverage processing service (WCPS) standard," Geoinformatica, vol. 14, no. 4, Jul. 2010. [4] A. Aiordǎchioaie and P. Baumann, "PetaScope: An open-source implementation of the OGC WCS Geo service standards suite," Lect. Notes Comput. Sci. (including Subser. Lect. Notes Artif. Intell. Lect. Notes Bioinformatics), vol. 6187 LNCS, pp. 160-168, Jun. 2010. [5] P. Hogan, C. Maxwell, R. Kim, and T. Gaskins, "World Wind 3D Earth Viewing," Apr. 2007. [6] C. E. Viviano-Beck et al., "Revised CRISM spectral parameters and summary products based on the currently detected mineral diversity on Mars," J. Geophys. Res. E Planets, vol. 119, no. 6, pp. 1403-1431, Jun. 2014. [7] R. N. Clark et al., "USGS digital spectral library splib06a: U.S. Geological Survey, Digital Data Series 231," 2007. [Online]. Available: http://speclab.cr.usgs.gov/spectral.lib06.

  19. Development of a Dynamic Web Mapping Service for Vegetation Productivity Using Earth Observation and in situ Sensors in a Sensor Web Based Approach

    PubMed Central

    Kooistra, Lammert; Bergsma, Aldo; Chuma, Beatus; de Bruin, Sytze

    2009-01-01

    This paper describes the development of a sensor web based approach which combines earth observation and in situ sensor data to derive typical information offered by a dynamic web mapping service (WMS). A prototype has been developed which provides daily maps of vegetation productivity for the Netherlands with a spatial resolution of 250 m. Daily available MODIS surface reflectance products and meteorological parameters obtained through a Sensor Observation Service (SOS) were used as input for a vegetation productivity model. This paper presents the vegetation productivity model, the sensor data sources and the implementation of the automated processing facility. Finally, an evaluation is made of the opportunities and limitations of sensor web based approaches for the development of web services which combine both satellite and in situ sensor sources. PMID:22574019

  20. Exploring NASA Satellite Data with High Resolution Visualization

    NASA Astrophysics Data System (ADS)

    Wei, J. C.; Yang, W.; Johnson, J. E.; Shen, S.; Zhao, P.; Gerasimov, I. V.; Vollmer, B.; Vicente, G. A.; Pham, L.

    2013-12-01

    Satellite data products are important for a wide variety of applications that can bring far-reaching benefits to the science community and the broader society. These benefits can best be achieved if the satellite data are well utilized and interpreted, such as model inputs from satellite, or extreme event (such as volcano eruption, dust storm, ...etc) interpretation from satellite. Unfortunately, this is not always the case, despite the abundance and relative maturity of numerous satellite data products provided by NASA and other organizations. Such obstacles may be avoided by providing satellite data as ';Images' with accurate pixel-level (Level 2) information, including pixel coverage area delineation and science team recommended quality screening for individual geophysical parameters. We will present a prototype service from the Goddard Earth Sciences Data and Information Services Center (GES DISC) supporting various visualization and data accessing capabilities from satellite Level 2 data (non-aggregated and un-gridded) at high spatial resolution. Functionality will include selecting data sources (e.g., multiple parameters under the same measurement, like NO2 and SO2 from Ozone Monitoring Instrument (OMI), or same parameter with different methods of aggregation, like NO2 in OMNO2G and OMNO2D products), defining area-of-interest and temporal extents, zooming, panning, overlaying, sliding, and data subsetting and reformatting. The portal interface will connect to the backend services with OGC standard-compliant Web Mapping Service (WMS) and Web Coverage Service (WCS) calls. The interface will also be able to connect to other OGC WMS and WCS servers, which will greatly enhance its expandability to integrate additional outside data/map sources.

  1. Standardized acquisition, storing and provision of 3D enabled spatial data

    NASA Astrophysics Data System (ADS)

    Wagner, B.; Maier, S.; Peinsipp-Byma, E.

    2017-05-01

    In the area of working with spatial data, in addition to the classic, two-dimensional geometrical data (maps, aerial images, etc.), the needs for three-dimensional spatial data (city models, digital elevation models, etc.) is increasing. Due to this increased demand the acquiring, storing and provision of 3D enabled spatial data in Geographic Information Systems (GIS) is more and more important. Existing proprietary solutions quickly reaches their limits during data exchange and data delivery to other systems. They generate a large workload, which will be very costly. However, it is noticeable that these expenses and costs can generally be significantly reduced using standards. The aim of this research is therefore to develop a concept in the field of three-dimensional spatial data that runs on existing standards whenever possible. In this research, the military image analysts are the preferred user group of the system. To achieve the objective of the widest possible use of standards in spatial 3D data, existing standards, proprietary interfaces and standards under discussion have been analyzed. Since the here used GIS of the Fraunhofer IOSB is already using and supporting OGC (Open Geospatial Consortium) and NATO-STANAG (NATO-Standardization Agreement) standards for the most part of it, a special attention for possible use was laid on their standards. The most promising standard is the OGC standard 3DPS (3D Portrayal Service) with its occurrences W3DS (Web 3D Service) and WVS (Web View Service). A demo system was created, using a standardized workflow from the data acquiring, storing and provision and showing the benefit of our approach.

  2. Development of ITSASGIS-5D: seeking interoperability between Marine GIS layers and scientific multidimensional data using open source tools and OGC services for multidisciplinary research.

    NASA Astrophysics Data System (ADS)

    Sagarminaga, Y.; Galparsoro, I.; Reig, R.; Sánchez, J. A.

    2012-04-01

    Since 2000, an intense effort was conducted in AZTI's Marine Research Division to set up a data management system which could gather all the marine datasets that were being produced by different in-house research projects. For that, a corporative GIS was designed that included a data and metadata repository, a database, a layer catalog & search application and an internet map viewer. Several layers, mostly dealing with physical, chemical and biological in-situ sampling, and basic and thematic cartography including bathymetry, geomorphology, different species habitat maps, and human pressure and activities maps, were successfully gathered in this system. Very soon, it was realised that new marine technologies yielding continuous multidimensional data, sometimes called FES (Fluid Earth System) data, were difficult to handle in this structure. The data affected, mainly included numerical oceanographic and meteorological models, remote sensing data, coastal RADAR data, and some in-situ observational systems such as CTD's casts, moored or lagrangian buoys, etc. A management system for gridded multidimensional data was developed using standardized formats (netcdf using CF conventions) and tools such as THREDDS catalog (UNIDATA/UCAR) providing web services such as OPENDAP, NCSS, and WCS, as well as ncWMS service developed by the Reading e-science Center. At present, a system (ITSASGIS-5D) is being developed, based on OGC standards and open-source tools to allow interoperability between all the data types mentioned before. This system includes, in the server side, postgresql/postgis databases and geoserver for GIS layers, and THREDDS/Opendap and ncWMS services for FES gridded data. Moreover, an on-line client is being developed to allow joint access, user configuration, data visualisation & query and data distribution. This client is using mapfish, ExtJS - GeoEXT, and openlayers libraries. Through this presentation the elements of the first released version of this system will be described and showed, together with the new topics to be developed in new versions that include among others, the integration of geoNetwork libraries and tools for both FES and GIS metadata management, and the use of new OGC Sensor Observation Services (SOS) to integrate non gridded multidimensional data such as time series, depth profiles or trajectories provided by different observational systems. The final aim of this approach is to contribute to the multidisciplinary access and use of marine data for management and research activities, and facilitate the implementation of integrated ecosystem based approaches in the fields of fisheries advice and management, marine spatial planning, or the implementation of the European policies such as the Water Framework Directive, the Marine Strategy Framework Directive or the Habitat Framework Directive.

  3. Lessons Learned from a Collaborative Sensor Web Prototype

    NASA Technical Reports Server (NTRS)

    Ames, Troy; Case, Lynne; Krahe, Chris; Hess, Melissa; Hennessy, Joseph F. (Technical Monitor)

    2002-01-01

    This paper describes the Sensor Web Application Prototype (SWAP) system that was developed for the Earth Science Technology Office (ESTO). The SWAP is aimed at providing an initial engineering proof-of-concept prototype highlighting sensor collaboration, dynamic cause-effect relationship between sensors, dynamic reconfiguration, and remote monitoring of sensor webs.

  4. Data near processing support for climate data analysis

    NASA Astrophysics Data System (ADS)

    Kindermann, Stephan; Ehbrecht, Carsten; Hempelmann, Nils

    2016-04-01

    Climate data repositories grow in size exponentially. Scalable data near processing capabilities are required to meet future data analysis requirements and to replace current "data download and process at home" workflows and approaches. On one hand side, these processing capabilities should be accessible via standardized interfaces (e.g. OGC WPS), on the other side a large variety of processing tools, toolboxes and deployment alternatives have to be supported and maintained at the data/processing center. We present a community approach of a modular and flexible system supporting the development, deployment and maintenace of OGC-WPS based web processing services. This approach is organized in an open source github project (called "bird-house") supporting individual processing services ("birds", e.g. climate index calculations, model data ensemble calculations), which rely on basic common infrastructural components (e.g. installation and deployment recipes, analysis code dependencies management). To support easy deployment at data centers as well as home institutes (e.g. for testing and development) the system supports the management of the often very complex package dependency chain of climate data analysis packages as well as docker based packaging and installation. We present a concrete deployment scenario at the German Climate Computing Center (DKRZ). The DKRZ one hand side hosts a multi-petabyte climate archive which is integrated e.g. into the european ENES and worldwide ESGF data infrastructure, and on the other hand hosts an HPC center supporting (model) data production and data analysis. The deployment scenario also includes openstack based data cloud services to support data import and data distribution for bird-house based WPS web processing services. Current challenges for inter-institutionnal deployments of web processing services supporting the european and international climate modeling community as well as the climate impact community are highlighted. Also aspects supporting future WPS based cross community usage scenarios supporting data reuse and data provenance aspects are reflected.

  5. Integrated web system of geospatial data services for climate research

    NASA Astrophysics Data System (ADS)

    Okladnikov, Igor; Gordov, Evgeny; Titov, Alexander

    2016-04-01

    Georeferenced datasets are currently actively used for modeling, interpretation and forecasting of climatic and ecosystem changes on different spatial and temporal scales. Due to inherent heterogeneity of environmental datasets as well as their huge size (up to tens terabytes for a single dataset) a special software supporting studies in the climate and environmental change areas is required. An approach for integrated analysis of georefernced climatological data sets based on combination of web and GIS technologies in the framework of spatial data infrastructure paradigm is presented. According to this approach a dedicated data-processing web system for integrated analysis of heterogeneous georeferenced climatological and meteorological data is being developed. It is based on Open Geospatial Consortium (OGC) standards and involves many modern solutions such as object-oriented programming model, modular composition, and JavaScript libraries based on GeoExt library, ExtJS Framework and OpenLayers software. This work is supported by the Ministry of Education and Science of the Russian Federation, Agreement #14.613.21.0037.

  6. A SOA-based approach to geographical data sharing

    NASA Astrophysics Data System (ADS)

    Li, Zonghua; Peng, Mingjun; Fan, Wei

    2009-10-01

    In the last few years, large volumes of spatial data have been available in different government departments in China, but these data are mainly used within these departments. With the e-government project initiated, spatial data sharing become more and more necessary. Currently, the Web has been used not only for document searching but also for the provision and use of services, known as Web services, which are published in a directory and may be automatically discovered by software agents. Particularly in the spatial domain, the possibility of accessing these large spatial datasets via Web services has motivated research into the new field of Spatial Data Infrastructure (SDI) implemented using service-oriented architecture. In this paper a Service-Oriented Architecture (SOA) based Geographical Information Systems (GIS) is proposed, and a prototype system is deployed based on Open Geospatial Consortium (OGC) standard in Wuhan, China, thus that all the departments authorized can access the spatial data within the government intranet, and also these spatial data can be easily integrated into kinds of applications.

  7. A novel highly selective and sensitive detection of serotonin based on Ag/polypyrrole/Cu2O nanocomposite modified glassy carbon electrode.

    PubMed

    Selvarajan, S; Suganthi, A; Rajarajan, M

    2018-06-01

    A silver/polypyrrole/copper oxide (Ag/PPy/Cu 2 O) ternary nanocomposite was prepared by sonochemical and oxidative polymerization simple way, in which Cu 2 O was decorated with Ag nanoparticles, and covered by polyprrole (PPy) layer. The as prepared materials was characterized by UV-vis-spectroscopy (UV-vis), FT-IR, X-ray diffraction (XRD), thermo-gravimetric analysis (TGA), scanning electron microscopy (SEM) with EDX, high resolution transmission electron microscopy (HR-TEM) and X-ray photoelectron spectroscopy (XPS). Sensing of serotonin (5HT) was evaluated electrocatalyst using polypyrrole/glassy carbon electrode (PPy/GCE), polypyrrole/copper oxide/glassy carbon electrode (PPy/Cu 2 O/GCE) and silver/polypyrrole/copper oxide/glassy carbon electrode (Ag/PPy/Cu 2 O/GCE). The Ag/PPy/Cu 2 O/GCE was electrochemically treated in 0.1MPBS solution through cyclic voltammetry (CV) and differential pulse voltammetry (DPV). The peak current response increases linearly with 5-HT concentration from 0.01 to 250 µmol L -1 and the detection limit was found to be 0.0124 μmol L -1 . It exhibits high electrocatalytic activity, satisfactory repeatability, stability, fast response and good selectivity against potentially interfering species, which suggests its potential in the development of sensitive, selective, easy-operation and low-cost serotonin sensor for practical routine analyses. The proposed method is potential to expand the possible applied range of the nanocomposite material for detection of various concerned electro active substances. Copyright © 2018 Elsevier B.V. All rights reserved.

  8. Exploiting OAuth 2.0: from User Delegation for OGC Services to a Generic Federation-as-a-Service Solution for Federated Identity Management

    NASA Astrophysics Data System (ADS)

    Kershaw, Philip; Jensen, Jens; Stephens, Ag; van Engen, Willem

    2013-04-01

    We explore an application of OAuth to enable user delegation for OGC-based services and the evolution of this solution to form part of a wider Federation-as-a-Service offering for federated identity management. OAuth has established itself in the commercial sector as a means for users to delegate access to secured resources under their control to third parties. It has also found its way into the academic and research domains as a solution for user delegation. Notable examples including the CILogon project for Teragrid in the US, and also, closer to the Earth Sciences, as part of the OGC Web Services, Phase 6 Testbed. Both are examples of OAuth 1.0 implementations. Version 2.0 has seen significant changes to this original specification which have not been without controversy but it has arguably provided a greater degree of flexibility in how it can be applied and the use cases that it can address. At CEDA (Centre for Environmental Data Archival, STFC), a Python implementation of OAuth 2.0 was made to explore these capabilities with a focus on providing a solution for user delegation for data access, processing and visualisation services for the Earth Observation and Climate sciences domains. The initial goal was to provide a means of delegating short-lived user credentials to trusted services along the same lines as the established approach of Proxy certificates widely used in Grid computing. For the OGC and other HTTP-based services employed by CEDA, OAuth makes a natural fit for this role, integrating with minimal impact on existing interfaces. Working implementations have been made for CEDA's COWS Web Processing Service and Web Map Service. Packaging the software and making it available in Open Source repositories together with the generic nature of the solution have made it readily exploitable in other application domains. At the Max Planck Institute for Psycholinguistics (Nijmegen, The Netherlands), the software will be used to integrate some tools in the CLARIN infrastructure*. Enhancements have been fedback to the package through this activity. Collaboration with STFC's Scientific Computing department has also seen this solution expand and evolve to support a more demanding set of use cases required to meet the needs for Contrail, an EU Framework 7 project. The goal of Contrail is to develop an Open Source solution for federating resources from multiple Cloud providers. Bringing the solution developed with OAuth together with technologies such as SAML and OpenID it has been possible to develop a generic suite of services to support federated access and identity management, a Federation-as-a-Service package. This is showing promise with trials with the EUDAT project. A deployment of the Contrail software is also planned for CEMS (the facility for Climate and Environmental Monitoring from Space), a new joint academic-industry led facility based at the STFC Harwell site providing access to large-volume Earth Observation and Climate datasets through a Cloud-based service model. * This work is part of the programme of BiG Grid, the Dutch e-Science Grid, which is financially supported by the Netherlands Organisation for Scientific Research, NWO.

  9. Using OPeNDAP's Data-Services Framework to Lift Mash-Ups above Blind Dates

    NASA Astrophysics Data System (ADS)

    Gallagher, J. H. R.; Fulker, D. W.

    2015-12-01

    OPeNDAP's data-as-service framework (Hyrax) matches diverse sources with many end-user tools and contexts. Keys to its flexibility include: A data model embracing tabular data alongside n-dim arrays and other structures useful in geoinformatics. A REST-like protocol that supports—via suffix notation—a growing set of output forms (netCDF, XML, etc.) plus a query syntax for subsetting. Subsetting applies (via constraints on column values) to tabular data or (via constraints on indices or coordinates) to array-style data . A handler-style architecture that admits a growing set of input types. Community members may contribute handlers, making Hyrax effective as middleware, where N sources are mapped to M outputs with order N+M effort (not NxM). Hyrax offers virtual aggregations of source data, enabling granularity aimed at users, not data-collectors. OPeNDAP-access libraries exist in multiple languages, including Python, Java, and C++. Recent enhancements are increasing this framework's interoperability (i.e., its mash-up) potential. Extensions implemented as servlets—running adjacent to Hyrax—are enriching the forms of aggregation and enabling new protocols: User-specified aggregations, namely, applying a query to (huge) lists of source granules, and receiving one (large) table or zipped netCDF file. OGC (Open Geospatial Consortium) protocols, WMS and WCS. A Webification (W10n) protocol that returns JavaScript Object Notation (JSON). Extensions to OPeNDAP's query language are reducing transfer volumes and enabling new forms of inspection. Advances underway include: Functions that, for triangular-mesh sources, return sub-meshes spec'd via geospatial bounding boxes. Functions that, for data from multiple, satellite-borne sensors (with differing orbits), select observations based on coincidence. Calculations of means, histograms, etc. that greatly reduce output volumes.. Paths for communities to contribute new server functions (in Python, e.g.) that data providers may incorporate into Hyrax via installation parameters. One could say Hyrax itself is a mash-up, but we suggest it as an instrument for a mash-up artist's toolbox. This instrument can support mash-ups built on netCDF files, OGC protocols, JavaScript Web pages, and/or programs written in Python, Java, C or C++.

  10. A data delivery system for IMOS, the Australian Integrated Marine Observing System

    NASA Astrophysics Data System (ADS)

    Proctor, R.; Roberts, K.; Ward, B. J.

    2010-09-01

    The Integrated Marine Observing System (IMOS, www.imos.org.au), an AUD 150 m 7-year project (2007-2013), is a distributed set of equipment and data-information services which, among many applications, collectively contribute to meeting the needs of marine climate research in Australia. The observing system provides data in the open oceans around Australia out to a few thousand kilometres as well as the coastal oceans through 11 facilities which effectively observe and measure the 4-dimensional ocean variability, and the physical and biological response of coastal and shelf seas around Australia. Through a national science rationale IMOS is organized as five regional nodes (Western Australia - WAIMOS, South Australian - SAIMOS, Tasmania - TASIMOS, New SouthWales - NSWIMOS and Queensland - QIMOS) surrounded by an oceanic node (Blue Water and Climate). Operationally IMOS is organized as 11 facilities (Argo Australia, Ships of Opportunity, Southern Ocean Automated Time Series Observations, Australian National Facility for Ocean Gliders, Autonomous Underwater Vehicle Facility, Australian National Mooring Network, Australian Coastal Ocean Radar Network, Australian Acoustic Tagging and Monitoring System, Facility for Automated Intelligent Monitoring of Marine Systems, eMarine Information Infrastructure and Satellite Remote Sensing) delivering data. IMOS data is freely available to the public. The data, a combination of near real-time and delayed mode, are made available to researchers through the electronic Marine Information Infrastructure (eMII). eMII utilises the Australian Academic Research Network (AARNET) to support a distributed database on OPeNDAP/THREDDS servers hosted by regional computing centres. IMOS instruments are described through the OGC Specification SensorML and where-ever possible data is in CF compliant netCDF format. Metadata, conforming to standard ISO 19115, is automatically harvested from the netCDF files and the metadata records catalogued in the OGC GeoNetwork Metadata Entry and Search Tool (MEST). Data discovery, access and download occur via web services through the IMOS Ocean Portal (http://imos.aodn.org.au) and tools for the display and integration of near real-time data are in development.

  11. Flow Webs: Mechanism and Architecture for the Implementation of Sensor Webs

    NASA Astrophysics Data System (ADS)

    Gorlick, M. M.; Peng, G. S.; Gasster, S. D.; McAtee, M. D.

    2006-12-01

    The sensor web is a distributed, federated infrastructure much like its predecessors, the internet and the world wide web. It will be a federation of many sensor webs, large and small, under many distinct spans of control, that loosely cooperates and share information for many purposes. Realistically, it will grow piecemeal as distinct, individual systems are developed and deployed, some expressly built for a sensor web while many others were created for other purposes. Therefore, the architecture of the sensor web is of fundamental import and architectural strictures that inhibit innovation, experimentation, sharing or scaling may prove fatal. Drawing upon the architectural lessons of the world wide web, we offer a novel system architecture, the flow web, that elevates flows, sequences of messages over a domain of interest and constrained in both time and space, to a position of primacy as a dynamic, real-time, medium of information exchange for computational services. The flow web captures; in a single, uniform architectural style; the conflicting demands of the sensor web including dynamic adaptations to changing conditions, ease of experimentation, rapid recovery from the failures of sensors and models, automated command and control, incremental development and deployment, and integration at multiple levels—in many cases, at different times. Our conception of sensor webs—dynamic amalgamations of sensor webs each constructed within a flow web infrastructure—holds substantial promise for earth science missions in general, and of weather, air quality, and disaster management in particular. Flow webs, are by philosophy, design and implementation a dynamic infrastructure that permits massive adaptation in real-time. Flows may be attached to and detached from services at will, even while information is in transit through the flow. This concept, flow mobility, permits dynamic integration of earth science products and modeling resources in response to real-time demands. Flows are the connective tissue of flow webs—massive computational engines organized as directed graphs whose nodes are semi-autonomous components and whose edges are flows. The individual components of a flow web may themselves be encapsulated flow webs. In other words, a flow web subgraph may be presented to a yet larger flow web as a single, seamless component. Flow webs, at all levels, may be edited and modified while still executing. Within a flow web individual components may be added, removed, started, paused, halted, reparameterized, or inspected. The topology of a flow web may be changed at will. Thus, flow webs exhibit an extraordinary degree of adaptivity and robustness as they are explicitly designed to be modified on the fly, an attribute well suited for dynamic model interactions in sensor webs. We describe our concept for a sensor web, implemented as a flow web, in the context of a wildfire disaster management system for the southern California region. Comprehensive wildfire management requires cooperation among multiple agencies. Flow webs allow agencies to share resources in exactly the manner they choose. We will explain how to employ flow webs and agents to integrate satellite remote sensing data, models, in-situ sensors, UAVs and other resources into a sensor web that interconnects organizations and their disaster management tools in a manner that simultaneously preserves their independence and builds upon the individual strengths of agency-specific models and data sources.

  12. Web mapping system for complex processing and visualization of environmental geospatial datasets

    NASA Astrophysics Data System (ADS)

    Titov, Alexander; Gordov, Evgeny; Okladnikov, Igor

    2016-04-01

    Environmental geospatial datasets (meteorological observations, modeling and reanalysis results, etc.) are used in numerous research applications. Due to a number of objective reasons such as inherent heterogeneity of environmental datasets, big dataset volume, complexity of data models used, syntactic and semantic differences that complicate creation and use of unified terminology, the development of environmental geodata access, processing and visualization services as well as client applications turns out to be quite a sophisticated task. According to general INSPIRE requirements to data visualization geoportal web applications have to provide such standard functionality as data overview, image navigation, scrolling, scaling and graphical overlay, displaying map legends and corresponding metadata information. It should be noted that modern web mapping systems as integrated geoportal applications are developed based on the SOA and might be considered as complexes of interconnected software tools for working with geospatial data. In the report a complex web mapping system including GIS web client and corresponding OGC services for working with geospatial (NetCDF, PostGIS) dataset archive is presented. There are three basic tiers of the GIS web client in it: 1. Tier of geospatial metadata retrieved from central MySQL repository and represented in JSON format 2. Tier of JavaScript objects implementing methods handling: --- NetCDF metadata --- Task XML object for configuring user calculations, input and output formats --- OGC WMS/WFS cartographical services 3. Graphical user interface (GUI) tier representing JavaScript objects realizing web application business logic Metadata tier consists of a number of JSON objects containing technical information describing geospatial datasets (such as spatio-temporal resolution, meteorological parameters, valid processing methods, etc). The middleware tier of JavaScript objects implementing methods for handling geospatial metadata, task XML object, and WMS/WFS cartographical services interconnects metadata and GUI tiers. The methods include such procedures as JSON metadata downloading and update, launching and tracking of the calculation task running on the remote servers as well as working with WMS/WFS cartographical services including: obtaining the list of available layers, visualizing layers on the map, exporting layers in graphical (PNG, JPG, GeoTIFF), vector (KML, GML, Shape) and digital (NetCDF) formats. Graphical user interface tier is based on the bundle of JavaScript libraries (OpenLayers, GeoExt and ExtJS) and represents a set of software components implementing web mapping application business logic (complex menus, toolbars, wizards, event handlers, etc.). GUI provides two basic capabilities for the end user: configuring the task XML object functionality and cartographical information visualizing. The web interface developed is similar to the interface of such popular desktop GIS applications, as uDIG, QuantumGIS etc. Web mapping system developed has shown its effectiveness in the process of solving real climate change research problems and disseminating investigation results in cartographical form. The work is supported by SB RAS Basic Program Projects VIII.80.2.1 and IV.38.1.7.

  13. Web-Based Interface for Command and Control of Network Sensors

    NASA Technical Reports Server (NTRS)

    Wallick, Michael N.; Doubleday, Joshua R.; Shams, Khawaja S.

    2010-01-01

    This software allows for the visualization and control of a network of sensors through a Web browser interface. It is currently being deployed for a network of sensors monitoring Mt. Saint Helen s volcano; however, this innovation is generic enough that it can be deployed for any type of sensor Web. From this interface, the user is able to fully control and monitor the sensor Web. This includes, but is not limited to, sending "test" commands to individual sensors in the network, monitoring for real-world events, and reacting to those events

  14. WMS Server 2.0

    NASA Technical Reports Server (NTRS)

    Plesea, Lucian; Wood, James F.

    2012-01-01

    This software is a simple, yet flexible server of raster map products, compliant with the Open Geospatial Consortium (OGC) Web Map Service (WMS) 1.1.1 protocol. The server is a full implementation of the OGC WMS 1.1.1 as a fastCGI client and using Geospatial Data Abstraction Library (GDAL) for data access. The server can operate in a proxy mode, where all or part of the WMS requests are done on a back server. The server has explicit support for a colocated tiled WMS, including rapid response of black (no-data) requests. It generates JPEG and PNG images, including 16-bit PNG. The GDAL back-end support allows great flexibility on the data access. The server is a port to a Linux/GDAL platform from the original IRIX/IL platform. It is simpler to configure and use, and depending on the storage format used, it has better performance than other available implementations. The WMS server 2.0 is a high-performance WMS implementation due to the fastCGI architecture. The use of GDAL data back end allows for great flexibility. The configuration is relatively simple, based on a single XML file. It provides scaling and cropping, as well as blending of multiple layers based on layer transparency.

  15. Complex virtual urban environment modeling from CityGML data and OGC web services: application to the SIMFOR project

    NASA Astrophysics Data System (ADS)

    Chambelland, Jean-Christophe; Gesquière, Gilles

    2012-03-01

    Due to the advances in computer graphics and network speed it is possible to navigate in 3D virtual world in real time. This technology proposed for example in computer games, has been adapted for training systems. In this context, a collaborative serious game for urban crisis management called SIMFOR is born in France. This project has been designed for intensive realistic training and consequently must allow the players to create new urban operational theatres. In this goal, importing, structuring, processing and exchanging 3D urban data remains an important underlying problem. This communication will focus on the design of the 3D Environment Editor (EE) and the related data processes needed to prepare the data flow to be exploitable by the runtime environment of SIMFOR. We will use solutions proposed by the Open Geospatial Consortium (OGC) to aggregate and share data. A presentation of the proposed architecture will be given. The overall design of the EE and some strategies for efficiently analyzing, displaying and exporting large amount of urban CityGML information will be presented. An example illustrating the potentiality of the EE and the reliability of the proposed data processing will be proposed.

  16. Providing Internet Access to High-Resolution Lunar Images

    NASA Technical Reports Server (NTRS)

    Plesea, Lucian

    2008-01-01

    The OnMoon server is a computer program that provides Internet access to high-resolution Lunar images, maps, and elevation data, all suitable for use in geographical information system (GIS) software for generating images, maps, and computational models of the Moon. The OnMoon server implements the Open Geospatial Consortium (OGC) Web Map Service (WMS) server protocol and supports Moon-specific extensions. Unlike other Internet map servers that provide Lunar data using an Earth coordinate system, the OnMoon server supports encoding of data in Moon-specific coordinate systems. The OnMoon server offers access to most of the available high-resolution Lunar image and elevation data. This server can generate image and map files in the tagged image file format (TIFF) or the Joint Photographic Experts Group (JPEG), 8- or 16-bit Portable Network Graphics (PNG), or Keyhole Markup Language (KML) format. Image control is provided by use of the OGC Style Layer Descriptor (SLD) protocol. Full-precision spectral arithmetic processing is also available, by use of a custom SLD extension. This server can dynamically add shaded relief based on the Lunar elevation to any image layer. This server also implements tiled WMS protocol and super-overlay KML for high-performance client application programs.

  17. Providing Internet Access to High-Resolution Mars Images

    NASA Technical Reports Server (NTRS)

    Plesea, Lucian

    2008-01-01

    The OnMars server is a computer program that provides Internet access to high-resolution Mars images, maps, and elevation data, all suitable for use in geographical information system (GIS) software for generating images, maps, and computational models of Mars. The OnMars server is an implementation of the Open Geospatial Consortium (OGC) Web Map Service (WMS) server. Unlike other Mars Internet map servers that provide Martian data using an Earth coordinate system, the OnMars WMS server supports encoding of data in Mars-specific coordinate systems. The OnMars server offers access to most of the available high-resolution Martian image and elevation data, including an 8-meter-per-pixel uncontrolled mosaic of most of the Mars Global Surveyor (MGS) Mars Observer Camera Narrow Angle (MOCNA) image collection, which is not available elsewhere. This server can generate image and map files in the tagged image file format (TIFF), Joint Photographic Experts Group (JPEG), 8- or 16-bit Portable Network Graphics (PNG), or Keyhole Markup Language (KML) format. Image control is provided by use of the OGC Style Layer Descriptor (SLD) protocol. The OnMars server also implements tiled WMS protocol and super-overlay KML for high-performance client application programs.

  18. Sensor Web Interoperability Testbed Results Incorporating Earth Observation Satellites

    NASA Technical Reports Server (NTRS)

    Frye, Stuart; Mandl, Daniel J.; Alameh, Nadine; Bambacus, Myra; Cappelaere, Pat; Falke, Stefan; Derezinski, Linda; Zhao, Piesheng

    2007-01-01

    This paper describes an Earth Observation Sensor Web scenario based on the Open Geospatial Consortium s Sensor Web Enablement and Web Services interoperability standards. The scenario demonstrates the application of standards in describing, discovering, accessing and tasking satellites and groundbased sensor installations in a sequence of analysis activities that deliver information required by decision makers in response to national, regional or local emergencies.

  19. Importance of the spatial data and the sensor web in the ubiquitous computing area

    NASA Astrophysics Data System (ADS)

    Akçit, Nuhcan; Tomur, Emrah; Karslıoǧlu, Mahmut O.

    2014-08-01

    Spatial data has become a critical issue in recent years. In the past years, nearly more than three quarters of databases, were related directly or indirectly to locations referring to physical features, which constitute the relevant aspects. Spatial data is necessary to identify or calculate the relationships between spatial objects when using spatial operators in programs or portals. Originally, calculations were conducted using Geographic Information System (GIS) programs on local computers. Subsequently, through the Internet, they formed a geospatial web, which is integrated into a discoverable collection of geographically related web standards and key features, and constitutes a global network of geospatial data that employs the World Wide Web to process textual data. In addition, the geospatial web is used to gather spatial data producers, resources, and users. Standards also constitute a critical dimension in further globalizing the idea of the geospatial web. The sensor web is an example of the real time service that the geospatial web can provide. Sensors around the world collect numerous types of data. The sensor web is a type of sensor network that is used for visualizing, calculating, and analyzing collected sensor data. Today, people use smart devices and systems more frequently because of the evolution of technology and have more than one mobile device. The considerable number of sensors and different types of data that are positioned around the world have driven the production of interoperable and platform-independent sensor web portals. The focus of such production has been on further developing the idea of an interoperable and interdependent sensor web of all devices that share and collect information. The other pivotal idea consists of encouraging people to use and send data voluntarily for numerous purposes with the some level of credibility. The principal goal is to connect mobile and non-mobile device in the sensor web platform together to operate for serving and collecting information from people.

  20. Earth Observation oriented teaching materials development based on OGC Web services and Bashyt generated reports

    NASA Astrophysics Data System (ADS)

    Stefanut, T.; Gorgan, D.; Giuliani, G.; Cau, P.

    2012-04-01

    Creating e-Learning materials in the Earth Observation domain is a difficult task especially for non-technical specialists who have to deal with distributed repositories, large amounts of information and intensive processing requirements. Furthermore, due to the lack of specialized applications for developing teaching resources, technical knowledge is required also for defining data presentation structures or in the development and customization of user interaction techniques for better teaching results. As a response to these issues during the GiSHEO FP7 project [1] and later in the EnviroGRIDS FP7 [2] project, we have developed the eGLE e-Learning Platform [3], a tool based application that provides dedicated functionalities to the Earth Observation specialists for developing teaching materials. The proposed architecture is built around a client-server design that provides the core functionalities (e.g. user management, tools integration, teaching materials settings, etc.) and has been extended with a distributed component implemented through the tools that are integrated into the platform, as described further. Our approach in dealing with multiple transfer protocol types, heterogeneous data formats or various user interaction techniques involve the development and integration of very specialized elements (tools) that can be customized by the trainers in a visual manner through simple user interfaces. In our concept each tool is dedicated to a specific data type, implementing optimized mechanisms for searching, retrieving, visualizing and interacting with it. At the same time, in each learning resource can be integrated any number of tools, through drag-and-drop interaction, allowing the teacher to retrieve pieces of data of various types (e.g. images, charts, tables, text, videos etc.) from different sources (e.g. OGC web services, charts created through Bashyt application, etc.) through different protocols (ex. WMS, BASHYT API, FTP, HTTP etc.) and to display them all together in a unitary manner using the same visual structure [4]. Addressing the High Power Computation requirements that are met while processing environmental data, our platform can be easily extended through tools that connect to GRID infrastructures, WCS web services, Bashyt API (for creating specialized hydrological reports) or any other specialized services (ex. graphics cluster visualization) that can be reached over the Internet. At run time, on the trainee's computer each tool is launched in an asynchronous running mode and connects to the data source that has been established by the teacher, retrieving and displaying the information to the user. The data transfer is accomplished directly between the trainee's computer and the corresponding services (e.g. OGC, Bashyt API, etc.) without passing through the core server platform. In this manner, the eGLE application can provide better and more responsive connections to a large number of users.

  1. A Ubiquitous Sensor Network Platform for Integrating Smart Devices into the Semantic Sensor Web

    PubMed Central

    de Vera, David Díaz Pardo; Izquierdo, Álvaro Sigüenza; Vercher, Jesús Bernat; Gómez, Luis Alfonso Hernández

    2014-01-01

    Ongoing Sensor Web developments make a growing amount of heterogeneous sensor data available to smart devices. This is generating an increasing demand for homogeneous mechanisms to access, publish and share real-world information. This paper discusses, first, an architectural solution based on Next Generation Networks: a pilot Telco Ubiquitous Sensor Network (USN) Platform that embeds several OGC® Sensor Web services. This platform has already been deployed in large scale projects. Second, the USN-Platform is extended to explore a first approach to Semantic Sensor Web principles and technologies, so that smart devices can access Sensor Web data, allowing them also to share richer (semantically interpreted) information. An experimental scenario is presented: a smart car that consumes and produces real-world information which is integrated into the Semantic Sensor Web through a Telco USN-Platform. Performance tests revealed that observation publishing times with our experimental system were well within limits compatible with the adequate operation of smart safety assistance systems in vehicles. On the other hand, response times for complex queries on large repositories may be inappropriate for rapid reaction needs. PMID:24945678

  2. A ubiquitous sensor network platform for integrating smart devices into the semantic sensor web.

    PubMed

    de Vera, David Díaz Pardo; Izquierdo, Alvaro Sigüenza; Vercher, Jesús Bernat; Hernández Gómez, Luis Alfonso

    2014-06-18

    Ongoing Sensor Web developments make a growing amount of heterogeneous sensor data available to smart devices. This is generating an increasing demand for homogeneous mechanisms to access, publish and share real-world information. This paper discusses, first, an architectural solution based on Next Generation Networks: a pilot Telco Ubiquitous Sensor Network (USN) Platform that embeds several OGC® Sensor Web services. This platform has already been deployed in large scale projects. Second, the USN-Platform is extended to explore a first approach to Semantic Sensor Web principles and technologies, so that smart devices can access Sensor Web data, allowing them also to share richer (semantically interpreted) information. An experimental scenario is presented: a smart car that consumes and produces real-world information which is integrated into the Semantic Sensor Web through a Telco USN-Platform. Performance tests revealed that observation publishing times with our experimental system were well within limits compatible with the adequate operation of smart safety assistance systems in vehicles. On the other hand, response times for complex queries on large repositories may be inappropriate for rapid reaction needs.

  3. 77 FR 41186 - Proposed Consent Decree, Clean Air Act Citizen Suit

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-12

    ... ENVIRONMENTAL PROTECTION AGENCY [EPA-HQ-OGC-2012-0554; FRL-9699-6] Proposed Consent Decree, Clean... August 13, 2012. ADDRESSES: Submit your comments, identified by Docket ID number EPA-HQ- OGC-2012-0554... (identified by Docket ID No. EPA-HQ-OGC-2012-0554) contains a copy of the proposed consent decree. The...

  4. Real-time GIS data model and sensor web service platform for environmental data management.

    PubMed

    Gong, Jianya; Geng, Jing; Chen, Zeqiang

    2015-01-09

    Effective environmental data management is meaningful for human health. In the past, environmental data management involved developing a specific environmental data management system, but this method often lacks real-time data retrieving and sharing/interoperating capability. With the development of information technology, a Geospatial Service Web method is proposed that can be employed for environmental data management. The purpose of this study is to determine a method to realize environmental data management under the Geospatial Service Web framework. A real-time GIS (Geographic Information System) data model and a Sensor Web service platform to realize environmental data management under the Geospatial Service Web framework are proposed in this study. The real-time GIS data model manages real-time data. The Sensor Web service platform is applied to support the realization of the real-time GIS data model based on the Sensor Web technologies. To support the realization of the proposed real-time GIS data model, a Sensor Web service platform is implemented. Real-time environmental data, such as meteorological data, air quality data, soil moisture data, soil temperature data, and landslide data, are managed in the Sensor Web service platform. In addition, two use cases of real-time air quality monitoring and real-time soil moisture monitoring based on the real-time GIS data model in the Sensor Web service platform are realized and demonstrated. The total time efficiency of the two experiments is 3.7 s and 9.2 s. The experimental results show that the method integrating real-time GIS data model and Sensor Web Service Platform is an effective way to manage environmental data under the Geospatial Service Web framework.

  5. The OGC Publish/Subscribe specification in the context of sensor-based applications

    NASA Astrophysics Data System (ADS)

    Bigagli, Lorenzo

    2014-05-01

    The Open Geospatial Consortium Publish/Subscribe Standards Working Group (in short, OGC PubSub SWG) was chartered in 2010 to specify a mechanism to support publish/subscribe requirements across OGC service interfaces and data types (coverage, feature, etc.) The Publish/Subscribe Interface Standard 1.0 - Core (13-131) defines an abstract description of the basic mandatory functionality, along with several optional, extended capabilities. The Core is independent of the underlying binding, for which two extensions are currently considered in the PubSub SWG scope: a SOAP binding and RESTful binding. Two primary parties characterize the publish/subscribe model: a Publisher, which is publishing information, and a Subscriber, which expresses an interest in all or part of the published information. In many cases, the Subscriber and the entity to which data is to be delivered (the Receiver) are one and the same. However, they are distinguished in PubSub to allow for these roles to be segregated. This is useful, for example, in event-based systems, where system entities primarily react to incoming information and may emit new information to other interested entities. The Publish/Subscribe model is distinguished from the typical request/response model, where a client makes a request and the server responds with either the requested information or a failure. This provides relatively immediate feedback, but can be insufficient in cases where the client is waiting for a specific event (such as data arrival, server changes, or data updates). In fact, while waiting for an event, a client must repeatedly request the desired information (polling). This has undesirable side effects: if a client polls frequently this can increase server load and network traffic, and if a client polls infrequently it may not receive a message when it is needed. These issues are accentuated when event occurrences are unpredictable, or when the delay between event occurrence and client notification must be small. Instead, the Publish/Subscribe model is characterized by the ability for a Subscriber to specify an ongoing (persistent) expression of interest in some messages, and by the asynchronous delivery of such messages. Hence, the publish/subscribe model can be useful to reduce the latency between event occurrence and event notification, as it is the Publisher's responsibility to publish a message when the event occurs, rather than relying on clients to anticipate the occurrence. The following cross-service requirements have been identified for PubSub 1.0: • Provide notification capabilities as a module to existing OGC services with no impact on existing service semantics and by reusing service-specific filtering semantics. • Usable as a way to push actual data (not only references to data) from a data access service (i.e. WCS, WFS, SOS) to the client. • Usable as a way to push notification messages (i.e. lightweight, no data but rather references to data) to the client. • Usable as a way to provide notifications of service and dataset updates in order to simplify/optimize harvesting by catalogs. The use-cases identified for PubSub 1.0 include: • Service Filtered Data Push. • Service Filtered Notification. • Notification of Threshold Crossings. • FAA SAA Dissemination Pilot. • Emergency / Safety Critical Application. The above suggests that the OGC Publish/Subscribe specification could be successfully applied to sensor-based monitoring. This work elaborates on this technology and its possible applications in this context.

  6. Optimized Autonomous Space In-situ Sensor-Web for volcano monitoring

    USGS Publications Warehouse

    Song, W.-Z.; Shirazi, B.; Kedar, S.; Chien, S.; Webb, F.; Tran, D.; Davis, A.; Pieri, D.; LaHusen, R.; Pallister, J.; Dzurisin, D.; Moran, S.; Lisowski, M.

    2008-01-01

    In response to NASA's announced requirement for Earth hazard monitoring sensor-web technology, a multidisciplinary team involving sensor-network experts (Washington State University), space scientists (JPL), and Earth scientists (USGS Cascade Volcano Observatory (CVO)), is developing a prototype dynamic and scaleable hazard monitoring sensor-web and applying it to volcano monitoring. The combined Optimized Autonomous Space -In-situ Sensor-web (OASIS) will have two-way communication capability between ground and space assets, use both space and ground data for optimal allocation of limited power and bandwidth resources on the ground, and use smart management of competing demands for limited space assets. It will also enable scalability and seamless infusion of future space and in-situ assets into the sensor-web. The prototype will be focused on volcano hazard monitoring at Mount St. Helens, which has been active since October 2004. The system is designed to be flexible and easily configurable for many other applications as well. The primary goals of the project are: 1) integrating complementary space (i.e., Earth Observing One (EO-1) satellite) and in-situ (ground-based) elements into an interactive, autonomous sensor-web; 2) advancing sensor-web power and communication resource management technology; and 3) enabling scalability for seamless infusion of future space and in-situ assets into the sensor-web. To meet these goals, we are developing: 1) a test-bed in-situ array with smart sensor nodes capable of making autonomous data acquisition decisions; 2) efficient self-organization algorithm of sensor-web topology to support efficient data communication and command control; 3) smart bandwidth allocation algorithms in which sensor nodes autonomously determine packet priorities based on mission needs and local bandwidth information in real-time; and 4) remote network management and reprogramming tools. The space and in-situ control components of the system will be integrated such that each element is capable of autonomously tasking the other. Sensor-web data acquisition and dissemination will be accomplished through the use of the Open Geospatial Consortium Sensorweb Enablement protocols. The three-year project will demonstrate end-to-end system performance with the in-situ test-bed at Mount St. Helens and NASA's EO-1 platform. ??2008 IEEE.

  7. 77 FR 68140 - Privacy Act of 1974; New System of Records, Office of General Counsel E-Discovery Management...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-15

    ... System of Records, Office of General Counsel E-Discovery Management System: Republication of System... Counsel (OGC) E- Discovery Management System (EDMS). EDMS was first announced and described in a July 17... for OGC's E-Discovery Management System (OGC-EDMS), a system expected to significantly improve the...

  8. Architecture of the local spatial data infrastructure for regional climate change research

    NASA Astrophysics Data System (ADS)

    Titov, Alexander; Gordov, Evgeny

    2013-04-01

    Georeferenced datasets (meteorological databases, modeling and reanalysis results, etc.) are actively used in modeling and analysis of climate change for various spatial and temporal scales. Due to inherent heterogeneity of environmental datasets as well as their size which might constitute up to tens terabytes for a single dataset studies in the area of climate and environmental change require a special software support based on SDI approach. A dedicated architecture of the local spatial data infrastructure aiming at regional climate change analysis using modern web mapping technologies is presented. Geoportal is a key element of any SDI, allowing searching of geoinformation resources (datasets and services) using metadata catalogs, producing geospatial data selections by their parameters (data access functionality) as well as managing services and applications of cartographical visualization. It should be noted that due to objective reasons such as big dataset volume, complexity of data models used, syntactic and semantic differences of various datasets, the development of environmental geodata access, processing and visualization services turns out to be quite a complex task. Those circumstances were taken into account while developing architecture of the local spatial data infrastructure as a universal framework providing geodata services. So that, the architecture presented includes: 1. Effective in terms of search, access, retrieval and subsequent statistical processing, model of storing big sets of regional georeferenced data, allowing in particular to store frequently used values (like monthly and annual climate change indices, etc.), thus providing different temporal views of the datasets 2. General architecture of the corresponding software components handling geospatial datasets within the storage model 3. Metadata catalog describing in detail using ISO 19115 and CF-convention standards datasets used in climate researches as a basic element of the spatial data infrastructure as well as its publication according to OGC CSW (Catalog Service Web) specification 4. Computational and mapping web services to work with geospatial datasets based on OWS (OGC Web Services) standards: WMS, WFS, WPS 5. Geoportal as a key element of thematic regional spatial data infrastructure providing also software framework for dedicated web applications development To realize web mapping services Geoserver software is used since it provides natural WPS implementation as a separate software module. To provide geospatial metadata services GeoNetwork Opensource (http://geonetwork-opensource.org) product is planned to be used for it supports ISO 19115/ISO 19119/ISO 19139 metadata standards as well as ISO CSW 2.0 profile for both client and server. To implement thematic applications based on geospatial web services within the framework of local SDI geoportal the following open source software have been selected: 1. OpenLayers JavaScript library, providing basic web mapping functionality for the thin client such as web browser 2. GeoExt/ExtJS JavaScript libraries for building client-side web applications working with geodata services. The web interface developed will be similar to the interface of such popular desktop GIS applications, as uDIG, QuantumGIS etc. The work is partially supported by RF Ministry of Education and Science grant 8345, SB RAS Program VIII.80.2.1 and IP 131.

  9. SensorWeb Evolution Using the Earth Observing One (EO-1) Satellite as a Test Platform

    NASA Technical Reports Server (NTRS)

    Mandl, Daniel; Frye, Stuart; Cappelaere, Pat; Ly, Vuong; Handy, Matthew; Chien, Steve; Grossman, Robert; Tran, Daniel

    2012-01-01

    The Earth Observing One (EO-1) satellite was launched in November 2000 as a one year technology demonstration mission for a variety of space technologies. After the first year, in addition to collecting science data from its instruments, the EO-1 mission has been used as a testbed for a variety of technologies which provide various automation capabilities and which have been used as a pathfinder for the creation of SensorWebs. A SensorWeb is the integration of variety of space, airborne and ground sensors into a loosely coupled collaborative sensor system that automatically provides useful data products. Typically, a SensorWeb is comprised of heterogeneous sensors tied together with a messaging architecture and web services. This paper provides an overview of the various technologies that were tested and eventually folded into normal operations. As these technologies were folded in, the nature of operations transformed. The SensorWeb software enables easy connectivity for collaboration with sensors, but the side benefit is that it improved the EO-1 operational efficiency. This paper presents the various phases of EO-1 operation over the past 12 years and also presents operational efficiency gains demonstrated by some metrics.

  10. Sensor Web Dynamic Measurement Techniques and Adaptive Observing Strategies

    NASA Technical Reports Server (NTRS)

    Talabac, Stephen J.

    2004-01-01

    Sensor Web observing systems may have the potential to significantly improve our ability to monitor, understand, and predict the evolution of rapidly evolving, transient, or variable environmental features and events. This improvement will come about by integrating novel data collection techniques, new or improved instruments, emerging communications technologies and protocols, sensor mark-up languages, and interoperable planning and scheduling systems. In contrast to today's observing systems, "event-driven" sensor webs will synthesize real- or near-real time measurements and information from other platforms and then react by reconfiguring the platforms and instruments to invoke new measurement modes and adaptive observation strategies. Similarly, "model-driven" sensor webs will utilize environmental prediction models to initiate targeted sensor measurements or to use a new observing strategy. The sensor web concept contrasts with today's data collection techniques and observing system operations concepts where independent measurements are made by remote sensing and in situ platforms that do not share, and therefore cannot act upon, potentially useful complementary sensor measurement data and platform state information. This presentation describes NASA's view of event-driven and model-driven Sensor Webs and highlights several research and development activities at the Goddard Space Flight Center.

  11. Communicating data quality through Web Map Services

    NASA Astrophysics Data System (ADS)

    Blower, Jon; Roberts, Charles; Griffiths, Guy; Lewis, Jane; Yang, Kevin

    2013-04-01

    The sharing and visualization of environmental data through spatial data infrastructures is becoming increasingly common. However, information about the quality of data is frequently unavailable or presented in an inconsistent fashion. ("Data quality" is a phrase with many possible meanings but here we define it as "fitness for purpose" - therefore different users have different notions of what constitutes a "high quality" dataset.) The GeoViQua project (www.geoviqua.org) is developing means for eliciting, formatting, discovering and visualizing quality information using ISO and Open Geospatial Consortium (OGC) standards. Here we describe one aspect of the innovations of the GeoViQua project. In this presentation, we shall demonstrate new developments in using Web Map Services to communicate data quality at the level of datasets, variables and individual samples. We shall outline a new draft set of conventions (known as "WMS-Q"), which describe a set of rules for using WMS to convey quality information (OGC draft Engineering Report 12-160). We shall demonstrate these conventions through new prototype software, based upon the widely-used ncWMS software, that applies these rules to enable the visualization of uncertainties in raster data such as satellite products and the results of numerical simulations. Many conceptual and practical issues have arisen from these experiments. How can source data be formatted so that a WMS implementation can detect the semantic links between variables (e.g. the links between a mean field and its variance)? The visualization of uncertainty can be a complex task - how can we provide users with the power and flexibility to choose an optimal strategy? How can we maintain compatibility (as far as possible) with existing WMS clients? We explore these questions with reference to existing standards and approaches, including UncertML, NetCDF-U and Styled Layer Descriptors.

  12. Challenges in Visualizing Satellite Level 2 Atmospheric Data with GIS approach

    NASA Astrophysics Data System (ADS)

    Wei, J. C.; Yang, W.; Zhao, P.; Pham, L.; Meyer, D. J.

    2017-12-01

    Satellite data products are important for a wide variety of applications that can bring far-reaching benefits to the science community and the broader society. These benefits can best be achieved if the satellite data are well utilized and interpreted. Unfortunately, this is not always the case, despite the abundance and relative maturity of numerous satellite data products provided by NASA and other organizations. One way to help users better understand the satellite data is to provide data along with `Images', including accurate pixel coverage area delineation, and science team recommended quality screening for individual geophysical parameters. However, there are challenges of visualizing remote sensed non-gridded products: (1) different geodetics of space-borne instruments (2) data often arranged in "along-track" and "across-track" axes (3) spatially and temporally continuous data chunked into granule files: data for a portion (or all) of a satellite orbit (4) no general rule of resampling or interpolations to a grid (5) geophysical retrieval only based on pixel center location without shape information. In this presentation, we will unravel a new Goddard Earth Sciences Data and Information Services Center (GES DISC) Level 2 (L2) visualization on-demand service. The service's front end provides various visualization and data accessing capabilities, such as overlay and swipe of multiply variables and subset and download of data in different formats. The backend of the service consists of Open Geospatial Consortium (OGC) standard-compliant Web Mapping Service (WMS) and Web Coverage Service. The infrastructure allows inclusion of outside data sources served in OGC compliant protocols and allows other interoperable clients, such as ArcGIS clients, to connect to our L2 WCS/WMS.

  13. Challenges in Obtaining and Visualizing Satellite Level 2 Data in GIS

    NASA Technical Reports Server (NTRS)

    Wei, Jennifer C.; Yang, Wenli; Zhao, Peisheng; Pham, Long; Meyer, David J.

    2017-01-01

    Satellite data products are important for a wide variety of applications that can bring far-reaching benefits to the science community and the broader society. These benefits can best be achieved if the satellite data are well utilized and interpreted. Unfortunately, this is not always the case, despite the abundance and relative maturity of numerous satellite data products provided by NASA and other organizations. One way to help users better understand the satellite data is to provide data along with Images, including accurate pixel coverage area delineation, and science team recommended quality screening for individual geophysical parameters. However, there are challenges of visualizing remote sensed non-gridded products: (1) different geodetics of space-borne instruments (2) data often arranged in a long-track� and a cross-track� axes (3) spatially and temporally continuous data chunked into granule files: data for a portion (or all) of a satellite orbit (4) no general rule of resampling or interpolations to a grid (5) geophysical retrieval only based on pixel center location without shape information. In this presentation, we will unravel a new Goddard Earth Sciences Data and Information Services Center (GES DISC) Level 2 (L2) visualization on-demand service. The service's front end provides various visualization and data accessing capabilities, such as overlay and swipe of multiply variables and subset and download of data in different formats. The backend of the service consists of Open Geospatial Consortium (OGC) standard-compliant Web Mapping Service (WMS) and Web Coverage Service. The infrastructure allows inclusion of outside data sources served in OGC compliant protocols and allows other interoperable clients, such as ArcGIS clients, to connect to our L2 WCS/WMS.

  14. Exploiting Aura OMI Level 2 Data with High Resolution Visualization

    NASA Astrophysics Data System (ADS)

    Wei, J. C.; Yang, W.; Johnson, J. E.; Zhao, P.; Gerasimov, I. V.; Pham, L.; Vicente, G. A.; Shen, S.

    2014-12-01

    Satellite data products are important for a wide variety of applications that can bring far-reaching benefits to the science community and the broader society. These benefits can best be achieved if the satellite data are well utilized and interpreted, such as model inputs from satellite, or extreme event (such as volcano eruption, dust storm, …etc) interpretation from satellite. Unfortunately, this is not always the case, despite the abundance and relative maturity of numerous satellite data products provided by NASA and other organizations. One way to help users better understand the satellite data is to provide data along with 'Images', including accurate pixel-level (Level 2) information, pixel coverage area delineation, and science team recommended quality screening for individual geophysical parameters. Goddard Earth Sciences Data and Information Services Center (GES DISC) always strives to best support (i.e., Software-as-a-service, SaaS) the user-community for NASA Earth Science Data. In this case, we will present a new visualization tool that helps users exploiting Aura Ozone Monitoring Instrument (OMI) Level 2 data. This new visualization service utilizes Open Geospatial Consortium (OGC) standard-compliant Web Mapping Service (WMS) and Web Coverage Service (WCS) calls in the backend infrastructure. The functionality of the service allows users to select data sources (e.g., multiple parameters under the same measurement, like NO2 and SO2 from OMI Level 2 or same parameter with different methods of aggregation, like NO2 in OMNO2G and OMNO2D products), defining area-of-interest and temporal extents, zooming, panning, overlaying, sliding, and data subsetting and reformatting. The interface will also be able to connect to other OGC WMS and WCS servers, which will greatly enhance its expandability to integrate additional outside data/map sources (such as Global Imagery Browse Services (GIBS)).

  15. Gmz: a Gml Compression Model for Webgis

    NASA Astrophysics Data System (ADS)

    Khandelwal, A.; Rajan, K. S.

    2017-09-01

    Geography markup language (GML) is an XML specification for expressing geographical features. Defined by Open Geospatial Consortium (OGC), it is widely used for storage and transmission of maps over the Internet. XML schemas provide the convenience to define custom features profiles in GML for specific needs as seen in widely popular cityGML, simple features profile, coverage, etc. Simple features profile (SFP) is a simpler subset of GML profile with support for point, line and polygon geometries. SFP has been constructed to make sure it covers most commonly used GML geometries. Web Feature Service (WFS) serves query results in SFP by default. But it falls short of being an ideal choice due to its high verbosity and size-heavy nature, which provides immense scope for compression. GMZ is a lossless compression model developed to work for SFP compliant GML files. Our experiments indicate GMZ achieves reasonably good compression ratios and can be useful in WebGIS based applications.

  16. Distributed data collection and supervision based on web sensor

    NASA Astrophysics Data System (ADS)

    He, Pengju; Dai, Guanzhong; Fu, Lei; Li, Xiangjun

    2006-11-01

    As a node in Internet/Intranet, web sensor has been promoted in recent years and wildly applied in remote manufactory, workshop measurement and control field. However, the conventional scheme can only support HTTP protocol, and the remote users supervise and control the collected data published by web in the standard browser because of the limited resource of the microprocessor in the sensor; moreover, only one node of data acquirement can be supervised and controlled in one instant therefore the requirement of centralized remote supervision, control and data process can not be satisfied in some fields. In this paper, the centralized remote supervision, control and data process by the web sensor are proposed and implemented by the principle of device driver program. The useless information of the every collected web page embedded in the sensor is filtered and the useful data is transmitted to the real-time database in the workstation, and different filter algorithms are designed for different sensors possessing independent web pages. Every sensor node has its own filter program of web, called "web data collection driver program", the collecting details are shielded, and the supervision, control and configuration software can be implemented by the call of web data collection driver program just like the use of the I/O driver program. The proposed technology can be applied in the data acquirement where relative low real-time is required.

  17. Experimenting with an Evolving Ground/Space-based Software Architecture to Enable Sensor Webs

    NASA Technical Reports Server (NTRS)

    mandl, Daniel; Frye, Stuart

    2005-01-01

    A series of ongoing experiments are being conducted at the NASA Goddard Space Flight Center to explore integrated ground and space-based software architectures enabling sensor webs. A sensor web, as defined by Steve Talabac at NASA Goddard Space Flight Center(GSFC), is a coherent set of distributed nodes interconnected by a communications fabric, that collectively behave as a single, dynamically adaptive, observing system. The nodes can be comprised of satellites, ground instruments, computing nodes etc. Sensor web capability requires autonomous management of constellation resources. This becomes progressively more important as more and more satellites share resource, such as communication channels and ground station,s while automatically coordinating their activities. There have been five ongoing activities which include an effort to standardize a set of middleware. This paper will describe one set of activities using the Earth Observing 1 satellite, which used a variety of ground and flight software along with other satellites and ground sensors to prototype a sensor web. This activity allowed us to explore where the difficulties that occur in the assembly of sensor webs given today s technology. We will present an overview of the software system architecture, some key experiments and lessons learned to facilitate better sensor webs in the future.

  18. Sensor Web for Spatio-Temporal Monitoring of a Hydrological Environment

    NASA Technical Reports Server (NTRS)

    Delin, K. A.; Jackson, S. P.; Johnson, D. W.; Burleigh, S. C.; Woodrow, R. R.; McAuley, M.; Britton, J. T.; Dohm, J. M.; Ferre, T. P. A.; Ip, Felipe

    2004-01-01

    The Sensor Web is a macroinstrument concept that allows for the spatio-temporal understanding of an environment through coordinated efforts between multiple numbers and types of sensing platforms, including, in its most general form, both orbital and terrestrial and both fixed and mobile. Each of these platforms, or pods, communicates within its local neighborhood and thus distributes information to the instrument as a whole. The result of sharing and continual processing of this information among all the Sensor Web elements will result in an information flow and a global perception of and reactive capability to the environment. As illustrated, the Sensor Web concept also allows for the recursive notion of a web of webs with individual distributed instruments possibly playing the role of a single node point on a larger Sensor Web instrument. In particular, the fusion of inexpensive, yet sophisticated, commercial technology from both the computation and telecommunication revolutions has enabled the development of practical, fielded, and embedded in situ systems that have been the focus of the NASA/JPL Sensor Webs Project (http://sensorwebs.jpl.nasa.gov/). These Sensor Webs are complete systems consisting of not only the pod elements that wirelessly communicate among themselves, but also interfacing and archiving software that allows for easy use by the end-user. Previous successful deployments have included environments as diverse as coastal regions, Antarctica, and desert areas. The Sensor Web has broad implications for Earth and planetary science and will revolutionize the way experiments and missions are conceived and performed. As part of our current efforts to develop a macrointelligence within the system, we have deployed a Sensor Web at the Central Avra Valley Storage and Recovery Project (CAVSARP) facility located west of Tucson, AZ. This particular site was selected because it is ideal for studying spatio-temporal phenomena and for providing a test site for more sophisticated hydrological studies in the future.

  19. Sensor web

    NASA Technical Reports Server (NTRS)

    Delin, Kevin A. (Inventor); Jackson, Shannon P. (Inventor)

    2011-01-01

    A Sensor Web formed of a number of different sensor pods. Each of the sensor pods include a clock which is synchronized with a master clock so that all of the sensor pods in the Web have a synchronized clock. The synchronization is carried out by first using a coarse synchronization which takes less power, and subsequently carrying out a fine synchronization to make a fine sync of all the pods on the Web. After the synchronization, the pods ping their neighbors to determine which pods are listening and responded, and then only listen during time slots corresponding to those pods which respond.

  20. Secure, Autonomous, Intelligent Controller for Integrating Distributed Sensor Webs

    NASA Technical Reports Server (NTRS)

    Ivancic, William D.

    2007-01-01

    This paper describes the infrastructure and protocols necessary to enable near-real-time commanding, access to space-based assets, and the secure interoperation between sensor webs owned and controlled by various entities. Select terrestrial and aeronautics-base sensor webs will be used to demonstrate time-critical interoperability between integrated, intelligent sensor webs both terrestrial and between terrestrial and space-based assets. For this work, a Secure, Autonomous, Intelligent Controller and knowledge generation unit is implemented using Virtual Mission Operation Center technology.

  1. SAMuS: Service-Oriented Architecture for Multisensor Surveillance in Smart Homes

    PubMed Central

    Van de Walle, Rik

    2014-01-01

    The design of a service-oriented architecture for multisensor surveillance in smart homes is presented as an integrated solution enabling automatic deployment, dynamic selection, and composition of sensors. Sensors are implemented as Web-connected devices, with a uniform Web API. RESTdesc is used to describe the sensors and a novel solution is presented to automatically compose Web APIs that can be applied with existing Semantic Web reasoners. We evaluated the solution by building a smart Kinect sensor that is able to dynamically switch between IR and RGB and optimizing person detection by incorporating feedback from pressure sensors, as such demonstrating the collaboration among sensors to enhance detection of complex events. The performance results show that the platform scales for many Web APIs as composition time remains limited to a few hundred milliseconds in almost all cases. PMID:24778579

  2. The Mitochondrial 2-Oxoglutarate Carrier Is Part of a Metabolic Pathway That Mediates Glucose- and Glutamine-stimulated Insulin Secretion*

    PubMed Central

    Odegaard, Matthew L.; Joseph, Jamie W.; Jensen, Mette V.; Lu, Danhong; Ilkayeva, Olga; Ronnebaum, Sarah M.; Becker, Thomas C.; Newgard, Christopher B.

    2010-01-01

    Glucose-stimulated insulin secretion from pancreatic islet β-cells is dependent in part on pyruvate cycling through the pyruvate/isocitrate pathway, which generates cytosolic α-ketoglutarate, also known as 2-oxoglutarate (2OG). Here, we have investigated if mitochondrial transport of 2OG through the 2-oxoglutarate carrier (OGC) participates in control of nutrient-stimulated insulin secretion. Suppression of OGC in clonal pancreatic β-cells (832/13 cells) and isolated rat islets by adenovirus-mediated delivery of small interfering RNA significantly decreased glucose-stimulated insulin secretion. OGC suppression also reduced insulin secretion in response to glutamine plus the glutamate dehydrogenase activator 2-amino-2-norbornane carboxylic acid. Nutrient-stimulated increases in glucose usage, glucose oxidation, glutamine oxidation, or ATP:ADP ratio were not affected by OGC knockdown, whereas suppression of OGC resulted in a significant decrease in the NADPH:NADP+ ratio during stimulation with glucose but not glutamine + 2-amino-2-norbornane carboxylic acid. Finally, OGC suppression reduced insulin secretion in response to a membrane-permeant 2OG analog, dimethyl-2OG. These data reveal that the OGC is part of a mechanism of fuel-stimulated insulin secretion that is common to glucose, amino acid, and organic acid secretagogues, involving flux through the pyruvate/isocitrate cycling pathway. Although the components of this pathway must remain intact for appropriate stimulus-secretion coupling, production of NADPH does not appear to be the universal second messenger signal generated by these reactions. PMID:20356834

  3. Implementation of Sensor Twitter Feed Web Service Server and Client

    DTIC Science & Technology

    2016-12-01

    ARL-TN-0807 ● DEC 2016 US Army Research Laboratory Implementation of Sensor Twitter Feed Web Service Server and Client by...Implementation of Sensor Twitter Feed Web Service Server and Client by Bhagyashree V Kulkarni University of Maryland Michael H Lee Computational...

  4. WebTag: Web browsing into sensor tags over NFC.

    PubMed

    Echevarria, Juan Jose; Ruiz-de-Garibay, Jonathan; Legarda, Jon; Alvarez, Maite; Ayerbe, Ana; Vazquez, Juan Ignacio

    2012-01-01

    Information and Communication Technologies (ICTs) continue to overcome many of the challenges related to wireless sensor monitoring, such as for example the design of smarter embedded processors, the improvement of the network architectures, the development of efficient communication protocols or the maximization of the life cycle autonomy. This work tries to improve the communication link of the data transmission in wireless sensor monitoring. The upstream communication link is usually based on standard IP technologies, but the downstream side is always masked with the proprietary protocols used for the wireless link (like ZigBee, Bluetooth, RFID, etc.). This work presents a novel solution (WebTag) for a direct IP based access to a sensor tag over the Near Field Communication (NFC) technology for secure applications. WebTag allows a direct web access to the sensor tag by means of a standard web browser, it reads the sensor data, configures the sampling rate and implements IP based security policies. It is, definitely, a new step towards the evolution of the Internet of Things paradigm.

  5. WebTag: Web Browsing into Sensor Tags over NFC

    PubMed Central

    Echevarria, Juan Jose; Ruiz-de-Garibay, Jonathan; Legarda, Jon; Álvarez, Maite; Ayerbe, Ana; Vazquez, Juan Ignacio

    2012-01-01

    Information and Communication Technologies (ICTs) continue to overcome many of the challenges related to wireless sensor monitoring, such as for example the design of smarter embedded processors, the improvement of the network architectures, the development of efficient communication protocols or the maximization of the life cycle autonomy. This work tries to improve the communication link of the data transmission in wireless sensor monitoring. The upstream communication link is usually based on standard IP technologies, but the downstream side is always masked with the proprietary protocols used for the wireless link (like ZigBee, Bluetooth, RFID, etc.). This work presents a novel solution (WebTag) for a direct IP based access to a sensor tag over the Near Field Communication (NFC) technology for secure applications. WebTag allows a direct web access to the sensor tag by means of a standard web browser, it reads the sensor data, configures the sampling rate and implements IP based security policies. It is, definitely, a new step towards the evolution of the Internet of Things paradigm. PMID:23012511

  6. A Smart Sensor Web for Ocean Observation: Integrated Acoustics, Satellite Networking, and Predictive Modeling

    NASA Astrophysics Data System (ADS)

    Arabshahi, P.; Chao, Y.; Chien, S.; Gray, A.; Howe, B. M.; Roy, S.

    2008-12-01

    In many areas of Earth science, including climate change research, there is a need for near real-time integration of data from heterogeneous and spatially distributed sensors, in particular in-situ and space- based sensors. The data integration, as provided by a smart sensor web, enables numerous improvements, namely, 1) adaptive sampling for more efficient use of expensive space-based sensing assets, 2) higher fidelity information gathering from data sources through integration of complementary data sets, and 3) improved sensor calibration. The specific purpose of the smart sensor web development presented here is to provide for adaptive sampling and calibration of space-based data via in-situ data. Our ocean-observing smart sensor web presented herein is composed of both mobile and fixed underwater in-situ ocean sensing assets and Earth Observing System (EOS) satellite sensors providing larger-scale sensing. An acoustic communications network forms a critical link in the web between the in-situ and space-based sensors and facilitates adaptive sampling and calibration. After an overview of primary design challenges, we report on the development of various elements of the smart sensor web. These include (a) a cable-connected mooring system with a profiler under real-time control with inductive battery charging; (b) a glider with integrated acoustic communications and broadband receiving capability; (c) satellite sensor elements; (d) an integrated acoustic navigation and communication network; and (e) a predictive model via the Regional Ocean Modeling System (ROMS). Results from field experiments, including an upcoming one in Monterey Bay (October 2008) using live data from NASA's EO-1 mission in a semi closed-loop system, together with ocean models from ROMS, are described. Plans for future adaptive sampling demonstrations using the smart sensor web are also presented.

  7. Supporting NEESPI with Data Services - The SIB-ESS-C e-Infrastructure

    NASA Astrophysics Data System (ADS)

    Gerlach, R.; Schmullius, C.; Frotscher, K.

    2009-04-01

    Data discovery and retrieval is commonly among the first steps performed for any Earth science study. The way scientific data is searched and accessed has changed significantly over the past two decades. Especially the development of the World Wide Web and the technologies that evolved along shortened the data discovery and data exchange process. On the other hand the amount of data collected and distributed by earth scientists has increased exponentially requiring new concepts for data management and sharing. One such concept to meet the demand is to build up Spatial Data Infrastructures (SDI) or e-Infrastructures. These infrastructures usually contain components for data discovery allowing users (or other systems) to query a catalogue or registry and retrieve metadata information on available data holdings and services. Data access is typically granted using FTP/HTTP protocols or, more advanced, through Web Services. A Service Oriented Architecture (SOA) approach based on standardized services enables users to benefit from interoperability among different systems and to integrate distributed services into their application. The Siberian Earth System Science Cluster (SIB-ESS-C) being established at the University of Jena (Germany) is such a spatial data infrastructure following these principles and implementing standards published by the Open Geospatial Consortium (OGC) and the International Organization for Standardization (ISO). The prime objective is to provide researchers with focus on Siberia with the technical means for data discovery, data access, data publication and data analysis. The region of interest covers the entire Asian part of the Russian Federation from the Ural to the Pacific Ocean including the Ob-, Lena- and Yenissey river catchments. The aim of SIB-ESS-C is to provide a comprehensive set of data products for Earth system science in this region. Although SIB-ESS-C will be equipped with processing capabilities for in-house data generation (mainly from Earth Observation), current data holdings of SIB-ESS-C have been created in collaboration with a number of partners in previous and ongoing research projects (e.g. SIBERIA-II, SibFORD, IRIS). At the current development stage the SIB-ESS-C system comprises a federated metadata catalogue accessible through the SIB-ESS-C Web Portal or from any OGC-CSW compliant client. Due to full interoperability with other metadata catalogues users of the SIB-ESS-C Web Portal are able to search external metadata repositories. The Web Portal contains also a simple visualization component which will be extended to a comprehensive visualization and analysis tool in the near future. All data products are already accessible as a Web Mapping Service and will be made available as Web Feature and Web Coverage Services soon allowing users to directly incorporate the data into their application. The SIB-ESS-C infrastructure will be further developed as one node in a network of similar systems (e.g. NASA GIOVANNI) in the NEESPI region.

  8. SMART characterisation of New Zealand's aquifers using fast and passive methods

    NASA Astrophysics Data System (ADS)

    Klug, H.; Daughney, C.; Verhagen, F.; Westerhoff, R.; Ward, N. Dudley

    2012-04-01

    Groundwater resources account for about half of New Zealand's abstractive water needs and supplies about eighty per cent of all water used in the agricultural sector. Despite the importance of New Zealand's groundwater resources, we still lack essential information related to their basic properties such as volume, hydraulic properties, interaction with surface water, and water age. These measures are required to ensure sustainable management in order to avoid overexploitation of water resources and to circumvent water scarcity situations where humans and the economy will be stressed due to insufficient water supply. A newly established research collaboration between New Zealand and Europe aims to provide a methodological framework to characterise New Zealand's groundwater aquifers. The SMART project (www.smart-project.info) will rely on existing data sources of regional councils and research institutes and will develop novel measurement techniques that can be applied to large areas with little effort, little acquisition time, and minimal cost. The project aims to synthesise in situ measurements from sensor observation services, ambient noise seismic tomography, real-time fibre optic temperature sensing, novel age tracers, airborne geophysical surveying and satellite remote sensing techniques. Validation of direct and indirect groundwater information will be achieved through use of multiple methods in case study areas and by "ground-truthing" the new methods against existing data obtained from traditional methods (e.g. drilling, aquifer pump testing, river gauging). An important overarching part of the project is the quantification of uncertainty associated with all techniques to be employed. An online Sensor WebGIS prototype will provide the project results and other case study observations (e.g. temperature, precipitation, soil moisture) in as near real-time as possible. These datasets serve as a validation source for the satellite monitoring results and present an actual view on the status of the environment. The web portal will not only visualise near real-time (station based) point measurements but also process these datasets to spatially distributed maps on climatological parameters. The OGC compliant and open source based portal will be developed towards a 3D groundwater interface and inventory. This inventory will be tailored to stakeholder needs (e.g. open access, ease of use, and interoperability with existing systems) which have already been identified through stakeholder consultation processes. The portal prototype runs on a platform-independent web browser ensuring access and visibility to all stakeholders and decision makers at regional and national level.

  9. Method for simultaneously making a plurality of acoustic signal sensor elements

    NASA Technical Reports Server (NTRS)

    Bryant, Timothy D. (Inventor); Wynkoop, Mark W. (Inventor); Holloway, Nancy M. H. (Inventor); Zuckerwar, Allan J. (Inventor)

    2005-01-01

    A fetal heart monitoring system preferably comprising a backing plate having a generally concave front surface and a generally convex back surface, and at least one sensor element attached to the concave front surface for acquiring acoustic fetal heart signals produced by a fetus within a body. The sensor element has a shape that conforms to the generally concave back surface of the backing plate. In one embodiment, the at least one sensor element comprises an inner sensor, and a plurality of outer sensors surrounding the inner sensor. The fetal heart monitoring system can further comprise a web belt, and a web belt guide movably attached to the web belt. The web belt guide being is to the convex back surface of the backing plate.

  10. Method for Simultaneously Making a Plurality of Acoustic Signal Sensor Elements

    NASA Technical Reports Server (NTRS)

    Bryant, Timothy D.; Wynkoop, Mark W.; Holloway, Nancy M. H.; Zuckerwar, Allan J.

    2005-01-01

    A fetal heart monitoring system preferably comprising a backing plate having a generally concave front surface and a generally convex back surface, and at least one sensor element attached to the concave front surface for acquiring acoustic fetal heart signals produced by a fetus within a body. The sensor element has a shape that conforms to the generally concave back surface of the backing plate. In one embodiment, the at least one sensor element comprises an inner sensor, and a plurality of outer sensors surrounding the inner sensor. The fetal heart monitoring system can further comprise a web belt, and a web belt guide movably attached to the web belt. The web belt guide being is to the convex back surface of the backing plate.

  11. Can EO afford big data - an assessment of the temporal and monetary costs of existing and emerging big data workflows

    NASA Astrophysics Data System (ADS)

    Clements, Oliver; Walker, Peter

    2014-05-01

    The cost of working with extremely large data sets is an increasingly important issue within the Earth Observation community. From global coverage data at any resolution to small coverage data at extremely high resolution, the community has always produced big data. This will only increase as new sensors are deployed and their data made available. Over time standard workflows have emerged. These have been facilitated by the production and adoption of standard technologies. Groups such as the International Organisation for Standardisation (ISO) and the Open Geospatial Consortium (OGC) have been a driving force in this area for many years. The production of standard protocols and interfaces such as OPeNDAP, Web Coverage Service (WCS), Web Processing Service (WPS) and the newer emerging standards such as Web Coverage Processing Service (WCPS) have helped to galvanise these workflows. An example of a traditional workflow, assume a researcher wants to assess the temporal trend in chlorophyll concentration. This would involve a discovery phase, an acquisition phase, a processing phase and finally a derived product or analysis phase. Each element of this workflow has an associated temporal and monetary cost. Firstly the researcher would require a high bandwidth connection or the acquisition phase would take too long. Secondly the researcher must have their own expensive equipment for use in the processing phase. Both of these elements cost money and time. This can make the whole process prohibitive to scientists from the developing world or "citizen scientists" that do not have the processing infrastructure necessary. The use of emerging technologies can help improve both the monetary and time costs associated with these existing workflows. By utilising a WPS that is hosted at the same location as the data a user is able to apply processing to the data without needing their own processing infrastructure. This however limits the user to predefined processes that are made available by the data provider. The emerging OGC WCPS standard combined with big data analytics engines may provide a mechanism to improve this situation. The technology allows users to create their own queries using an SQL like query language and apply them over available large data archive, once again at the data providers end. This not only removes the processing cost whilst still allowing user defined processes it also reduces the bandwidth required, as only the final analysis or derived product needs to be downloaded. The maturity of the new technologies is a stage where their use should be justified by a quantitative assessment rather than simply by the fact that they are new developments. We will present a study of the time and cost requirements for a selection of existing workflows and then show how new/emerging standards and technologies can help to both reduce the cost to the user by shifting processing to the data, and reducing the required bandwidth for analysing large datasets, making analysis of big-data archives possible for a greater and more diverse audience.

  12. GIS Services, Visualization Products, and Interoperability at the National Oceanic and Atmospheric Administration (NOAA) National Climatic Data Center (NCDC)

    NASA Astrophysics Data System (ADS)

    Baldwin, R.; Ansari, S.; Reid, G.; Lott, N.; Del Greco, S.

    2007-12-01

    The main goal in developing and deploying Geographic Information System (GIS) services at NOAA's National Climatic Data Center (NCDC) is to provide users with simple access to data archives while integrating new and informative climate products. Several systems at NCDC provide a variety of climatic data in GIS formats and/or map viewers. The Online GIS Map Services provide users with data discovery options which flow into detailed product selection maps, which may be queried using standard "region finder" tools or gazetteer (geographical dictionary search) functions. Each tabbed selection offers steps to help users progress through the systems. A series of additional base map layers or data types have been added to provide companion information. New map services include: Severe Weather Data Inventory, Local Climatological Data, Divisional Data, Global Summary of the Day, and Normals/Extremes products. THREDDS Data Server technology is utilized to provide access to gridded multidimensional datasets such as Model, Satellite and Radar. This access allows users to download data as a gridded NetCDF file, which is readable by ArcGIS. In addition, users may subset the data for a specific geographic region, time period, height range or variable prior to download. The NCDC Weather Radar Toolkit (WRT) is a client tool which accesses Weather Surveillance Radar 1988 Doppler (WSR-88D) data locally or remotely from the NCDC archive, NOAA FTP server or any URL or THREDDS Data Server. The WRT Viewer provides tools for custom data overlays, Web Map Service backgrounds, animations and basic filtering. The export of images and movies is provided in multiple formats. The WRT Data Exporter allows for data export in both vector polygon (Shapefile, Well-Known Text) and raster (GeoTIFF, ESRI Grid, VTK, NetCDF, GrADS) formats. As more users become accustom to GIS, questions of better, cheaper, faster access soon follow. Expanding use and availability can best be accomplished through standards which promote interoperability. Our GIS related products provide Open Geospatial Consortium (OGC) compliant Web Map Services (WMS), Web Feature Services (WFS), Web Coverage Services (WCS) and Federal Geographic Data Committee (FGDC) metadata as a complement to the map viewers. KML/KMZ data files (soon to be compliant OGC specifications) also provide access.

  13. Design & implementation of distributed spatial computing node based on WPS

    NASA Astrophysics Data System (ADS)

    Liu, Liping; Li, Guoqing; Xie, Jibo

    2014-03-01

    Currently, the research work of SIG (Spatial Information Grid) technology mostly emphasizes on the spatial data sharing in grid environment, while the importance of spatial computing resources is ignored. In order to implement the sharing and cooperation of spatial computing resources in grid environment, this paper does a systematical research of the key technologies to construct Spatial Computing Node based on the WPS (Web Processing Service) specification by OGC (Open Geospatial Consortium). And a framework of Spatial Computing Node is designed according to the features of spatial computing resources. Finally, a prototype of Spatial Computing Node is implemented and the relevant verification work under the environment is completed.

  14. MyOcean Central Information System - Achievements and Perspectives

    NASA Astrophysics Data System (ADS)

    Claverie, Vincent; Loubrieu, Thomas; Jolibois, Tony; de Dianous, Rémi; Blower, Jon; Romero, Laia; Griffiths, Guy

    2013-04-01

    Since 2009, MyOcean (http://www.myocean.eu) is providing an operational service, for forecasts, analysis and expertise on ocean currents, temperature, salinity, sea level, primary ecosystems and ice coverage. The production of observation and forecasting data is done by 42 Production Units (PU). Product download and visualisation are hosted by 25 Dissemination Units (DU). All these products and associated services are gathered in a single catalogue hiding the intricate distributed organization of PUs and DUs. Besides applying INSPIRE directive and OGC recommendations, MyOcean overcomes technical choices and challenges. This presentation focuses on 3 specific issues met by MyOcean and relevant for many Spatial Data Infrastructures: user's transaction accounting, large volume download and stream line the catalogue maintenance. Transaction Accounting: Set up powerful means to get detailed knowledge of system usage in order to subsequently improve the products (ocean observations, analysis and forecast dataset) and services (view, download) offer. This subject drives the following ones: Central authentication management for the distributed web services implementations: add-on to THREDDS Data Server for WMS and NETCDF sub-setting service, specific FTP. Share user management with co-funding projects. In addition to MyOcean, alternate projects also need consolidated information about the use of the cofunded products. Provide a central facility for the user management. This central facility provides users' rights to geographically distributed services and gathers transaction accounting history from these distributed services. Propose a user-friendly web interface to download large volume of data (several GigaBytes) as robust as basic FTP but intuitive and file/directory independent. This should rely on a web service drafting the INSPIRE to-be specification and OGC recommendations for download taking into account that FTP server is not enough friendly (need to know filenames, directories) and Web-page not allowing downloading several files. Streamline the maintenance of the central catalogue. The major update for MyOcean v3 (April 2013) is the usage of Geonetwork for catalogue management. This improves the system at different levels : The editing interface is more user-friendly and the catalogue updates are managed in a workflow. This workflow allows higher flexibility for minor updates without giving up the high level qualification requirements for the catalogue content. The distributed web services (download, view) are automatically harvested from the THREDDS Data Server. Thus the manual editing on the catalogue is reduced, the associated typos are avoided and the quality of information is finally improved.

  15. The SOOS Data Portal, providing access to Southern Oceans data

    NASA Astrophysics Data System (ADS)

    Proctor, Roger; Finney, Kim; Blain, Peter; Taylor, Fiona; Newman, Louise; Meredith, Mike; Schofield, Oscar

    2013-04-01

    The Southern Ocean Observing System (SOOS) is an international initiative to enhance, coordinate and expand the strategic observations of the Southern Oceans that are required to address key scientific and societal challenges. A key component of SOOS will be the creation and maintenance of a Southern Ocean Data Portal to provide improved access to historical and ongoing data (Schofield et al., 2012, Eos, Vol. 93, No. 26, pp 241-243). The scale of this effort will require strong leveraging of existing data centres, new cyberinfrastructure development efforts, and defined data collection, quality control, and archiving procedures across the international community. The task of assembling the SOOS data portal is assigned to the SOOS Data Management Sub-Committee. The information infrastructure chosen for the SOOS data portal is based on the Australian Ocean Data Network (AODN, http://portal.aodn.org.au). The AODN infrastructure is built on open-source tools and the use of international standards ensures efficiency of data exchange and interoperability between contributing systems. OGC standard web services protocols are used for serving of data via the internet. These include Web Map Service (WMS) for visualisation, Web Feature Service (WFS) for data download, and Catalogue Service for Web (CSW) for catalogue exchange. The portal offers a number of tools to access and visualize data: - a Search link to the metadata catalogue enables search and discovery by simple text search, by geographic area, temporal extent, keyword, parameter, organisation, or by any combination of these, allowing users to gain access to further information and/or the data for download. Also, searches can be restricted to items which have either data to download, or attached map layers, or both - a Map interface for discovery and display of data, with the ability to change the style and opacity of layers, add additional data layers via OGC Web Map Services, view animated timeseries datastreams - data can be easily accessed and downloaded including directly from OPeNDAP/THREDDS servers. The SOOS data portal (http://soos.aodn.org.au/soos) aims to make access to Southern Ocean data a simple process and the initial layout classifies data into six themes - Heat and Freshwater; Circulation; Ice-sheets and Sea level; Carbon; Sea-ice; and Ecosystems, with the ability to integrate layers between themes. The portal is in its infancy (pilot launched January 2013) with a limited number of datasets available; however, the number of datasets is expected to grow rapidly as the international community becomes fully engaged.

  16. Evolving EO-1 Sensor Web Testbed Capabilities in Pursuit of GEOSS

    NASA Technical Reports Server (NTRS)

    Mandi, Dan; Ly, Vuong; Frye, Stuart; Younis, Mohamed

    2006-01-01

    A viewgraph presentation to evolve sensor web capabilities in pursuit of capabilities to support Global Earth Observing System of Systems (GEOSS) is shown. The topics include: 1) Vision to Enable Sensor Webs with "Hot Spots"; 2) Vision Extended for Communication/Control Architecture for Missions to Mars; 3) Key Capabilities Implemented to Enable EO-1 Sensor Webs; 4) One of Three Experiments Conducted by UMBC Undergraduate Class 12-14-05 (1 - 3); 5) Closer Look at our Mini-Rovers and Simulated Mars Landscae at GSFC; 6) Beginning to Implement Experiments with Standards-Vision for Integrated Sensor Web Environment; 7) Goddard Mission Services Evolution Center (GMSEC); 8) GMSEC Component Catalog; 9) Core Flight System (CFS) and Extension for GMSEC for Flight SW; 10) Sensor Modeling Language; 11) Seamless Ground to Space Integrated Message Bus Demonstration (completed December 2005); 12) Other Experiments in Queue; 13) Acknowledgements; and 14) References.

  17. Passive Fetal Heart Monitoring System

    NASA Technical Reports Server (NTRS)

    Bryant, Timothy D. (Inventor); Wynkoop, Mark W. (Inventor); Holloway, Nancy M. H. (Inventor); Zuckerwar, Allan J. (Inventor)

    2004-01-01

    A fetal heart monitoring system preferably comprising a backing plate having a generally concave front surface and a generally convex back surface, and at least one sensor element attached to the concave front surface for acquiring acoustic fetal heart signals produced by a fetus within a body. The sensor element has a shape that conforms to the generally concave back surface of the backing plate. In one embodiment, the at least one sensor element comprises an inner sensor, and a plurality of outer sensors surrounding the inner sensor. The fetal heart monitoring system can further comprise a web belt, and a web belt guide movably attached to the web belt. The web belt guide being is to the convex back surface of the backing plate.

  18. The Challenge of Handling Big Data Sets in the Sensor Web

    NASA Astrophysics Data System (ADS)

    Autermann, Christian; Stasch, Christoph; Jirka, Simon

    2016-04-01

    More and more Sensor Web components are deployed in different domains such as hydrology, oceanography or air quality in order to make observation data accessible via the Web. However, besides variability of data formats and protocols in environmental applications, the fast growing volume of data with high temporal and spatial resolution is imposing new challenges for Sensor Web technologies when sharing observation data and metadata about sensors. Variability, volume and velocity are the core issues that are addressed by Big Data concepts and technologies. Most solutions in the geospatial sector focus on remote sensing and raster data, whereas big in-situ observation data sets relying on vector features require novel approaches. Hence, in order to deal with big data sets in infrastructures for observational data, the following questions need to be answered: 1. How can big heterogeneous spatio-temporal datasets be organized, managed, and provided to Sensor Web applications? 2. How can views on big data sets and derived information products be made accessible in the Sensor Web? 3. How can big observation data sets be processed efficiently? We illustrate these challenges with examples from the marine domain and outline how we address these challenges. We therefore show how big data approaches from mainstream IT can be re-used and applied to Sensor Web application scenarios.

  19. GITEWS, an extensible and open integration platform for manifold sensor systems and processing components based on Sensor Web Enablement and the principles of Service Oriented Architectures

    NASA Astrophysics Data System (ADS)

    Haener, Rainer; Waechter, Joachim; Fleischer, Jens; Herrnkind, Stefan; Schwarting, Herrmann

    2010-05-01

    The German Indonesian Tsunami Early Warning System (GITEWS) is a multifaceted system consisting of various sensor types like seismometers, sea level sensors or GPS stations, and processing components, all with their own system behavior and proprietary data structure. To operate a warning chain, beginning from measurements scaling up to warning products, all components have to interact in a correct way, both syntactically and semantically. Designing the system great emphasis was laid on conformity to the Sensor Web Enablement (SWE) specification by the Open Geospatial Consortium (OGC). The technical infrastructure, the so called Tsunami Service Bus (TSB) follows the blueprint of Service Oriented Architectures (SOA). The TSB is an integration concept (SWE) where functionality (observe, task, notify, alert, and process) is grouped around business processes (Monitoring, Decision Support, Sensor Management) and packaged as interoperable services (SAS, SOS, SPS, WNS). The benefits of using a flexible architecture together with SWE lead to an open integration platform: • accessing and controlling heterogeneous sensors in a uniform way (Functional Integration) • assigns functionality to distinct services (Separation of Concerns) • allows resilient relationship between systems (Loose Coupling) • integrates services so that they can be accessed from everywhere (Location Transparency) • enables infrastructures which integrate heterogeneous applications (Encapsulation) • allows combination of services (Orchestration) and data exchange within business processes Warning systems will evolve over time: New sensor types might be added, old sensors will be replaced and processing components will be improved. From a collection of few basic services it shall be possible to compose more complex functionality essential for specific warning systems. Given these requirements a flexible infrastructure is a prerequisite for sustainable systems and their architecture must be tailored for evolution. The use of well-known techniques and widely used open source software implementing industrial standards reduces the impact of service modifications allowing the evolution of a system as a whole. GITEWS implemented a solution to feed sensor raw data from any (remote) system into the infrastructure. Specific dispatchers enable plugging in sensor-type specific processing without changing the architecture. Client components don't need to be adjusted if new sensor-types or individuals are added to the system, because they access them via standardized services. One of the outstanding features of service-oriented architectures is the possibility to compose new services from existing ones. The so called orchestration, allows the definition of new warning processes which can be adapted easily to new requirements. This approach has following advantages: • With implementing SWE it is possible to establish the "detection" and integration of sensors via the internet. Thus a system of systems combining early warning functionality at different levels of detail is feasible. • Any institution could add both its own components as well as components from third parties if they are developed in conformance to SOA principles. In a federation an institution keeps the ownership of its data and decides which data are provided by a service and when. • A system can be deployed at minor costs as a core for own development at any institution and thus enabling autonomous early warning- or monitoring systems. The presentation covers both design and various instantiations (live demonstration) of the GITEWS architecture. Experiences concerning the design and complexity of SWE will be addressed in detail. A substantial amount of attention is laid on the techniques and methods of extending the architecture, adapting proprietary components to SWE services and encoding, and their orchestration in high level workflows and processes. Furthermore the potential of the architecture concerning adaptive behavior, collaboration across boundaries and semantic interoperability will be addressed.

  20. Use of Open Standards and Technologies at the Lunar Mapping and Modeling Project

    NASA Astrophysics Data System (ADS)

    Law, E.; Malhotra, S.; Bui, B.; Chang, G.; Goodale, C. E.; Ramirez, P.; Kim, R. M.; Sadaqathulla, S.; Rodriguez, L.

    2011-12-01

    The Lunar Mapping and Modeling Project (LMMP), led by the Marshall Space Flight center (MSFC), is tasked by NASA. The project is responsible for the development of an information system to support lunar exploration activities. It provides lunar explorers a set of tools and lunar map and model products that are predominantly derived from present lunar missions (e.g., the Lunar Reconnaissance Orbiter (LRO)) and from historical missions (e.g., Apollo). At Jet Propulsion Laboratory (JPL), we have built the LMMP interoperable geospatial information system's underlying infrastructure and a single point of entry - the LMMP Portal by employing a number of open standards and technologies. The Portal exposes a set of services to users to allow search, visualization, subset, and download of lunar data managed by the system. Users also have access to a set of tools that visualize, analyze and annotate the data. The infrastructure and Portal are based on web service oriented architecture. We designed the system to support solar system bodies in general including asteroids, earth and planets. We employed a combination of custom software, commercial and open-source components, off-the-shelf hardware and pay-by-use cloud computing services. The use of open standards and web service interfaces facilitate platform and application independent access to the services and data, offering for instances, iPad and Android mobile applications and large screen multi-touch with 3-D terrain viewing functions, for a rich browsing and analysis experience from a variety of platforms. The web services made use of open standards including: Representational State Transfer (REST); and Open Geospatial Consortium (OGC)'s Web Map Service (WMS), Web Coverage Service (WCS), Web Feature Service (WFS). Its data management services have been built on top of a set of open technologies including: Object Oriented Data Technology (OODT) - open source data catalog, archive, file management, data grid framework; openSSO - open source access management and federation platform; solr - open source enterprise search platform; redmine - open source project collaboration and management framework; GDAL - open source geospatial data abstraction library; and others. Its data products are compliant with Federal Geographic Data Committee (FGDC) metadata standard. This standardization allows users to access the data products via custom written applications or off-the-shelf applications such as GoogleEarth. We will demonstrate this ready-to-use system for data discovery and visualization by walking through the data services provided through the portal such as browse, search, and other tools. We will further demonstrate image viewing and layering of lunar map images from the Internet, via mobile devices such as Apple's iPad.

  1. Best Practices for Making Scientific Data Discoverable and Accessible through Integrated, Standards-Based Data Portals

    NASA Astrophysics Data System (ADS)

    Lucido, J. M.

    2013-12-01

    Scientists in the fields of hydrology, geophysics, and climatology are increasingly using the vast quantity of publicly-available data to address broadly-scoped scientific questions. For example, researchers studying contamination of nearshore waters could use a combination of radar indicated precipitation, modeled water currents, and various sources of in-situ monitoring data to predict water quality near a beach. In discovering, gathering, visualizing and analyzing potentially useful data sets, data portals have become invaluable tools. The most effective data portals often aggregate distributed data sets seamlessly and allow multiple avenues for accessing the underlying data, facilitated by the use of open standards. Additionally, adequate metadata are necessary for attribution, documentation of provenance and relating data sets to one another. Metadata also enable thematic, geospatial and temporal indexing of data sets and entities. Furthermore, effective portals make use of common vocabularies for scientific methods, units of measure, geologic features, chemical, and biological constituents as they allow investigators to correctly interpret and utilize data from external sources. One application that employs these principles is the National Ground Water Monitoring Network (NGWMN) Data Portal (http://cida.usgs.gov/ngwmn), which makes groundwater data from distributed data providers available through a single, publicly accessible web application by mediating and aggregating native data exposed via web services on-the-fly into Open Geospatial Consortium (OGC) compliant service output. That output may be accessed either through the map-based user interface or through the aforementioned OGC web services. Furthermore, the Geo Data Portal (http://cida.usgs.gov/climate/gdp/), which is a system that provides users with data access, subsetting and geospatial processing of large and complex climate and land use data, exemplifies the application of International Standards Organization (ISO) metadata records to enhance data discovery for both human and machine interpretation. Lastly, the Water Quality Portal (http://www.waterqualitydata.us/) achieves interoperable dissemination of water quality data by referencing a vocabulary service for mapping constituents and methods between the USGS and USEPA. The NGWMN Data Portal, Geo Data Portal and Water Quality Portal are three examples of best practices when implementing data portals that provide distributed scientific data in an integrated, standards-based approach.

  2. Best Practices for Preparing Interoperable Geospatial Data

    NASA Astrophysics Data System (ADS)

    Wei, Y.; Santhana Vannan, S.; Cook, R. B.; Wilson, B. E.; Beaty, T. W.

    2010-12-01

    Geospatial data is critically important for a wide scope of research and applications: carbon cycle and ecosystem, climate change, land use and urban planning, environmental protecting, etc. Geospatial data is created by different organizations using different methods, from remote sensing observations, field surveys, model simulations, etc., and stored in various formats. So geospatial data is diverse and heterogeneous, which brings a huge barrier for the sharing and using of geospatial data, especially when targeting a broad user community. Many efforts have been taken to address different aspects of using geospatial data by improving its interoperability. For example, the specification for Open Geospatial Consortium (OGC) catalog services defines a standard way for geospatial information discovery; OGC Web Coverage Services (WCS) and OPeNDAP define interoperable protocols for geospatial data access, respectively. But the reality is that only having the standard mechanisms for data discovery and access is not enough. The geospatial data content itself has to be organized in standard, easily understandable, and readily usable formats. The Oak Ridge National Lab Distributed Archived Data Center (ORNL DAAC) archives data and information relevant to biogeochemical dynamics, ecological data, and environmental processes. The Modeling and Synthesis Thematic Data Center (MAST-DC) prepares and distributes both input data and output data of carbon cycle models and provides data support for synthesis and terrestrial model inter-comparison in multi-scales. Both of these NASA-funded data centers compile and distribute a large amount of diverse geospatial data and have broad user communities, including GIS users, Earth science researchers, and ecosystem modeling teams. The ORNL DAAC and MAST-DC address this geospatial data interoperability issue by standardizing the data content and feeding them into a well-designed Spatial Data Infrastructure (SDI) which provides interoperable mechanisms to advertise, visualize, and distribute the standardized geospatial data. In this presentation, we summarize the experiences learned and the best practices for geospatial data standardization. The presentation will describe how diverse and historical data archived in the ORNL DAAC were converted into standard and non-proprietary formats; what tools were used to make the conversion; how the spatial and temporal information are properly captured in a consistent manor; how to name a data file or a variable to make it both human-friendly and semantically interoperable; how NetCDF file format and CF convention can promote the data usage in ecosystem modeling user community; how those standardized geospatial data can be fed into OGC Web Services to support on-demand data visualization and access; and how the metadata should be collected and organized so that they can be discovered through standard catalog services.

  3. A Semantic Sensor Web for Environmental Decision Support Applications

    PubMed Central

    Gray, Alasdair J. G.; Sadler, Jason; Kit, Oles; Kyzirakos, Kostis; Karpathiotakis, Manos; Calbimonte, Jean-Paul; Page, Kevin; García-Castro, Raúl; Frazer, Alex; Galpin, Ixent; Fernandes, Alvaro A. A.; Paton, Norman W.; Corcho, Oscar; Koubarakis, Manolis; De Roure, David; Martinez, Kirk; Gómez-Pérez, Asunción

    2011-01-01

    Sensing devices are increasingly being deployed to monitor the physical world around us. One class of application for which sensor data is pertinent is environmental decision support systems, e.g., flood emergency response. For these applications, the sensor readings need to be put in context by integrating them with other sources of data about the surrounding environment. Traditional systems for predicting and detecting floods rely on methods that need significant human resources. In this paper we describe a semantic sensor web architecture for integrating multiple heterogeneous datasets, including live and historic sensor data, databases, and map layers. The architecture provides mechanisms for discovering datasets, defining integrated views over them, continuously receiving data in real-time, and visualising on screen and interacting with the data. Our approach makes extensive use of web service standards for querying and accessing data, and semantic technologies to discover and integrate datasets. We demonstrate the use of our semantic sensor web architecture in the context of a flood response planning web application that uses data from sensor networks monitoring the sea-state around the coast of England. PMID:22164110

  4. Improving Groundwater Data Interoperability: Results of the Second OGC Groundwater Interoperability Experiment

    NASA Astrophysics Data System (ADS)

    Lucido, J. M.; Booth, N.

    2014-12-01

    Interoperable sharing of groundwater data across international boarders is essential for the proper management of global water resources. However storage and management of groundwater data is often times distributed across many agencies or organizations. Furthermore these data may be represented in disparate proprietary formats, posing a significant challenge for integration. For this reason standard data models are required to achieve interoperability across geographical and political boundaries. The GroundWater Markup Language 1.0 (GWML1) was developed in 2010 as an extension of the Geography Markup Language (GML) in order to support groundwater data exchange within Spatial Data Infrastructures (SDI). In 2013, development of GWML2 was initiated under the sponsorship of the Open Geospatial Consortium (OGC) for intended adoption by the international community as the authoritative standard for the transfer of groundwater feature data, including data about water wells, aquifers, and related entities. GWML2 harmonizes GWML1 and the EU's INSPIRE models related to geology and hydrogeology. Additionally, an interoperability experiment was initiated to test the model for commercial, technical, scientific, and policy use cases. The scientific use case focuses on the delivery of data required for input into computational flow modeling software used to determine the flow of groundwater within a particular aquifer system. It involves the delivery of properties associated with hydrogeologic units, observations related to those units, and information about the related aquifers. To test this use case web services are being implemented using GWML2 and WaterML2, which is the authoritative standard for water time series observations, in order to serve USGS water well and hydrogeologic data via standard OGC protocols. Furthermore, integration of these data into a computational groundwater flow model will be tested. This submission will present the GWML2 information model and results of an interoperability experiment with a particular emphasis on the scientific use case.

  5. PlanetServer: Innovative approaches for the online analysis of hyperspectral satellite data from Mars

    NASA Astrophysics Data System (ADS)

    Oosthoek, J. H. P.; Flahaut, J.; Rossi, A. P.; Baumann, P.; Misev, D.; Campalani, P.; Unnithan, V.

    2014-06-01

    PlanetServer is a WebGIS system, currently under development, enabling the online analysis of Compact Reconnaissance Imaging Spectrometer (CRISM) hyperspectral data from Mars. It is part of the EarthServer project which builds infrastructure for online access and analysis of huge Earth Science datasets. Core functionality consists of the rasdaman Array Database Management System (DBMS) for storage, and the Open Geospatial Consortium (OGC) Web Coverage Processing Service (WCPS) for data querying. Various WCPS queries have been designed to access spatial and spectral subsets of the CRISM data. The client WebGIS, consisting mainly of the OpenLayers javascript library, uses these queries to enable online spatial and spectral analysis. Currently the PlanetServer demonstration consists of two CRISM Full Resolution Target (FRT) observations, surrounding the NASA Curiosity rover landing site. A detailed analysis of one of these observations is performed in the Case Study section. The current PlanetServer functionality is described step by step, and is tested by focusing on detecting mineralogical evidence described in earlier Gale crater studies. Both the PlanetServer methodology and its possible use for mineralogical studies will be further discussed. Future work includes batch ingestion of CRISM data and further development of the WebGIS and analysis tools.

  6. Design and performance of an integrated ground and space sensor web for monitoring active volcanoes.

    NASA Astrophysics Data System (ADS)

    Lahusen, Richard; Song, Wenzhan; Kedar, Sharon; Shirazi, Behrooz; Chien, Steve; Doubleday, Joshua; Davies, Ashley; Webb, Frank; Dzurisin, Dan; Pallister, John

    2010-05-01

    An interdisciplinary team of computer, earth and space scientists collaborated to develop a sensor web system for rapid deployment at active volcanoes. The primary goals of this Optimized Autonomous Space In situ Sensorweb (OASIS) are to: 1) integrate complementary space and in situ (ground-based) elements into an interactive, autonomous sensor web; 2) advance sensor web power and communication resource management technology; and 3) enable scalability for seamless addition sensors and other satellites into the sensor web. This three-year project began with a rigorous multidisciplinary interchange that resulted in definition of system requirements to guide the design of the OASIS network and to achieve the stated project goals. Based on those guidelines, we have developed fully self-contained in situ nodes that integrate GPS, seismic, infrasonic and lightning (ash) detection sensors. The nodes in the wireless sensor network are linked to the ground control center through a mesh network that is highly optimized for remote geophysical monitoring. OASIS also features an autonomous bidirectional interaction between ground nodes and instruments on the EO-1 space platform through continuous analysis and messaging capabilities at the command and control center. Data from both the in situ sensors and satellite-borne hyperspectral imaging sensors stream into a common database for real-time visualization and analysis by earth scientists. We have successfully completed a field deployment of 15 nodes within the crater and on the flanks of Mount St. Helens, Washington. The demonstration that sensor web technology facilitates rapid network deployments and that we can achieve real-time continuous data acquisition. We are now optimizing component performance and improving user interaction for additional deployments at erupting volcanoes in 2010.

  7. E-Learning Applications for Urban Modelling and Ogc Standards Using HTML5 Capabilities

    NASA Astrophysics Data System (ADS)

    Kaden, R.; König, G.; Malchow, C.; Kolbe, T. H.

    2012-07-01

    This article reports on the development of HTML5 based web-content related to urban modelling with special focus on GML and CityGML, allowing participants to access it regardless of the device platform. An essential part of the learning modules are short video lectures, supplemented by exercises and tests during the lecture to improve students' individual progress and success. The evaluation of the tests is used to guide students through the course content, depending on individual knowledge. With this approach, we provide learning applications on a wide range of devices, either mobile or desktop, fulfil the needs of just-in-time knowledge, and increase the emphasis on lifelong learning.

  8. Integrating sea floor observatory data: the EMSO data infrastructure

    NASA Astrophysics Data System (ADS)

    Huber, Robert; Azzarone, Adriano; Carval, Thierry; Doumaz, Fawzi; Giovanetti, Gabriele; Marinaro, Giuditta; Rolin, Jean-Francois; Beranzoli, Laura; Waldmann, Christoph

    2013-04-01

    The European research infrastructure EMSO is a European network of fixed-point, deep-seafloor and water column observatories deployed in key sites of the European Continental margin and Arctic. It aims to provide the technological and scientific framework for the investigation of the environmental processes related to the interaction between the geosphere, biosphere, and hydrosphere and for a sustainable management by long-term monitoring also with real-time data transmission. Since 2006, EMSO is on the ESFRI (European Strategy Forum on Research Infrastructures) roadmap and has entered its construction phase in 2012. Within this framework, EMSO is contributing to large infrastructure integration projects such as ENVRI and COOPEUS. The EMSO infrastructure is geographically distributed in key sites of European waters, spanning from the Arctic, through the Atlantic and Mediterranean Sea to the Black Sea. It is presently consisting of thirteen sites which have been identified by the scientific community according to their importance respect to Marine Ecosystems, Climate Changes and Marine GeoHazards. The data infrastructure for EMSO is being designed as a distributed system. Presently, EMSO data collected during experiments at each EMSO site are locally stored and organized in catalogues or relational databases run by the responsible regional EMSO nodes. Three major institutions and their data centers are currently offering access to EMSO data: PANGAEA, INGV and IFREMER. In continuation of the IT activities which have been performed during EMSOs twin project ESONET, EMSO is now implementing the ESONET data architecture within an operational EMSO data infrastructure. EMSO aims to be compliant with relevant marine initiatives such as MyOceans, EUROSITES, EuroARGO, SEADATANET and EMODNET as well as to meet the requirements of international and interdisciplinary projects such as COOPEUS and ENVRI, EUDAT and iCORDI. A major focus is therefore set on standardization and interoperability of the EMSO data infrastructure. Beneath common standards for metadata exchange such as OpenSearch or OAI-PMH, EMSO has chosen to implement core standards of the Open Geospatial Consortium (OGC) Sensor Web Enablement (SWE) suite of standards, such as Catalogue Service for Web (CS-W), Sensor Observation Service (SOS) and Observations and Measurements (O&M). Further, strong integration efforts are currently undertaken to harmonize data formats e.g NetCDF as well as the used ontologies and terminologies. The presentation will also give information to users about the discovery and visualization procedure for the EMSO data presently available.

  9. Citizen Observatories: A Standards Based Architecture

    NASA Astrophysics Data System (ADS)

    Simonis, Ingo

    2015-04-01

    A number of large-scale research projects are currently under way exploring the various components of citizen observatories, e.g. CITI-SENSE (http://www.citi-sense.eu), Citclops (http://citclops.eu), COBWEB (http://cobwebproject.eu), OMNISCIENTIS (http://www.omniscientis.eu), and WeSenseIt (http://www.wesenseit.eu). Common to all projects is the motivation to develop a platform enabling effective participation by citizens in environmental projects, while considering important aspects such as security, privacy, long-term storage and availability, accessibility of raw and processed data and its proper integration into catalogues and international exchange and collaboration systems such as GEOSS or INSPIRE. This paper describes the software architecture implemented for setting up crowdsourcing campaigns using standardized components, interfaces, security features, and distribution capabilities. It illustrates the Citizen Observatory Toolkit, a software suite that allows defining crowdsourcing campaigns, to invite registered and unregistered participants to participate in crowdsourcing campaigns, and to analyze, process, and visualize raw and quality enhanced crowd sourcing data and derived products. The Citizen Observatory Toolkit is not a single software product. Instead, it is a framework of components that are built using internationally adopted standards wherever possible (e.g. OGC standards from Sensor Web Enablement, GeoPackage, and Web Mapping and Processing Services, as well as security and metadata/cataloguing standards), defines profiles of those standards where necessary (e.g. SWE O&M profile, SensorML profile), and implements design decisions based on the motivation to maximize interoperability and reusability of all components. The toolkit contains tools to set up, manage and maintain crowdsourcing campaigns, allows building on-demand apps optimized for the specific sampling focus, supports offline and online sampling modes using modern cell phones with built-in sensing technologies, automates the upload of the raw data, and handles conflation services to match quality requirements and analysis challenges. The strict implementation of all components using internationally adopted standards ensures maximal interoperability and reusability of all components. The Citizen Observatory Toolkit is currently developed as part of the COBWEB research project. COBWEB is partially funded by the European Programme FP7/2007-2013 under grant agreement n° 308513; part of the topic ENV.2012.6.5-1 "Developing community based environmental monitoring and information systems using innovative and novel earth observation applications.

  10. Results from the Autonomous Triggering of in situ Sensors on Kilauea Volcano, HI, from Eruption Detection by Spacecraft

    NASA Astrophysics Data System (ADS)

    Doubleday, J.; Behar, A.; Davies, A.; Mora-Vargas, A.; Tran, D.; Abtahi, A.; Pieri, D. C.; Boudreau, K.; Cecava, J.

    2008-12-01

    Response time in acquiring sensor data in volcanic emergencies can be greatly improved through use of autonomous systems. For instance, ground-based observations and data processing applications of the JPL Volcano Sensor Web have promptly triggered spacecraft observations [e.g., 1]. The reverse command and information flow path can also be useful, using autonomous analysis of spacecraft data to trigger in situ sensors. In this demonstration project, SO2 sensors were incorporated into expendable "Volcano Monitor" capsules and placed downwind of the Pu'u 'O'o vent of Kilauea volcano, Hawai'i. In nominal (low) power conservation mode, data from these sensors were collected and transmitted every hour to the Volcano Sensor Web through the Iridium Satellite Network. When SO2 readings exceeded a predetermined threshold, the modem within the Volcano Monitor sent an alert to the Sensor Web, and triggered a request for prompt Earth Observing-1 (EO-1) spacecraft data acquisition. The Volcano Monitors were also triggered by the Sensor Web in response to an eruption detection by the MODIS instrument on Terra. During these pre- defined "critical events" the Sensor Web ordered the SO2 sensors within the Volcano Monitor to increase their sampling frequency to every 5 minutes (high power "burst mode"). Autonomous control of the sensors' sampling frequency enabled the Sensor Web to monitor and respond to rapidly evolving conditions, and allowed rapid compilation and dissemination of these data to the scientific community. Reference: [1] Davies et al., (2006) Eos, 87, (1), 1 and 5. This work was performed at the Jet Propulsion Laboratory-California Institute of Technology, under contract to NASA. Support was provided by the NASA AIST program, the Idaho Space Grant Consortium, and the New Mexico Space Grant Program. We also especially thank the personnel of the USGS Hawaiian Volcano Observatory for their invaluable scientific guidance and logistical assistance.

  11. 48 CFR 303.203 - Reporting suspected violations of the Gratuities clause.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... AND HUMAN SERVICES GENERAL IMPROPER BUSINESS PRACTICES AND PERSONAL CONFLICTS OF INTEREST Contractor... turn report the matter to the OGC Ethics Division for disposition. The OGC Ethics Division shall...

  12. 48 CFR 303.203 - Reporting suspected violations of the Gratuities clause.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... AND HUMAN SERVICES GENERAL IMPROPER BUSINESS PRACTICES AND PERSONAL CONFLICTS OF INTEREST Contractor... turn report the matter to the OGC Ethics Division for disposition. The OGC Ethics Division shall...

  13. 48 CFR 303.203 - Reporting suspected violations of the Gratuities clause.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... AND HUMAN SERVICES GENERAL IMPROPER BUSINESS PRACTICES AND PERSONAL CONFLICTS OF INTEREST Contractor... turn report the matter to the OGC Ethics Division for disposition. The OGC Ethics Division shall...

  14. 48 CFR 303.203 - Reporting suspected violations of the Gratuities clause.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... AND HUMAN SERVICES GENERAL IMPROPER BUSINESS PRACTICES AND PERSONAL CONFLICTS OF INTEREST Contractor... turn report the matter to the OGC Ethics Division for disposition. The OGC Ethics Division shall...

  15. 48 CFR 303.203 - Reporting suspected violations of the Gratuities clause.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... AND HUMAN SERVICES GENERAL IMPROPER BUSINESS PRACTICES AND PERSONAL CONFLICTS OF INTEREST Contractor... turn report the matter to the OGC Ethics Division for disposition. The OGC Ethics Division shall...

  16. Cryosphere Sensor Webs With The Autonomous Sciencecraft Experiment

    NASA Astrophysics Data System (ADS)

    Scharenbroich, L.; Doggett, T.; Kratz, T.; Castano, R.; Chien, S.; Davies, A. G.; Tran, D.; Mazzoni, D.

    2006-12-01

    Autonomous sensor-webs are being deployed as part of the Autonomous Sciencecraft Experiment [1], whereby observations using the Hyperion instrument [2] on-board Earth Observing-1 (EO-1 are triggered by either ground sensors or by near-real-time analysis of data from other space-based sensors. In the realm of cryosphere monitoring, one sensor-web has been set up pairing EO-1 with a sensor buoy [3] deployed in Sparkling Lake, one of several lakes in northern Wisconsin monitored by University of Wisconsin's Trout Lake Station. A Support Vector Machine (SVM) classifier was trained on historical thermistor chain data with manually recorded ice-in and ice-out times and used to trigger Hyperion observations of the Trout Lake area during spring thaw and winter freeze in 2005. A second sensor-web is being developed using near-real time sea ice data products, based on Department of Defense meteorological satellites, available from the National Snow and Ice Data Center (NSIDC) [4]. Once operational, this sensor web will trigger Hyperion observations of pre-defined targets in the Arctic and Antarctic where regional resolution data shows sea ice formation or break up. [1] Chien et al. (2005), An autonomous earth-observing sensor-web, IEEE Intelligent Systems, [2] Pearlman et al. (2003), Hyperion, a space-based imaging spectrometer, IEEE Trans. Geosci. Rem. Sens., 41(6), [3] Kratz, T. et al. (in press) Toward a Global Lake Ecological Observatory Network, Proceedings of the Karelian Institute, [4] Cavalieri et al. (1999) Near real-time DMSP SSM/I daily polar gridded sea ice concentrations, National Snow and Ice Data Center. Digital Media.

  17. An Optimized Autonomous Space In-situ Sensorweb (OASIS) for Volcano Monitoring

    NASA Astrophysics Data System (ADS)

    Song, W.; Shirazi, B.; Lahusen, R.; Chien, S.; Kedar, S.; Webb, F.

    2006-12-01

    In response to NASA's announced requirement for Earth hazard monitoring sensor-web technology, we are developing a prototype real-time Optimized Autonomous Space In-situ Sensorweb. The prototype will be focused on volcano hazard monitoring at Mount St. Helens, which has been in continuous eruption since October 2004. The system is designed to be flexible and easily configurable for many other applications as well. The primary goals of the project are: 1) integrating complementary space (i.e., Earth Observing One (EO- 1) satellite) and in-situ (ground-based) elements into an interactive, autonomous sensor-web; 2) advancing sensor-web power and communication resource management technology; and 3) enabling scalability for seamless infusion of future space and in-situ assets into the sensor-web. To meet these goals, we are developing: 1) a test-bed in-situ array with smart sensor nodes capable of making autonomous data acquisition decisions; 2) efficient self-organization algorithm of sensor-web topology to support efficient data communication and command control; 3) smart bandwidth allocation algorithms in which sensor nodes autonomously determine packet priorities based on mission needs and local bandwidth information in real- time; and 4) remote network management and reprogramming tools. The space and in-situ control components of the system will be integrated such that each element is capable of triggering the other. Sensor-web data acquisition and dissemination will be accomplished through the use of SensorML language standards for geospatial information. The three-year project will demonstrate end-to-end system performance with the in-situ test-bed at Mount St. Helens and NASA's EO-1 platform.

  18. Optimized autonomous space in-situ sensor web for volcano monitoring

    USGS Publications Warehouse

    Song, W.-Z.; Shirazi, B.; Huang, R.; Xu, M.; Peterson, N.; LaHusen, R.; Pallister, J.; Dzurisin, D.; Moran, S.; Lisowski, M.; Kedar, S.; Chien, S.; Webb, F.; Kiely, A.; Doubleday, J.; Davies, A.; Pieri, D.

    2010-01-01

    In response to NASA's announced requirement for Earth hazard monitoring sensor-web technology, a multidisciplinary team involving sensor-network experts (Washington State University), space scientists (JPL), and Earth scientists (USGS Cascade Volcano Observatory (CVO)), have developed a prototype of dynamic and scalable hazard monitoring sensor-web and applied it to volcano monitoring. The combined Optimized Autonomous Space In-situ Sensor-web (OASIS) has two-way communication capability between ground and space assets, uses both space and ground data for optimal allocation of limited bandwidth resources on the ground, and uses smart management of competing demands for limited space assets. It also enables scalability and seamless infusion of future space and in-situ assets into the sensor-web. The space and in-situ control components of the system are integrated such that each element is capable of autonomously tasking the other. The ground in-situ was deployed into the craters and around the flanks of Mount St. Helens in July 2009, and linked to the command and control of the Earth Observing One (EO-1) satellite. ?? 2010 IEEE.

  19. Knowledge base and sensor bus messaging service architecture for critical tsunami warning and decision-support

    NASA Astrophysics Data System (ADS)

    Sabeur, Z. A.; Wächter, J.; Middleton, S. E.; Zlatev, Z.; Häner, R.; Hammitzsch, M.; Loewe, P.

    2012-04-01

    The intelligent management of large volumes of environmental monitoring data for early tsunami warning requires the deployment of robust and scalable service oriented infrastructure that is supported by an agile knowledge-base for critical decision-support In the TRIDEC project (TRIDEC 2010-2013), a sensor observation service bus of the TRIDEC system is being developed for the advancement of complex tsunami event processing and management. Further, a dedicated TRIDEC system knowledge-base is being implemented to enable on-demand access to semantically rich OGC SWE compliant hydrodynamic observations and operationally oriented meta-information to multiple subscribers. TRIDEC decision support requires a scalable and agile real-time processing architecture which enables fast response to evolving subscribers requirements as the tsunami crisis develops. This is also achieved with the support of intelligent processing services which specialise in multi-level fusion methods with relevance feedback and deep learning. The TRIDEC knowledge base development work coupled with that of the generic sensor bus platform shall be presented to demonstrate advanced decision-support with situation awareness in context of tsunami early warning and crisis management.

  20. Ontology Alignment Architecture for Semantic Sensor Web Integration

    PubMed Central

    Fernandez, Susel; Marsa-Maestre, Ivan; Velasco, Juan R.; Alarcos, Bernardo

    2013-01-01

    Sensor networks are a concept that has become very popular in data acquisition and processing for multiple applications in different fields such as industrial, medicine, home automation, environmental detection, etc. Today, with the proliferation of small communication devices with sensors that collect environmental data, semantic Web technologies are becoming closely related with sensor networks. The linking of elements from Semantic Web technologies with sensor networks has been called Semantic Sensor Web and has among its main features the use of ontologies. One of the key challenges of using ontologies in sensor networks is to provide mechanisms to integrate and exchange knowledge from heterogeneous sources (that is, dealing with semantic heterogeneity). Ontology alignment is the process of bringing ontologies into mutual agreement by the automatic discovery of mappings between related concepts. This paper presents a system for ontology alignment in the Semantic Sensor Web which uses fuzzy logic techniques to combine similarity measures between entities of different ontologies. The proposed approach focuses on two key elements: the terminological similarity, which takes into account the linguistic and semantic information of the context of the entity's names, and the structural similarity, based on both the internal and relational structure of the concepts. This work has been validated using sensor network ontologies and the Ontology Alignment Evaluation Initiative (OAEI) tests. The results show that the proposed techniques outperform previous approaches in terms of precision and recall. PMID:24051523

  1. Ontology alignment architecture for semantic sensor Web integration.

    PubMed

    Fernandez, Susel; Marsa-Maestre, Ivan; Velasco, Juan R; Alarcos, Bernardo

    2013-09-18

    Sensor networks are a concept that has become very popular in data acquisition and processing for multiple applications in different fields such as industrial, medicine, home automation, environmental detection, etc. Today, with the proliferation of small communication devices with sensors that collect environmental data, semantic Web technologies are becoming closely related with sensor networks. The linking of elements from Semantic Web technologies with sensor networks has been called Semantic Sensor Web and has among its main features the use of ontologies. One of the key challenges of using ontologies in sensor networks is to provide mechanisms to integrate and exchange knowledge from heterogeneous sources (that is, dealing with semantic heterogeneity). Ontology alignment is the process of bringing ontologies into mutual agreement by the automatic discovery of mappings between related concepts. This paper presents a system for ontology alignment in the Semantic Sensor Web which uses fuzzy logic techniques to combine similarity measures between entities of different ontologies. The proposed approach focuses on two key elements: the terminological similarity, which takes into account the linguistic and semantic information of the context of the entity's names, and the structural similarity, based on both the internal and relational structure of the concepts. This work has been validated using sensor network ontologies and the Ontology Alignment Evaluation Initiative (OAEI) tests. The results show that the proposed techniques outperform previous approaches in terms of precision and recall.

  2. Matsu: An Elastic Cloud Connected to a SensorWeb for Disaster Response

    NASA Technical Reports Server (NTRS)

    Mandl, Daniel

    2011-01-01

    This slide presentation reviews the use of cloud computing combined with the SensorWeb in aiding disaster response planning. Included is an overview of the architecture of the SensorWeb, and overviews of the phase 1 of the EO-1 system and the steps to improve it to transform it to an On-demand product cloud as part of the Open Cloud Consortium (OCC). The effectiveness of this system is demonstrated in the SensorWeb for the Namibia flood in 2010, using information blended from MODIS, TRMM, River Gauge data, and the Google Earth version of Namibia the system enabled river surge predictions and could enable planning for future disaster responses.

  3. Investigating Geosparql Requirements for Participatory Urban Planning

    NASA Astrophysics Data System (ADS)

    Mohammadi, E.; Hunter, A. J. S.

    2015-06-01

    We propose that participatory GIS (PGIS) activities including participatory urban planning can be made more efficient and effective if spatial reasoning rules are integrated with PGIS tools to simplify engagement for public contributors. Spatial reasoning is used to describe relationships between spatial entities. These relationships can be evaluated quantitatively or qualitatively using geometrical algorithms, ontological relations, and topological methods. Semantic web services utilize tools and methods that can facilitate spatial reasoning. GeoSPARQL, introduced by OGC, is a spatial reasoning standard used to make declarations about entities (graphical contributions) that take the form of a subject-predicate-object triple or statement. GeoSPARQL uses three basic methods to infer topological relationships between spatial entities, including: OGC's simple feature topology, RCC8, and the DE-9IM model. While these methods are comprehensive in their ability to define topological relationships between spatial entities, they are often inadequate for defining complex relationships that exist in the spatial realm. Particularly relationships between urban entities, such as those between a bus route, the collection of associated bus stops and their overall surroundings as an urban planning pattern. In this paper we investigate common qualitative spatial reasoning methods as a preliminary step to enhancing the capabilities of GeoSPARQL in an online participatory GIS framework in which reasoning is used to validate plans based on standard patterns that can be found in an efficient/effective urban environment.

  4. NASA's Earth Imagery Service as Open Source Software

    NASA Astrophysics Data System (ADS)

    De Cesare, C.; Alarcon, C.; Huang, T.; Roberts, J. T.; Rodriguez, J.; Cechini, M. F.; Boller, R. A.; Baynes, K.

    2016-12-01

    The NASA Global Imagery Browse Service (GIBS) is a software system that provides access to an archive of historical and near-real-time Earth imagery from NASA-supported satellite instruments. The imagery itself is open data, and is accessible via standards such as the Open Geospatial Consortium (OGC)'s Web Map Tile Service (WMTS) protocol. GIBS includes three core software projects: The Imagery Exchange (TIE), OnEarth, and the Meta Raster Format (MRF) project. These projects are developed using a variety of open source software, including: Apache HTTPD, GDAL, Mapserver, Grails, Zookeeper, Eclipse, Maven, git, and Apache Commons. TIE has recently been released for open source, and is now available on GitHub. OnEarth, MRF, and their sub-projects have been on GitHub since 2014, and the MRF project in particular receives many external contributions from the community. Our software has been successful beyond the scope of GIBS: the PO.DAAC State of the Ocean and COVERAGE visualization projects reuse components from OnEarth. The MRF source code has recently been incorporated into GDAL, which is a core library in many widely-used GIS software such as QGIS and GeoServer. This presentation will describe the challenges faced in incorporating open software and open data into GIBS, and also showcase GIBS as a platform on which scientists and the general public can build their own applications.

  5. Optimizability of OGC Standards Implementations - a Case Study

    NASA Astrophysics Data System (ADS)

    Misev, D.; Baumann, P.

    2012-04-01

    Why do we shop at Amazon? Because they have a unique offering that is nowhere else available? Certainly not. Rather, Amazon offers (i) simple, yet effective search; (ii) very simple payment; (iii) extremely rapid delivery. This is how scientific services will be distinguished in future: not for their data holding (there will be manifold choice), but for their service quality. We are facing the transition from data stewardship to service stewardship. One of the OGC standards which particularly enables flexible retrieval is the Web Coverage Processing Service (WCPS). It defines a high-level query language on large, multi-dimensional raster data, such as 1D timeseries, 2D EO imagery, 3D x/y/t image time series and x/y/z geophysical data, 4D x/y/z/t climate and ocean data. We have implemented WCPS based on an Array Database Management System, rasdaman, which is available in open source. In this demonstration, we study WCPS queries on 2D, 3D, and 4D data sets. Particular emphasis is placed on the computational load queries generate in such on-demand processing and filtering. We look at different techniques and their impact on performance, such as adaptive storage partitioning, query rewriting, and just-in-time compilation. Results show that there is significant potential for effective server-side optimization once a query language is sufficiently high-level and declarative.

  6. Experimenting with Sensor Webs Using Earth Observing 1

    NASA Technical Reports Server (NTRS)

    Mandl, Dan

    2004-01-01

    The New Millennium Program (NMP) Earth Observing 1 ( EO-1) satellite was launched November 21, 2000 as a one year technology validation mission. After an almost flawless first year of operations, EO-1 continued to operate in a test bed d e to validate additional technologies and concepts that will be applicable to future sensor webs. A sensor web is a group of sensors, whether space-based, ground-based or air plane-based which act in a collaborative autonomous manner to produce more value than would otherwise result from the individual observations.

  7. The Purpose of the Sensor Web

    NASA Technical Reports Server (NTRS)

    Schoeberl, Mark R.

    2004-01-01

    The Sensor Web concept emerged as the number of Earth Science Satellites began to increase in the recent years. The idea, part of a vision for the future of earth science, was that the sensor systems would be linked in an active way to provide improved forecast capability. This means that a system that is nearly autonomous would need to be developed to allow the satellites to re-target and deploy assets for particular phenomena or provide on board processing for real time data. This talk will describe several elements of the sensor web.

  8. Supporting EarthScope Cyber-Infrastructure with a Modern GPS Science Data System

    NASA Astrophysics Data System (ADS)

    Webb, F. H.; Bock, Y.; Kedar, S.; Jamason, P.; Fang, P.; Dong, D.; Owen, S. E.; Prawirodirjo, L.; Squibb, M.

    2008-12-01

    Building on NASA's investment in the measurement of crustal deformation from continuous GPS, we are developing and implementing a Science Data System (SDS) that will provide mature, long-term Earth Science Data Records (ESDR's). This effort supports NASA's Earth Surface and Interiors (ESI) focus area and provide NASA's component to the EarthScope PBO. This multi-year development is sponsored by NASA's Making Earth System data records for Use in Research Environments (MEaSUREs) program. The SDS integrates the generation of ESDRs with data analysis and exploration, product generation, and modeling tools based on daily GPS data that include GPS networks in western North America and a component of NASA's Global GPS Network (GGN) for terrestrial reference frame definition. The system is expandable to multiple regional and global networks. The SDS builds upon mature data production, exploration, and analysis algorithms developed under NASA's REASoN, ACCESS, and SENH programs. This SDS provides access to positions, time series, velocity fields, and strain measurements derived from continuous GPS data obtained at tracking stations in both the Plate Boundary Observatory and other regional Western North America GPS networks, dating back to 1995. The SDS leverages the IT and Web Services developments carried out under the SCIGN/REASoN and ACCESS projects, which have streamlined access to data products for researchers and modelers, and which have created a prototype an on-the-fly interactive research environment through a modern data portal, GPS Explorer. This IT system has been designed using modern IT tools and principles in order to be extensible to any geographic location, scale, natural hazard, and combination of geophysical sensor and related data. We have built upon open GIS standards, particularly those of the OGC, and have used the principles of Web Service-based Service Oriented Architectures to provide scalability and extensibility to new services and capabilities.

  9. Standardized Access and Processing of Multi-Source Earth Observation Time-Series Data within a Regional Data Middleware

    NASA Astrophysics Data System (ADS)

    Eberle, J.; Schmullius, C.

    2017-12-01

    Increasing archives of global satellite data present a new challenge to handle multi-source satellite data in a user-friendly way. Any user is confronted with different data formats and data access services. In addition the handling of time-series data is complex as an automated processing and execution of data processing steps is needed to supply the user with the desired product for a specific area of interest. In order to simplify the access to data archives of various satellite missions and to facilitate the subsequent processing, a regional data and processing middleware has been developed. The aim of this system is to provide standardized and web-based interfaces to multi-source time-series data for individual regions on Earth. For further use and analysis uniform data formats and data access services are provided. Interfaces to data archives of the sensor MODIS (NASA) as well as the satellites Landsat (USGS) and Sentinel (ESA) have been integrated in the middleware. Various scientific algorithms, such as the calculation of trends and breakpoints of time-series data, can be carried out on the preprocessed data on the basis of uniform data management. Jupyter Notebooks are linked to the data and further processing can be conducted directly on the server using Python and the statistical language R. In addition to accessing EO data, the middleware is also used as an intermediary between the user and external databases (e.g., Flickr, YouTube). Standardized web services as specified by OGC are provided for all tools of the middleware. Currently, the use of cloud services is being researched to bring algorithms to the data. As a thematic example, an operational monitoring of vegetation phenology is being implemented on the basis of various optical satellite data and validation data from the German Weather Service. Other examples demonstrate the monitoring of wetlands focusing on automated discovery and access of Landsat and Sentinel data for local areas.

  10. On Representative Spaceflight Instrument and Associated Instrument Sensor Web Framework

    NASA Technical Reports Server (NTRS)

    Kizhner, Semion; Patel, Umeshkumar; Vootukuru, Meg

    2007-01-01

    Sensor Web-based adaptation and sharing of space flight mission resources, including those of the Space-Ground and Control-User communication segment, could greatly benefit from utilization of heritage Internet Protocols and devices applied for Spaceflight (SpaceIP). This had been successfully demonstrated by a few recent spaceflight experiments. However, while terrestrial applications of Internet protocols are well developed and understood (mostly due to billions of dollars in investments by the military and industry), the spaceflight application of Internet protocols is still in its infancy. Progress in the developments of SpaceIP-enabled instrument components will largely determine the SpaceIP utilization of those investments and acceptance in years to come. Likewise SpaceIP, the development of commercial real-time and instrument colocated computational resources, data compression and storage, can be enabled on-board a spacecraft and, in turn, support a powerful application to Sensor Web-based design of a spaceflight instrument. Sensor Web-enabled reconfiguration and adaptation of structures for hardware resources and information systems will commence application of Field Programmable Arrays (FPGA) and other aerospace programmable logic devices for what this technology was intended. These are a few obvious potential benefits of Sensor Web technologies for spaceflight applications. However, they are still waiting to be explored. This is because there is a need for a new approach to spaceflight instrumentation in order to make these mature sensor web technologies applicable for spaceflight. In this paper we present an approach in developing related and enabling spaceflight instrument-level technologies based on the new concept of a representative spaceflight Instrument Sensor Web (ISW).

  11. Realtime Data to Enable Earth-Observing Sensor Web Capabilities

    NASA Astrophysics Data System (ADS)

    Seablom, M. S.

    2015-12-01

    Over the past decade NASA's Earth Science Technology Office (ESTO) has invested in new technologies for information systems to enhance the Earth-observing capabilities of satellites, aircraft, and ground-based in situ observations. One focus area has been to create a common infrastructure for coordinated measurements from multiple vantage points which could be commanded either manually or through autonomous means, such as from a numerical model. This paradigm became known as the sensor web, formally defined to be "a coherent set of heterogeneous, loosely-coupled, distributed observing nodes interconnected by a communications fabric that can collectively behave as a single dynamically adaptive and reconfigurable observing system". This would allow for adaptive targeting of rapidly evolving, transient, or variable meteorological features to improve our ability to monitor, understand, and predict their evolution. It would also enable measurements earmarked at critical regions of the atmosphere that are highly sensitive to data analysis errors, thus offering the potential for significant improvements in the predictive skill of numerical weather forecasts. ESTO's investment strategy was twofold. Recognizing that implementation of an operational sensor web would not only involve technical cost and risk but also would require changes to the culture of how flight missions were designed and operated, ESTO funded the development of a mission-planning simulator that would quantitatively assess the added value of coordinated observations. The simulator was designed to provide the capability to perform low-cost engineering and design trade studies using synthetic data generated by observing system simulation experiments (OSSEs). The second part of the investment strategy was to invest in prototype applications that implemented key features of a sensor web, with the dual goals of developing a sensor web reference architecture as well as supporting useful science activities that would produce immediate benefit. We briefly discuss three of ESTO's sensor web projects that resulted from solicitations released in 2008 and 2011: the Earth System Sensor Web Simulator, the Earth Phenomena Observing System, and the Sensor Web 3G Namibia Flood Pilot.

  12. Interoperability Between Coastal Web Atlases Using Semantic Mediation: A Case Study of the International Coastal Atlas Network (ICAN)

    NASA Astrophysics Data System (ADS)

    Wright, D. J.; Lassoued, Y.; Dwyer, N.; Haddad, T.; Bermudez, L. E.; Dunne, D.

    2009-12-01

    Coastal mapping plays an important role in informing marine spatial planning, resource management, maritime safety, hazard assessment and even national sovereignty. As such, there is now a plethora of data/metadata catalogs, pre-made maps, tabular and text information on resource availability and exploitation, and decision-making tools. A recent trend has been to encapsulate these in a special class of web-enabled geographic information systems called a coastal web atlas (CWA). While multiple benefits are derived from tailor-made atlases, there is great value added from the integration of disparate CWAs. CWAs linked to one another can query more successfully to optimize planning and decision-making. If a dataset is missing in one atlas, it may be immediately located in another. Similar datasets in two atlases may be combined to enhance study in either region. *But how best to achieve semantic interoperability to mitigate vague data queries, concepts or natural language semantics when retrieving and integrating data and information?* We report on the development of a new prototype seeking to interoperate between two initial CWAs: the Marine Irish Digital Atlas (MIDA) and the Oregon Coastal Atlas (OCA). These two mature atlases are used as a testbed for more regional connections, with the intent for the OCA to use lessons learned to develop a regional network of CWAs along the west coast, and for MIDA to do the same in building and strengthening atlas networks with the UK, Belgium, and other parts of Europe. Our prototype uses semantic interoperability via services harmonization and ontology mediation, allowing local atlases to use their own data structures, and vocabularies (ontologies). We use standard technologies such as OGC Web Map Services (WMS) for delivering maps, and OGC Catalogue Service for the Web (CSW) for delivering and querying ISO-19139 metadata. The metadata records of a given CWA use a given ontology of terms called local ontology. Human or machine users formulate their requests using a common ontology of metadata terms, called global ontology. A CSW mediator rewrites the user’s request into CSW requests over local CSWs using their own (local) ontologies, collects the results and sends them back to the user. To extend the system, we have recently added global maritime boundaries and are also considering nearshore ocean observing system data. Ongoing work includes adding WFS, error management, and exception handling, enabling Smart Searches, and writing full documentation. This prototype is a central research project of the new International Coastal Atlas Network (ICAN), a group of 30+ organizations from 14 nations (and growing) dedicated to seeking interoperability approaches to CWAs in support of coastal zone management and the translation of coastal science to coastal decision-making.

  13. Next Generation Landsat Products Delivered Using Virtual Globes and OGC Standard Services

    NASA Astrophysics Data System (ADS)

    Neiers, M.; Dwyer, J.; Neiers, S.

    2008-12-01

    The Landsat Data Continuity Mission (LDCM) is the next in the series of Landsat satellite missions and is tasked with the objective of delivering data acquired by the Operational Land Imager (OLI). The OLI instrument will provide data continuity to over 30 years of global multispectral data collected by the Landsat series of satellites. The U.S. Geological Survey Earth Resources Observation and Science (USGS EROS) Center has responsibility for the development and operation of the LDCM ground system. One of the mission objectives of the LDCM is to distribute OLI data products electronically over the Internet to the general public on a nondiscriminatory basis and at no cost. To ensure the user community and general public can easily access LDCM data from multiple clients, the User Portal Element (UPE) of the LDCM ground system will use OGC standards and services such as Keyhole Markup Language (KML), Web Map Service (WMS), Web Coverage Service (WCS), and Geographic encoding of Really Simple Syndication (GeoRSS) feeds for both access to and delivery of LDCM products. The USGS has developed and tested the capabilities of several successful UPE prototypes for delivery of Landsat metadata, full resolution browse, and orthorectified (L1T) products from clients such as Google Earth, Google Maps, ESRI ArcGIS Explorer, and Microsoft's Virtual Earth. Prototyping efforts included the following services: using virtual globes to search the historical Landsat archive by dynamic generation of KML; notification of and access to new Landsat acquisitions and L1T downloads from GeoRSS feeds; Google indexing of KML files containing links to full resolution browse and data downloads; WMS delivery of reduced resolution browse, full resolution browse, and cloud mask overlays; and custom data downloads using WCS clients. These various prototypes will be demonstrated and LDCM service implementation plans will be discussed during this session.

  14. Integrated Web-Based Access to and use of Satellite Remote Sensing Data for Improved Decision Making in Hydrologic Applications

    NASA Astrophysics Data System (ADS)

    Teng, W.; Chiu, L.; Kempler, S.; Liu, Z.; Nadeau, D.; Rui, H.

    2006-12-01

    Using NASA satellite remote sensing data from multiple sources for hydrologic applications can be a daunting task and requires a detailed understanding of the data's internal structure and physical implementation. Gaining this understanding and applying it to data reduction is a time-consuming task that must be undertaken before the core investigation can begin. In order to facilitate such investigations, the NASA Goddard Earth Sciences Data and Information Services Center (GES DISC) has developed the GES-DISC Interactive Online Visualization ANd aNalysis Infrastructure or "Giovanni," which supports a family of Web interfaces (instances) that allow users to perform interactive visualization and analysis online without downloading any data. Two such Giovanni instances are particularly relevant to hydrologic applications: the Tropical Rainfall Measuring Mission (TRMM) Online Visualization and Analysis System (TOVAS) and the Agricultural Online Visualization and Analysis System (AOVAS), both highly popular and widely used for a variety of applications, including those related to several NASA Applications of National Priority, such as Agricultural Efficiency, Disaster Management, Ecological Forecasting, Homeland Security, and Public Health. Dynamic, context- sensitive Web services provided by TOVAS and AOVAS enable users to seamlessly access NASA data from within, and deeply integrate the data into, their local client environments. One example is between TOVAS and Florida International University's TerraFly, a Web-enabled system that serves a broad segment of the research and applications community, by facilitating access to various textual, remotely sensed, and vector data. Another example is between AOVAS and the U.S. Department of Agriculture Foreign Agricultural Service (USDA FAS)'s Crop Explorer, the primary decision support tool used by FAS to monitor the production, supply, and demand of agricultural commodities worldwide. AOVAS is also part of GES DISC's Agricultural Information System (AIS), which can operationally provide satellite remote sensing data products (e.g., near- real-time rainfall) and analysis services to agricultural users. AIS enables the remote, interoperable access to distributed data, by using the GrADS-Data Server (GDS) and the Open Geospatial Consortium (OGC)- compliant MapServer. The latter allows the access of AIS data from any OGC-compliant client, such as the Earth-Sun System Gateway (ESG) or Google Earth. The Giovanni system is evolving towards a Service- Oriented Architecture and is highly customizable (e.g., adding new products or services), thus availing the hydrologic applications user community of Giovanni's simple-to-use and powerful capabilities to improve decision-making.

  15. A Proxy Design to Leverage the Interconnection of CoAP Wireless Sensor Networks with Web Applications

    PubMed Central

    Ludovici, Alessandro; Calveras, Anna

    2015-01-01

    In this paper, we present the design of a Constrained Application Protocol (CoAP) proxy able to interconnect Web applications based on Hypertext Transfer Protocol (HTTP) and WebSocket with CoAP based Wireless Sensor Networks. Sensor networks are commonly used to monitor and control physical objects or environments. Smart Cities represent applications of such a nature. Wireless Sensor Networks gather data from their surroundings and send them to a remote application. This data flow may be short or long lived. The traditional HTTP long-polling used by Web applications may not be adequate in long-term communications. To overcome this problem, we include the WebSocket protocol in the design of the CoAP proxy. We evaluate the performance of the CoAP proxy in terms of latency and memory consumption. The tests consider long and short-lived communications. In both cases, we evaluate the performance obtained by the CoAP proxy according to the use of WebSocket and HTTP long-polling. PMID:25585107

  16. Sensor Web in Antarctica: Developing an Intelligent, Autonomous Platform for Locating Biological Flourishes in Cryogenic Environments

    NASA Technical Reports Server (NTRS)

    Delin, K. A.; Harvey, R. P.; Chabot, N. A.; Jackson, S. P.; Adams, Mike; Johnson, D. W.; Britton, J. T.

    2003-01-01

    The most rigorous tests of the ability to detect extant life will occur where biotic activity is limited by severe environmental conditions. Cryogenic environments are among the most severe-the energy and nutrients needed for biological activity are in short supply while the climate itself is actively destructive to biological mechanisms. In such settings biological activity is often limited to brief flourishes, occurring only when and where conditions are at their most favorable. The closer that typical regional conditions approach conditions that are actively hostile , the more widely distributed biological blooms will be in both time and space. On a spatial dimension of a few meters or a time dimension of a few days, biological activity becomes much more difficult to detect. One way to overcome this difficulty is to establish a Sensor Web that can monitor microclimates over appropriate scales of time and distance, allowing a continuous virtual presence for instant recognition of favorable conditions. A more sophisticated Sensor Web, incorporating metabolic sensors, can effectively meet the challenge to be in "the right place in the right time". This is particularly of value in planetary surface missions, where limited mobility and mission timelines require extremely efficient sample and data acquisition. Sensor Webs can be an effective way to fill the gap between broad scale orbital data collection and fine-scale surface lander science. We are in the process of developing an intelligent, distributed and autonomous Sensor Web that will allow us to monitor microclimate under severe cryogenic conditions, approaching those extant on the surface of Mars. Ultimately this Sensor Web will include the ability to detect and/or establish limits on extant microbiological activity through incorporation of novel metabolic gas sensors. Here we report the results of our first deployment of a Sensor Web prototype in a previously unexplored high altitude East Antarctic Plateau "micro-oasis" at the MacAlpine Hills, Law Glacier, Antarctica.

  17. Quality Metadata Management for Geospatial Scientific Workflows: from Retrieving to Assessing with Online Tools

    NASA Astrophysics Data System (ADS)

    Leibovici, D. G.; Pourabdollah, A.; Jackson, M.

    2011-12-01

    Experts and decision-makers use or develop models to monitor global and local changes of the environment. Their activities require the combination of data and processing services in a flow of operations and spatial data computations: a geospatial scientific workflow. The seamless ability to generate, re-use and modify a geospatial scientific workflow is an important requirement but the quality of outcomes is equally much important [1]. Metadata information attached to the data and processes, and particularly their quality, is essential to assess the reliability of the scientific model that represents a workflow [2]. Managing tools, dealing with qualitative and quantitative metadata measures of the quality associated with a workflow, are, therefore, required for the modellers. To ensure interoperability, ISO and OGC standards [3] are to be adopted, allowing for example one to define metadata profiles and to retrieve them via web service interfaces. However these standards need a few extensions when looking at workflows, particularly in the context of geoprocesses metadata. We propose to fill this gap (i) at first through the provision of a metadata profile for the quality of processes, and (ii) through providing a framework, based on XPDL [4], to manage the quality information. Web Processing Services are used to implement a range of metadata analyses on the workflow in order to evaluate and present quality information at different levels of the workflow. This generates the metadata quality, stored in the XPDL file. The focus is (a) on the visual representations of the quality, summarizing the retrieved quality information either from the standardized metadata profiles of the components or from non-standard quality information e.g., Web 2.0 information, and (b) on the estimated qualities of the outputs derived from meta-propagation of uncertainties (a principle that we have introduced [5]). An a priori validation of the future decision-making supported by the outputs of the workflow once run, is then provided using the meta-propagated qualities, obtained without running the workflow [6], together with the visualization pointing out the need to improve the workflow with better data or better processes on the workflow graph itself. [1] Leibovici, DG, Hobona, G Stock, K Jackson, M (2009) Qualifying geospatial workfow models for adaptive controlled validity and accuracy. In: IEEE 17th GeoInformatics, 1-5 [2] Leibovici, DG, Pourabdollah, A (2010a) Workflow Uncertainty using a Metamodel Framework and Metadata for Data and Processes. OGC TC/PC Meetings, September 2010, Toulouse, France [3] OGC (2011) www.opengeospatial.org [4] XPDL (2008) Workflow Process Definition Interface - XML Process Definition Language.Workflow Management Coalition, Document WfMC-TC-1025, 2008 [5] Leibovici, DG Pourabdollah, A Jackson, M (2011) Meta-propagation of Uncertainties for Scientific Workflow Management in Interoperable Spatial Data Infrastructures. In: Proceedings of the European Geosciences Union (EGU2011), April 2011, Austria [6] Pourabdollah, A Leibovici, DG Jackson, M (2011) MetaPunT: an Open Source tool for Meta-Propagation of uncerTainties in Geospatial Processing. In: Proceedings of OSGIS2011, June 2011, Nottingham, UK

  18. GeoSciML v3.0 - a significant upgrade of the CGI-IUGS geoscience data model

    NASA Astrophysics Data System (ADS)

    Raymond, O.; Duclaux, G.; Boisvert, E.; Cipolloni, C.; Cox, S.; Laxton, J.; Letourneau, F.; Richard, S.; Ritchie, A.; Sen, M.; Serrano, J.-J.; Simons, B.; Vuollo, J.

    2012-04-01

    GeoSciML version 3.0 (http://www.geosciml.org), released in late 2011, is the latest version of the CGI-IUGS* Interoperability Working Group geoscience data interchange standard. The new version is a significant upgrade and refactoring of GeoSciML v2 which was released in 2008. GeoSciML v3 has already been adopted by several major international interoperability initiatives, including OneGeology, the EU INSPIRE program, and the US Geoscience Information Network, as their standard data exchange format for geoscience data. GeoSciML v3 makes use of recently upgraded versions of several Open Geospatial Consortium (OGC) and ISO data transfer standards, including GML v3.2, SWE Common v2.0, and Observations and Measurements v2 (ISO 19156). The GeoSciML v3 data model has been refactored from a single large application schema with many packages, into a number of smaller, but related, application schema modules with individual namespaces. This refactoring allows the use and future development of modules of GeoSciML (eg; GeologicUnit, GeologicStructure, GeologicAge, Borehole) in smaller, more manageable units. As a result of this refactoring and the integration with new OGC and ISO standards, GeoSciML v3 is not backwardly compatible with previous GeoSciML versions. The scope of GeoSciML has been extended in version 3.0 to include new models for geomorphological data (a Geomorphology application schema), and for geological specimens, geochronological interpretations, and metadata for geochemical and geochronological analyses (a LaboratoryAnalysis-Specimen application schema). In addition, there is better support for borehole data, and the PhysicalProperties model now supports a wider range of petrophysical measurements. The previously used CGI_Value data type has been superseded in favour of externally governed data types provided by OGC's SWE Common v2 and GML v3.2 data standards. The GeoSciML v3 release includes worked examples of best practice in delivering geochemical analytical data using the Observations and Measurements (ISO19156) and SWE Common v2 models. The GeoSciML v3 data model does not include vocabularies to support the data model. However, it does provide a standard pattern to reference controlled vocabulary concepts using HTTP-URIs. The international GeoSciML community has developed distributed RDF-based geoscience vocabularies that can be accessed by GeoSciML web services using the standard pattern recommended in GeoSciML v3. GeoSciML v3 is the first version of GeoSciML that will be accompanied by web service validation tools using Schematron rules. For example, these validation tools may check for compliance of a web service to a particular profile of GeoSciML, or for logical consistency of data content that cannot be enforced by the application schemas. This validation process will support accreditation of GeoSciML services and a higher degree of semantic interoperability. * International Union of Geological Sciences Commission for Management and Application of Geoscience Information (CGI-IUGS)

  19. IrLaW an OGC compliant infrared thermography measurement system developed on mini PC with real time computing capabilities for long term monitoring of transport infrastructures

    NASA Astrophysics Data System (ADS)

    Dumoulin, J.; Averty, R.

    2012-04-01

    One of the objectives of ISTIMES project is to evaluate the potentialities offered by the integration of different electromagnetic techniques able to perform non-invasive diagnostics for surveillance and monitoring of transport infrastructures. Among the EM methods investigated, uncooled infrared camera is a promising technique due to its dissemination potential according to its relative low cost on the market. Infrared thermography, when it is used in quantitative mode (not in laboratory conditions) and not in qualitative mode (vision applied to survey), requires to process in real time thermal radiative corrections on raw data acquired to take into account influences of natural environment evolution with time. But, camera sensor has to be enough smart to apply in real time calibration law and radiometric corrections in a varying atmosphere. So, a complete measurement system was studied and developed with low cost infrared cameras available on the market. In the system developed, infrared camera is coupled with other sensors to feed simplified radiative models running, in real time, on GPU available on small PC. The system studied and developed uses a fast Ethernet camera FLIR A320 [1] coupled with a VAISALA WXT520 [2] weather station and a light GPS unit [3] for positioning and dating. It can be used with other Ethernet infrared cameras (i.e. visible ones) but requires to be able to access measured data at raw level. In the present study, it has been made possible thanks to a specific agreement signed with FLIR Company. The prototype system studied and developed is implemented on low cost small computer that integrates a GPU card to allow real time parallel computing [4] of simplified radiometric [5] heat balance using information measured with the weather station. An HMI was developed under Linux using OpenSource and complementary pieces of software developed at IFSTTAR. This new HMI called "IrLaW" has various functionalities that let it compliant to be use in real site for long term monitoring. It can be remotely controlled in wire or wireless communication mode depending on what is the context of measurement and the degree of accessibility to the system when it is running on real site. To complete and conclude, thanks to the development of a high level library, but also to the deployment of a daemon, our developed measurement system was tuned to be compatible with OGC standards. Complementary functionalities were also developed to allow the system to self declare to 52North. For that, a specific plugin was developed to be inserted previously at 52North level. Finally, data are also accessible by tasking the system when required, fort instance by using the web portal developed in the ISTIMES Framework. ACKNOWLEDGEMENT - The research leading to these results has received funding from the European Community's Seventh Framework Programme (FP7/2007-2013) under Grant Agreement n° 225663.

  20. A Web of Things-Based Emerging Sensor Network Architecture for Smart Control Systems.

    PubMed

    Khan, Murad; Silva, Bhagya Nathali; Han, Kijun

    2017-02-09

    The Web of Things (WoT) plays an important role in the representation of the objects connected to the Internet of Things in a more transparent and effective way. Thus, it enables seamless and ubiquitous web communication between users and the smart things. Considering the importance of WoT, we propose a WoT-based emerging sensor network (WoT-ESN), which collects data from sensors, routes sensor data to the web, and integrate smart things into the web employing a representational state transfer (REST) architecture. A smart home scenario is introduced to evaluate the proposed WoT-ESN architecture. The smart home scenario is tested through computer simulation of the energy consumption of various household appliances, device discovery, and response time performance. The simulation results show that the proposed scheme significantly optimizes the energy consumption of the household appliances and the response time of the appliances.

  1. A Web of Things-Based Emerging Sensor Network Architecture for Smart Control Systems

    PubMed Central

    Khan, Murad; Silva, Bhagya Nathali; Han, Kijun

    2017-01-01

    The Web of Things (WoT) plays an important role in the representation of the objects connected to the Internet of Things in a more transparent and effective way. Thus, it enables seamless and ubiquitous web communication between users and the smart things. Considering the importance of WoT, we propose a WoT-based emerging sensor network (WoT-ESN), which collects data from sensors, routes sensor data to the web, and integrate smart things into the web employing a representational state transfer (REST) architecture. A smart home scenario is introduced to evaluate the proposed WoT-ESN architecture. The smart home scenario is tested through computer simulation of the energy consumption of various household appliances, device discovery, and response time performance. The simulation results show that the proposed scheme significantly optimizes the energy consumption of the household appliances and the response time of the appliances.  PMID:28208787

  2. Programs and Projects of the Office of General Counsel (OGC)

    EPA Pesticide Factsheets

    The Office of General Counsel (OGC) attorneys work with EPA headquarters and regional offices to provide legal support for the issuance of permits, the approval of state environmental programs, and the initiation and litigation of enforcement actions.

  3. 78 FR 9721 - Privacy Act of 1974; New System of Records, Office of General Counsel E-Discovery Management...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-11

    ... massive emails, word processing documents, PDF files, spreadsheets, presentations, database entries, and....pdf . PURPOSES: OGC-EDMS provides OGC with a method to initiate, track, and manage the collection...

  4. Intensive time series data exploitation: the Multi-sensor Evolution Analysis (MEA) platform

    NASA Astrophysics Data System (ADS)

    Mantovani, Simone; Natali, Stefano; Folegani, Marco; Scremin, Alessandro

    2014-05-01

    The monitoring of the temporal evolution of natural phenomena must be performed in order to ensure their correct description and to allow improvements in modelling and forecast capabilities. This assumption, that is obvious for ground-based measurements, has not always been true for data collected through space-based platforms: except for geostationary satellites and sensors, that allow providing a very effective monitoring of phenomena with geometric scale from regional to global; smaller phenomena (with characteristic dimension lower than few kilometres) have been monitored with instruments that could collect data only with a time interval in the order of several days; bi-temporal techniques have been the most used ones for years, in order to characterise temporal changes and try identifying specific phenomena. The more the number of flying sensor has grown and their performance improved, the more their capability of monitoring natural phenomena at a smaller geographic scale has grown: we can now count on tenth of years of remotely sensed data, collected by hundreds of sensors that are now accessible from a wide users' community, and the techniques for data processing have to be adapted to move toward a data intensive exploitation. Starting from 2008, the European Space Agency has initiated the development of the Multi-sensor Evolution Analysis (MEA) platform (https://mea.eo.esa.int), whose first aim was to permit the access and exploitation of long term remotely sensed satellite data from different platforms: 15 years of global (A)ATSR data together with 5 years of regional AVNIR-2 data were loaded into the system and were used, through a web-based graphic user interface, for land cover change analysis. The MEA data availability has grown during years integrating multi-disciplinary data that feature spatial and temporal dimensions: so far tenths of Terabytes of data in the land and atmosphere domains are available and can be visualized and exploited, keeping the time dimension as the most relevant one (https://mea.eo.esa.int/data_availability.html). MEA is also used as Climate Data gateway in the framework of the FP7 EarthServer Project. In the present work, principles of the MEA platform are presented, emphasizing the general concept and the methods that have been implemented for data access (including OGC standard data access) and exploitation. In order to show its effectiveness, use cases focused on multi-field and multi-temporal data analysis are shown.

  5. Developing a Web-based system by integrating VGI and SDI for real estate management and marketing

    NASA Astrophysics Data System (ADS)

    Salajegheh, J.; Hakimpour, F.; Esmaeily, A.

    2014-10-01

    Property importance of various aspects, especially the impact on various sectors of the economy and the country's macroeconomic is clear. Because of the real, multi-dimensional and heterogeneous nature of housing as a commodity, the lack of an integrated system includes comprehensive information of property, the lack of awareness of some actors in this field about comprehensive information about property and the lack of clear and comprehensive rules and regulations for the trading and pricing, several problems arise for the people involved in this field. In this research implementation of a crowd-sourced Web-based real estate support system is desired. Creating a Spatial Data Infrastructure (SDI) in this system for collecting, updating and integrating all official data about property is also desired in this study. In this system a Web2.0 broker and technologies such as Web services and service composition has been used. This work aims to provide comprehensive and diverse information about property from different sources. For this purpose five-level real estate support system architecture is used. PostgreSql DBMS is used to implement the desired system. Geoserver software is also used as map server and reference implementation of OGC (Open Geospatial Consortium) standards. And Apache server is used to run web pages and user interfaces. Integration introduced methods and technologies provide a proper environment for various users to use the system and share their information. This goal is only achieved by cooperation between all involved organizations in real estate with implementation their required infrastructures in interoperability Web services format.

  6. GeoViQua: quality-aware geospatial data discovery and evaluation

    NASA Astrophysics Data System (ADS)

    Bigagli, L.; Papeschi, F.; Mazzetti, P.; Nativi, S.

    2012-04-01

    GeoViQua (QUAlity aware VIsualization for the Global Earth Observation System of Systems) is a recently started FP7 project aiming at complementing the Global Earth Observation System of Systems (GEOSS) with rigorous data quality specifications and quality-aware capabilities, in order to improve reliability in scientific studies and policy decision-making. GeoViQua main scientific and technical objective is to enhance the GEOSS Common Infrastructure (GCI) providing the user community with innovative quality-aware search and evaluation tools, which will be integrated in the GEO-Portal, as well as made available to other end-user interfaces. To this end, GeoViQua will promote the extension of the current standard metadata for geographic information with accurate and expressive quality indicators, also contributing to the definition of a quality label (GEOLabel). GeoViQua proposed solutions will be assessed in several pilot case studies covering the whole Earth Observation chain, from remote sensing acquisition to data processing, to applications in the main GEOSS Societal Benefit Areas. This work presents the preliminary results of GeoViQua Work Package 4 "Enhanced geo-search tools" (WP4), started in January 2012. Its major anticipated technical innovations are search and evaluation tools that communicate and exploit data quality information from the GCI. In particular, GeoViQua will investigate a graphical search interface featuring a coherent and meaningful aggregation of statistics and metadata summaries (e.g. in the form of tables, charts), thus enabling end users to leverage quality constraints for data discovery and evaluation. Preparatory work on WP4 requirements indicated that users need the "best" data for their purpose, implying a high degree of subjectivity in judgment. This suggests that the GeoViQua system should exploit a combination of provider-generated metadata (objective indicators such as summary statistics), system-generated metadata (contextual/tracking information such as provenance of data and metadata), and user-generated metadata (informal user comments, usage information, rating, etc.). Moreover, metadata should include sufficiently complete access information, to allow rich data visualization and propagation. The following main enabling components are currently identified within WP4: - Quality-aware access services, e.g. a quality-aware extension of the OGC Sensor Observation Service (SOS-Q) specification, to support quality constraints for sensor data publishing and access; - Quality-aware discovery services, namely a quality-aware extension of the OGC Catalog Service for the Web (CSW-Q), to cope with quality constrained search; - Quality-augmentation broker (GeoViQua Broker), to support the linking and combination of the existing GCI metadata with GeoViQua- and user-generated metadata required to support the users in selecting the "best" data for their intended use. We are currently developing prototypes of the above quality-enabled geo-search components, that will be assessed in a sensor-based pilot case study in the next months. In particular, the GeoViQua Broker will be integrated with the EuroGEOSS Broker, to implement CSW-Q and federate (either via distribution or harvesting schemes) quality-aware data sources, GeoViQua will constitute a valuable test-bed for advancing the current best practices and standards in geospatial quality representation and exploitation. The research leading to these results has received funding from the European Community's Seventh Framework Programme (FP7/2007-2013) under Grant Agreement n° 265178.

  7. An automated and integrated framework for dust storm detection based on ogc web processing services

    NASA Astrophysics Data System (ADS)

    Xiao, F.; Shea, G. Y. K.; Wong, M. S.; Campbell, J.

    2014-11-01

    Dust storms are known to have adverse effects on public health. Atmospheric dust loading is also one of the major uncertainties in global climatic modelling as it is known to have a significant impact on the radiation budget and atmospheric stability. The complexity of building scientific dust storm models is coupled with the scientific computation advancement, ongoing computing platform development, and the development of heterogeneous Earth Observation (EO) networks. It is a challenging task to develop an integrated and automated scheme for dust storm detection that combines Geo-Processing frameworks, scientific models and EO data together to enable the dust storm detection and tracking processes in a dynamic and timely manner. This study develops an automated and integrated framework for dust storm detection and tracking based on the Web Processing Services (WPS) initiated by Open Geospatial Consortium (OGC). The presented WPS framework consists of EO data retrieval components, dust storm detecting and tracking component, and service chain orchestration engine. The EO data processing component is implemented based on OPeNDAP standard. The dust storm detecting and tracking component combines three earth scientific models, which are SBDART model (for computing aerosol optical depth (AOT) of dust particles), WRF model (for simulating meteorological parameters) and HYSPLIT model (for simulating the dust storm transport processes). The service chain orchestration engine is implemented based on Business Process Execution Language for Web Service (BPEL4WS) using open-source software. The output results, including horizontal and vertical AOT distribution of dust particles as well as their transport paths, were represented using KML/XML and displayed in Google Earth. A serious dust storm, which occurred over East Asia from 26 to 28 Apr 2012, is used to test the applicability of the proposed WPS framework. Our aim here is to solve a specific instance of a complex EO data and scientific model integration problem by using a framework and scientific workflow approach together. The experimental result shows that this newly automated and integrated framework can be used to give advance near real-time warning of dust storms, for both environmental authorities and public. The methods presented in this paper might be also generalized to other types of Earth system models, leading to improved ease of use and flexibility.

  8. International Virtual Observatory System for Water Resources Information

    NASA Astrophysics Data System (ADS)

    Leinenweber, Lewis; Bermudez, Luis

    2013-04-01

    Sharing, accessing, and integrating hydrologic and climatic data have been identified as a critical need for some time. The current state of data portals, standards, technologies, activities, and expertise can be leverage to develop an initial operational capability for a virtual observatory system. This system will allow to link observations data with stream networks and models, and to solve semantic inconsistencies among communities. Prototyping a virtual observatory system is an inter-disciplinary, inter-agency and international endeavor. The Open Geospatial Consortium (OGC) within the OGC Interoperability Program provides the process and expertise to run such collaborative effort. The OGC serves as a global forum for the collaboration of developers and users of spatial data products and services, and to advance the development of international standards for geospatial interoperability. The project coordinated by OGC that is advancing an international virtual observatory system for water resources information is called Climatology-Hydrology Information Sharing Pilot, Phase 1 (CHISP-1). It includes observations and forecasts in the U.S. and Canada levering current networks and capabilities. It is designed to support the following use cases: 1) Hydrologic modeling for historical and near-future stream flow and groundwater conditions. Requires the integration of trans-boundary stream flow and groundwater well data, as well as national river networks (US NHD and Canada NHN) from multiple agencies. Emphasis will be on time series data and real-time flood monitoring. 2) Modeling and assessment of nutrient load into the lakes. Requires accessing water-quality data from multiple agencies and integrating with stream flow information for calculating loads. Emphasis on discrete sampled water quality observations, linking those to specific NHD stream reaches and catchments, and additional metadata for sampled data. The key objectives of these use cases are: 1) To link observations data to the stream network, enabling queries of conditions upstream from a given location to return all relevant gages and well locations. This is currently not practical with the data sources available. 2) To bridge differences in semantics across information models and processes used by the various data producers, to improve the hydrologic and water quality modeling capabilities. Other expected benefits to be derived from this project include: - Leverage a large body of existing data holdings and related activities of multiple agencies in the US and Canada. - Influence data and metadata standards used internationally for web-based information sharing, through multiple agency cooperation and OGC standards setting process. - Reduction of procurement risk through partnership-based development of an initial operating capability verses the cost for building a fully operational system using a traditional "waterfall approach". - Identification and clarification of what is possible, and of the key technical and non-technical barriers to continued progress in sharing and integrating hydrologic and climatic information. - Promote understanding and strengthen ties within the hydro-climatic community. This is anticipated to be the first phase of a multi-phase project, with future work on forecasting the hydrologic consequences of extreme weather events, and enabling more sophisticated water quality modeling.

  9. 48 CFR 801.602-75 - Review requirements-OGC.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 5 2010-10-01 2010-10-01 false Review requirements-OGC. 801.602-75 Section 801.602-75 Federal Acquisition Regulations System DEPARTMENT OF VETERANS AFFAIRS GENERAL DEPARTMENT OF VETERANS AFFAIRS ACQUISITION REGULATION SYSTEM Career Development, Contracting...

  10. 48 CFR 801.602-75 - Review requirements-OGC.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 5 2012-10-01 2012-10-01 false Review requirements-OGC. 801.602-75 Section 801.602-75 Federal Acquisition Regulations System DEPARTMENT OF VETERANS AFFAIRS GENERAL DEPARTMENT OF VETERANS AFFAIRS ACQUISITION REGULATION SYSTEM Career Development, Contracting...

  11. 48 CFR 801.602-75 - Review requirements-OGC.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 5 2011-10-01 2011-10-01 false Review requirements-OGC. 801.602-75 Section 801.602-75 Federal Acquisition Regulations System DEPARTMENT OF VETERANS AFFAIRS GENERAL DEPARTMENT OF VETERANS AFFAIRS ACQUISITION REGULATION SYSTEM Career Development, Contracting...

  12. 48 CFR 801.602-75 - Review requirements-OGC.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 5 2013-10-01 2013-10-01 false Review requirements-OGC. 801.602-75 Section 801.602-75 Federal Acquisition Regulations System DEPARTMENT OF VETERANS AFFAIRS GENERAL DEPARTMENT OF VETERANS AFFAIRS ACQUISITION REGULATION SYSTEM Career Development, Contracting...

  13. 48 CFR 801.602-75 - Review requirements-OGC.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 5 2014-10-01 2014-10-01 false Review requirements-OGC. 801.602-75 Section 801.602-75 Federal Acquisition Regulations System DEPARTMENT OF VETERANS AFFAIRS GENERAL DEPARTMENT OF VETERANS AFFAIRS ACQUISITION REGULATION SYSTEM Career Development, Contracting...

  14. Sensor Webs to Constellations

    NASA Astrophysics Data System (ADS)

    Cole, M.

    2017-12-01

    Advanced technology plays a key role in enabling future Earth-observing missions needed for global monitoring and climate research. Rapid progress over the past decade and anticipated for the coming decades have diminished the size of some satellites while increasing the amount of data and required pace of integration and analysis. Sensor web developments provide correlations to constellations of smallsats. Reviewing current advances in sensor webs and requirements for constellations will improve planning, operations, and data management for future architectures of multiple satellites with a common mission goal.

  15. Sensor Web Technology Challenges and Advancements for the Earth Science Decadal Survey Era

    NASA Technical Reports Server (NTRS)

    Norton, Charles D.; Moe, Karen

    2011-01-01

    This paper examines the Earth science decadal survey era and the role ESTO developed sensor web technologies can contribute to the scientific observations. This includes hardware and software technology advances for in-situ and in-space measurements. Also discussed are emerging areas of importance such as the potential of small satellites for sensor web based observations as well as advances in data fusion critical to the science and societal benefits of future missions, and the challenges ahead.

  16. The OGC Innovation Program Testbeds - Advancing Architectures for Earth and Systems

    NASA Astrophysics Data System (ADS)

    Bermudez, L. E.; Percivall, G.; Simonis, I.; Serich, S.

    2017-12-01

    The OGC Innovation Program provides a collaborative agile process for solving challenging science problems and advancing new technologies. Since 1999, 100 initiatives have taken place, from multi-million dollar testbeds to small interoperability experiments. During these initiatives, sponsors and technology implementers (including academia and private sector) come together to solve problems, produce prototypes, develop demonstrations, provide best practices, and advance the future of standards. This presentation will provide the latest system architectures that can be used for Earth and space systems as a result of the OGC Testbed 13, including the following components: Elastic cloud autoscaler for Earth Observations (EO) using a WPS in an ESGF hybrid climate data research platform. Accessibility of climate data for the scientist and non-scientist users via on demand models wrapped in WPS. Standards descriptions for containerize applications to discover processes on the cloud, including using linked data, a WPS extension for hybrid clouds and linking to hybrid big data stores. OpenID and OAuth to secure OGC Services with built-in Attribute Based Access Control (ABAC) infrastructures leveraging GeoDRM patterns. Publishing and access of vector tiles, including use of compression and attribute options reusing patterns from WMS, WMTS and WFS. Servers providing 3D Tiles and streaming of data, including Indexed 3d Scene Layer (I3S), CityGML and Common DataBase (CDB). Asynchronous Services with advanced pushed notifications strategies, with a filter language instead of simple topic subscriptions, that can be use across OGC services. Testbed 14 will continue advancing topics like Big Data, security, and streaming, as well as making easier to use OGC services (e.g. RESTful APIs). The Call for Participation will be issued in December and responses are due on mid January 2018.

  17. The OGC Innovation Program Testbeds - Advancing Architectures for Earth and Systems

    NASA Astrophysics Data System (ADS)

    Bermudez, L. E.; Percivall, G.; Simonis, I.; Serich, S.

    2016-12-01

    The OGC Innovation Program provides a collaborative agile process for solving challenging science problems and advancing new technologies. Since 1999, 100 initiatives have taken place, from multi-million dollar testbeds to small interoperability experiments. During these initiatives, sponsors and technology implementers (including academia and private sector) come together to solve problems, produce prototypes, develop demonstrations, provide best practices, and advance the future of standards. This presentation will provide the latest system architectures that can be used for Earth and space systems as a result of the OGC Testbed 13, including the following components: Elastic cloud autoscaler for Earth Observations (EO) using a WPS in an ESGF hybrid climate data research platform. Accessibility of climate data for the scientist and non-scientist users via on demand models wrapped in WPS. Standards descriptions for containerize applications to discover processes on the cloud, including using linked data, a WPS extension for hybrid clouds and linking to hybrid big data stores. OpenID and OAuth to secure OGC Services with built-in Attribute Based Access Control (ABAC) infrastructures leveraging GeoDRM patterns. Publishing and access of vector tiles, including use of compression and attribute options reusing patterns from WMS, WMTS and WFS. Servers providing 3D Tiles and streaming of data, including Indexed 3d Scene Layer (I3S), CityGML and Common DataBase (CDB). Asynchronous Services with advanced pushed notifications strategies, with a filter language instead of simple topic subscriptions, that can be use across OGC services. Testbed 14 will continue advancing topics like Big Data, security, and streaming, as well as making easier to use OGC services (e.g. RESTful APIs). The Call for Participation will be issued in December and responses are due on mid January 2018.

  18. Robopedia: Leveraging Sensorpedia for Web-Enabled Robot Control

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Resseguie, David R

    There is a growing interest in building Internetscale sensor networks that integrate sensors from around the world into a single unified system. In contrast, robotics application development has primarily focused on building specialized systems. These specialized systems take scalability and reliability into consideration, but generally neglect exploring the key components required to build a large scale system. Integrating robotic applications with Internet-scale sensor networks will unify specialized robotics applications and provide answers to large scale implementation concerns. We focus on utilizing Internet-scale sensor network technology to construct a framework for unifying robotic systems. Our framework web-enables a surveillance robot smore » sensor observations and provides a webinterface to the robot s actuators. This lets robots seamlessly integrate into web applications. In addition, the framework eliminates most prerequisite robotics knowledge, allowing for the creation of general web-based robotics applications. The framework also provides mechanisms to create applications that can interface with any robot. Frameworks such as this one are key to solving large scale mobile robotics implementation problems. We provide an overview of previous Internetscale sensor networks, Sensorpedia (an ad-hoc Internet-scale sensor network), our framework for integrating robots with Sensorpedia, two applications which illustrate our frameworks ability to support general web-based robotic control, and offer experimental results that illustrate our framework s scalability, feasibility, and resource requirements.« less

  19. Autonomous Triggering of in situ Sensors on Kilauea Volcano, HI, from Eruption Detection by the EO-1 Spacecraft: Design and Operational Scenario.

    NASA Astrophysics Data System (ADS)

    Boudreau, K.; Cecava, J. R.; Behar, A.; Davies, A. G.; Tran, D. Q.; Abtahi, A. A.; Pieri, D. C.; Jpl Volcano Sensor Web Team, A

    2007-12-01

    Response time in acquiring sensor data in volcanic emergencies can be greatly improved through use of autonomous systems. For instance, ground-based observations and data processing applications of the JPL Volcano Sensor Web have promptly triggered spacecraft observations [e.g., 1]. The reverse command and information flow path can also be useful, using autonomous analysis of spacecraft data to trigger in situ sensors. In this demonstration project, SO2 sensors have been incorporated into expendable "Volcano Monitor" capsules to be placed downwind of the Pu'U 'O'o vent of Kilauea volcano, Hawai'i. In nominal (low) power conservation mode, data from these sensors are collected and transmitted every hour to the Volcano Sensor Web through the Iridium Satellite Network. If SO2 readings exceed a predetermined threshold, the modem within the Volcano Monitor sends an alert to the Sensor Web, triggering a request for prompt Earth Observing-1 ( EO-1) spacecraft data acquisition. During pre-defined "critical events" as perceived by multiple sensors (which could include both in situ and spaceborne devices), however, the Sensor Web can order the SO2 sensors within the Volcano Monitor to increase their sampling frequency to once per minute (high power "burst mode"). Autonomous control of the sensors' sampling frequency enables the Sensor Web to monitor and respond to rapidly evolving conditions before and during an eruption, and allows near real-time compilation and dissemination of these data to the scientific community. Reference: [1] Davies et al., (2006) Eos, 87, (1), 1&5. This work was performed at the Jet Propulsion Laboratory-California Institute of Technology, under contract to NASA. Support was provided by the NASA AIST program, the Idaho Space Grant Consortium, and the New Mexico Space Grant Program. We thank the personnel of the USGS Hawaiian Volcano Observatory for their invaluable assistance.

  20. Cyber-physical geographical information service-enabled control of diverse in-situ sensors.

    PubMed

    Chen, Nengcheng; Xiao, Changjiang; Pu, Fangling; Wang, Xiaolei; Wang, Chao; Wang, Zhili; Gong, Jianya

    2015-01-23

    Realization of open online control of diverse in-situ sensors is a challenge. This paper proposes a Cyber-Physical Geographical Information Service-enabled method for control of diverse in-situ sensors, based on location-based instant sensing of sensors, which provides closed-loop feedbacks. The method adopts the concepts and technologies of newly developed cyber-physical systems (CPSs) to combine control with sensing, communication, and computation, takes advantage of geographical information service such as services provided by the Tianditu which is a basic geographic information service platform in China and Sensor Web services to establish geo-sensor applications, and builds well-designed human-machine interfaces (HMIs) to support online and open interactions between human beings and physical sensors through cyberspace. The method was tested with experiments carried out in two geographically distributed scientific experimental fields, Baoxie Sensor Web Experimental Field in Wuhan city and Yemaomian Landslide Monitoring Station in Three Gorges, with three typical sensors chosen as representatives using the prototype system Geospatial Sensor Web Common Service Platform. The results show that the proposed method is an open, online, closed-loop means of control.

  1. Cyber-Physical Geographical Information Service-Enabled Control of Diverse In-Situ Sensors

    PubMed Central

    Chen, Nengcheng; Xiao, Changjiang; Pu, Fangling; Wang, Xiaolei; Wang, Chao; Wang, Zhili; Gong, Jianya

    2015-01-01

    Realization of open online control of diverse in-situ sensors is a challenge. This paper proposes a Cyber-Physical Geographical Information Service-enabled method for control of diverse in-situ sensors, based on location-based instant sensing of sensors, which provides closed-loop feedbacks. The method adopts the concepts and technologies of newly developed cyber-physical systems (CPSs) to combine control with sensing, communication, and computation, takes advantage of geographical information service such as services provided by the Tianditu which is a basic geographic information service platform in China and Sensor Web services to establish geo-sensor applications, and builds well-designed human-machine interfaces (HMIs) to support online and open interactions between human beings and physical sensors through cyberspace. The method was tested with experiments carried out in two geographically distributed scientific experimental fields, Baoxie Sensor Web Experimental Field in Wuhan city and Yemaomian Landslide Monitoring Station in Three Gorges, with three typical sensors chosen as representatives using the prototype system Geospatial Sensor Web Common Service Platform. The results show that the proposed method is an open, online, closed-loop means of control. PMID:25625906

  2. Modern Technologies aspects for Oceanographic Data Management and Dissemination : The HNODC Implementation

    NASA Astrophysics Data System (ADS)

    Lykiardopoulos, A.; Iona, A.; Lakes, V.; Batis, A.; Balopoulos, E.

    2009-04-01

    The development of new technologies for the aim of enhancing Web Applications with Dynamically data access was the starting point for Geospatial Web Applications to developed at the same time as well. By the means of these technologies the Web Applications embed the capability of presenting Geographical representations of the Geo Information. The induction in nowadays, of the state of the art technologies known as Web Services, enforce the Web Applications to have interoperability among them i.e. to be able to process requests from each other via a network. In particular throughout the Oceanographic Community, modern Geographical Information systems based on Geospatial Web Services are now developed or will be developed shortly in the near future, with capabilities of managing the information itself fully through Web Based Geographical Interfaces. The exploitation of HNODC Data Base, through a Web Based Application enhanced with Web Services by the use of open source tolls may be consider as an ideal case of such implementation. Hellenic National Oceanographic Data Center (HNODC) as a National Public Oceanographic Data provider and at the same time a member of the International Net of Oceanographic Data Centers( IOC/IODE), owns a very big volume of Data and Relevant information about the Marine Ecosystem. For the efficient management and exploitation of these Data, a relational Data Base has been constructed with a storage of over 300.000 station data concerning, physical, chemical and biological Oceanographic information. The development of a modern Web Application for the End User worldwide to be able to explore and navigate throughout HNODC data via the use of an interface with the capability of presenting Geographical representations of the Geo Information, is today a fact. The application is constituted with State of the art software components and tools such as: • Geospatial and no Spatial Web Services mechanisms • Geospatial open source tools for the creation of Dynamic Geographical Representations. • Communication protocols (messaging mechanisms) in all Layers such as XML and GML together with SOAP protocol via Apache/Axis. At the same time the application may interact with any other SOA application either in sending or receiving Geospatial Data through Geographical Layers, since it inherits the big advantage of interoperability between Web Services systems. Roughly the Architecture can denoted as follows: • At the back End Open source PostgreSQL DBMS stands as the data storage mechanism with more than one Data Base Schemas cause of the separation of the Geospatial Data and the non Geospatial Data. • UMN Map Server and Geoserver are the mechanisms for: Represent Geospatial Data via Web Map Service (WMS) Querying and Navigating in Geospatial and Meta Data Information via Web Feature Service (WFS) oAnd in the near future Transacting and processing new or existing Geospatial Data via Web Processing Service (WPS) • Map Bender, a geospatial portal site management software for OGC and OWS architectures acts as the integration module between the Geospatial Mechanisms. Mapbender comes with an embedded data model capable to manage interfaces for displaying, navigating and querying OGC compliant web map and feature services (WMS and transactional WFS). • Apache and Tomcat stand again as the Web Service middle Layers • Apache Axis with it's embedded implementation of the SOAP protocol ("Simple Object Access Protocol") acts as the No spatial data Mechanism of Web Services. These modules of the platform are still under development but their implementation will be fulfilled in the near future. • And a new Web user Interface for the end user based on enhanced and customized version of a MapBender GUI, a powerful Web Services client. For HNODC the interoperability of Web Services is the big advantage of the developed platform since it is capable to act in the future as provider and consumer of Web Services in both ways: • Either as data products provider for external SOA platforms. • Or as consumer of data products from external SOA platforms for new applications to be developed or for existing applications to be enhanced. A great paradigm of Data Managenet integration and dissemination via the use of such technologies is the European's Union Research Project Seadatanet, with the main objective to develop a standardized distributed system for managing and disseminating the large and diverse data sets and to enhance the currently existing infrastructures with Web Services Further more and when the technology of Web Processing Service (WPS), will be mature enough and applicable for development, the derived data products will be able to have any kind of GIS functionality for consumers across the network. From this point of view HNODC, joins the global scientific community by providing and consuming application Independent data products.

  3. An international standard for observation data

    NASA Astrophysics Data System (ADS)

    Cox, Simon

    2010-05-01

    A generic information model for observations and related features supports data exchange both within and between different scientific and technical communities. Observations and Measurements (O&M) formalizes a neutral terminology for observation data and metadata. It was based on a model developed for medical observations, and draws on experience from geology and mineral exploration, in-situ monitoring, remote sensing, intelligence, biodiversity studies, ocean observations and climate simulations. Hundreds of current deployments of Sensor Observation Services (SOS), covering multiple disciplines, provide validation of the O&M model. A W3C Incubator group on 'Semantic Sensor Networks' is now using O&M as one of the bases for development of a formal ontology for sensor networks. O&M defines the information describing observation acts and their results, including the following key terms: observation, result, observed-property, feature-of-interest, procedure, phenomenon-time, and result-time. The model separates of the (meta-)data associated with the observation procedure, the observed feature, and the observation event itself. Observation results may take various forms, including scalar quantities, categories, vectors, grids, or any data structure required to represent the value of some property of some observed feature. O&M follows the ISO/TC 211 General Feature Model so non-geometric properties must be associated with typed feature instances. This requires formalization of information that may be trivial when working within some earth-science sub-disciplines (e.g. temperature, pressure etc. are associated with the atmosphere or ocean, and not just a location) but is critical to cross-disciplinary applications. It also allows the same structure and terminology to be used for in-situ, ex-situ and remote sensing observations, as well as for simulations. For example: a stream level observation is an in-situ monitoring application where the feature-of-interest is a reach, the observed property is water-level, and the result is a time-series of heights; stream quality is usually determined by ex-situ observation where the feature-of-interest is a specimen that is recovered from the stream, the observed property is water-quality, and the result is a set of measures of various parameters, or an assessment derived from these; on the other hand, distribution of surface temperature of a water body is typically determined through remote-sensing, where at observation time the procedure is located distant from the feature-of-interest, and the result is an image or grid. Observations usually involve sampling of an ultimate feature-of-interest. In the environmental sciences common sampling strategies are used. Spatial sampling is classified primarily by topological dimension (point, curve, surface, volume) and is supported by standard processing and visualisation tools. Specimens are used for ex-situ processing in most disciplines. Sampling features are often part of complexes (e.g. specimens are sub-divided; specimens are retrieved from points along a transect; sections are taken across tracts), so relationships between instances must be recorded. And observational campaigns involve collections of sampling features. The sampling feature model is a core part of O&M, and application experience has shown that describing the relationships between sampling features and observations is generally critical to successful use of the model. O&M was developed through Open Geospatial Consortium (OGC) as part of the Sensor Web Enablement (SWE) initiative. Other SWE standards include SensorML, SOS, Sensor Planning Service (SPS). The OGC O&M standard (Version 1) had two parts: part 1 describes observation events, and part 2 provides a schema sampling features. A revised version of O&M (Version 2) is to be published in a single document as ISO 19156. O&M Version 1 included an XML encoding for data exchange, which is used as the payload for SOS responses. The new version will provide a UML model only. Since an XML encoding may be generated following a rule, such as that presented in ISO 19136 (GML 3.2), it is not included in the standard directly. O&M Version 2 thus supports multiple physical implementations and versions.

  4. Utilizing Novel Non-traditional Sensor Tasking Approaches to Enhance the Space Situational Awareness Picture Maintained by the Space Surveillance Network

    NASA Astrophysics Data System (ADS)

    Herz, A.; Herz, E.; Center, K.; George, P.; Axelrad, P.; Mutschler, S.; Jones, B.

    2016-09-01

    The Space Surveillance Network (SSN) is tasked with the increasingly difficult mission of detecting, tracking, cataloging and identifying artificial objects orbiting the Earth, including active and inactive satellites, spent rocket bodies, and fragmented debris. Much of the architecture and operations of the SSN are limited and outdated. Efforts are underway to modernize some elements of the systems. Even so, the ability to maintain the best current Space Situational Awareness (SSA) picture and identify emerging events in a timely fashion could be significantly improved by leveraging non-traditional sensor sites. Orbit Logic, the University of Colorado and the University of Texas at Austin are developing an innovative architecture and operations concept to coordinate the tasking and observation information processing of non - traditional assets based on information-theoretic approaches. These confirmed tasking schedules and the resulting data can then be used to "inform" the SSN tasking process. The 'Heimdall Web' system is comprised of core tasking optimization components and accompanying Web interfaces within a secure, split architecture that will for the first time allow non-traditional sensors to support SSA and improve SSN tasking. Heimdall Web application components appropriately score/prioritize space catalog objects based on covariance, priority, observability, expected information gain, and probability of detect - then coordinate an efficient sensor observation schedule for non-SSN sensors contributing to the overall SSA picture maintained by the Joint Space Operations Center (JSpOC). The Heimdall Web Ops concept supports sensor participation levels of "Scheduled", "Tasked" and "Contributing". Scheduled and Tasked sensors are provided optimized observation schedules or object tracking lists from central algorithms, while Contributing sensors review and select from a list of "desired track objects". All sensors are "Web Enabled" for tasking and feedback, supplying observation schedules, confirmed observations and related data back to Heimdall Web to complete the feedback loop for the next scheduling iteration.

  5. A GeoNode-Based Multiscale Platform For Management, Visualization And Integration Of DInSAR Data With Different Geospatial Information Sources

    NASA Astrophysics Data System (ADS)

    Buonanno, Sabatino; Fusco, Adele; Zeni, Giovanni; Manunta, Michele; Lanari, Riccardo

    2017-04-01

    This work describes the implementation of an efficient system for managing, viewing, analyzing and updating remotely sensed data, with special reference to Differential Interferometric Synthetic Aperture Radar (DInSAR) data. The DInSAR products measure Earth surface deformation both in space and time, producing deformation maps and time series[1,2]. The use of these data in research or operational contexts requires tools that have to handle temporal and spatial variability with high efficiency. For this aim we present an implementation based on Spatial Data Infrastructure (SDI) for data integration, management and interchange, by using standard protocols[3]. SDI tools provide access to static datasets that operate only with spatial variability . In this paper we use the open source project GeoNode as framework to extend SDI infrastructure functionalities to ingest very efficiently DInSAR deformation maps and deformation time series. GeoNode allows to realize comprehensive and distributed infrastructure, following the standards of the Open Geospatial Consortium, Inc. - OGC, for remote sensing data management, analysis and integration [4,5]. In the current paper we explain the methodology used for manage the data complexity and data integration using the opens source project GeoNode. The solution presented in this work for the ingestion of DinSAR products is a very promising starting point for future developments of the OGC compliant implementation of a semi-automatic remote sensing data processing chain . [1] Berardino, P., Fornaro, G., Lanari, R., & Sansosti, E. (2002). A new Algorithm for Surface Deformation Monitoring based on Small Baseline Differential SAR Interferograms. IEEE Transactions on Geoscience and Remote Sensing, 40, 11, pp. 2375-2383. [2] Lanari R., F. Casu, M. Manzo, G. Zeni,, P. Berardino, M. Manunta and A. Pepe (2007), An overview of the Small Baseline Subset Algorithm: a DInSAR Technique for Surface Deformation Analysis, P. Appl. Geophys., 164, doi: 10.1007/s00024-007-0192-9. [3] Nebert, D.D. (ed). 2000. Developing Spatial data Infrastructures: The SDI Cookbook. [4] Geonode (www.geonode.org) [5] Kolodziej, k. (ed). 2004. OGC OpenGIS Web Map Server Cookbook. Open Geospatial Consortium, 1.0.2 edition.

  6. Ceos Wgiss Common Framework for Wgiss Connected Data Assets

    NASA Astrophysics Data System (ADS)

    Enloe, Y.; Mitchell, A. E.; Albani, M.; Yapur, M.

    2016-12-01

    The Committee on Earth Observation Satellites (CEOS), established in 1984 to coordinate civil space-borne observations of the Earth, has been building through its Working Group on Information Systems and Services (WGISS), a common data framework to identify and connect data assets at member agencies. Some of these data assets are federated systems such as the CEOS WGISS Integrated Catalog (CWIC), the European Space Agency's FedEO (Federated Earth Observations Missions Access) system, and the International Directory Network (IDN) which is an international effort developed by NASA to assist researchers in locating information on available data sets. A system level team provides coordination and oversight to make this loosely coupled federated system function and evolve. WGISS has identified 2 search standards, the Open Geospatial Consortium (OGC) Catalog Services for the Web (CSW) and the CEOS OpenSearch Best Practices (which references the OGC OpenSearch Geo and Time Extensions and OGC OpenSearch Extension for Earth Observation) as well as an interoperable metadata standard (ISO 19115) for use within the WGISS Connected Assets. Data partners must register their data collections in the IDN using the Global Change Master Directory (GCMD) Keywords. Data partners need to support one of the 2 search standards and be able to map their internal metadata to the ISO 19115 metadata elements. All searchable data must have a data access path. Clients can offer search and access to all or a subset of the satellite data available through the WGISS Connected Data Assets. Clients can offer support for a 2-step search: (1) Discovery through collection search using platform, instrument, science keywords, etc. at the IDN and (2) Search granule metadata at data partners through CWIC or FedEO. There are more than a dozen international agencies that offer their data through the WGISS Federation or working on developing their connections. This list includes European Space Agency, NASA, NOAA, USGS, National Institute for Space Research (Brazil), Canadian Center for Mapping and Earth Observations (CCMEO), the Academy for Opto-Electronics (China), the Indian Space Research Organization (ISRO), EUMETSAT, Russian Federal Space Agency (ROSCOSMOS) and several agencies within Australia.

  7. Exosomes derived from human umbilical cord mesenchymal stem cells protect against cisplatin-induced ovarian granulosa cell stress and apoptosis in vitro.

    PubMed

    Sun, Liping; Li, Dong; Song, Kun; Wei, Jianlu; Yao, Shu; Li, Zhao; Su, Xuantao; Ju, Xiuli; Chao, Lan; Deng, Xiaohui; Kong, Beihua; Li, Li

    2017-05-31

    Human umbilical cord mesenchymal stem cells (huMSCs) can treat primary ovarian insufficiency (POI) related to ovarian granulosa cell (OGC) apoptosis caused by cisplatin chemotherapy. Exosomes are a class of membranous vesicles with diameters of 30-200 nm that are constitutively released by eukaryotic cells. Exosomes mediate local cell-to-cell communication by transferring microRNAs and proteins. In the present study, we demonstrated the effects of exosomes derived from huMSCs (huMSC-EXOs) on a cisplatin-induced OGC model in vitro and discussed the preliminary mechanisms involved in these effects. We successfully extracted huMSC-EXOs from huMSC culture supernatant and observed the effective uptake of exosomes by cells with fluorescent staining. Using flow cytometry (with annexin-V/PI labelling), we found that huMSC-EXOs increased the number of living cells. Western blotting showed that the expression of Bcl-2 and caspase-3 were upregulated, whilst the expression of Bax, cleaved caspase-3 and cleaved PARP were downregulated to protect OGCs. These results suggest that huMSC-EXOs can be used to prevent and treat chemotherapy-induced OGC apoptosis in vitro. Therefore, this work provides insight and further evidence of stem cell function and indicates that huMSC-EXOs protect OGCs from cisplatin-induced injury in vitro.

  8. OneGeology-Europe: architecture, portal and web services to provide a European geological map

    NASA Astrophysics Data System (ADS)

    Tellez-Arenas, Agnès.; Serrano, Jean-Jacques; Tertre, François; Laxton, John

    2010-05-01

    OneGeology-Europe is a large ambitious project to make geological spatial data further known and accessible. The OneGeology-Europe project develops an integrated system of data to create and make accessible for the first time through the internet the geological map of the whole of Europe. The architecture implemented by the project is web services oriented, based on the OGC standards: the geological map is not a centralized database but is composed by several web services, each of them hosted by a European country involved in the project. Since geological data are elaborated differently from country to country, they are difficult to share. OneGeology-Europe, while providing more detailed and complete information, will foster even beyond the geological community an easier exchange of data within Europe and globally. This implies an important work regarding the harmonization of the data, both model and the content. OneGeology-Europe is characterised by the high technological capacity of the EU Member States, and has the final goal to achieve the harmonisation of European geological survey data according to common standards. As a direct consequence Europe will make a further step in terms of innovation and information dissemination, continuing to play a world leading role in the development of geosciences information. The scope of the common harmonized data model was defined primarily by the requirements of the geological map of Europe, but in addition users were consulted and the requirements of both INSPIRE and ‘high-resolution' geological maps were considered. The data model is based on GeoSciML, developed since 2006 by a group of Geological Surveys. The data providers involved in the project implemented a new component that allows the web services to deliver the geological map expressed into GeoSciML. In order to capture the information describing the geological units of the map of Europe the scope of the data model needs to include lithology; age; genesis and metamorphic character. For high resolution maps physical properties, bedding characteristics and weathering also need to be added. Furthermore, Geological data held by national geological surveys is generally described in national language of the country. The project has to deal with the multilingual issue, an important requirement of the INSPIRE directive. The project provides a list of harmonized vocabularies, a set of web services to deal with them, and a web site for helping the geoscientists while mapping the terms used into the national datasets into these vocabularies. The web services provided by each data provider, with the particular component that allows them to deliver the harmonised data model and to handle the multilingualism, are the first part of the architecture. The project also implements a web portal that provides several functionalities. Thanks to the common data model implemented by each web service delivering a part of the geological map, and using OGC SLD standards, the client offers the following option. A user can request for a sub-selection of the map, for instance searching on a particular attribute such as "age is quaternary", and display only the parts of the map according to the filter. Using the web services on the common vocabularies, the data displayed are translated. The project started September 2008 for two years, with 29 partners from 20 countries (20 partners are Geological Surveys). The budget is 3.25 M€, with a European Commission contribution of 2.6 M€. The paper will describe the technical solutions to implement OneGeology-Europe components: the profile of the common data model to exchange geological data, the web services to view and access geological data; and a geoportal to provide the user with a user-friendly way to discover, view and access geological data.

  9. Restful Implementation of Catalogue Service for Geospatial Data Provenance

    NASA Astrophysics Data System (ADS)

    Jiang, L. C.; Yue, P.; Lu, X. C.

    2013-10-01

    Provenance, also known as lineage, is important in understanding the derivation history of data products. Geospatial data provenance helps data consumers to evaluate the quality and reliability of geospatial data. In a service-oriented environment, where data are often consumed or produced by distributed services, provenance could be managed by following the same service-oriented paradigm. The Open Geospatial Consortium (OGC) Catalogue Service for the Web (CSW) is used for the registration and query of geospatial data provenance by extending ebXML Registry Information Model (ebRIM). Recent advance of the REpresentational State Transfer (REST) paradigm has shown great promise for the easy integration of distributed resources. RESTful Web Service aims to provide a standard way for Web clients to communicate with servers based on REST principles. The existing approach for provenance catalogue service could be improved by adopting the RESTful design. This paper presents the design and implementation of a catalogue service for geospatial data provenance following RESTful architecture style. A middleware named REST Converter is added on the top of the legacy catalogue service to support a RESTful style interface. The REST Converter is composed of a resource request dispatcher and six resource handlers. A prototype service is developed to demonstrate the applicability of the approach.

  10. Geospatial Visualization of Scientific Data Through Keyhole Markup Language

    NASA Astrophysics Data System (ADS)

    Wernecke, J.; Bailey, J. E.

    2008-12-01

    The development of virtual globes has provided a fun and innovative tool for exploring the surface of the Earth. However, it has been the paralleling maturation of Keyhole Markup Language (KML) that has created a new medium and perspective through which to visualize scientific datasets. Originally created by Keyhole Inc., and then acquired by Google in 2004, in 2007 KML was given over to the Open Geospatial Consortium (OGC). It became an OGC international standard on 14 April 2008, and has subsequently been adopted by all major geobrowser developers (e.g., Google, Microsoft, ESRI, NASA) and many smaller ones (e.g., Earthbrowser). By making KML a standard at a relatively young stage in its evolution, developers of the language are seeking to avoid the issues that plagued the early World Wide Web and development of Hypertext Markup Language (HTML). The popularity and utility of Google Earth, in particular, has been enhanced by KML features such as the Smithsonian volcano layer and the dynamic weather layers. Through KML, users can view real-time earthquake locations (USGS), view animations of polar sea-ice coverage (NSIDC), or read about the daily activities of chimpanzees (Jane Goodall Institute). Perhaps even more powerful is the fact that any users can create, edit, and share their own KML, with no or relatively little knowledge of manipulating computer code. We present an overview of the best current scientific uses of KML and a guide to how scientists can learn to use KML themselves.

  11. Hybrid Exploration Agent Platform and Sensor Web System

    NASA Technical Reports Server (NTRS)

    Stoffel, A. William; VanSteenberg, Michael E.

    2004-01-01

    A sensor web to collect the scientific data needed to further exploration is a major and efficient asset to any exploration effort. This is true not only for lunar and planetary environments, but also for interplanetary and liquid environments. Such a system would also have myriad direct commercial spin-off applications. The Hybrid Exploration Agent Platform and Sensor Web or HEAP-SW like the ANTS concept is a Sensor Web concept. The HEAP-SW is conceptually and practically a very different system. HEAP-SW is applicable to any environment and a huge range of exploration tasks. It is a very robust, low cost, high return, solution to a complex problem. All of the technology for initial development and implementation is currently available. The HEAP Sensor Web or HEAP-SW consists of three major parts, The Hybrid Exploration Agent Platforms or HEAP, the Sensor Web or SW and the immobile Data collection and Uplink units or DU. The HEAP-SW as a whole will refer to any group of mobile agents or robots where each robot is a mobile data collection unit that spends most of its time acting in concert with all other robots, DUs in the web, and the HEAP-SWs overall Command and Control (CC) system. Each DU and robot is, however, capable of acting independently. The three parts of the HEAP-SW system are discussed in this paper. The Goals of the HEAP-SW system are: 1) To maximize the amount of exploration enhancing science data collected; 2) To minimize data loss due to system malfunctions; 3) To minimize or, possibly, eliminate the risk of total system failure; 4) To minimize the size, weight, and power requirements of each HEAP robot; 5) To minimize HEAP-SW system costs. The rest of this paper discusses how these goals are attained.

  12. Standards-based sensor interoperability and networking SensorWeb: an overview

    NASA Astrophysics Data System (ADS)

    Bolling, Sam

    2012-06-01

    The War fighter lacks a unified Intelligence, Surveillance, and Reconnaissance (ISR) environment to conduct mission planning, command and control (C2), tasking, collection, exploitation, processing, and data discovery of disparate sensor data across the ISR Enterprise. Legacy sensors and applications are not standardized or integrated for assured, universal access. Existing tasking and collection capabilities are not unified across the enterprise, inhibiting robust C2 of ISR including near-real time, cross-cueing operations. To address these critical needs, the National Measurement and Signature Intelligence (MASINT) Office (NMO), and partnering Combatant Commands and Intelligence Agencies are developing SensorWeb, an architecture that harmonizes heterogeneous sensor data to a common standard for users to discover, access, observe, subscribe to and task sensors. The SensorWeb initiative long term goal is to establish an open commercial standards-based, service-oriented framework to facilitate plug and play sensors. The current development effort will produce non-proprietary deliverables, intended as a Government off the Shelf (GOTS) solution to address the U.S. and Coalition nations' inability to quickly and reliably detect, identify, map, track, and fully understand security threats and operational activities.

  13. Automatic Earth observation data service based on reusable geo-processing workflow

    NASA Astrophysics Data System (ADS)

    Chen, Nengcheng; Di, Liping; Gong, Jianya; Yu, Genong; Min, Min

    2008-12-01

    A common Sensor Web data service framework for Geo-Processing Workflow (GPW) is presented as part of the NASA Sensor Web project. This framework consists of a data service node, a data processing node, a data presentation node, a Catalogue Service node and BPEL engine. An abstract model designer is used to design the top level GPW model, model instantiation service is used to generate the concrete BPEL, and the BPEL execution engine is adopted. The framework is used to generate several kinds of data: raw data from live sensors, coverage or feature data, geospatial products, or sensor maps. A scenario for an EO-1 Sensor Web data service for fire classification is used to test the feasibility of the proposed framework. The execution time and influences of the service framework are evaluated. The experiments show that this framework can improve the quality of services for sensor data retrieval and processing.

  14. Towards a framework for geospatial tangible user interfaces in collaborative urban planning

    NASA Astrophysics Data System (ADS)

    Maquil, Valérie; Leopold, Ulrich; De Sousa, Luís Moreira; Schwartz, Lou; Tobias, Eric

    2018-04-01

    The increasing complexity of urban planning projects today requires new approaches to better integrate stakeholders with different professional backgrounds throughout a city. Traditional tools used in urban planning are designed for experts and offer little opportunity for participation and collaborative design. This paper introduces the concept of geospatial tangible user interfaces (GTUI) and reports on the design and implementation as well as the usability of such a GTUI to support stakeholder participation in collaborative urban planning. The proposed system uses physical objects to interact with large digital maps and geospatial data projected onto a tabletop. It is implemented using a PostGIS database, a web map server providing OGC web services, the computer vision framework reacTIVision, a Java-based TUIO client, and GeoTools. We describe how a GTUI has be instantiated and evaluated within the scope of two case studies related to real world collaborative urban planning scenarios. Our results confirm the feasibility of our proposed GTUI solutions to (a) instantiate different urban planning scenarios, (b) support collaboration, and (c) ensure an acceptable usability.

  15. Towards a framework for geospatial tangible user interfaces in collaborative urban planning

    NASA Astrophysics Data System (ADS)

    Maquil, Valérie; Leopold, Ulrich; De Sousa, Luís Moreira; Schwartz, Lou; Tobias, Eric

    2018-03-01

    The increasing complexity of urban planning projects today requires new approaches to better integrate stakeholders with different professional backgrounds throughout a city. Traditional tools used in urban planning are designed for experts and offer little opportunity for participation and collaborative design. This paper introduces the concept of geospatial tangible user interfaces (GTUI) and reports on the design and implementation as well as the usability of such a GTUI to support stakeholder participation in collaborative urban planning. The proposed system uses physical objects to interact with large digital maps and geospatial data projected onto a tabletop. It is implemented using a PostGIS database, a web map server providing OGC web services, the computer vision framework reacTIVision, a Java-based TUIO client, and GeoTools. We describe how a GTUI has be instantiated and evaluated within the scope of two case studies related to real world collaborative urban planning scenarios. Our results confirm the feasibility of our proposed GTUI solutions to (a) instantiate different urban planning scenarios, (b) support collaboration, and (c) ensure an acceptable usability.

  16. A Common Metadata System for Marine Data Portals

    NASA Astrophysics Data System (ADS)

    Wosniok, C.; Breitbach, G.; Lehfeldt, R.

    2012-04-01

    Processing and allocation of marine datasets depend on the nature of the data resulting from field campaigns, continuous monitoring and numerical modeling. Two research and development projects in northern Germany manage different types of marine data. Due to different data characteristics and institutional frameworks separate data portals are required. This paper describes the integration of distributed marine data in Germany. The Marine Data Infrastructure of Germany (MDI-DE) supports public authorities in the German coastal zone with the implementation of European directives like INSPIRE or the Marine Strategy Framework Directive. This is carried out through setting up standardized web services within a network of participating coastal agencies and the installation of a common data portal (http://www.mdi-de.org), which integrates distributed marine data concerning coastal engineering, coastal water protection and nature conservation in an interoperable and harmonized manner for administrative and scientific purposes as well as for information of the general public. The Coastal Observation System for Northern and Arctic Seas (COSYNA) aims at developing and testing analysis systems for the operational synoptic description of the environmental status of the North Sea and of Arctic coastal waters. This is done by establishing a network of monitoring facilities and the provision of its data in near-real-time. In situ measurements with poles, ferry boxes, and buoys, together with remote sensing measurements, and the data assimilation of these data into simulation results enables COSYNA to provide pre-operational 'products', that are beyond the present routinely applied techniques in observation and modelling. The data allocation in near-real-time requires thoroughly executed data validation, which is processed on the fly before data is passed on to the COSYNA portal (http://kofserver2.hzg.de/codm/). Both projects apply OGC standards such as Web Mapping Service (WMS), Web Feature Service (WFS) and Sensor Observation Service (SOS), which ensures interoperability and extensibility. In addition, metadata as crucial components for searching and finding information in large data infrastructures is provided via the Catalogue Web Service (CS-W). MDI-DE and COSYNA rely on the metadata information system for marine metadata NOKIS, which reflects a metadata profile tailored for marine data according to the specifications of German coastal authorities. In spite of this common software base, interoperability between the two data collections requires constant alignments of the diverse data processed by the two portals. While monitoring data in the MDI-DE is currently rather campaign-based, COSYNA has to fit constantly evolving time series into metadata sets. With all data following the same metadata profile, we now reach full interoperability between the different data collections. The distributed marine information system provides options to search, find and visualise the harmonised results from continuous monitoring, field campaigns, numerical modeling and other data in one web client.

  17. An Architecture for Automated Fire Detection Early Warning System Based on Geoprocessing Service Composition

    NASA Astrophysics Data System (ADS)

    Samadzadegan, F.; Saber, M.; Zahmatkesh, H.; Joze Ghazi Khanlou, H.

    2013-09-01

    Rapidly discovering, sharing, integrating and applying geospatial information are key issues in the domain of emergency response and disaster management. Due to the distributed nature of data and processing resources in disaster management, utilizing a Service Oriented Architecture (SOA) to take advantages of workflow of services provides an efficient, flexible and reliable implementations to encounter different hazardous situation. The implementation specification of the Web Processing Service (WPS) has guided geospatial data processing in a Service Oriented Architecture (SOA) platform to become a widely accepted solution for processing remotely sensed data on the web. This paper presents an architecture design based on OGC web services for automated workflow for acquisition, processing remotely sensed data, detecting fire and sending notifications to the authorities. A basic architecture and its building blocks for an automated fire detection early warning system are represented using web-based processing of remote sensing imageries utilizing MODIS data. A composition of WPS processes is proposed as a WPS service to extract fire events from MODIS data. Subsequently, the paper highlights the role of WPS as a middleware interface in the domain of geospatial web service technology that can be used to invoke a large variety of geoprocessing operations and chaining of other web services as an engine of composition. The applicability of proposed architecture by a real world fire event detection and notification use case is evaluated. A GeoPortal client with open-source software was developed to manage data, metadata, processes, and authorities. Investigating feasibility and benefits of proposed framework shows that this framework can be used for wide area of geospatial applications specially disaster management and environmental monitoring.

  18. Spatial data standards meet meteorological data - pushing the boundaries

    NASA Astrophysics Data System (ADS)

    Wagemann, Julia; Siemen, Stephan; Lamy-Thepaut, Sylvie

    2017-04-01

    The data archive of the European Centre for Medium-Range Weather Forecasts (ECMWF) holds around 120 PB of data and is world's largest archive of meteorological data. This information is of great value for many Earth Science disciplines, but the complexity of the data (up to five dimensions and different time axis domains) and its native data format GRIB, while being an efficient archive format, limits the overall data uptake especially from users outside the MetOcean domain. ECMWF's MARS WebAPI is a very efficient and flexible system for expert users to access and retrieve meteorological data, though challenging for users outside the MetOcean domain. With the help of web-based standards for data access and processing, ECMWF wants to make more than 1 PB of meteorological and climate data easier accessible to users across different Earth Science disciplines. As climate data provider for the H2020 project EarthServer-2, ECMWF explores the feasibility to give on-demand access to it's MARS archive via the OGC standard interface Web Coverage Service (WCS). Despite the potential a WCS for climate and meteorological data offers, the standards-based modelling of meteorological and climate data entails many challenges and reveals the boundaries of the current Web Coverage Service 2.0 standard. Challenges range from valid semantic data models for meteorological data to optimal and efficient data structures for a scalable web service. The presentation reviews the applicability of the current Web Coverage Service 2.0 standard to meteorological and climate data and discusses challenges that are necessary to overcome in order to achieve real interoperability and to ensure the conformant sharing and exchange of meteorological data.

  19. Grid Enabled Geospatial Catalogue Web Service

    NASA Technical Reports Server (NTRS)

    Chen, Ai-Jun; Di, Li-Ping; Wei, Ya-Xing; Liu, Yang; Bui, Yu-Qi; Hu, Chau-Min; Mehrotra, Piyush

    2004-01-01

    Geospatial Catalogue Web Service is a vital service for sharing and interoperating volumes of distributed heterogeneous geospatial resources, such as data, services, applications, and their replicas over the web. Based on the Grid technology and the Open Geospatial Consortium (0GC) s Catalogue Service - Web Information Model, this paper proposes a new information model for Geospatial Catalogue Web Service, named as GCWS which can securely provides Grid-based publishing, managing and querying geospatial data and services, and the transparent access to the replica data and related services under the Grid environment. This information model integrates the information model of the Grid Replica Location Service (RLS)/Monitoring & Discovery Service (MDS) with the information model of OGC Catalogue Service (CSW), and refers to the geospatial data metadata standards from IS0 19115, FGDC and NASA EOS Core System and service metadata standards from IS0 191 19 to extend itself for expressing geospatial resources. Using GCWS, any valid geospatial user, who belongs to an authorized Virtual Organization (VO), can securely publish and manage geospatial resources, especially query on-demand data in the virtual community and get back it through the data-related services which provide functions such as subsetting, reformatting, reprojection etc. This work facilitates the geospatial resources sharing and interoperating under the Grid environment, and implements geospatial resources Grid enabled and Grid technologies geospatial enabled. It 2!so makes researcher to focus on science, 2nd not cn issues with computing ability, data locztic, processir,g and management. GCWS also is a key component for workflow-based virtual geospatial data producing.

  20. Tropical Rainfall Measuring Mission (TRMM) Precipitation Data and Services for Research and Applications

    NASA Technical Reports Server (NTRS)

    Liu, Zhong; Ostrenga, Dana; Teng, William; Kempler, Steven

    2012-01-01

    Precipitation is a critical component of the Earth's hydrological cycle. Launched on 27 November 1997, TRMM is a joint U.S.-Japan satellite mission to provide the first detailed and comprehensive data set of the four-dimensional distribution of rainfall and latent heating over vastly under-sampled tropical and subtropical oceans and continents (40 S - 40 N). Over the past 14 years, TRMM has been a major data source for meteorological, hydrological and other research and application activities around the world. The purpose of this short article is to inform that the NASA Goddard Earth Sciences Data and Information Services Center (GES DISC) provides TRMM archive and near-real-time precipitation data sets and services for research and applications. TRMM data consist of orbital data from TRMM instruments at the sensor s resolution, gridded data at a range of spatial and temporal resolutions, subsets, ground-based instrument data, and ancillary data. Data analysis, display, and delivery are facilitated by the following services: (1) Mirador (data search and access); (2) TOVAS (TRMM Online Visualization and Analysis System); (3) OPeNDAP (Open-source Project for a Network Data Access Protocol); (4) GrADS Data Server (GDS); and (5) Open Geospatial Consortium (OGC) Web Map Service (WMS) for the GIS community. Precipitation data application services are available to support a wide variety of applications around the world. Future plans include enhanced and new services to address data related issues from the user community. Meanwhile, the GES DISC is preparing for the Global Precipitation Measurement (GPM) mission which is scheduled for launch in 2014.

  1. Cloud2IR: Infrared thermography and environmental sensors integrated in an autonomoussystem for long term monitoring of structures

    NASA Astrophysics Data System (ADS)

    Crinière, Antoine; Dumoulin, Jean; Mevel, Laurent; Andrade-Barroso, Guillermo

    2016-04-01

    Since late 2014, the project Cloud2SM aims to develop a robust information system able to assess the long term monitoring of civil engineering structures as well as interfacing various sensors and data. Cloud2SM address three main goals, the management of distributed data and sensors network, the asynchronous processing of the data through network and the local management of the sensors themselves [1]. Integrated to this project Cloud2IR is an autonomous sensor system dedicated to the long term monitoring of infrastructures. Past experimentations have shown the need as well as usefulness of such system [2]. Before Cloud2IR an initially laboratory oriented system was used, which implied heavy operating system to be used [3]. Based on such system Cloud2IR has benefited of the experimental knowledge acquired to redefine a lighter architecture based on generics standards, more appropriated to autonomous operations on field and which can be later included in a wide distributed architecture such as Cloud2SM. The sensor system can be divided in two parts. The sensor side, this part is mainly composed by the various sensors drivers themselves as the infrared camera, the weather station or the pyranometers and their different fixed configurations. In our case, as infrared camera are slightly different than other kind of sensors, the system implement in addition an RTSP server which can be used to set up the FOV as well as other measurement parameter considerations. The second part can be seen as the data side, which is common to all sensors. It instantiate through a generic interface all the sensors and control the data access loop (not the requesting). This side of the system is weakly coupled (see data coupling) with the sensor side. It can be seen as a general framework able to aggregate any sensor data, type or size and automatically encapsulate them in various generic data format as HDF5 or cloud data as OGC SWE standard. This whole part is also responsible of the acquisition scenario the local storage management and the network management through SFTP or SOAP for the OGC frame. The data side only need an XML configuration file and if a configuration change occurs in time the system is automatically restarted with the new value. Cloud2IR has been deployed on field since several Monthat the SenseCity outdoor test bed in Marne La Vallée (France)[4]. The next step will be the full standardisation of the system and possibly the full separation between the sensor side and the data side which can be seen at term as an external framework. References: [1] A Crinière, J Dumoulin, L Mevel, G Andrade-Barosso, M Simonin. The Cloud2SM Project.European Geosciences Union General Assembly (EGU2015), Apr 2015, Vienne, Austria. 2015. [2] J Dumoulin, A Criniere, and R Averty. The detection and thermal characterization of the inner structure of the 'musmeci' bridge deck by infrared thermography monitoring. Journal Of Geophysics And Engineering doi:10.1088/1742-2132/10/6/064003, Vol 10, 2013. [3] J Dumoulin, R Averty. Development of an infrared system coupled with a weather station for real time atmospheric corrections using GPU computing: Application to bridge monitoring, in Proc of 11 th International Conference on Quantitative InfraRed Thermography, Naples Italy, 2012. [4] F Derkx, B Lebental, T Bourouina, Frédéric B, C Cojocaru, and al..The Sense-City project.XVIIIth Symposium on Vibrations, Shocks and Noise, Jul 2012, France. 9p, 2012.

  2. An open, interoperable, transdisciplinary approach to a point cloud data service using OGC standards and open source software.

    NASA Astrophysics Data System (ADS)

    Steer, Adam; Trenham, Claire; Druken, Kelsey; Evans, Benjamin; Wyborn, Lesley

    2017-04-01

    High resolution point clouds and other topology-free point data sources are widely utilised for research, management and planning activities. A key goal for research and management users is making these data and common derivatives available in a way which is seamlessly interoperable with other observed and modelled data. The Australian National Computational Infrastructure (NCI) stores point data from a range of disciplines, including terrestrial and airborne LiDAR surveys, 3D photogrammetry, airborne and ground-based geophysical observations, bathymetric observations and 4D marine tracers. These data are stored alongside a significant store of Earth systems data including climate and weather, ecology, hydrology, geoscience and satellite observations, and available from NCI's National Environmental Research Data Interoperability Platform (NERDIP) [1]. Because of the NERDIP requirement for interoperability with gridded datasets, the data models required to store these data may not conform to the LAS/LAZ format - the widely accepted community standard for point data storage and transfer. The goal for NCI is making point data discoverable, accessible and useable in ways which allow seamless integration with earth observation datasets and model outputs - in turn assisting researchers and decision-makers in the often-convoluted process of handling and analyzing massive point datasets. With a use-case of providing a web data service and supporting a derived product workflow, NCI has implemented and tested a web-based point cloud service using the Open Geospatial Consortium (OGC) Web Processing Service [2] as a transaction handler between a web-based client and server-side computing tools based on a native Linux operating system. Using this model, the underlying toolset for driving a data service is flexible and can take advantage of NCI's highly scalable research cloud. Present work focusses on the Point Data Abstraction Library (PDAL) [3] as a logical choice for efficiently handling LAS/LAZ based point workflows, and native HDF5 libraries for handling point data kept in HDF5-based structures (eg NetCDF4, SPDlib [4]). Points stored in database tables (eg postgres-pointcloud [5]) will be considered as testing continues. Visualising and exploring massive point datasets in a web browser alongside multiple datasets has been demonstrated by the entwine-3D tiles project [6]. This is a powerful interface which enables users to investigate and select appropriate data, and is also being investigated as a potential front-end to a WPS-based point data service. In this work we show preliminary results for a WPS-based point data access system, in preparation for demonstration at FOSS4G 2017, Boston (http://2017.foss4g.org/) [1] http://nci.org.au/data-collections/nerdip/ [2] http://www.opengeospatial.org/standards/wps [3] http://www.pdal.io [4] http://www.spdlib.org/doku.php [5] https://github.com/pgpointcloud/pointcloud [6] http://cesium.entwine.io

  3. Scalability Issues for Remote Sensing Infrastructure: A Case Study.

    PubMed

    Liu, Yang; Picard, Sean; Williamson, Carey

    2017-04-29

    For the past decade, a team of University of Calgary researchers has operated a large "sensor Web" to collect, analyze, and share scientific data from remote measurement instruments across northern Canada. This sensor Web receives real-time data streams from over a thousand Internet-connected sensors, with a particular emphasis on environmental data (e.g., space weather, auroral phenomena, atmospheric imaging). Through research collaborations, we had the opportunity to evaluate the performance and scalability of their remote sensing infrastructure. This article reports the lessons learned from our study, which considered both data collection and data dissemination aspects of their system. On the data collection front, we used benchmarking techniques to identify and fix a performance bottleneck in the system's memory management for TCP data streams, while also improving system efficiency on multi-core architectures. On the data dissemination front, we used passive and active network traffic measurements to identify and reduce excessive network traffic from the Web robots and JavaScript techniques used for data sharing. While our results are from one specific sensor Web system, the lessons learned may apply to other scientific Web sites with remote sensing infrastructure.

  4. Linked Data: what does it offer Earth Sciences?

    NASA Astrophysics Data System (ADS)

    Cox, Simon; Schade, Sven

    2010-05-01

    'Linked Data' is a current buzz-phrase promoting access to various forms of data on the internet. It starts from the two principles that have underpinned the architecture and scalability of the World Wide Web: 1. Universal Resource Identifiers - using the http protocol which is supported by the DNS system. 2. Hypertext - in which URIs of related resources are embedded within a document. Browsing is the key mode of interaction, with traversal of links between resources under control of the client. Linked Data also adds, or re-emphasizes: • Content negotiation - whereby the client uses http headers to tell the service what representation of a resource is acceptable, • Semantic Web principles - formal semantics for links, following the RDF data model and encoding, and • The 'mashup' effect - in which original and unexpected value may emerge from reuse of data, even if published in raw or unpolished form. Linked Data promotes typed links to all kinds of data, so is where the semantic web meets the 'deep web', i.e. resources which may be accessed using web protocols, but are in representations not indexed by search engines. Earth sciences are data rich, but with a strong legacy of specialized formats managed and processed by disconnected applications. However, most contemporary research problems require a cross-disciplinary approach, in which the heterogeneity resulting from that legacy is a significant challenge. In this context, Linked Data clearly has much to offer the earth sciences. But, there are some important questions to answer. What is a resource? Most earth science data is organized in arrays and databases. A subset useful for a particular study is usually identified by a parameterized query. The Linked Data paradigm emerged from the world of documents, and will often only resolve data-sets. It is impractical to create even nested navigation resources containing links to all potentially useful objects or subsets. From the viewpoint of human user interfaces, the browse metaphor, which has been such an important part of the success of the web, must be augmented with other interaction mechanisms, including query. What are the impacts on search and metadata? Hypertext provides links selected by the page provider. However, science should endeavor to be exhaustive in its use of data. Resource discovery through links must be supplemented by more systematic data discovery through search. Conversely, the crawlers that generate search indexes must be fed by resource providers (a) serving navigation pages with links to every dataset (b) adding enough 'metadata' (semantics) on each link to effectively populate the indexes. Linked Data makes this easier due to its integration with semantic web technologies, including structured vocabularies. What is the relation between structured data and Linked Data? Linked Data has focused on web-pages (primarily HTML) for human browsing, and RDF for semantics, assuming that other representations are opaque. However, this overlooks the wealth of XML data on the web, some of which is structured according to XML Schemas that provide semantics. Technical applications can use content-negotiation to get a structured representation, and exploit its semantics. Particularly relevant for earth sciences are data representations based on OGC Geography Markup Language (GML), such as GeoSciML, O&M and MOLES. GML was strongly influenced by RDF, and typed links are intrinsic: xlink:href plays the role that rdf:resource does in RDF representations. Services which expose GML-formatted resources (such as OGC Web Feature Service) are a prototype of Linked Data. Giving credit where it is due. Organizations investing in data collection may be reluctant to publish the raw data prior to completing an initial analysis. To encourage early data publication the system must provide suitable incentives, and citation analysis must recognize the increasing diversity of publication routes and forms. Linked Data makes it easier to include rich citation information when data is both published and used.

  5. Off-great-circle paths in transequatorial propagation: 2. Nonmagnetic-field-aligned reflections

    NASA Astrophysics Data System (ADS)

    Tsunoda, Roland T.; Maruyama, Takashi; Tsugawa, Takuya; Yokoyama, Tatsuhiro; Ishii, Mamoru; Nguyen, Trang T.; Ogawa, Tadahiko; Nishioka, Michi

    2016-11-01

    There is considerable evidence that plasma structure in nighttime equatorial F layer develops from large-scale wave structure (LSWS) in bottomside F layer. However, crucial details of how this process proceeds, from LSWS to equatorial plasma bubbles (EPBs), remain to be sorted out. A major obstacle to success is the paucity of measurements that provide a space-time description of the bottomside F layer over a broad geographical region. The transequatorial propagation (TEP) experiment is one of few methods that can do so. New findings using a TEP experiment, between Shepparton (SHP), Australia, and Oarai (ORI), Japan, are presented in two companion papers. In Paper 1 (P1), (1) off-great-circle (OGC) paths are described in terms of discrete and diffuse types, (2) descriptions of OGC paths are generalized from a single-reflection to a multiple-reflection process, and (3) discrete type is shown to be associated with an unstructured but distorted upwelling, whereas the diffuse type is shown to be associated with EPBs. In Paper 2 (P2), attention is placed on differences in east-west (EW) asymmetry, found between OGC paths from the SHP-ORI experiment and those from another near-identical TEP experiment. Differences are reconciled by allowing three distinct sources for the EW asymmetries: (1) reflection properties within an upwelling (see P1), (2) OGC paths that depend on magnetic declination of geomagnetic field (B), and (3) OGC paths supported by non-B-aligned reflectors at latitudes where inclination of B is finite.

  6. Extending Climate Analytics-As to the Earth System Grid Federation

    NASA Astrophysics Data System (ADS)

    Tamkin, G.; Schnase, J. L.; Duffy, D.; McInerney, M.; Nadeau, D.; Li, J.; Strong, S.; Thompson, J. H.

    2015-12-01

    We are building three extensions to prior-funded work on climate analytics-as-a-service that will benefit the Earth System Grid Federation (ESGF) as it addresses the Big Data challenges of future climate research: (1) We are creating a cloud-based, high-performance Virtual Real-Time Analytics Testbed supporting a select set of climate variables from six major reanalysis data sets. This near real-time capability will enable advanced technologies like the Cloudera Impala-based Structured Query Language (SQL) query capabilities and Hadoop-based MapReduce analytics over native NetCDF files while providing a platform for community experimentation with emerging analytic technologies. (2) We are building a full-featured Reanalysis Ensemble Service comprising monthly means data from six reanalysis data sets. The service will provide a basic set of commonly used operations over the reanalysis collections. The operations will be made accessible through NASA's climate data analytics Web services and our client-side Climate Data Services (CDS) API. (3) We are establishing an Open Geospatial Consortium (OGC) WPS-compliant Web service interface to our climate data analytics service that will enable greater interoperability with next-generation ESGF capabilities. The CDS API will be extended to accommodate the new WPS Web service endpoints as well as ESGF's Web service endpoints. These activities address some of the most important technical challenges for server-side analytics and support the research community's requirements for improved interoperability and improved access to reanalysis data.

  7. 77 FR 49011 - Privacy Act of 1974; New System of Records, Office of General Counsel E-Discovery Management...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-15

    ... System of Records, Office of General Counsel E-Discovery Management System--Change in Final Effective... the OGC E-Discovery Management System until after the opportunity for further comment is provided to... intent to establish a new system of records for OGC's E-Discovery Management System (EDMS), a system...

  8. Using Psychophysiological Sensors to Assess Mental Workload During Web Browsing.

    PubMed

    Jimenez-Molina, Angel; Retamal, Cristian; Lira, Hernan

    2018-02-03

    Knowledge of the mental workload induced by a Web page is essential for improving users' browsing experience. However, continuously assessing the mental workload during a browsing task is challenging. To address this issue, this paper leverages the correlation between stimuli and physiological responses, which are measured with high-frequency, non-invasive psychophysiological sensors during very short span windows. An experiment was conducted to identify levels of mental workload through the analysis of pupil dilation measured by an eye-tracking sensor. In addition, a method was developed to classify mental workload by appropriately combining different signals (electrodermal activity (EDA), electrocardiogram, photoplethysmo-graphy (PPG), electroencephalogram (EEG), temperature and pupil dilation) obtained with non-invasive psychophysiological sensors. The results show that the Web browsing task involves four levels of mental workload. Also, by combining all the sensors, the efficiency of the classification reaches 93.7%.

  9. Using Psychophysiological Sensors to Assess Mental Workload During Web Browsing

    PubMed Central

    Jimenez-Molina, Angel; Retamal, Cristian; Lira, Hernan

    2018-01-01

    Knowledge of the mental workload induced by a Web page is essential for improving users’ browsing experience. However, continuously assessing the mental workload during a browsing task is challenging. To address this issue, this paper leverages the correlation between stimuli and physiological responses, which are measured with high-frequency, non-invasive psychophysiological sensors during very short span windows. An experiment was conducted to identify levels of mental workload through the analysis of pupil dilation measured by an eye-tracking sensor. In addition, a method was developed to classify mental workload by appropriately combining different signals (electrodermal activity (EDA), electrocardiogram, photoplethysmo-graphy (PPG), electroencephalogram (EEG), temperature and pupil dilation) obtained with non-invasive psychophysiological sensors. The results show that the Web browsing task involves four levels of mental workload. Also, by combining all the sensors, the efficiency of the classification reaches 93.7%. PMID:29401688

  10. Virtual Sensor Web Architecture

    NASA Astrophysics Data System (ADS)

    Bose, P.; Zimdars, A.; Hurlburt, N.; Doug, S.

    2006-12-01

    NASA envisions the development of smart sensor webs, intelligent and integrated observation network that harness distributed sensing assets, their associated continuous and complex data sets, and predictive observation processing mechanisms for timely, collaborative hazard mitigation and enhanced science productivity and reliability. This paper presents Virtual Sensor Web Infrastructure for Collaborative Science (VSICS) Architecture for sustained coordination of (numerical and distributed) model-based processing, closed-loop resource allocation, and observation planning. VSICS's key ideas include i) rich descriptions of sensors as services based on semantic markup languages like OWL and SensorML; ii) service-oriented workflow composition and repair for simple and ensemble models; event-driven workflow execution based on event-based and distributed workflow management mechanisms; and iii) development of autonomous model interaction management capabilities providing closed-loop control of collection resources driven by competing targeted observation needs. We present results from initial work on collaborative science processing involving distributed services (COSEC framework) that is being extended to create VSICS.

  11. PAVICS: A platform for the Analysis and Visualization of Climate Science - adopting a workflow-based analysis method for dealing with a multitude of climate data sources

    NASA Astrophysics Data System (ADS)

    Gauvin St-Denis, B.; Landry, T.; Huard, D. B.; Byrns, D.; Chaumont, D.; Foucher, S.

    2017-12-01

    As the number of scientific studies and policy decisions requiring tailored climate information continues to increase, the demand for support from climate service centers to provide the latest information in the format most helpful for the end-user is also on the rise. Ouranos, being one such organization based in Montreal, has partnered with the Centre de recherche informatique de Montreal (CRIM) to develop a platform that will offer climate data products that have been identified as most useful for users through years of consultation. The platform is built as modular components that target the various requirements of climate data analysis. The data components host and catalog NetCDF data as well as geographical and political delimitations. The analysis components are made available as atomic operations through Web Processing Service (WPS) or as workflows, whereby the operations are chained through a simple JSON structure and executed on a distributed network of computing resources. The visualization components range from Web Map Service (WMS) to a complete frontend for searching the data, launching workflows and interacting with maps of the results. Each component can easily be deployed and executed as an independent service through the use of Docker technology and a proxy is available to regulate user workspaces and access permissions. PAVICS includes various components from birdhouse, a collection of WPS initially developed by the German Climate Research Center (DKRZ) and Institut Pierre Simon Laplace (IPSL) and is designed to be highly interoperable with other WPS as well as many Open Geospatial Consortium (OGC) standards. Further connectivity is made with the Earth System Grid Federation (ESGF) nodes and local results are made searchable using the same API terminology. Other projects conducted by CRIM that integrate with PAVICS include the OGC Testbed 13 Innovation Program (IP) initiative that will enhance advanced cloud capabilities, application packaging deployment processes, as well as enabling Earth Observation (EO) processes relevant to climate. As part of its experimental agenda, working implementations of scalable machine learning on big climate data with Spark and SciSpark were delivered.

  12. The Cloud2SM Project

    NASA Astrophysics Data System (ADS)

    Crinière, Antoine; Dumoulin, Jean; Mevel, Laurent; Andrade-Barosso, Guillermo; Simonin, Matthieu

    2015-04-01

    From the past decades the monitoring of civil engineering structure became a major field of research and development process in the domains of modelling and integrated instrumentation. This increasing of interest can be attributed in part to the need of controlling the aging of such structures and on the other hand to the need to optimize maintenance costs. From this standpoint the project Cloud2SM (Cloud architecture design for Structural Monitoring with in-line Sensors and Models tasking), has been launched to develop a robust information system able to assess the long term monitoring of civil engineering structures as well as interfacing various sensors and data. The specificity of such architecture is to be based on the notion of data processing through physical or statistical models. Thus the data processing, whether material or mathematical, can be seen here as a resource of the main architecture. The project can be divided in various items: -The sensors and their measurement process: Those items provide data to the main architecture and can embed storage or computational resources. Dependent of onboard capacity and the amount of data generated it can be distinguished heavy and light sensors. - The storage resources: Based on the cloud concept this resource can store at least two types of data, raw data and processed ones. - The computational resources: This item includes embedded "pseudo real time" resources as the dedicated computer cluster or computational resources. - The models: Used for the conversion of raw data to meaningful data. Those types of resources inform the system of their needs they can be seen as independents blocks of the system. - The user interface: This item can be divided in various HMI to assess maintaining operation on the sensors or pop-up some information to the user. - The demonstrators: The structures themselves. This project follows previous research works initiated in the European project ISTIMES [1]. It includes the infrared thermal monitoring of civil engineering structures [2-3] and/or the vibration monitoring of such structures [4-5]. The chosen architecture is based on the OGC standard in order to ensure the interoperability between the various measurement systems. This concept is extended to the notion of physical models. The last but not the least main objective of this project is to explore the feasibility and the reliability to deploy mathematical models and process a large amount of data using the GPGPU capacity of a dedicated computational cluster, while studying OGC standardization to those technical concepts. References [1] M. Proto et al., « Transport Infrastructure surveillance and Monitoring by Electromagnetic Sensing: the ISTIMES project », Journal Sensors, Sensors 2010, 10(12), 10620-10639; doi:10.3390/s101210620, December 2010. [2] J. Dumoulin, A. Crinière, R. Averty ," Detection and thermal characterization of the inner structure of the "Musmeci" bridge deck by infrared thermography monitoring ",Journal of Geophysics and Engineering, Volume 10, Number 2, 17 pages ,November 2013, IOP Science, doi:10.1088/1742-2132/10/6/064003. [3] J Dumoulin and V Boucher; "Infrared thermography system for transport infrastructures survey with inline local atmospheric parameter measurements and offline model for radiation attenuation evaluations," J. Appl. Remote Sens., 8(1), 084978 (2014). doi:10.1117/1.JRS.8.084978. [4] V. Le Cam, M. Doehler, M. Le Pen, L. Mevel. "Embedded modal analysis algorithms on the smart wireless sensor platform PEGASE", In Proc. 9th International Workshop on Structural Health Monitoring, Stanford, CA, USA, 2013. [5] M. Zghal, L. Mevel, P. Del Moral, "Modal parameter estimation using interacting Kalman filter", Mechanical Systems and Signal Processing, 2014.

  13. 75 FR 80808 - Proposed Consent Decree, Clean Air Act Citizen Suit

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-23

    ... number EPA-HQ- OGC-2010-1060, online at http://www.regulations.gov (EPA's preferred method); by e-mail to... Center, EPA West, Room 3334, 1301 Constitution Ave., NW., Washington, DC, between 8:30 a.m. and 4:30 p.m... No. EPA-HQ-OGC-2010-1060) contains a copy of the proposed consent decree. The official public docket...

  14. Squamous cell carcinoma of the uterine cervix associated with osteoclast-like giant cells: A case report and review of the literature.

    PubMed

    Yu, Guohua; Lin, Chunhua; Wang, Wei; Han, Yekun; Qu, Guimei; Zhang, Tingguo

    2014-10-01

    Squamous cell carcinoma is a common malignant tumor of the uterine cervix. The present study reports the case of squamous cell carcinoma of the uterine cervix with osteoclast-like giant cells (OGCs) in an 84-year-old female who had suffered from irregular vaginal bleeding for one month. Colposcopy was performed and a cauliflower-like mass was identified in the front lip of the uterine cervix. Biopsy was then performed, and the tumor was found to be composed of epithelial cell nests, ranging in size. The neoplastic cells exhibited unclear boundaries and eosinophilic cytoplasm. Additionally, the nuclei were atypical and mitosis was observed. Among the epithelial nests, there were numerous OGCs with abundant eosinophilic cytoplasm, as well as multinucleation with bland nuclei. By immunohistochemical staining, the epithelial cells were positive for cytokeratin, while negative for CD68 and vimentin. By contrast, the immunophenotype of the OGCs was the exact opposite. Based on the histological characters, a diagnosis of squamous cell carcinoma of the uterine cervix associated with OGCs was made. Considering the age of the patient, radiotherapy was administered. The patient succumbed to brain metastasis of the tumor after eight months of follow-up.

  15. Neural Network Substorm Identification: Enabling TREx Sensor Web Modes

    NASA Astrophysics Data System (ADS)

    Chaddock, D.; Spanswick, E.; Arnason, K. M.; Donovan, E.; Liang, J.; Ahmad, S.; Jackel, B. J.

    2017-12-01

    Transition Region Explorer (TREx) is a ground-based sensor web of optical and radio instruments that is presently being deployed across central Canada. The project consists of an array of co-located blue-line, full-colour, and near-infrared all-sky imagers, imaging riometers, proton aurora spectrographs, and GNSS systems. A key goal of the TREx project is to create the world's first (artificial) intelligent sensor web for remote sensing space weather. The sensor web will autonomously control and coordinate instrument operations in real-time. To accomplish this, we will use real-time in-line analytics of TREx and other data to dynamically switch between operational modes. An operating mode could be, for example, to have a blue-line imager gather data at a one or two orders of magnitude higher cadence than it operates for its `baseline' mode. The software decision to increase the imaging cadence would be in response to an anticipated increase in auroral activity or other programmatic requirements. Our first test for TREx's sensor web technologies is to develop the capacity to autonomously alter the TREx operating mode prior to a substorm expansion phase onset. In this paper, we present our neural network analysis of historical optical and riometer data and our ability to predict an optical onset. We explore the preliminary insights into using a neural network to pick out trends and features which it deems are similar among substorms.

  16. Python Winding Itself Around Datacubes: How to Access Massive Multi-Dimensional Arrays in a Pythonic Way

    NASA Astrophysics Data System (ADS)

    Merticariu, Vlad; Misev, Dimitar; Baumann, Peter

    2017-04-01

    While python has developed into the lingua franca in Data Science there is often a paradigm break when accessing specialized tools. In particular for one of the core data categories in science and engineering, massive multi-dimensional arrays, out-of-memory solutions typically employ their own, different models. We discuss this situation on the example of the scalable open-source array engine, rasdaman ("raster data manager") which offers access to and processing of Petascale multi-dimensional arrays through an SQL-style array query language, rasql. Such queries are executed in the server on a storage engine utilizing adaptive array partitioning and based on a processing engine implementing a "tile streaming" paradigm to allow processing of arrays massively larger than server RAM. The rasdaman QL has acted as blueprint for forthcoming ISO Array SQL and the Open Geospatial Consortium (OGC) geo analytics language, Web Coverage Processing Service, adopted in 2008. Not surprisingly, rasdaman is OGC and INSPIRE Reference Implementation for their "Big Earth Data" standards suite. Recently, rasdaman has been augmented with a python interface which allows to transparently interact with the database (credits go to Siddharth Shukla's Master Thesis at Jacobs University). Programmers do not need to know the rasdaman query language, as the operators are silently transformed, through lazy evaluation, into queries. Arrays delivered are likewise automatically transformed into their python representation. In the talk, the rasdaman concept will be illustrated with the help of large-scale real-life examples of operational satellite image and weather data services, and sample python code.

  17. Spatial Data Services for Interdisciplinary Applications from the NASA Socioeconomic Data and Applications Center

    NASA Astrophysics Data System (ADS)

    Chen, R. S.; MacManus, K.; Vinay, S.; Yetman, G.

    2016-12-01

    The Socioeconomic Data and Applications Center (SEDAC), one of 12 Distributed Active Archive Centers (DAACs) in the NASA Earth Observing System Data and Information System (EOSDIS), has developed a variety of operational spatial data services aimed at providing online access, visualization, and analytic functions for geospatial socioeconomic and environmental data. These services include: open web services that implement Open Geospatial Consortium (OGC) specifications such as Web Map Service (WMS), Web Feature Service (WFS), and Web Coverage Service (WCS); spatial query services that support Web Processing Service (WPS) and Representation State Transfer (REST); and web map clients and a mobile app that utilize SEDAC and other open web services. These services may be accessed from a variety of external map clients and visualization tools such as NASA's WorldView, NOAA's Climate Explorer, and ArcGIS Online. More than 200 data layers related to population, settlements, infrastructure, agriculture, environmental pollution, land use, health, hazards, climate change and other aspects of sustainable development are available through WMS, WFS, and/or WCS. Version 2 of the SEDAC Population Estimation Service (PES) supports spatial queries through WPS and REST in the form of a user-defined polygon or circle. The PES returns an estimate of the population residing in the defined area for a specific year (2000, 2005, 2010, 2015, or 2020) based on SEDAC's Gridded Population of the World version 4 (GPWv4) dataset, together with measures of accuracy. The SEDAC Hazards Mapper and the recently released HazPop iOS mobile app enable users to easily submit spatial queries to the PES and see the results. SEDAC has developed an operational virtualized backend infrastructure to manage these services and support their continual improvement as standards change, new data and services become available, and user needs evolve. An ongoing challenge is to improve the reliability and performance of the infrastructure, in conjunction with external services, to meet both research and operational needs.

  18. A SensorML-based Metadata Model and Registry for Ocean Observatories: a Contribution from European Projects NeXOS and FixO3

    NASA Astrophysics Data System (ADS)

    Delory, E.; Jirka, S.

    2016-02-01

    Discovering sensors and observation data is important when enabling the exchange of oceanographic data between observatories and scientists that need the data sets for their work. To better support this discovery process, one task of the European project FixO3 (Fixed-point Open Ocean Observatories) is dealing with the question which elements are needed for developing a better registry for sensors. This has resulted in four items which are addressed by the FixO3 project in cooperation with further European projects such as NeXOS (http://www.nexosproject.eu/). 1.) Metadata description format: To store and retrieve information about sensors and platforms it is necessary to have a common approach how to provide and encode the metadata. For this purpose, the OGC Sensor Model Language (SensorML) 2.0 standard was selected. Especially the opportunity to distinguish between sensor types and instances offers new chances for a more efficient provision and maintenance of sensor metadata. 2.) Conversion of existing metadata into a SensorML 2.0 representation: In order to ensure a sustainable re-use of already provided metadata content (e.g. from ESONET-FixO3 yellow pages), it is important to provide a mechanism which is capable of transforming these already available metadata sets into the new SensorML 2.0 structure. 3.) Metadata editor: To create descriptions of sensors and platforms, it is not possible to expect users to manually edit XML-based description files. Thus, a visual interface is necessary to help during the metadata creation. We will outline a prototype of this editor, building upon the development of the ESONET sensor registry interface. 4.) Sensor Metadata Store: A server is needed that for storing and querying the created sensor descriptions. For this purpose different options exist which will be discussed. In summary, we will present a set of different elements enabling sensor discovery ranging from metadata formats, metadata conversion and editing to metadata storage. Furthermore, the current development status will be demonstrated.

  19. Building Geospatial Web Services for Ecological Monitoring and Forecasting

    NASA Astrophysics Data System (ADS)

    Hiatt, S. H.; Hashimoto, H.; Melton, F. S.; Michaelis, A. R.; Milesi, C.; Nemani, R. R.; Wang, W.

    2008-12-01

    The Terrestrial Observation and Prediction System (TOPS) at NASA Ames Research Center is a modeling system that generates a suite of gridded data products in near real-time that are designed to enhance management decisions related to floods, droughts, forest fires, human health, as well as crop, range, and forest production. While these data products introduce great possibilities for assisting management decisions and informing further research, realization of their full potential is complicated by their shear volume and by the need for a necessary infrastructure for remotely browsing, visualizing, and analyzing the data. In order to address these difficulties we have built an OGC-compliant WMS and WCS server based on an open source software stack that provides standardized access to our archive of data. This server is built using the open source Java library GeoTools which achieves efficient I/O and image rendering through Java Advanced Imaging. We developed spatio-temporal raster management capabilities using the PostGrid raster indexation engine. We provide visualization and browsing capabilities through a customized Ajax web interface derived from the kaMap project. This interface allows resource managers to quickly assess ecosystem conditions and identify significant trends and anomalies from within their web browser without the need to download source data or install special software. Our standardized web services also expose TOPS data to a range of potential clients, from web mapping applications to virtual globes and desktop GIS packages. However, support for managing the temporal dimension of our data is currently limited in existing software systems. Future work will attempt to overcome this shortcoming by building time-series visualization and analysis tools that can be integrated with existing geospatial software.

  20. Web-based Visualization and Query of semantically segmented multiresolution 3D Models in the Field of Cultural Heritage

    NASA Astrophysics Data System (ADS)

    Auer, M.; Agugiaro, G.; Billen, N.; Loos, L.; Zipf, A.

    2014-05-01

    Many important Cultural Heritage sites have been studied over long periods of time by different means of technical equipment, methods and intentions by different researchers. This has led to huge amounts of heterogeneous "traditional" datasets and formats. The rising popularity of 3D models in the field of Cultural Heritage in recent years has brought additional data formats and makes it even more necessary to find solutions to manage, publish and study these data in an integrated way. The MayaArch3D project aims to realize such an integrative approach by establishing a web-based research platform bringing spatial and non-spatial databases together and providing visualization and analysis tools. Especially the 3D components of the platform use hierarchical segmentation concepts to structure the data and to perform queries on semantic entities. This paper presents a database schema to organize not only segmented models but also different Levels-of-Details and other representations of the same entity. It is further implemented in a spatial database which allows the storing of georeferenced 3D data. This enables organization and queries by semantic, geometric and spatial properties. As service for the delivery of the segmented models a standardization candidate of the OpenGeospatialConsortium (OGC), the Web3DService (W3DS) has been extended to cope with the new database schema and deliver a web friendly format for WebGL rendering. Finally a generic user interface is presented which uses the segments as navigation metaphor to browse and query the semantic segmentation levels and retrieve information from an external database of the German Archaeological Institute (DAI).

  1. Siberian Earth System Science Cluster - A web-based Geoportal to provide user-friendly Earth Observation Products for supporting NEESPI scientists

    NASA Astrophysics Data System (ADS)

    Eberle, J.; Gerlach, R.; Hese, S.; Schmullius, C.

    2012-04-01

    To provide earth observation products in the area of Siberia, the Siberian Earth System Science Cluster (SIB-ESS-C) was established as a spatial data infrastructure at the University of Jena (Germany), Department for Earth Observation. This spatial data infrastructure implements standards published by the Open Geospatial Consortium (OGC) and the International Organizsation for Standardization (ISO) for data discovery, data access, data processing and data analysis. The objective of SIB-ESS-C is to faciliate environmental research and Earth system science in Siberia. The region for this project covers the entire Asian part of the Russian Federation approximately between 58°E - 170°W and 48°N - 80°N. To provide discovery, access and analysis services a webportal was published for searching and visualisation of available data. This webportal is based on current web technologies like AJAX, Drupal Content Management System as backend software and a user-friendly surface with Drag-n-Drop and further mouse events. To have a wide range of regular updated earth observation products, some products from sensor MODIS at the satellites Aqua and Terra were processed. A direct connection to NASA archive servers makes it possible to download MODIS Level 3 and 4 products and integrate it in the SIB-ESS-C infrastructure. These data can be downloaded in a file format called Hierarchical Data Format (HDF). For visualisation and further analysis, this data is reprojected, converted to GeoTIFF and global products clipped to the project area. All these steps are implemented as an automatic process chain. If new MODIS data is available within the infrastructure this process chain is executed. With the link to a MODIS catalogue system, the system gets new data daily. With the implemented analysis processes, timeseries data can be analysed, for example to plot a trend or different time series against one another. Scientists working in this area and working with MODIS data can make use of this service over the webportal. Both searching manually the NASA archive for MODIS data, processing these data automatically and then download it for further processing and using the regular updated products.

  2. Data visualization in interactive maps and time series

    NASA Astrophysics Data System (ADS)

    Maigne, Vanessa; Evano, Pascal; Brockmann, Patrick; Peylin, Philippe; Ciais, Philippe

    2014-05-01

    State-of-the-art data visualization has nothing to do with plots and maps we used few years ago. Many opensource tools are now available to provide access to scientific data and implement accessible, interactive, and flexible web applications. Here we will present a web site opened November 2013 to create custom global and regional maps and time series from research models and datasets. For maps, we explore and get access to data sources from a THREDDS Data Server (TDS) with the OGC WMS protocol (using the ncWMS implementation) then create interactive maps with the OpenLayers javascript library and extra information layers from a GeoServer. Maps become dynamic, zoomable, synchroneaously connected to each other, and exportable to Google Earth. For time series, we extract data from a TDS with the Netcdf Subset Service (NCSS) then display interactive graphs with a custom library based on the Data Driven Documents javascript library (D3.js). This time series application provides dynamic functionalities such as interpolation, interactive zoom on different axes, display of point values, and export to different formats. These tools were implemented for the Global Carbon Atlas (http://www.globalcarbonatlas.org): a web portal to explore, visualize, and interpret global and regional carbon fluxes from various model simulations arising from both human activities and natural processes, a work led by the Global Carbon Project.

  3. The EarthServer Federation: State, Role, and Contribution to GEOSS

    NASA Astrophysics Data System (ADS)

    Merticariu, Vlad; Baumann, Peter

    2016-04-01

    The intercontinental EarthServer initiative has established a European datacube platform with proven scalability: known databases exceed 100 TB, and single queries have been split across more than 1,000 cloud nodes. Its service interface being rigorously based on the OGC "Big Geo Data" standards, Web Coverage Service (WCS) and Web Coverage Processing Service (WCPS), a series of clients can dock into the services, ranging from open-source OpenLayers and QGIS over open-source NASA WorldWind to proprietary ESRI ArcGIS. Datacube fusion in a "mix and match" style is supported by the platform technolgy, the rasdaman Array Database System, which transparently federates queries so that users simply approach any node of the federation to access any data item, internally optimized for minimal data transfer. Notably, rasdaman is part of GEOSS GCI. NASA is contributing its Web WorldWind virtual globe for user-friendly data extraction, navigation, and analysis. Integrated datacube / metadata queries are contributed by CITE. Current federation members include ESA (managed by MEEO sr.l.), Plymouth Marine Laboratory (PML), the European Centre for Medium-Range Weather Forecast (ECMWF), Australia's National Computational Infrastructure, and Jacobs University (adding in Planetary Science). Further data centers have expressed interest in joining. We present the EarthServer approach, discuss its underlying technology, and illustrate the contribution this datacube platform can make to GEOSS.

  4. Design and implementation of PAVEMON: A GIS web-based pavement monitoring system based on large amounts of heterogeneous sensors data

    NASA Astrophysics Data System (ADS)

    Shahini Shamsabadi, Salar

    A web-based PAVEment MONitoring system, PAVEMON, is a GIS oriented platform for accommodating, representing, and leveraging data from a multi-modal mobile sensor system. Stated sensor system consists of acoustic, optical, electromagnetic, and GPS sensors and is capable of producing as much as 1 Terabyte of data per day. Multi-channel raw sensor data (microphone, accelerometer, tire pressure sensor, video) and processed results (road profile, crack density, international roughness index, micro texture depth, etc.) are outputs of this sensor system. By correlating the sensor measurements and positioning data collected in tight time synchronization, PAVEMON attaches a spatial component to all the datasets. These spatially indexed outputs are placed into an Oracle database which integrates seamlessly with PAVEMON's web-based system. The web-based system of PAVEMON consists of two major modules: 1) a GIS module for visualizing and spatial analysis of pavement condition information layers, and 2) a decision-support module for managing maintenance and repair (Mℝ) activities and predicting future budget needs. PAVEMON weaves together sensor data with third-party climate and traffic information from the National Oceanic and Atmospheric Administration (NOAA) and Long Term Pavement Performance (LTPP) databases for an organized data driven approach to conduct pavement management activities. PAVEMON deals with heterogeneous and redundant observations by fusing them for jointly-derived higher-confidence results. A prominent example of the fusion algorithms developed within PAVEMON is a data fusion algorithm used for estimating the overall pavement conditions in terms of ASTM's Pavement Condition Index (PCI). PAVEMON predicts PCI by undertaking a statistical fusion approach and selecting a subset of all the sensor measurements. Other fusion algorithms include noise-removal algorithms to remove false negatives in the sensor data in addition to fusion algorithms developed for identifying features on the road. PAVEMON offers an ideal research and monitoring platform for rapid, intelligent and comprehensive evaluation of tomorrow's transportation infrastructure based on up-to-date data from heterogeneous sensor systems.

  5. Oceanotron, Scalable Server for Marine Observations

    NASA Astrophysics Data System (ADS)

    Loubrieu, T.; Bregent, S.; Blower, J. D.; Griffiths, G.

    2013-12-01

    Ifremer, French marine institute, is deeply involved in data management for different ocean in-situ observation programs (ARGO, OceanSites, GOSUD, ...) or other European programs aiming at networking ocean in-situ observation data repositories (myOcean, seaDataNet, Emodnet). To capitalize the effort for implementing advance data dissemination services (visualization, download with subsetting) for these programs and generally speaking water-column observations repositories, Ifremer decided to develop the oceanotron server (2010). Knowing the diversity of data repository formats (RDBMS, netCDF, ODV, ...) and the temperamental nature of the standard interoperability interface profiles (OGC/WMS, OGC/WFS, OGC/SOS, OpeNDAP, ...), the server is designed to manage plugins: - StorageUnits : which enable to read specific data repository formats (netCDF/OceanSites, RDBMS schema, ODV binary format). - FrontDesks : which get external requests and send results for interoperable protocols (OGC/WMS, OGC/SOS, OpenDAP). In between a third type of plugin may be inserted: - TransformationUnits : which enable ocean business related transformation of the features (for example conversion of vertical coordinates from pressure in dB to meters under sea surface). The server is released under open-source license so that partners can develop their own plugins. Within MyOcean project, University of Reading has plugged a WMS implementation as an oceanotron frontdesk. The modules are connected together by sharing the same information model for marine observations (or sampling features: vertical profiles, point series and trajectories), dataset metadata and queries. The shared information model is based on OGC/Observation & Measurement and Unidata/Common Data Model initiatives. The model is implemented in java (http://www.ifremer.fr/isi/oceanotron/javadoc/). This inner-interoperability level enables to capitalize ocean business expertise in software development without being indentured to specific data formats or protocols. Oceanotron is deployed at seven European data centres for marine in-situ observations within myOcean. While additional extensions are still being developed, to promote new collaborative initiatives, a work is now done on continuous and distributed integration (jenkins, maven), shared reference documentation (on alfresco) and code and release dissemination (sourceforge, github).

  6. TEODOOR, a blueprint for distributed terrestrial observation data infrastructures

    NASA Astrophysics Data System (ADS)

    Kunkel, Ralf; Sorg, Jürgen; Abbrent, Martin; Borg, Erik; Gasche, Rainer; Kolditz, Olaf; Neidl, Frank; Priesack, Eckart; Stender, Vivien

    2017-04-01

    TERENO (TERrestrial ENvironmental Observatories) is an initiative funded by the large research infrastructure program of the Helmholtz Association of Germany. Four observation platforms to facilitate the investigation of consequences of global change for terrestrial ecosys-tems and the socioeconomic implications of these have been implemented and equipped from 2007 until 2013. Data collection, however, is planned to be performed for at least 30 years. TERENO provides series of system variables (e.g. precipitation, runoff, groundwater level, soil moisture, water vapor and trace gases fluxes) for the analysis and prognosis of global change consequences using integrated model systems, which will be used to derive efficient prevention, mitigation and adaptation strategies. Each platform is operated by a different Helmholtz-Institution, which maintains its local data infrastructure. Within the individual observatories, areas with intensive measurement programs have been implemented. Different sensors provide information on various physical parameters like soil moisture, temperatures, ground water levels or gas fluxes. Sensor data from more than 900 stations are collected automatically with a frequency of 20 s-1 up to 2 h-1, summing up to about 2,500,000 data values per day. In addition, three weather radar devices create raster data with a frequency of 12 to 60 h-1. The data are automatically imported into local relational database systems using a common data quality assessment framework, used to handle processing and assessment of heterogeneous environmental observation data. Starting with the way data are imported into the data infrastructure, custom workflows are developed. Data levels implying the underlying data processing, stages of quality assessment and data ac-cessibility are defined. In order to facilitate the acquisition, provision, integration, management and exchange of heterogeneous geospatial resources within a scientific and non-scientific environment the dis-tributed spatial data infrastructure TEODOOR (TEreno Online Data RepOsitORry) has been build-up. The individual observatories are connected via OGC-compliant web-services, while the TERENO Data Discovery Portal (DDP) enables data discovery, visualization and data ac-cess. Currently, free access to data from more than 900 monitoring stations is provided.

  7. Twitter web-service for soft agent reporting in persistent surveillance systems

    NASA Astrophysics Data System (ADS)

    Rababaah, Haroun; Shirkhodaie, Amir

    2010-04-01

    Persistent surveillance is an intricate process requiring monitoring, gathering, processing, tracking, and characterization of many spatiotemporal events occurring concurrently. Data associated with events can be readily attained by networking of hard (physical) sensors. Sensors may have homogeneous or heterogeneous (hybrid) sensing modalities with different communication bandwidth requirements. Complimentary to hard sensors are human observers or "soft sensors" that can report occurrences of evolving events via different communication devices (e.g., texting, cell phones, emails, instant messaging, etc.) to the command control center. However, networking of human observers in ad-hoc way is rather a difficult task. In this paper, we present a Twitter web-service for soft agent reporting in persistent surveillance systems (called Web-STARS). The objective of this web-service is to aggregate multi-source human observations in hybrid sensor networks rapidly. With availability of Twitter social network, such a human networking concept can not only be realized for large scale persistent surveillance systems (PSS), but also, it can be employed with proper interfaces to expedite rapid events reporting by human observers. The proposed technique is particularly suitable for large-scale persistent surveillance systems with distributed soft and hard sensor networks. The efficiency and effectiveness of the proposed technique is measured experimentally by conducting several simulated persistent surveillance scenarios. It is demonstrated that by fusion of information from hard and soft agents improves understanding of common operating picture and enhances situational awareness.

  8. Transformation of HDF-EOS metadata from the ECS model to ISO 19115-based XML

    NASA Astrophysics Data System (ADS)

    Wei, Yaxing; Di, Liping; Zhao, Baohua; Liao, Guangxuan; Chen, Aijun

    2007-02-01

    Nowadays, geographic data, such as NASA's Earth Observation System (EOS) data, are playing an increasing role in many areas, including academic research, government decisions and even in people's every lives. As the quantity of geographic data becomes increasingly large, a major problem is how to fully make use of such data in a distributed, heterogeneous network environment. In order for a user to effectively discover and retrieve the specific information that is useful, the geographic metadata should be described and managed properly. Fortunately, the emergence of XML and Web Services technologies greatly promotes information distribution across the Internet. The research effort discussed in this paper presents a method and its implementation for transforming Hierarchical Data Format (HDF)-EOS metadata from the NASA ECS model to ISO 19115-based XML, which will be managed by the Open Geospatial Consortium (OGC) Catalogue Services—Web Profile (CSW). Using XML and international standards rather than domain-specific models to describe the metadata of those HDF-EOS data, and further using CSW to manage the metadata, can allow metadata information to be searched and interchanged more widely and easily, thus promoting the sharing of HDF-EOS data.

  9. PolarHub: A Global Hub for Polar Data Discovery

    NASA Astrophysics Data System (ADS)

    Li, W.

    2014-12-01

    This paper reports the outcome of a NSF project in developing a large-scale web crawler PolarHub to discover automatically the distributed polar dataset in the format of OGC web services (OWS) in the cyberspace. PolarHub is a machine robot; its goal is to visit as many webpages as possible to find those containing information about polar OWS, extract this information and store it into the backend data repository. This is a very challenging task given huge data volume of webpages on the Web. Three unique features was introduced in PolarHub to make it distinctive from earlier crawler solutions: (1) a multi-task, multi-user, multi-thread support to the crawling tasks; (2) an extensive use of thread pool and Data Access Object (DAO) design patterns to separate persistent data storage and business logic to achieve high extendibility of the crawler tool; (3) a pattern-matching based customizable crawling algorithm to support discovery of multi-type geospatial web services; and (4) a universal and portable client-server communication mechanism combining a server-push and client pull strategies for enhanced asynchronous processing. A series of experiments were conducted to identify the impact of crawling parameters to the overall system performance. The geographical distribution pattern of all PolarHub identified services is also demonstrated. We expect this work to make a major contribution to the field of geospatial information retrieval and geospatial interoperability, to bridge the gap between data provider and data consumer, and to accelerate polar science by enhancing the accessibility and reusability of adequate polar data.

  10. GI-conf: A configuration tool for the GI-cat distributed catalog

    NASA Astrophysics Data System (ADS)

    Papeschi, F.; Boldrini, E.; Bigagli, L.; Mazzetti, P.

    2009-04-01

    In this work we present a configuration tool for the GI-cat. In an Service-Oriented Architecture (SOA) framework, GI-cat implements a distributed catalog service providing advanced capabilities, such as: caching, brokering and mediation functionalities. GI-cat applies a distributed approach, being able to distribute queries to the remote service providers of interest in an asynchronous style, and notifies the status of the queries to the caller implementing an incremental feedback mechanism. Today, GI-cat functionalities are made available through two standard catalog interfaces: the OGC CSW ISO and CSW Core Application Profiles. However, two other interfaces are under testing: the CIM and the EO Extension Packages of the CSW ebRIM Application Profile. GI-cat is able to interface a multiplicity of discovery and access services serving heterogeneous Earth and Space Sciences resources. They include international standards like the OGC Web Services -i.e. OGC CSW, WCS, WFS and WMS, as well as interoperability arrangements (i.e. community standards) such as: UNIDATA THREDDS/OPeNDAP, SeaDataNet CDI (Common Data Index), GBIF (Global Biodiversity Information Facility) services, and SibESS-C infrastructure services. GI-conf implements user-friendly configuration tool for GI-cat. This is a GUI application that employs a visual and very simple approach to configure both the GI-cat publishing and distribution capabilities, in a dynamic way. The tool allows to set one or more GI-cat configurations. Each configuration consists of: a) the catalog standards interfaces published by GI-cat; b) the resources (i.e. services/servers) to be accessed and mediated -i.e. federated. Simple icons are used for interfaces and resources, implementing a user-friendly visual approach. The main GI-conf functionalities are: • Interfaces and federated resources management: user can set which interfaces must be published; besides, she/he can add a new resource, update or remove an already federated resource. • Multiple configuration management: multiple GI-cat configurations can be defined; every configuration identifies a set of published interfaces and a set of federated resources. Configurations can be edited, added, removed, exported, and even imported. • HTML report creation: an HTML report can be created, showing the current active GI-cat configuration, including the resources that are being federated and the published interface endpoints. The configuration tool is shipped with GI-cat and can be used to configure the service after its installation is completed.

  11. Synergetic effect of Ti 3+ and oxygen doping on enhancing photoelectrochemical and photocatalytic properties of TiO 2/g-C 3N 4 heterojunctions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Kai; Huang, Zhenyu; Zeng, Xiaoqiao

    To improve the utilization of visible light and reduce photogenerated electron/hole recombination, Ti 3+ self-doped TiO 2/oxygen-doped graphitic carbon nitride (Ti 3+-TiO 2/O-g-C 3N 4) heterojunctions were prepared via hydrothermal treatment of a mixture of g-C 3N 4 and titanium oxohydride sol obtained from the reaction of TiH 2 with H 2O 2. In this way, exfoliated O-g-C 3N 4 and Ti 3+-TiO 2 nanoparticles were obtained. Simultaneously, strong bonding was formed between Ti 3+-TiO 2 nanoparticles and exfoliated O-g-C 3N 4 during the hydrothermal process. Charge transfer and recombination processes were characterized by transient photocurrent responses, electrochemical impedance test,more » and photoluminescence spectroscopy. The photocatalytic performances were investigated through rhodamine B degradation test under an irradiation source based on 30 W cold visible-light-emitting diode. The highest visible-light photoelectrochemical and photocatalytic activities were observed from the heterojunction with 1:2 mass ratio of Ti 3+-TiO 2 to O-g-C 3N 4. The photodegradation reaction rate constant based on this heterojuction is 0.0356 min -1, which is 3.87 and 4.56 times higher than those of pristine Ti 3+-TiO 2 and pure g-C 3N 4, respectively. Here, the remarkably high photoelectrochemical and photocatalytic performances of the heterojunctions are mainly attributed to the synergetic effect of efficient photogenerated electron-hole separation, decreased electron transfer resistance from interfacial chemical hydroxy residue bonds, and oxidizing groups originating from Ti 3+-TiO 2 and O-g-C 3N 4.« less

  12. Synergetic Effect of Ti3+ and Oxygen Doping on Enhancing Photoelectrochemical and Photocatalytic Properties of TiO2/g-C3N4 Heterojunctions.

    PubMed

    Li, Kai; Huang, Zhenyu; Zeng, Xiaoqiao; Huang, Baibiao; Gao, Shanmin; Lu, Jun

    2017-04-05

    To improve the utilization of visible light and reduce photogenerated electron/hole recombination, Ti 3+ self-doped TiO 2 /oxygen-doped graphitic carbon nitride (Ti 3+ -TiO 2 /O-g-C 3 N 4 ) heterojunctions were prepared via hydrothermal treatment of a mixture of g-C 3 N 4 and titanium oxohydride sol obtained from the reaction of TiH 2 with H 2 O 2 . In this way, exfoliated O-g-C 3 N 4 and Ti 3+ -TiO 2 nanoparticles were obtained. Simultaneously, strong bonding was formed between Ti 3+ -TiO 2 nanoparticles and exfoliated O-g-C 3 N 4 during the hydrothermal process. Charge transfer and recombination processes were characterized by transient photocurrent responses, electrochemical impedance test, and photoluminescence spectroscopy. The photocatalytic performances were investigated through rhodamine B degradation test under an irradiation source based on 30 W cold visible-light-emitting diode. The highest visible-light photoelectrochemical and photocatalytic activities were observed from the heterojunction with 1:2 mass ratio of Ti 3+ -TiO 2 to O-g-C 3 N 4 . The photodegradation reaction rate constant based on this heterojuction is 0.0356 min -1 , which is 3.87 and 4.56 times higher than those of pristine Ti 3+ -TiO 2 and pure g-C 3 N 4 , respectively. The remarkably high photoelectrochemical and photocatalytic performances of the heterojunctions are mainly attributed to the synergetic effect of efficient photogenerated electron-hole separation, decreased electron transfer resistance from interfacial chemical hydroxy residue bonds, and oxidizing groups originating from Ti 3+ -TiO 2 and O-g-C 3 N 4 .

  13. Synergetic effect of Ti 3+ and oxygen doping on enhancing photoelectrochemical and photocatalytic properties of TiO 2/g-C 3N 4 heterojunctions

    DOE PAGES

    Li, Kai; Huang, Zhenyu; Zeng, Xiaoqiao; ...

    2017-03-07

    To improve the utilization of visible light and reduce photogenerated electron/hole recombination, Ti 3+ self-doped TiO 2/oxygen-doped graphitic carbon nitride (Ti 3+-TiO 2/O-g-C 3N 4) heterojunctions were prepared via hydrothermal treatment of a mixture of g-C 3N 4 and titanium oxohydride sol obtained from the reaction of TiH 2 with H 2O 2. In this way, exfoliated O-g-C 3N 4 and Ti 3+-TiO 2 nanoparticles were obtained. Simultaneously, strong bonding was formed between Ti 3+-TiO 2 nanoparticles and exfoliated O-g-C 3N 4 during the hydrothermal process. Charge transfer and recombination processes were characterized by transient photocurrent responses, electrochemical impedance test,more » and photoluminescence spectroscopy. The photocatalytic performances were investigated through rhodamine B degradation test under an irradiation source based on 30 W cold visible-light-emitting diode. The highest visible-light photoelectrochemical and photocatalytic activities were observed from the heterojunction with 1:2 mass ratio of Ti 3+-TiO 2 to O-g-C 3N 4. The photodegradation reaction rate constant based on this heterojuction is 0.0356 min -1, which is 3.87 and 4.56 times higher than those of pristine Ti 3+-TiO 2 and pure g-C 3N 4, respectively. Here, the remarkably high photoelectrochemical and photocatalytic performances of the heterojunctions are mainly attributed to the synergetic effect of efficient photogenerated electron-hole separation, decreased electron transfer resistance from interfacial chemical hydroxy residue bonds, and oxidizing groups originating from Ti 3+-TiO 2 and O-g-C 3N 4.« less

  14. ESB-based Sensor Web integration for the prediction of electric power supply system vulnerability.

    PubMed

    Stoimenov, Leonid; Bogdanovic, Milos; Bogdanovic-Dinic, Sanja

    2013-08-15

    Electric power supply companies increasingly rely on enterprise IT systems to provide them with a comprehensive view of the state of the distribution network. Within a utility-wide network, enterprise IT systems collect data from various metering devices. Such data can be effectively used for the prediction of power supply network vulnerability. The purpose of this paper is to present the Enterprise Service Bus (ESB)-based Sensor Web integration solution that we have developed with the purpose of enabling prediction of power supply network vulnerability, in terms of a prediction of defect probability for a particular network element. We will give an example of its usage and demonstrate our vulnerability prediction model on data collected from two different power supply companies. The proposed solution is an extension of the GinisSense Sensor Web-based architecture for collecting, processing, analyzing, decision making and alerting based on the data received from heterogeneous data sources. In this case, GinisSense has been upgraded to be capable of operating in an ESB environment and combine Sensor Web and GIS technologies to enable prediction of electric power supply system vulnerability. Aside from electrical values, the proposed solution gathers ambient values from additional sensors installed in the existing power supply network infrastructure. GinisSense aggregates gathered data according to an adapted Omnibus data fusion model and applies decision-making logic on the aggregated data. Detected vulnerabilities are visualized to end-users through means of a specialized Web GIS application.

  15. ESB-Based Sensor Web Integration for the Prediction of Electric Power Supply System Vulnerability

    PubMed Central

    Stoimenov, Leonid; Bogdanovic, Milos; Bogdanovic-Dinic, Sanja

    2013-01-01

    Electric power supply companies increasingly rely on enterprise IT systems to provide them with a comprehensive view of the state of the distribution network. Within a utility-wide network, enterprise IT systems collect data from various metering devices. Such data can be effectively used for the prediction of power supply network vulnerability. The purpose of this paper is to present the Enterprise Service Bus (ESB)-based Sensor Web integration solution that we have developed with the purpose of enabling prediction of power supply network vulnerability, in terms of a prediction of defect probability for a particular network element. We will give an example of its usage and demonstrate our vulnerability prediction model on data collected from two different power supply companies. The proposed solution is an extension of the GinisSense Sensor Web-based architecture for collecting, processing, analyzing, decision making and alerting based on the data received from heterogeneous data sources. In this case, GinisSense has been upgraded to be capable of operating in an ESB environment and combine Sensor Web and GIS technologies to enable prediction of electric power supply system vulnerability. Aside from electrical values, the proposed solution gathers ambient values from additional sensors installed in the existing power supply network infrastructure. GinisSense aggregates gathered data according to an adapted Omnibus data fusion model and applies decision-making logic on the aggregated data. Detected vulnerabilities are visualized to end-users through means of a specialized Web GIS application. PMID:23955435

  16. Integrating Statistical Machine Learning in a Semantic Sensor Web for Proactive Monitoring and Control.

    PubMed

    Adeleke, Jude Adekunle; Moodley, Deshendran; Rens, Gavin; Adewumi, Aderemi Oluyinka

    2017-04-09

    Proactive monitoring and control of our natural and built environments is important in various application scenarios. Semantic Sensor Web technologies have been well researched and used for environmental monitoring applications to expose sensor data for analysis in order to provide responsive actions in situations of interest. While these applications provide quick response to situations, to minimize their unwanted effects, research efforts are still necessary to provide techniques that can anticipate the future to support proactive control, such that unwanted situations can be averted altogether. This study integrates a statistical machine learning based predictive model in a Semantic Sensor Web using stream reasoning. The approach is evaluated in an indoor air quality monitoring case study. A sliding window approach that employs the Multilayer Perceptron model to predict short term PM 2 . 5 pollution situations is integrated into the proactive monitoring and control framework. Results show that the proposed approach can effectively predict short term PM 2 . 5 pollution situations: precision of up to 0.86 and sensitivity of up to 0.85 is achieved over half hour prediction horizons, making it possible for the system to warn occupants or even to autonomously avert the predicted pollution situations within the context of Semantic Sensor Web.

  17. A Prototype Flood Early Warning SensorWeb System for Namibia

    NASA Astrophysics Data System (ADS)

    Sohlberg, R. A.; Mandl, D.; Frye, S. W.; Cappelaere, P. G.; Szarzynski, J.; Policelli, F.; van Langenhove, G.

    2010-12-01

    During the past two years, there have been extensive floods in the country of Namibia, Africa which have affected up to a quarter of the population. Via a collaboration between a group funded by the Earth Science Technology Office (ESTO) at NASA that has been performing various SensorWeb prototyping activities for disasters, the Department of Hydrology in Namibia and the United Nations Space-based Information for Disaster and Emergency Response (UN-SPIDER) , experiments were conducted on how to apply various satellite resources integrated into a SensorWeb architecture along with in-situ sensors such as river gauges and rain gauges into a flood early warning system. The SensorWeb includes a global flood model and a higher resolution basin specific flood model. Furthermore, flood extent and status is monitored by optical and radar types of satellites and integrated via some automation. We have taken a practical approach to find out how to create a working system by selectively using the components that provide good results. The vision for the future is to combine this with the country side dwelling unit data base to create risk maps that provide specific warnings to houses within high risk areas based on near term predictions. This presentation will show some of the highlights of the effort thus far plus our future plans.

  18. Integrating Statistical Machine Learning in a Semantic Sensor Web for Proactive Monitoring and Control

    PubMed Central

    Adeleke, Jude Adekunle; Moodley, Deshendran; Rens, Gavin; Adewumi, Aderemi Oluyinka

    2017-01-01

    Proactive monitoring and control of our natural and built environments is important in various application scenarios. Semantic Sensor Web technologies have been well researched and used for environmental monitoring applications to expose sensor data for analysis in order to provide responsive actions in situations of interest. While these applications provide quick response to situations, to minimize their unwanted effects, research efforts are still necessary to provide techniques that can anticipate the future to support proactive control, such that unwanted situations can be averted altogether. This study integrates a statistical machine learning based predictive model in a Semantic Sensor Web using stream reasoning. The approach is evaluated in an indoor air quality monitoring case study. A sliding window approach that employs the Multilayer Perceptron model to predict short term PM2.5 pollution situations is integrated into the proactive monitoring and control framework. Results show that the proposed approach can effectively predict short term PM2.5 pollution situations: precision of up to 0.86 and sensitivity of up to 0.85 is achieved over half hour prediction horizons, making it possible for the system to warn occupants or even to autonomously avert the predicted pollution situations within the context of Semantic Sensor Web. PMID:28397776

  19. Applying Semantic Web Services and Wireless Sensor Networks for System Integration

    NASA Astrophysics Data System (ADS)

    Berkenbrock, Gian Ricardo; Hirata, Celso Massaki; de Oliveira Júnior, Frederico Guilherme Álvares; de Oliveira, José Maria Parente

    In environments like factories, buildings, and homes automation services tend to often change during their lifetime. Changes are concerned to business rules, process optimization, cost reduction, and so on. It is important to provide a smooth and straightforward way to deal with these changes so that could be handled in a faster and low cost manner. Some prominent solutions use the flexibility of Wireless Sensor Networks and the meaningful description of Semantic Web Services to provide service integration. In this work, we give an overview of current solutions for machinery integration that combine both technologies as well as a discussion about some perspectives and open issues when applying Wireless Sensor Networks and Semantic Web Services for automation services integration.

  20. Geophysics in INSPIRE

    NASA Astrophysics Data System (ADS)

    Sőrés, László

    2013-04-01

    INSPIRE is a European directive to harmonize spatial data in Europe. Its' aim is to establish a transparent, multidisciplinary network of environmental information by using international standards and OGC web services. Spatial data themes defined in the annex of the directive cover 34 domains that are closely bundled to environment and spatial information. According to the INSPIRE roadmap all data providers must setup discovery, viewing and download services and restructure data stores to provide spatial data as defined by the underlying specifications by 2014 December 1. More than 3000 institutions are going to be involved in the progress. During the data specification process geophysics as an inevitable source of geo information was introduced to Annex II Geology. Within the Geology theme Geophysics is divided into core and extended model. The core model contains specifications for legally binding data provisioning and is going to be part of the Implementation Rules of the INSPIRE directives. To minimize the work load of obligatory data transformations the scope of the core model is very limited and simple. It covers the most essential geophysical feature types that are relevant in economic and environmental context. To fully support the use cases identified by the stake holders the extended model was developed. It contains a wide range of spatial object types for geophysical measurements, processed and interpreted results, and wrapper classes to help data providers in using the Observation and Measurements (O&M) standard for geophysical data exchange. Instead of introducing the traditional concept of "geophysical methods" at a high structural level the data model classifies measurements and geophysical models based on their spatial characteristics. Measurements are classified as geophysical station (point), geophysical profile (curve) and geophysical swath (surface). Generic classes for processing results and interpretation models are curve model (1D), surface model (2D), and solid model (3D). Both measurements and models are derived from O&M sampling features that may be linked to sampling procedures and observation results. Geophysical products are output of complex procedures and can precisely be described as chains of consecutive O&M observations. For describing geophysical processes and results the data model both supports the use of OGC standard XML encoding (SensorML, SWE, GML) and traditional industry standards (SPS, UKOOA, SEG formats). To control the scope of the model and to harmonize terminology an initial set of extendable code lists was developed. The attempt to create a hierarchical SKOS vocabulary of terms for geophysical methods, resource types, processes, properties and technical parameters was partly based on the work done in the eContentPlus GEOMIND project. The result is far from being complete, and the work must be continued in the future.

  1. Bcl-2 is a novel interacting partner for the 2-oxoglutarate carrier and a key regulator of mitochondrial glutathione

    PubMed Central

    Wilkins, Heather M.; Marquardt, Kristin; Lash, Lawrence H.; Linseman, Daniel A.

    2011-01-01

    Despite making up only a minor fraction of the total cellular glutathione, recent studies indicate that the mitochondrial glutathione pool is essential for cell survival. Selective depletion of mitochondrial glutathione is sufficient to sensitize cells to mitochondrial oxidative stress (MOS)1 and intrinsic apoptosis. Glutathione is synthesized exclusively in the cytoplasm and must be actively transported into mitochondria. Therefore, regulation of mitochondrial glutathione transport is a key factor in maintaining the antioxidant status of mitochondria. Bcl-2 is resident in the outer mitochondrial membrane where it acts as a central regulator of the intrinsic apoptotic cascade. In addition, Bcl-2 displays an antioxidant-like function that has been linked experimentally to the regulation of cellular glutathione content. We have previously demonstrated a novel interaction between recombinant Bcl-2 and reduced glutathione (GSH) which was antagonized by either Bcl-2 homology-3 domain (BH3) mimetics or a BH3-only protein, recombinant Bim. These previous findings prompted us to investigate if this novel Bcl-2/GSH interaction might play a role in regulating mitochondrial glutathione transport. Incubation of primary cultures of cerebellar granule neurons (CGNs) with the BH3 mimetic, HA14-1, induced MOS and caused specific depletion of the mitochondrial glutathione pool. Bcl-2 was co-immunoprecipitated with GSH following chemical cross-linking in CGNs and this Bcl-2/GSH interaction was antagonized by pre-incubation with HA14-1. Moreover, both HA14-1 and recombinant Bim inhibited GSH transport into isolated rat brain mitochondria. To further investigate a possible link between Bcl-2 function and mitochondrial glutathione transport, we next examined if Bcl-2 associated with the 2-oxoglutarate carrier (OGC), an inner mitochondrial membrane protein known to transport glutathione in liver and kidney. Following co-transfection of CHO cells, Bcl-2 was co-immunoprecipitated with OGC and this novel interaction was significantly enhanced by glutathione monoethylester (GSH-MEE). Similarly, recombinant Bcl-2 interacted with recombinant OGC in the presence of GSH. Bcl-2 and OGC co-transfection in CHO cells significantly increased the mitochondrial glutathione pool. Finally, the ability of Bcl-2 to protect CHO cells from apoptosis induced by hydrogen peroxide was significantly attenuated by the OGC inhibitor phenylsuccinate. These data suggest that GSH binding by Bcl-2 enhances its affinity for the OGC. Bcl-2 and OGC appear to act in a coordinated manner to increase the mitochondrial glutathione pool and enhance resistance of cells to oxidative stress. We conclude that regulation of mitochondrial glutathione transport is a principal mechanism by which Bcl-2 suppresses MOS. PMID:22115789

  2. Providing a virtual tour of a glacial watershed

    NASA Astrophysics Data System (ADS)

    Berner, L.; Habermann, M.; Hood, E.; Fatland, R.; Heavner, M.; Knuth, E.

    2007-12-01

    SEAMONSTER, a NASA funded sensor web project, is the SouthEast Alaska MOnitoring Network for Science, Telecommunications, Education, and Research. Seamonster is leveraging existing open-source software and is an implementation of existing sensor web technologies intended to act as a sensor web testbed, an educational tool, a scientific resource, and a public resource. The primary focus area of initial SEAMONSTER deployment is the Lemon Creek watershed, which includes the Lemon Creek Glacier studied as part of the 1957-58 IPY. This presentation describes our year one efforts to maximize education and public outreach activities of SEAMONSTER. During the first summer, 37 sensors were deployed throughout two partially glaciated watersheds and facilitated data acquisition in temperate rain forest, alpine, lacustrine, and glacial environments. Understanding these environments are important for public understanding of climate change. These environments are geographically isolated, limiting public access to, and understanding of, such locales. In an effort to inform the general public and primary educators about the basic processes occurring in these unique natural systems, we are developing an interactive website. This web portal will supplement and enhance environmental science primary education by providing educators and students with interactive access to basic information from the glaciological, hydrological, and meteorological systems we are studying. In addition, we are developing an interactive virtual tour of the Lemon Glacier and its watershed. This effort will include Google Earth as a means of real-time data visualization and will take advantage of time-lapse movies, photographs, maps, and satellite imagery to promote an understanding of these unique natural systems and the role of sensor webs in education.

  3. Sensor-Web Operations Explorer

    NASA Technical Reports Server (NTRS)

    Meemong, Lee; Miller, Charles; Bowman, Kevin; Weidner, Richard

    2008-01-01

    Understanding the atmospheric state and its impact on air quality requires observations of trace gases, aerosols, clouds, and physical parameters across temporal and spatial scales that range from minutes to days and from meters to more than 10,000 kilometers. Observations include continuous local monitoring for particle formation; field campaigns for emissions, local transport, and chemistry; and periodic global measurements for continental transport and chemistry. Understanding includes global data assimilation framework capable of hierarchical coupling, dynamic integration of chemical data and atmospheric models, and feedback loops between models and observations. The objective of the sensor-web system is to observe trace gases, aerosols, clouds, and physical parameters, an integrated observation infrastructure composed of space-borne, air-borne, and in-situ sensors will be simulated based on their measurement physics properties. The objective of the sensor-web operation is to optimally plan for heterogeneous multiple sensors, the sampling strategies will be explored and science impact will be analyzed based on comprehensive modeling of atmospheric phenomena including convection, transport, and chemical process. Topics include system architecture, software architecture, hardware architecture, process flow, technology infusion, challenges, and future direction.

  4. A web-based system for home monitoring of patients with Parkinson's disease using wearable sensors.

    PubMed

    Chen, Bor-Rong; Patel, Shyamal; Buckley, Thomas; Rednic, Ramona; McClure, Douglas J; Shih, Ludy; Tarsy, Daniel; Welsh, Matt; Bonato, Paolo

    2011-03-01

    This letter introduces MercuryLive, a platform to enable home monitoring of patients with Parkinson's disease (PD) using wearable sensors. MercuryLive contains three tiers: a resource-aware data collection engine that relies upon wearable sensors, web services for live streaming and storage of sensor data, and a web-based graphical user interface client with video conferencing capability. Besides, the platform has the capability of analyzing sensor (i.e., accelerometer) data to reliably estimate clinical scores capturing the severity of tremor, bradykinesia, and dyskinesia. Testing results showed an average data latency of less than 400 ms and video latency of about 200 ms with video frame rate of about 13 frames/s when 800 kb/s of bandwidth were available and we used a 40% video compression, and data feature upload requiring 1 min of extra time following a 10 min interactive session. These results indicate that the proposed platform is suitable to monitor patients with PD to facilitate the titration of medications in the late stages of the disease.

  5. TRMM Precipitation Application Examples Using Data Services at NASA GES DISC

    NASA Technical Reports Server (NTRS)

    Liu, Zhong; Ostrenga, D.; Teng, W.; Kempler, S.; Greene, M.

    2012-01-01

    Data services to support precipitation applications are important for maximizing the NASA TRMM (Tropical Rainfall Measuring Mission) and the future GPM (Global Precipitation Mission) mission's societal benefits. TRMM Application examples using data services at the NASA GES DISC, including samples from users around the world will be presented in this poster. Precipitation applications often require near-real-time support. The GES DISC provides such support through: 1) Providing near-real-time precipitation products through TOVAS; 2) Maps of current conditions for monitoring precipitation and its anomaly around the world; 3) A user friendly tool (TOVAS) to analyze and visualize near-real-time and historical precipitation products; and 4) The GES DISC Hurricane Portal that provides near-real-time monitoring services for the Atlantic basin. Since the launch of TRMM, the GES DISC has developed data services to support precipitation applications around the world. In addition to the near-real-time services, other services include: 1) User friendly TRMM Online Visualization and Analysis System (TOVAS; URL: http://disc2.nascom.nasa.gov/Giovanni/tovas/); 2) Mirador (http://mirador.gsfc.nasa.gov/), a simplified interface for searching, browsing, and ordering Earth science data at GES DISC. Mirador is designed to be fast and easy to learn; 3) Data via OPeNDAP (http://disc.sci.gsfc.nasa.gov/services/opendap/). The OPeNDAP provides remote access to individual variables within datasets in a form usable by many tools, such as IDV, McIDAS-V, Panoply, Ferret and GrADS; and 4) The Open Geospatial Consortium (OGC) Web Map Service (WMS) (http://disc.sci.gsfc.nasa.gov/services/wxs_ogc.shtml). The WMS is an interface that allows the use of data and enables clients to build customized maps with data coming from a different network.

  6. Establishing a National 3d Geo-Data Model for Building Data Compliant to Citygml: Case of Turkey

    NASA Astrophysics Data System (ADS)

    Ates Aydar, S.; Stoter, J.; Ledoux, H.; Demir Ozbek, E.; Yomralioglu, T.

    2016-06-01

    This paper presents the generation of the 3D national building geo-data model of Turkey, which is compatible with the international OGC CityGML Encoding Standard. We prepare an ADE named CityGML-TRKBIS.BI that is produced by extending existing thematic modules of CityGML according to TRKBIS needs. All thematic data groups in TRKBIS geo-data model have been remodelled in order to generate the national large scale 3D geo-data model for Turkey. Specific attention has been paid to data groups that have different class structure according to related CityGML data themes such as building data model. Current 2D geo-information model for building data theme of Turkey (TRKBIS.BI) was established based on INSPIRE specifications for building (Core 2D and Extended 2D profiles), ISO/TC 211 standards and OGC web services. New version of TRKBIS.BI which is established according to semantic and geometric rules of CityGML will represent 2D-2.5D and 3D objects. After a short overview on generic approach, this paper describes extending CityGML building data theme according to TRKBIS.BI through several steps. First, building models of both standards were compared according to their data structure, classes and attributes. Second, CityGML building model was extended with respect to TRKBIS needs and CityGML-TRKBIS Building ADE was established in UML. This study provides new insights into 3D applications in Turkey. The generated 3D geo-data model for building thematic class will be used as a common exchange format that meets 2D, 2.5D and 3D implementation needs at national level.

  7. GeoSciML and EarthResourceML Update, 2012

    NASA Astrophysics Data System (ADS)

    Richard, S. M.; Commissionthe Management; Application Inte, I.

    2012-12-01

    CGI Interoperability Working Group activities during 2012 include deployment of services using the GeoSciML-Portrayal schema, addition of new vocabularies to support properties added in version 3.0, improvements to server software for deploying services, introduction of EarthResourceML v.2 for mineral resources, and collaboration with the IUSS on a markup language for soils information. GeoSciML and EarthResourceML have been used as the basis for the INSPIRE Geology and Mineral Resources specifications respectively. GeoSciML-Portrayal is an OGC GML simple-feature application schema for presentation of geologic map unit, contact, and shear displacement structure (fault and ductile shear zone) descriptions in web map services. Use of standard vocabularies for geologic age and lithology enables map services using shared legends to achieve visual harmonization of maps provided by different services. New vocabularies have been added to the collection of CGI vocabularies provided to support interoperable GeoSciML services, and can be accessed through http://resource.geosciml.org. Concept URIs can be dereferenced to obtain SKOS rdf or html representations using the SISSVoc vocabulary service. New releases of the FOSS GeoServer application greatly improve support for complex XML feature schemas like GeoSciML, and the ArcGIS for INSPIRE extension implements similar complex feature support for ArcGIS Server. These improved server implementations greatly facilitate deploying GeoSciML services. EarthResourceML v2 adds features for information related to mining activities. SoilML provides an interchange format for soil material, soil profile, and terrain information. Work is underway to add GeoSciML to the portfolio of Open Geospatial Consortium (OGC) specifications.

  8. Scalability Issues for Remote Sensing Infrastructure: A Case Study

    PubMed Central

    Liu, Yang; Picard, Sean; Williamson, Carey

    2017-01-01

    For the past decade, a team of University of Calgary researchers has operated a large “sensor Web” to collect, analyze, and share scientific data from remote measurement instruments across northern Canada. This sensor Web receives real-time data streams from over a thousand Internet-connected sensors, with a particular emphasis on environmental data (e.g., space weather, auroral phenomena, atmospheric imaging). Through research collaborations, we had the opportunity to evaluate the performance and scalability of their remote sensing infrastructure. This article reports the lessons learned from our study, which considered both data collection and data dissemination aspects of their system. On the data collection front, we used benchmarking techniques to identify and fix a performance bottleneck in the system’s memory management for TCP data streams, while also improving system efficiency on multi-core architectures. On the data dissemination front, we used passive and active network traffic measurements to identify and reduce excessive network traffic from the Web robots and JavaScript techniques used for data sharing. While our results are from one specific sensor Web system, the lessons learned may apply to other scientific Web sites with remote sensing infrastructure. PMID:28468262

  9. Observations to information

    NASA Astrophysics Data System (ADS)

    Cox, S. J.

    2013-12-01

    Observations provide the fundamental constraint on natural science interpretations. Earth science observations originate in many contexts, including in-situ field observations and monitoring, various modes of remote sensing and geophysics, sampling for ex-situ (laboratory) analysis, as well as numerical modelling and simulation which also provide estimates of parameter values. Most investigations require a combination of these, often sourced from multiple initiatives and archives, so data discovery and re-organization can be a significant project burden. The Observations and Measurements (O&M) information model was developed to provide a common vocabulary that can be applied to all these cases, and thus provide a basis for cross-initiative and cross-domain interoperability. O&M was designed in the context of the standards for geographic information from OGC and ISO. It provides a complementary viewpoint to the well-known feature (object oriented) and coverage (property field) views, but prioritizes the property determination process. Nevertheless, use of O&M implies the existence of well defined feature types. In disciplines such as geology and ecosystem sciences the primary complexity is in their model of the world, for which the description of each item requires access to diverse observation sets. On the other hand, geophysics and earth observations work with simpler underlying information items, but in larger quantities over multiple spatio-temporal dimensions, acquired using complex sensor systems. Multiple transformations between the three viewpoints are involved in the data flows in most investigations, from collection through analysis to information and story. The O&M model classifies observations: - from a provider viewpoint: in terms of the sensor or procedure involved; - from a consumer viewpoint: in terms of the property being reported, and the feature with which it is associated. These concerns carry different weights in different applications. Communities generating data using ships, satellites and aircraft habitually classify observations by the source platform and mission, as this implies a rich set of metadata to the cognoscenti. However, integrators are more likely to focus on the phenomenon being observed, together with the location of the features carrying it. In this context sensor information informs quality evaluation, as a secondary consideration following after data discovery. The observation model is specialized by constraining facets, such as observed property, sensor or procedure, to be taken from a specific set or vocabulary. Such vocabularies are typically developed on a project or community basis, but data fusion depends on them being widely accessible, and comparable with related vocabularies. Better still if they are transparently governed, trusted and stable enough to encourage re-use. Semantic web technologies support distribution of rigorously constructed vocabularies through standard interfaces, with standard mechanisms for asserting or inferring of proximity and other relationships. Interoperability of observation data in future is likely to depend on the development of a viable ecosystem of these secondary resources.

  10. Web processing service for climate impact and extreme weather event analyses. Flyingpigeon (Version 1.0)

    NASA Astrophysics Data System (ADS)

    Hempelmann, Nils; Ehbrecht, Carsten; Alvarez-Castro, Carmen; Brockmann, Patrick; Falk, Wolfgang; Hoffmann, Jörg; Kindermann, Stephan; Koziol, Ben; Nangini, Cathy; Radanovics, Sabine; Vautard, Robert; Yiou, Pascal

    2018-01-01

    Analyses of extreme weather events and their impacts often requires big data processing of ensembles of climate model simulations. Researchers generally proceed by downloading the data from the providers and processing the data files ;at home; with their own analysis processes. However, the growing amount of available climate model and observation data makes this procedure quite awkward. In addition, data processing knowledge is kept local, instead of being consolidated into a common resource of reusable code. These drawbacks can be mitigated by using a web processing service (WPS). A WPS hosts services such as data analysis processes that are accessible over the web, and can be installed close to the data archives. We developed a WPS named 'flyingpigeon' that communicates over an HTTP network protocol based on standards defined by the Open Geospatial Consortium (OGC), to be used by climatologists and impact modelers as a tool for analyzing large datasets remotely. Here, we present the current processes we developed in flyingpigeon relating to commonly-used processes (preprocessing steps, spatial subsets at continent, country or region level, and climate indices) as well as methods for specific climate data analysis (weather regimes, analogues of circulation, segetal flora distribution, and species distribution models). We also developed a novel, browser-based interactive data visualization for circulation analogues, illustrating the flexibility of WPS in designing custom outputs. Bringing the software to the data instead of transferring the data to the code is becoming increasingly necessary, especially with the upcoming massive climate datasets.

  11. Providing Geographic Datasets as Linked Data in Sdi

    NASA Astrophysics Data System (ADS)

    Hietanen, E.; Lehto, L.; Latvala, P.

    2016-06-01

    In this study, a prototype service to provide data from Web Feature Service (WFS) as linked data is implemented. At first, persistent and unique Uniform Resource Identifiers (URI) are created to all spatial objects in the dataset. The objects are available from those URIs in Resource Description Framework (RDF) data format. Next, a Web Ontology Language (OWL) ontology is created to describe the dataset information content using the Open Geospatial Consortium's (OGC) GeoSPARQL vocabulary. The existing data model is modified in order to take into account the linked data principles. The implemented service produces an HTTP response dynamically. The data for the response is first fetched from existing WFS. Then the Geographic Markup Language (GML) format output of the WFS is transformed on-the-fly to the RDF format. Content Negotiation is used to serve the data in different RDF serialization formats. This solution facilitates the use of a dataset in different applications without replicating the whole dataset. In addition, individual spatial objects in the dataset can be referred with URIs. Furthermore, the needed information content of the objects can be easily extracted from the RDF serializations available from those URIs. A solution for linking data objects to the dataset URI is also introduced by using the Vocabulary of Interlinked Datasets (VoID). The dataset is divided to the subsets and each subset is given its persistent and unique URI. This enables the whole dataset to be explored with a web browser and all individual objects to be indexed by search engines.

  12. Integrating semantic web technologies and geospatial catalog services for geospatial information discovery and processing in cyberinfrastructure

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yue, Peng; Gong, Jianya; Di, Liping

    Abstract A geospatial catalogue service provides a network-based meta-information repository and interface for advertising and discovering shared geospatial data and services. Descriptive information (i.e., metadata) for geospatial data and services is structured and organized in catalogue services. The approaches currently available for searching and using that information are often inadequate. Semantic Web technologies show promise for better discovery methods by exploiting the underlying semantics. Such development needs special attention from the Cyberinfrastructure perspective, so that the traditional focus on discovery of and access to geospatial data can be expanded to support the increased demand for processing of geospatial information andmore » discovery of knowledge. Semantic descriptions for geospatial data, services, and geoprocessing service chains are structured, organized, and registered through extending elements in the ebXML Registry Information Model (ebRIM) of a geospatial catalogue service, which follows the interface specifications of the Open Geospatial Consortium (OGC) Catalogue Services for the Web (CSW). The process models for geoprocessing service chains, as a type of geospatial knowledge, are captured, registered, and discoverable. Semantics-enhanced discovery for geospatial data, services/service chains, and process models is described. Semantic search middleware that can support virtual data product materialization is developed for the geospatial catalogue service. The creation of such a semantics-enhanced geospatial catalogue service is important in meeting the demands for geospatial information discovery and analysis in Cyberinfrastructure.« less

  13. A Web Service-Based Framework Model for People-Centric Sensing Applications Applied to Social Networking

    PubMed Central

    Nunes, David; Tran, Thanh-Dien; Raposo, Duarte; Pinto, André; Gomes, André; Silva, Jorge Sá

    2012-01-01

    As the Internet evolved, social networks (such as Facebook) have bloomed and brought together an astonishing number of users. Mashing up mobile phones and sensors with these social environments enables the creation of people-centric sensing systems which have great potential for expanding our current social networking usage. However, such systems also have many associated technical challenges, such as privacy concerns, activity detection mechanisms or intermittent connectivity, as well as limitations due to the heterogeneity of sensor nodes and networks. Considering the openness of the Web 2.0, good technical solutions for these cases consist of frameworks that expose sensing data and functionalities as common Web-Services. This paper presents our RESTful Web Service-based model for people-centric sensing frameworks, which uses sensors and mobile phones to detect users’ activities and locations, sharing this information amongst the user’s friends within a social networking site. We also present some screenshot results of our experimental prototype. PMID:22438732

  14. A Web Service-based framework model for people-centric sensing applications applied to social networking.

    PubMed

    Nunes, David; Tran, Thanh-Dien; Raposo, Duarte; Pinto, André; Gomes, André; Silva, Jorge Sá

    2012-01-01

    As the Internet evolved, social networks (such as Facebook) have bloomed and brought together an astonishing number of users. Mashing up mobile phones and sensors with these social environments enables the creation of people-centric sensing systems which have great potential for expanding our current social networking usage. However, such systems also have many associated technical challenges, such as privacy concerns, activity detection mechanisms or intermittent connectivity, as well as limitations due to the heterogeneity of sensor nodes and networks. Considering the openness of the Web 2.0, good technical solutions for these cases consist of frameworks that expose sensing data and functionalities as common Web-Services. This paper presents our RESTful Web Service-based model for people-centric sensing frameworks, which uses sensors and mobile phones to detect users' activities and locations, sharing this information amongst the user's friends within a social networking site. We also present some screenshot results of our experimental prototype.

  15. Knowledge Management in Sensor Enabled Online Services

    NASA Astrophysics Data System (ADS)

    Smyth, Dominick; Cappellari, Paolo; Roantree, Mark

    The Future Internet, has as its vision, the development of improved features and usability for services, applications and content. In many cases, services can be provided automatically through the use of monitors or sensors. This means web generated sensor data becoming available not only to the companies that own the sensors but also to the domain users who generate the data and to information and knowledge workers who harvest the output. The goal is improving the service through better usage of the information provided by the service. Applications and services vary from climate, traffic, health and sports event monitoring. In this paper, we present the WSW system that harvests web sensor data to provide additional and, in some cases, more accurate information using an analysis of both live and warehoused information.

  16. Experiences integrating autonomous components and legacy systems into tsunami early warning systems

    NASA Astrophysics Data System (ADS)

    Reißland, S.; Herrnkind, S.; Guenther, M.; Babeyko, A.; Comoglu, M.; Hammitzsch, M.

    2012-04-01

    Fostered by and embedded in the general development of Information and Communication Technology (ICT) the evolution of Tsunami Early Warning Systems (TEWS) shows a significant development from seismic-centred to multi-sensor system architectures using additional sensors, e.g. sea level stations for the detection of tsunami waves and GPS stations for the detection of ground displacements. Furthermore, the design and implementation of a robust and scalable service infrastructure supporting the integration and utilisation of existing resources serving near real-time data not only includes sensors but also other components and systems offering services such as the delivery of feasible simulations used for forecasting in an imminent tsunami threat. In the context of the development of the German Indonesian Tsunami Early Warning System (GITEWS) and the project Distant Early Warning System (DEWS) a service platform for both sensor integration and warning dissemination has been newly developed and demonstrated. In particular, standards of the Open Geospatial Consortium (OGC) and the Organization for the Advancement of Structured Information Standards (OASIS) have been successfully incorporated. In the project Collaborative, Complex, and Critical Decision-Support in Evolving Crises (TRIDEC) new developments are used to extend the existing platform to realise a component-based technology framework for building distributed TEWS. This talk will describe experiences made in GITEWS, DEWS and TRIDEC while integrating legacy stand-alone systems and newly developed special-purpose software components into TEWS using different software adapters and communication strategies to make the systems work together in a corporate infrastructure. The talk will also cover task management and data conversion between the different systems. Practical approaches and software solutions for the integration of sensors, e.g. providing seismic and sea level data, and utilisation of special-purpose components, such as simulation systems, in TEWS will be presented.

  17. CB-EMIS CELL PHONE CLIENT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lurie, Gordon

    2007-01-02

    The cell phone software allows any Java enabled cell phone to view sensor and meteorological data via an internet connection using a secure connection to the CB-EMIS Web Service. Users with appropriate privileges can monitor the state of the sensors and perform simple maintenance tasks remotely. All sensitive data is downloaded from the web service, thus protecting sensitive data in the event a cell phone is lost.

  18. Expanding Access and Usage of NASA Near Real-Time Imagery and Data

    NASA Astrophysics Data System (ADS)

    Cechini, M.; Murphy, K. J.; Boller, R. A.; Schmaltz, J. E.; Thompson, C. K.; Huang, T.; McGann, J. M.; Ilavajhala, S.; Alarcon, C.; Roberts, J. T.

    2013-12-01

    In late 2009, the Land Atmosphere Near-real-time Capability for EOS (LANCE) was created to greatly expand the range of near real-time data products from a variety of Earth Observing System (EOS) instruments. Since that time, NASA's Earth Observing System Data and Information System (EOSDIS) developed the Global Imagery Browse Services (GIBS) to provide highly responsive, scalable, and expandable imagery services that distribute near real-time imagery in an intuitive and geo-referenced format. The GIBS imagery services provide access through standards-based protocols such as the Open Geospatial Consortium (OGC) Web Map Tile Service (WMTS) and standard mapping file formats such as the Keyhole Markup Language (KML). Leveraging these standard mechanisms opens NASA near real-time imagery to a broad landscape of mapping libraries supporting mobile applications. By easily integrating with mobile application development libraries, GIBS makes it possible for NASA imagery to become a reliable and valuable source for end-user applications. Recently, EOSDIS has taken steps to integrate near real-time metadata products into the EOS ClearingHOuse (ECHO) metadata repository. Registration of near real-time metadata allows for near real-time data discovery through ECHO clients. In kind with the near real-time data processing requirements, the ECHO ingest model allows for low-latency metadata insertion and updates. Combining with the ECHO repository, the fast visual access of GIBS imagery can now be linked directly back to the source data file(s). Through the use of discovery standards such as OpenSearch, desktop and mobile applications can connect users to more than just an image. As data services, such as OGC Web Coverage Service, become more prevalent within the EOSDIS system, applications may even be able to connect users from imagery to data values. In addition, the full resolution GIBS imagery provides visual context to other GIS data and tools. The NASA near real-time imagery covers a broad set of Earth science disciplines. By leveraging the ECHO and GIBS services, these data can become a visual context within which other GIS activities are performed. The focus of this presentation is to discuss the GIBS imagery and ECHO metadata services facilitating near real-time discovery and usage. Existing synergies and future possibilities will also be discussed. The NASA Worldview demonstration client will be used to show an existing application combining the ECHO and GIBS services.

  19. The BCube Crawler: Web Scale Data and Service Discovery for EarthCube.

    NASA Astrophysics Data System (ADS)

    Lopez, L. A.; Khalsa, S. J. S.; Duerr, R.; Tayachow, A.; Mingo, E.

    2014-12-01

    Web-crawling, a core component of the NSF-funded BCube project, is researching and applying the use of big data technologies to find and characterize different types of web services, catalog interfaces, and data feeds such as the ESIP OpenSearch, OGC W*S, THREDDS, and OAI-PMH that describe or provide access to scientific datasets. Given the scale of the Internet, which challenges even large search providers such as Google, the BCube plan for discovering these web accessible services is to subdivide the problem into three smaller, more tractable issues. The first, to be able to discover likely sites where relevant data and data services might be found, the second, to be able to deeply crawl the sites discovered to find any data and services which might be present. Lastly, to leverage the use of semantic technologies to characterize the services and data found, and to filter out everything but those relevant to the geosciences. To address the first two challenges BCube uses an adapted version of Apache Nutch (which originated Hadoop), a web scale crawler, and Amazon's ElasticMapReduce service for flexibility and cost effectiveness. For characterization of the services found, BCube is examining existing web service ontologies for their applicability to our needs and will re-use and/or extend these in order to query for services with specific well-defined characteristics in scientific datasets such as the use of geospatial namespaces. The original proposal for the crawler won a grant from Amazon's academic program, which allowed us to become operational; we successfully tested the Bcube Crawler at web scale obtaining a significant corpus, sizeable enough to enable work on characterization of the services and data found. There is still plenty of work to be done, doing "smart crawls" by managing the frontier, developing and enhancing our scoring algorithms and fully implementing the semantic characterization technologies. We describe the current status of the project, our successes and issues encountered. The final goal of the BCube crawler project is to provide relevant data services to other projects on the EarthCube stack and third party partners so they can be brokered and used by a wider scientific community.

  20. Real-time Data Access to First Responders: A VORB application

    NASA Astrophysics Data System (ADS)

    Lu, S.; Kim, J. B.; Bryant, P.; Foley, S.; Vernon, F.; Rajasekar, A.; Meier, S.

    2006-12-01

    Getting information to first responders is not an easy task. The sensors that provide the information are diverse in formats and come from many disciplines. They are also distributed by location, transmit data at different frequencies and are managed and owned by autonomous administrative entities. Pulling such types of data in real-time, needs a very robust sensor network with reliable data transport and buffering capabilities. Moreover, the system should be extensible and scalable in numbers and sensor types. ROADNet is a real- time sensor network project at UCSD gathering diverse environmental data in real-time or near-real-time. VORB (Virtual Object Ring Buffer) is the middleware used in ROADNet offering simple, uniform and scalable real-time data management for discovering (through metadata), accessing and archiving real-time data and data streams. Recent development in VORB, a web API, has offered quick and simple real-time data integration with web applications. In this poster, we discuss one application developed as part of ROADNet. SMER (Santa Margarita Ecological Reserve) is located in interior Southern California, a region prone to catastrophic wildfires each summer and fall. To provide data during emergencies, we have applied the VORB framework to develop a web-based application for providing access to diverse sensor data including weather data, heat sensor information, and images from cameras. Wildfire fighters have access to real-time data about weather and heat conditions in the area and view pictures taken from cameras at multiple points in the Reserve to pinpoint problem areas. Moreover, they can browse archived images and sensor data from earlier times to provide a comparison framework. To show scalability of the system, we have expanded the sensor network under consideration through other areas in Southern California including sensors accessible by Los Angeles County Fire Department (LACOFD) and those available through the High Performance Wireless Research and Education Network (HPWREN). The poster will discuss the system architecture and components, the types of sensor being used and usage scenarios. The system is currently operational through the SMER web-site.

  1. Web access and dissemination of Andalusian coastal erosion rates: viewers and standard/filtered map services.

    NASA Astrophysics Data System (ADS)

    Álvarez Francoso, Jose; Prieto Campos, Antonio; Ojeda Zujar, Jose; Guisado-Pintado, Emilia; Pérez Alcántara, Juan Pedro

    2017-04-01

    The accessibility to environmental information via web viewers using map services (OGC or proprietary services) has become more frequent since newly information sources (ortophotos, LIDAR, GPS) are of great detailed and thus generate a great volume of data which barely can be disseminated using either analogue (paper maps) or digital (pdf) formats. Moreover, governments and public institutions are concerned about the need of facilitates provision to research results and improve communication about natural hazards to citizens and stakeholders. This information ultimately, if adequately disseminated, it's crucial in decision making processes, risk management approaches and could help to increase social awareness related to environmental issues (particularly climate change impacts). To overcome this issue, two strategies for wide dissemination and communication of the results achieved in the calculation of beach erosion for the 640 km length of the Andalusian coast (South Spain) using web viewer technology are presented. Each of them are oriented to different end users and thus based on different methodologies. Erosion rates has been calculated at 50m intervals for different periods (1956-1977-2001-2011) as part of a National Research Project based on the spasialisation and web-access of coastal vulnerability indicators for Andalusian region. The 1st proposal generates WMS services (following OGC standards) that are made available by Geoserver, using a geoviewer client developed through Leaflet. This viewer is designed to be used by the general public (citizens, politics, etc) by combining a set of tools that give access to related documents (pdfs), visualisation tools (panoramio pictures, geo-localisation with GPS) are which are displayed within an user-friendly interface. Further, the use of WMS services (implemented on Geoserver) provides a detailed semiology (arrows and proportional symbols, using alongshore coastaline buffers to represent data) which not only enhances access to erosion rates but also enables multi-scale data representation. The 2nd proposal, as intended to be used by technicians and specialists on the field, includes a geoviewer with an innovative profile (including visualization of time-ranges, application of different uncertainty levels to the data, etc) to fulfil the needs of these users. For its development, a set of Javascript libraries combined with Openlayers (or Leaflet) are implemented to guarantee all the functionalities existing for the basic geoviewer. Further to this, the viewer has been improved by i) the generation of services by request through the application of a filter in ECQL language (Extended Common Query Language), using the vendor parameter CQL_FILTER from Geoserver. These dynamic filters allow the final user to predefine the visualised variable, its spatial and temporal domain, a range of specific values and other attributes, thus multiplying the generation of real-time cartography; ii) by using the layer's WFS service, the Javascript application exploit the alphanumeric data to generate related statistics in real time (e.g. mean rates, length of eroded coast, etc.) and interactive graphs (via HighCharts.js library) which accurately help in beach erosion rates interpretation (representing trends and bars diagrams, among others. As a result two approaches for communicating scientific results to different audiences based on web-based with complete dataset of geo-information, services and functionalities are implemented. The combination of standardised environmental data with tailor-made exploitation techniques (interactive maps, and real-time statistics) assures the correct access and interpretation of the information.

  2. Virtual Sensors in a Web 2.0 Digital Watershed

    NASA Astrophysics Data System (ADS)

    Liu, Y.; Hill, D. J.; Marini, L.; Kooper, R.; Rodriguez, A.; Myers, J. D.

    2008-12-01

    The lack of rainfall data in many watersheds is one of the major barriers for modeling and studying many environmental and hydrological processes and supporting decision making. There are just not enough rain gages on the ground. To overcome this data scarcity issue, a Web 2.0 digital watershed is developed at NCSA(National Center for Supercomputing Applications), where users can point-and-click on a web-based google map interface and create new precipitation virtual sensors at any location within the same coverage region as a NEXRAD station. A set of scientific workflows are implemented to perform spatial, temporal and thematic transformations to the near-real-time NEXRAD Level II data. Such workflows can be triggered by the users' actions and generate either rainfall rate or rainfall accumulation streaming data at a user-specified time interval. We will discuss some underlying components of this digital watershed, which consists of a semantic content management middleware, a semantically enhanced streaming data toolkit, virtual sensor management functionality, and RESTful (REpresentational State Transfer) web service that can trigger the workflow execution. Such loosely coupled architecture presents a generic framework for constructing a Web 2.0 style digital watershed. An implementation of this architecture at the Upper Illinois Rive Basin will be presented. We will also discuss the implications of the virtual sensor concept for the broad environmental observatory community and how such concept will help us move towards a participatory digital watershed.

  3. Development of a Cloud Computing-Based Pier Type Port Structure Stability Evaluation Platform Using Fiber Bragg Grating Sensors.

    PubMed

    Jo, Byung Wan; Jo, Jun Ho; Khan, Rana Muhammad Asad; Kim, Jung Hoon; Lee, Yun Sung

    2018-05-23

    Structure Health Monitoring is a topic of great interest in port structures due to the ageing of structures and the limitations of evaluating structures. This paper presents a cloud computing-based stability evaluation platform for a pier type port structure using Fiber Bragg Grating (FBG) sensors in a system consisting of a FBG strain sensor, FBG displacement gauge, FBG angle meter, gateway, and cloud computing-based web server. The sensors were installed on core components of the structure and measurements were taken to evaluate the structures. The measurement values were transmitted to the web server via the gateway to analyze and visualize them. All data were analyzed and visualized in the web server to evaluate the structure based on the safety evaluation index (SEI). The stability evaluation platform for pier type port structures involves the efficient monitoring of the structures which can be carried out easily anytime and anywhere by converging new technologies such as cloud computing and FBG sensors. In addition, the platform has been successfully implemented at “Maryang Harbor” situated in Maryang-Meyon of Korea to test its durability.

  4. Method and system of measuring ultrasonic signals in the plane of a moving web

    DOEpatents

    Hall, Maclin S.; Jackson, Theodore G.; Wink, Wilmer A.; Knerr, Christopher

    1996-01-01

    An improved system for measuring the velocity of ultrasonic signals within the plane of moving web-like materials, such as paper, paperboard and the like. In addition to velocity measurements of ultrasonic signals in the plane of the web in the machine direction, MD, and a cross direction, CD, generally perpendicular to the direction of the traveling web, therefor, one embodiment of the system in accordance with the present invention is also adapted to provide on-line indication of the polar specific stiffness of the moving web. In another embodiment of the invention, the velocity of ultrasonic signals in the plane of the web are measured by way of a plurality of ultrasonic transducers carried by synchronously driven wheels or cylinders, thus eliminating undue transducer wear due to any speed differences between the transducers and the web. In order to provide relatively constant contact force between the transducers and the webs, the transducers are mounted in a sensor housings which include a spring for biasing the transducer radially outwardly. The sensor housings are adapted to be easily and conveniently mounted to the carrier to provide a relatively constant contact force between the transducers and the moving web.

  5. Method and system of measuring ultrasonic signals in the plane of a moving web

    DOEpatents

    Hall, M.S.; Jackson, T.G.; Wink, W.A.; Knerr, C.

    1996-02-27

    An improved system for measuring the velocity of ultrasonic signals within the plane of moving web-like materials, such as paper, paperboard and the like is disclosed. In addition to velocity measurements of ultrasonic signals in the plane of the web in the machine direction, MD, and a cross direction, CD, generally perpendicular to the direction of the traveling web, therefore, one embodiment of the system in accordance with the present invention is also adapted to provide on-line indication of the polar specific stiffness of the moving web. In another embodiment of the invention, the velocity of ultrasonic signals in the plane of the web are measured by way of a plurality of ultrasonic transducers carried by synchronously driven wheels or cylinders, thus eliminating undue transducer wear due to any speed differences between the transducers and the web. In order to provide relatively constant contact force between the transducers and the webs, the transducers are mounted in a sensor housings which include a spring for biasing the transducer radially outwardly. The sensor housings are adapted to be easily and conveniently mounted to the carrier to provide a relatively constant contact force between the transducers and the moving web. 37 figs.

  6. Enviro-Net: From Networks of Ground-Based Sensor Systems to a Web Platform for Sensor Data Management

    PubMed Central

    Pastorello, Gilberto Z.; Sanchez-Azofeifa, G. Arturo; Nascimento, Mario A.

    2011-01-01

    Ecosystems monitoring is essential to properly understand their development and the effects of events, both climatological and anthropological in nature. The amount of data used in these assessments is increasing at very high rates. This is due to increasing availability of sensing systems and the development of new techniques to analyze sensor data. The Enviro-Net Project encompasses several of such sensor system deployments across five countries in the Americas. These deployments use a few different ground-based sensor systems, installed at different heights monitoring the conditions in tropical dry forests over long periods of time. This paper presents our experience in deploying and maintaining these systems, retrieving and pre-processing the data, and describes the Web portal developed to help with data management, visualization and analysis. PMID:22163965

  7. Web-GIS visualisation of permafrost-related Remote Sensing products for ESA GlobPermafrost

    NASA Astrophysics Data System (ADS)

    Haas, A.; Heim, B.; Schaefer-Neth, C.; Laboor, S.; Nitze, I.; Grosse, G.; Bartsch, A.; Kaab, A.; Strozzi, T.; Wiesmann, A.; Seifert, F. M.

    2016-12-01

    The ESA GlobPermafrost (www.globpermafrost.info) provides a remote sensing service for permafrost research and applications. The service comprises of data product generation for various sites and regions as well as specific infrastructure allowing overview and access to datasets. Based on an online user survey conducted within the project, the user community extensively applies GIS software to handle remote sensing-derived datasets and requires preview functionalities before accessing them. In response, we develop the Permafrost Information System PerSys which is conceptualized as an open access geospatial data dissemination and visualization portal. PerSys will allow visualisation of GlobPermafrost raster and vector products such as land cover classifications, Landsat multispectral index trend datasets, lake and wetland extents, InSAR-based land surface deformation maps, rock glacier velocity fields, spatially distributed permafrost model outputs, and land surface temperature datasets. The datasets will be published as WebGIS services relying on OGC-standardized Web Mapping Service (WMS) and Web Feature Service (WFS) technologies for data display and visualization. The WebGIS environment will be hosted at the AWI computing centre where a geodata infrastructure has been implemented comprising of ArcGIS for Server 10.4, PostgreSQL 9.2 and a browser-driven data viewer based on Leaflet (http://leafletjs.com). Independently, we will provide an `Access - Restricted Data Dissemination Service', which will be available to registered users for testing frequently updated versions of project datasets. PerSys will become a core project of the Arctic Permafrost Geospatial Centre (APGC) within the ERC-funded PETA-CARB project (www.awi.de/petacarb). The APGC Data Catalogue will contain all final products of GlobPermafrost, allow in-depth dataset search via keywords, spatial and temporal coverage, data type, etc., and will provide DOI-based links to the datasets archived in the long-term, open access PANGAEA data repository.

  8. Turning Interoperability Operational with GST

    NASA Astrophysics Data System (ADS)

    Schaeben, Helmut; Gabriel, Paul; Gietzel, Jan; Le, Hai Ha

    2013-04-01

    GST - Geosciences in space and time is being developed and implemented as hub to facilitate the exchange of spatially and temporally indexed multi-dimensional geoscience data and corresponding geomodels amongst partners. It originates from TUBAF's contribution to the EU project "ProMine" and its perspective extensions are TUBAF's contribution to the actual EU project "GeoMol". As of today, it provides basic components of a geodata infrastructure as required to establish interoperability with respect to geosciences. Generally, interoperability means the facilitation of cross-border and cross-sector information exchange, taking into account legal, organisational, semantic and technical aspects, cf. Interoperability Solutions for European Public Administrations (ISA), cf. http://ec.europa.eu/isa/. Practical interoperability for partners of a joint geoscience project, say European Geological Surveys acting in a border region, means in particular provision of IT technology to exchange spatially and maybe additionally temporally indexed multi-dimensional geoscience data and corresponding models, i.e. the objects composing geomodels capturing the geometry, topology, and various geoscience contents. Geodata Infrastructure (GDI) and interoperability are objectives of several inititatives, e.g. INSPIRE, OneGeology-Europe, and most recently EGDI-SCOPE to name just the most prominent ones. Then there are quite a few markup languages (ML) related to geographical or geological information like GeoSciML, EarthResourceML, BoreholeML, ResqML for reservoir characterization, earth and reservoir models, and many others featuring geoscience information. Several Web Services are focused on geographical or geoscience information. The Open Geospatial Consortium (OGC) promotes specifications of a Web Feature Service (WFS), a Web Map Service (WMS), a Web Coverage Serverice (WCS), a Web 3D Service (W3DS), and many more. It will be clarified how GST is related to these initiatives, especially how it complies with existing or developing standards or quasi-standards and how it applies and extents services towards interoperability in the Earth sciences.

  9. Web-Based Tools for Data Visualization and Decision Support for South Asia

    NASA Astrophysics Data System (ADS)

    Jones, N.; Nelson, J.; Pulla, S. T.; Ames, D. P.; Souffront, M.; David, C. H.; Zaitchik, B. F.; Gatlin, P. N.; Matin, M. A.

    2017-12-01

    The objective of the NASA SERVIR project is to assist developing countries in using information provided by Earth observing satellites to assess and manage climate risks, land use, and water resources. We present a collection of web apps that integrate earth observations and in situ data to facilitate deployment of data and water resources models as decision-making tools in support of this effort. The interactive nature of web apps makes this an excellent medium for creating decision support tools that harness cutting edge modeling techniques. Thin client apps hosted in a cloud portal eliminates the need for the decision makers to procure and maintain the high performance hardware required by the models, deal with issues related to software installation and platform incompatibilities, or monitor and install software updates, a problem that is exacerbated for many of the regional SERVIR hubs where both financial and technical capacity may be limited. All that is needed to use the system is an Internet connection and a web browser. We take advantage of these technologies to develop tools which can be centrally maintained but openly accessible. Advanced mapping and visualization make results intuitive and information derived actionable. We also take advantage of the emerging standards for sharing water information across the web using the OGC and WMO approved WaterML standards. This makes our tools interoperable and extensible via application programming interfaces (APIs) so that tools and data from other projects can both consume and share the tools developed in our project. Our approach enables the integration of multiple types of data and models, thus facilitating collaboration between science teams in SERVIR. The apps developed thus far by our team process time-varying netCDF files from Earth observations and large-scale computer simulations and allow visualization and exploration via raster animation and extraction of time series at selected points and/or regions.

  10. SensorDB: a virtual laboratory for the integration, visualization and analysis of varied biological sensor data.

    PubMed

    Salehi, Ali; Jimenez-Berni, Jose; Deery, David M; Palmer, Doug; Holland, Edward; Rozas-Larraondo, Pablo; Chapman, Scott C; Georgakopoulos, Dimitrios; Furbank, Robert T

    2015-01-01

    To our knowledge, there is no software or database solution that supports large volumes of biological time series sensor data efficiently and enables data visualization and analysis in real time. Existing solutions for managing data typically use unstructured file systems or relational databases. These systems are not designed to provide instantaneous response to user queries. Furthermore, they do not support rapid data analysis and visualization to enable interactive experiments. In large scale experiments, this behaviour slows research discovery, discourages the widespread sharing and reuse of data that could otherwise inform critical decisions in a timely manner and encourage effective collaboration between groups. In this paper we present SensorDB, a web based virtual laboratory that can manage large volumes of biological time series sensor data while supporting rapid data queries and real-time user interaction. SensorDB is sensor agnostic and uses web-based, state-of-the-art cloud and storage technologies to efficiently gather, analyse and visualize data. Collaboration and data sharing between different agencies and groups is thereby facilitated. SensorDB is available online at http://sensordb.csiro.au.

  11. MaNIDA: Integration of marine expedition information, data and publications: Data Portal of German Marine Research

    NASA Astrophysics Data System (ADS)

    Koppe, Roland; Scientific MaNIDA-Team

    2013-04-01

    The Marine Network for Integrated Data Access (MaNIDA) aims to build a sustainable e-infrastructure to support discovery and re-use of marine data from distinct data providers in Germany (see related abstracts in session ESSI 1.2). In order to provide users integrated access and retrieval of expedition or cruise metadata, data, services and publications as well as relationships among the various objects, we are developing (web) applications based on state of the art technologies: the Data Portal of German Marine Research. Since the German network of distributed content providers have distinct objectives and mandates for storing digital objects (e.g. long-term data preservation, near real time data, publication repositories), we have to cope with heterogeneous metadata in terms of syntax and semantic, data types and formats as well as access solutions. We have defined a set of core metadata elements which are common to our content providers and therefore useful for discovery and building relationships among objects. Existing catalogues for various types of vocabularies are being used to assure the mapping to community-wide used terms. We distinguish between expedition metadata and continuously harvestable metadata objects from distinct data providers. • Existing expedition metadata from distinct sources is integrated and validated in order to create an expedition metadata catalogue which is used as authoritative source for expedition-related content. The web application allows browsing by e.g. research vessel and date, exploring expeditions and research gaps by tracklines and viewing expedition details (begin/end, ports, platforms, chief scientists, events, etc.). Also expedition-related objects from harvesting are dynamically associated with expedition information and presented to the user. Hence we will provide web services to detailed expedition information. • Other harvestable content is separated into four categories: archived data and data products, near real time data, publications and reports. Reports are a special case of publication, describing cruise planning, cruise reports or popular reports on expeditions and are orthogonal to e.g. peer-reviewed articles. Each object's metadata contains at least: identifier(s) e.g. doi/hdl, title, author(s), date, expedition(s), platform(s) e.g. research vessel Polarstern. Furthermore project(s), parameter(s), device(s) and e.g. geographic coverage are of interest. An international gazetteer resolves geographic coverage to region names and annotates to object metadata. Information is homogenously presented to the user, independent of the underlying format, but adaptable to specific disciplines e.g. bathymetry. Also data access and dissemination information is available to the user as data download link or web services (e.g. WFS, WMS). Based on relationship metadata we are dynamically building graphs of objects to support the user in finding possible relevant associated objects. Technically metadata is based on ISO / OGC standards or provider specification. Metadata is harvested via OAI-PMH or OGC CSW and indexed with Apache Lucene. This enables powerful full-text search, geographic and temporal search as well as faceting. In this presentation we will illustrate the architecture and the current implementation of our integrated approach.

  12. Developing a Metadata Infrastructure to facilitate data driven science gateway and to provide Inspire/GEMINI compliance for CLIPC

    NASA Astrophysics Data System (ADS)

    Mihajlovski, Andrej; Plieger, Maarten; Som de Cerff, Wim; Page, Christian

    2016-04-01

    The CLIPC project is developing a portal to provide a single point of access for scientific information on climate change. This is made possible through the Copernicus Earth Observation Programme for Europe, which will deliver a new generation of environmental measurements of climate quality. The data about the physical environment which is used to inform climate change policy and adaptation measures comes from several categories: satellite measurements, terrestrial observing systems, model projections and simulations and from re-analyses (syntheses of all available observations constrained with numerical weather prediction systems). These data categories are managed by different communities: CLIPC will provide a single point of access for the whole range of data. The CLIPC portal will provide a number of indicators showing impacts on specific sectors which have been generated using a range of factors selected through structured expert consultation. It will also, as part of the transformation services, allow users to explore the consequences of using different combinations of driving factors which they consider to be of particular relevance to their work or life. The portal will provide information on the scientific quality and pitfalls of such transformations to prevent misleading usage of the results. The CLIPC project will develop an end to end processing chain (indicator tool kit), from comprehensive information on the climate state through to highly aggregated decision relevant products. Indicators of climate change and climate change impact will be provided, and a tool kit to update and post process the collection of indicators will be integrated into the portal. The CLIPC portal has a distributed architecture, making use of OGC services provided by e.g., climate4impact.eu and CEDA. CLIPC has two themes: 1. Harmonized access to climate datasets derived from models, observations and re-analyses 2. A climate impact tool kit to evaluate, rank and aggregate indicators Key is the availability of standardized metadata, describing indicator data and services. This will enable standardization and interoperability between the different distributed services of CLIPC. To disseminate CLIPC indicator data, transformed data products to enable impacts assessments and climate change impact indicators a standardized meta-data infrastructure is provided. The challenge is that compliance of existing metadata to INSPIRE ISO standards and GEMINI standards needs to be extended to further allow the web portal to be generated from the available metadata blueprint. The information provided in the headers of netCDF files available through multiple catalogues, allow us to generate ISO compliant meta data which is in turn used to generate web based interface content, as well as OGC compliant web services such as WCS and WMS for front end and WPS interactions for the scientific users to combine and generate new datasets. The goal of the metadata infrastructure is to provide a blueprint for creating a data driven science portal, generated from the underlying: GIS data, web services and processing infrastructure. In the presentation we will present the results and lessons learned.

  13. The EarthServer Geology Service: web coverage services for geosciences

    NASA Astrophysics Data System (ADS)

    Laxton, John; Sen, Marcus; Passmore, James

    2014-05-01

    The EarthServer FP7 project is implementing web coverage services using the OGC WCS and WCPS standards for a range of earth science domains: cryospheric; atmospheric; oceanographic; planetary; and geological. BGS is providing the geological service (http://earthserver.bgs.ac.uk/). Geoscience has used remote sensed data from satellites and planes for some considerable time, but other areas of geosciences are less familiar with the use of coverage data. This is rapidly changing with the development of new sensor networks and the move from geological maps to geological spatial models. The BGS geology service is designed initially to address two coverage data use cases and three levels of data access restriction. Databases of remote sensed data are typically very large and commonly held offline, making it time-consuming for users to assess and then download data. The service is designed to allow the spatial selection, editing and display of Landsat and aerial photographic imagery, including band selection and contrast stretching. This enables users to rapidly view data, assess is usefulness for their purposes, and then enhance and download it if it is suitable. At present the service contains six band Landsat 7 (Blue, Green, Red, NIR 1, NIR 2, MIR) and three band false colour aerial photography (NIR, green, blue), totalling around 1Tb. Increasingly 3D spatial models are being produced in place of traditional geological maps. Models make explicit spatial information implicit on maps and thus are seen as a better way of delivering geosciences information to non-geoscientists. However web delivery of models, including the provision of suitable visualisation clients, has proved more challenging than delivering maps. The EarthServer geology service is delivering 35 surfaces as coverages, comprising the modelled superficial deposits of the Glasgow area. These can be viewed using a 3D web client developed in the EarthServer project by Fraunhofer. As well as remote sensed imagery and 3D models, the geology service is also delivering DTM coverages which can be viewed in the 3D client in conjunction with both imagery and models. The service is accessible through a web GUI which allows the imagery to be viewed against a range of background maps and DTMs, and in the 3D client; spatial selection to be carried out graphically; the results of image enhancement to be displayed; and selected data to be downloaded. The GUI also provides access to the Glasgow model in the 3D client, as well as tutorial material. In the final year of the project it is intended to increase the volume of data to 20Tb and enhance the WCPS processing, including depth and thickness querying of 3D models. We have also investigated the use of GeoSciML, developed to describe and interchange the information on geological maps, to describe model surface coverages. EarthServer is developing a combined WCPS and xQuery query language, and we will investigate applying this to the GeoSciML described surfaces to answer questions such as 'find all units with a predominant sand lithology within 25m of the surface'.

  14. Home monitoring of patients with Parkinson's disease via wearable technology and a web-based application.

    PubMed

    Patel, Shyamal; Chen, Bor-Rong; Buckley, Thomas; Rednic, Ramona; McClure, Doug; Tarsy, Daniel; Shih, Ludy; Dy, Jennifer; Welsh, Matt; Bonato, Paolo

    2010-01-01

    Objective long-term health monitoring can improve the clinical management of several medical conditions ranging from cardiopulmonary diseases to motor disorders. In this paper, we present our work toward the development of a home-monitoring system. The system is currently used to monitor patients with Parkinson's disease who experience severe motor fluctuations. Monitoring is achieved using wireless wearable sensors whose data are relayed to a remote clinical site via a web-based application. The work herein presented shows that wearable sensors combined with a web-based application provide reliable quantitative information that can be used for clinical decision making.

  15. The Climate-G Portal: a Grid Enabled Scientifc Gateway for Climate Change

    NASA Astrophysics Data System (ADS)

    Fiore, Sandro; Negro, Alessandro; Aloisio, Giovanni

    2010-05-01

    Grid portals are web gateways aiming at concealing the underlying infrastructure through a pervasive, transparent, user-friendly, ubiquitous and seamless access to heterogeneous and geographical spread resources (i.e. storage, computational facilities, services, sensors, network, databases). Definitively they provide an enhanced problem-solving environment able to deal with modern, large scale scientific and engineering problems. Scientific gateways are able to introduce a revolution in the way scientists and researchers organize and carry out their activities. Access to distributed resources, complex workflow capabilities, and community-oriented functionalities are just some of the features that can be provided by such a web-based environment. In the context of the EGEE NA4 Earth Science Cluster, Climate-G is a distributed testbed focusing on climate change research topics. The Euro-Mediterranean Center for Climate Change (CMCC) is actively participating in the testbed providing the scientific gateway (Climate-G Portal) to access to the entire infrastructure. The Climate-G Portal has to face important and critical challenges as well as has to satisfy and address key requirements. In the following, the most relevant ones are presented and discussed. Transparency: the portal has to provide a transparent access to the underlying infrastructure preventing users from dealing with low level details and the complexity of a distributed grid environment. Security: users must be authenticated and authorized on the portal to access and exploit portal functionalities. A wide set of roles is needed to clearly assign the proper one to each user. The access to the computational grid must be completely secured, since the target infrastructure to run jobs is a production grid environment. A security infrastructure (based on X509v3 digital certificates) is strongly needed. Pervasivity and ubiquity: the access to the system must be pervasive and ubiquitous. This is easily true due to the nature of the needed web approach. Usability and simplicity: the portal has to provide simple, high level and user friendly interfaces to ease the access and exploitation of the entire system. Coexistence of general purpose and domain oriented services: along with general purpose services (file transfer, job submission, etc.), the portal has to provide domain based services and functionalities. Subsetting of data, visualization of 2D maps around a virtual globe, delivery of maps through OGC compliant interfaces (i.e. Web Map Service - WMS) are just some examples. Since april 2009, about 70 users (85% coming from the climate change community) got access to the portal. A key challenge of this work is the idea to provide users with an integrated working environment, that is a place where scientists can find huge amount of data, complete metadata support, a wide set of data access services, data visualization and analysis tools, easy access to the underlying grid infrastructure and advanced monitoring interfaces.

  16. RESTFul based heterogeneous Geoprocessing workflow interoperation for Sensor Web Service

    NASA Astrophysics Data System (ADS)

    Yang, Chao; Chen, Nengcheng; Di, Liping

    2012-10-01

    Advanced sensors on board satellites offer detailed Earth observations. A workflow is one approach for designing, implementing and constructing a flexible and live link between these sensors' resources and users. It can coordinate, organize and aggregate the distributed sensor Web services to meet the requirement of a complex Earth observation scenario. A RESTFul based workflow interoperation method is proposed to integrate heterogeneous workflows into an interoperable unit. The Atom protocols are applied to describe and manage workflow resources. The XML Process Definition Language (XPDL) and Business Process Execution Language (BPEL) workflow standards are applied to structure a workflow that accesses sensor information and one that processes it separately. Then, a scenario for nitrogen dioxide (NO2) from a volcanic eruption is used to investigate the feasibility of the proposed method. The RESTFul based workflows interoperation system can describe, publish, discover, access and coordinate heterogeneous Geoprocessing workflows.

  17. Eodataservice.org: Big Data Platform to Enable Multi-disciplinary Information Extraction from Geospatial Data

    NASA Astrophysics Data System (ADS)

    Natali, S.; Mantovani, S.; Barboni, D.; Hogan, P.

    2017-12-01

    In 1999, US Vice-President Al Gore outlined the concept of `Digital Earth' as a multi-resolution, three-dimensional representation of the planet to find, visualise and make sense of vast amounts of geo- referenced information on physical and social environments, allowing to navigate through space and time, accessing historical and forecast data to support scientists, policy-makers, and any other user. The eodataservice platform (http://eodataservice.org/) implements the Digital Earth Concept: eodatasevice is a cross-domain platform that makes available a large set of multi-year global environmental collections allowing data discovery, visualization, combination, processing and download. It implements a "virtual datacube" approach where data stored on distributed data centers are made available via standardized OGC-compliant interfaces. Dedicated web-based Graphic User Interfaces (based on the ESA-NASA WebWorldWind technology) as well as web-based notebooks (e.g. Jupyter notebook), deskop GIS tools and command line interfaces can be used to access and manipulate the data. The platform can be fully customized on users' needs. So far eodataservice has been used for the following thematic applications: High resolution satellite data distribution Land surface monitoring using SAR surface deformation data Atmosphere, ocean and climate applications Climate-health applications Urban Environment monitoring Safeguard of cultural heritage sites Support to farmers and (re)-insurances in the agriculturés field In the current work, the EO Data Service concept is presented as key enabling technology; furthermore various examples are provided to demonstrate the high level of interdisciplinarity of the platform.

  18. The GLIMS Glacier Database

    NASA Astrophysics Data System (ADS)

    Raup, B. H.; Khalsa, S. S.; Armstrong, R.

    2007-12-01

    The Global Land Ice Measurements from Space (GLIMS) project has built a geospatial and temporal database of glacier data, composed of glacier outlines and various scalar attributes. These data are being derived primarily from satellite imagery, such as from ASTER and Landsat. Each "snapshot" of a glacier is from a specific time, and the database is designed to store multiple snapshots representative of different times. We have implemented two web-based interfaces to the database; one enables exploration of the data via interactive maps (web map server), while the other allows searches based on text-field constraints. The web map server is an Open Geospatial Consortium (OGC) compliant Web Map Server (WMS) and Web Feature Server (WFS). This means that other web sites can display glacier layers from our site over the Internet, or retrieve glacier features in vector format. All components of the system are implemented using Open Source software: Linux, PostgreSQL, PostGIS (geospatial extensions to the database), MapServer (WMS and WFS), and several supporting components such as Proj.4 (a geographic projection library) and PHP. These tools are robust and provide a flexible and powerful framework for web mapping applications. As a service to the GLIMS community, the database contains metadata on all ASTER imagery acquired over glacierized terrain. Reduced-resolution of the images (browse imagery) can be viewed either as a layer in the MapServer application, or overlaid on the virtual globe within Google Earth. The interactive map application allows the user to constrain by time what data appear on the map. For example, ASTER or glacier outlines from 2002 only, or from Autumn in any year, can be displayed. The system allows users to download their selected glacier data in a choice of formats. The results of a query based on spatial selection (using a mouse) or text-field constraints can be downloaded in any of these formats: ESRI shapefiles, KML (Google Earth), MapInfo, GML (Geography Markup Language) and GMT (Generic Mapping Tools). This "clip-and-ship" function allows users to download only the data they are interested in. Our flexible web interfaces to the database, which includes various support layers (e.g. a layer to help collaborators identify satellite imagery over their region of expertise) will facilitate enhanced analysis to be undertaken on glacier systems, their distribution, and their impacts on other Earth systems.

  19. A SOA broker solution for standard discovery and access services: the GI-cat framework

    NASA Astrophysics Data System (ADS)

    Boldrini, Enrico

    2010-05-01

    GI-cat ideal users are data providers or service providers within the geoscience community. The former have their data already available through an access service (e.g. an OGC Web Service) and would have it published through a standard catalog service, in a seamless way. The latter would develop a catalog broker and let users query and access different geospatial resources through one or more standard interfaces and Application Profiles (AP) (e.g. OGC CSW ISO AP, CSW ebRIM/EO AP, etc.). GI-cat actually implements a broker components (i.e. a middleware service) which carries out distribution and mediation functionalities among "well-adopted" catalog interfaces and data access protocols. GI-cat also publishes different discovery interfaces: the OGC CSW ISO and ebRIM Application Profiles (the latter coming with support for the EO and CIM extension packages) and two different OpenSearch interfaces developed in order to explore Web 2.0 possibilities. An extended interface is also available to exploit all available GI-cat features, such as interruptible incremental queries and queries feedback. Interoperability tests performed in the context of different projects have also pointed out the importance to enforce compatibility with existing and wide-spread tools of the open source community (e.g. GeoNetwork and Deegree catalogs), which was then achieved. Based on a service-oriented framework of modular components, GI-cat can effectively be customized and tailored to support different deployment scenarios. In addition to the distribution functionality an harvesting approach has been lately experimented, allowing the user to switch between a distributed and a local search giving thus more possibilities to support different deployment scenarios. A configurator tool is available in order to enable an effective high level configuration of the broker service. A specific geobrowser was also naturally developed, for demonstrating the advanced GI-cat functionalities. This client, called GI-go, is an example of the possible applications which may be built on top of the GI-cat broker component. GI-go allows discovering and browsing of the available datasets, retrieving and evaluating their description and performing distributed queries according to any combination of the following criteria: geographic area, temporal interval, topic of interest (free-text and/or keyword selection are allowed) and data source (i.e. where, when, what, who). The results set of a query (e.g. datasets metadata) are then displayed in an incremental way leveraging the asynchronous interactions approach implemented by GI-cat. This feature allows the user to access the intermediate query results. Query interruption and feedback features are also provided to the user. Alternatively, user may perform a browsing task by selecting a catalog resource from the current configuration and navigate through its aggregated and/or leaf datasets. In both cases datasets metadata, expressed according to ISO 19139 (and also Dublin Core and ebRIM if available), are displayed for download, along with a resource portrayal and actual data access (when this is meaningful and possible). The GI-cat distributed catalog service has been successfully deployed and experimented in the framework of different projects and initiative, including the SeaDataNet FP6 project, GEOSS IP3 (Interoperability Process Pilot Project), GEOSS AIP-2 (Architectural Implementation Project - Phase 2), FP7 GENESI-DR, CNR GIIDA, FP7 EUROGEOSS and ESA HMA project.

  20. Sensor node for remote monitoring of waterborne disease-causing bacteria.

    PubMed

    Kim, Kyukwang; Myung, Hyun

    2015-05-05

    A sensor node for sampling water and checking for the presence of harmful bacteria such as E. coli in water sources was developed in this research. A chromogenic enzyme substrate assay method was used to easily detect coliform bacteria by monitoring the color change of the sampled water mixed with a reagent. Live webcam image streaming to the web browser of the end user with a Wi-Fi connected sensor node shows the water color changes in real time. The liquid can be manipulated on the web-based user interface, and also can be observed by webcam feeds. Image streaming and web console servers run on an embedded processor with an expansion board. The UART channel of the expansion board is connected to an external Arduino board and a motor driver to control self-priming water pumps to sample the water, mix the reagent, and remove the water sample after the test is completed. The sensor node can repeat water testing until the test reagent is depleted. The authors anticipate that the use of the sensor node developed in this research can decrease the cost and required labor for testing samples in a factory environment and checking the water quality of local water sources in developing countries.

  1. Exploring U.S Cropland - A Web Service based Cropland Data Layer Visualization, Dissemination and Querying System (Invited)

    NASA Astrophysics Data System (ADS)

    Yang, Z.; Han, W.; di, L.

    2010-12-01

    The National Agricultural Statistics Service (NASS) of the USDA produces the Cropland Data Layer (CDL) product, which is a raster-formatted, geo-referenced, U.S. crop specific land cover classification. These digital data layers are widely used for a variety of applications by universities, research institutions, government agencies, and private industry in climate change studies, environmental ecosystem studies, bioenergy production & transportation planning, environmental health research and agricultural production decision making. The CDL is also used internally by NASS for crop acreage and yield estimation. Like most geospatial data products, the CDL product is only available by CD/DVD delivery or online bulk file downloading via the National Research Conservation Research (NRCS) Geospatial Data Gateway (external users) or in a printed paper map format. There is no online geospatial information access and dissemination, no crop visualization & browsing, no geospatial query capability, nor online analytics. To facilitate the application of this data layer and to help disseminating the data, a web-service based CDL interactive map visualization, dissemination, querying system is proposed. It uses Web service based service oriented architecture, adopts open standard geospatial information science technology and OGC specifications and standards, and re-uses functions/algorithms from GeoBrain Technology (George Mason University developed). This system provides capabilities of on-line geospatial crop information access, query and on-line analytics via interactive maps. It disseminates all data to the decision makers and users via real time retrieval, processing and publishing over the web through standards-based geospatial web services. A CDL region of interest can also be exported directly to Google Earth for mashup or downloaded for use with other desktop application. This web service based system greatly improves equal-accessibility, interoperability, usability, and data visualization, facilitates crop geospatial information usage, and enables US cropland online exploring capability without any client-side software installation. It also greatly reduces the need for paper map and analysis report printing and media usages, and thus enhances low-carbon Agro-geoinformation dissemination for decision support.

  2. Data Integration Support for Data Served in the OPeNDAP and OGC Environments

    NASA Technical Reports Server (NTRS)

    McDonald, Kenneth R.; Wharton, Stephen W. (Technical Monitor)

    2006-01-01

    NASA is coordinating a technology development project to construct a gateway between system components built upon the Open-source Project for a Network Data AcceSs Protocol (OPeNDAP) and those made available made available via interfaces specified by the Open Geospatial Consortium (OGC). This project is funded though the Advanced Collaborative Connections for Earth-Sun System Science (ACCESS) Program and is a NASA contribution to the Committee on Earth Satellites (CEOS) Working Group on Information Systems and Services (WGISS). The motivation for the project is the set of data integration needs that have been expressed by the Coordinated Enhanced Observing Period (CEOP), an international program that is addressing the study of the global water cycle. CEOP is assembling a large collection in situ and satellite data and mode1 results from a wide variety of sources covering 35 sites around the globe. The data are provided by systems based on either the OPeNDAP or OGC protocols but the research community desires access to the full range of data and associated services from a single client. This presentation will discuss the current status of the OPeNDAP/OGC Gateway Project. The project is building upon an early prototype that illustrated the feasibility of such a gateway and which was demonstrated to the CEOP science community. In its first year as an ACCESS project, the effort has been has focused on the design of the catalog and data services that will be provided by the gateway and the mappings between the metadata and services provided in the two environments.

  3. Web-based Data Exploration, Exploitation and Visualization Tools for Satellite Sensor VIS/IR Calibration Applications

    NASA Astrophysics Data System (ADS)

    Gopalan, A.; Doelling, D. R.; Scarino, B. R.; Chee, T.; Haney, C.; Bhatt, R.

    2016-12-01

    The CERES calibration group at NASA/LaRC has developed and deployed a suite of online data exploration and visualization tools targeted towards a range of spaceborne VIS/IR imager calibration applications for the Earth Science community. These web-based tools are driven by the open-source R (Language for Statistical Computing and Visualization) with a web interface for the user to customize the results according to their application. The tool contains a library of geostationary and sun-synchronous imager spectral response functions (SRF), incoming solar spectra, SCIAMACHY and Hyperion Earth reflected visible hyper-spectral data, and IASI IR hyper-spectral data. The suite of six specific web-based tools was designed to provide critical information necessary for sensor cross-calibration. One of the challenges of sensor cross-calibration is accounting for spectral band differences and may introduce biases if not handled properly. The spectral band adjustment factors (SBAF) are a function of the earth target, atmospheric and cloud conditions or scene type and angular conditions, when obtaining sensor radiance pairs. The SBAF will need to be customized for each inter-calibration target and sensor pair. The advantages of having a community open source tool are: 1) only one archive of SCIAMACHY, Hyperion, and IASI datasets needs to be maintained, which is on the order of 50TB. 2) the framework will allow easy incorporation of new satellite SRFs and hyper-spectral datasets and associated coincident atmospheric and cloud properties, such as PW. 3) web tool or SBAF algorithm improvements or suggestions when incorporated can benefit the community at large. 4) The customization effort is on the user rather than on the host. In this paper we discuss each of these tools in detail and explore the variety of advanced options that can be used to constrain the results along with specific use cases to highlight the value-added by these datasets.

  4. A spatial information crawler for OpenGIS WFS

    NASA Astrophysics Data System (ADS)

    Jiang, Jun; Yang, Chong-jun; Ren, Ying-chao

    2008-10-01

    The growth of the internet makes it non-trivial to search for the accuracy information efficiently. Topical crawler, which is aiming at a certain area, attracts more and more intention now because it can help people to find out what they need. Furthermore, with the OpenGIS WFS (Web Feature Service) Specification developed by OGC (Open GIS Consortium), much more geospatial data providers adopt this protocol to publish their data on the internet. In this case, a crawler which is aiming at the WFS servers can help people to find the geospatial data from WFS servers. In this paper, we propose a prototype system of a WFS crawler based on the OpenGIS WFS Specification. The crawler architecture, working principles, and detailed function of each component are introduced. This crawler is capable of discovering WFS servers dynamically, saving and updating the service contents of the servers. The data collect by the crawler can be supported to a geospatial data search engine as its data source.

  5. Geospatial Brokering - Challenges and Future Directions

    NASA Astrophysics Data System (ADS)

    White, C. E.

    2012-12-01

    An important feature of many brokers is to facilitate straightforward human access to scientific data while maintaining programmatic access to it for system solutions. Standards-based protocols are critical for this, and there are a number of protocols to choose from. In this discussion, we will present a web application solution that leverages certain protocols - e.g., OGC CSW, REST, and OpenSearch - to provide programmatic as well as human access to geospatial resources. We will also discuss managing resources to reduce duplication yet increase discoverability, federated search solutions, and architectures that combine human-friendly interfaces with powerful underlying data management. The changing requirements witnessed in brokering solutions over time, our recent experience participating in the EarthCube brokering hack-a-thon, and evolving interoperability standards provide insight to future technological and philosophical directions planned for geospatial broker solutions. There has been much change over the past decade, but with the unprecedented data collaboration of recent years, in many ways the challenges and opportunities are just beginning.

  6. Useful Sensor Web Capabilities to Enable Progressive Mission Autonomy

    NASA Technical Reports Server (NTRS)

    Mandl, Dan

    2007-01-01

    This viewgraph presentation reviews using the Sensor Web capabilities as an enabling technology to allow for progressive autonomy of NASA space missions. The presentation reviews technical challenges for future missions, and some of the capabilities that exist to meet those challenges. To establish the ability of the technology to meet the challenges, experiments were conducted on three missions: Earth Observing 1 (EO-1), Cosmic Hot Interstellar Plasma Spectrometer (CHIPS) and Space Technology 5 (ST-5). These experiments are reviewed.

  7. In-plane ultrasonic velocity measurement of longitudinal and shear waves in the machine direction with transducers in rotating wheels

    DOEpatents

    Hall, M.S.; Jackson, T.G.; Knerr, C.

    1998-02-17

    An improved system for measuring the velocity of ultrasonic signals within the plane of moving web-like materials, such as paper, paperboard and the like. In addition to velocity measurements of ultrasonic signals in the plane of the web in the MD and CD, one embodiment of the system in accordance with the present invention is also adapted to provide on-line indication of the polar specific stiffness of the moving web. In another embodiment of the invention, the velocity of ultrasonic signals in the plane of the web are measured by way of a plurality of ultrasonic transducers carried by synchronously driven wheels or cylinders, thus eliminating undue transducer wear due to any speed differences between the transducers and the web. In order to provide relatively constant contact force between the transducers and the webs, the transducers are mounted in a sensor housings which include a spring for biasing the transducer radially outwardly. The sensor housings are adapted to be easily and conveniently mounted to the carrier to provide a relatively constant contact force between the transducers and the moving web. 37 figs.

  8. In-plane ultrasonic velocity measurement of longitudinal and shear waves in the machine direction with transducers in rotating wheels

    DOEpatents

    Hall, Maclin S.; Jackson, Theodore G.; Knerr, Christopher

    1998-02-17

    An improved system for measuring the velocity of ultrasonic signals within the plane of moving web-like materials, such as paper, paperboard and the like. In addition to velocity measurements of ultrasonic signals in the plane of the web in the MD and CD, one embodiment of the system in accordance with the present invention is also adapted to provide on-line indication of the polar specific stiffness of the moving web. In another embodiment of the invention, the velocity of ultrasonic signals in the plane of the web are measured by way of a plurality of ultrasonic transducers carried by synchronously driven wheels or cylinders, thus eliminating undue transducer wear due to any speed differences between the transducers and the web. In order to provide relatively constant contact force between the transducers and the webs, the transducers are mounted in a sensor housings which include a spring for biasing the transducer radially outwardly. The sensor housings are adapted to be easily and conveniently mounted to the carrier to provide a relatively constant contact force between the transducers and the moving web.

  9. An intelligent tool for activity data collection.

    PubMed

    Sarkar, A M Jehad

    2011-01-01

    Activity recognition systems using simple and ubiquitous sensors require a large variety of real-world sensor data for not only evaluating their performance but also training the systems for better functioning. However, a tremendous amount of effort is required to setup an environment for collecting such data. For example, expertise and resources are needed to design and install the sensors, controllers, network components, and middleware just to perform basic data collections. It is therefore desirable to have a data collection method that is inexpensive, flexible, user-friendly, and capable of providing large and diverse activity datasets. In this paper, we propose an intelligent activity data collection tool which has the ability to provide such datasets inexpensively without physically deploying the testbeds. It can be used as an inexpensive and alternative technique to collect human activity data. The tool provides a set of web interfaces to create a web-based activity data collection environment. It also provides a web-based experience sampling tool to take the user's activity input. The tool generates an activity log using its activity knowledge and the user-given inputs. The activity knowledge is mined from the web. We have performed two experiments to validate the tool's performance in producing reliable datasets.

  10. The Global Sensor Web: A Platform for Citizen Science (Invited)

    NASA Astrophysics Data System (ADS)

    Simons, A. L.

    2013-12-01

    The Global Sensor Web (GSW) is an effort to provide an infrastructure for the collection, sharing and visualizing sensor data from around the world. Over the past three years the GSW has been developed and tested as a standardized platform for citizen science. The most developed of the citizen science projects built onto the GSW has been Distributed Electronic Cosmic-ray Observatory (DECO), which is an Android application designed to harness a global network of mobile devices, to detect the origin and behavior of the cosmic radiation. Other projects which can be readily built on top of GSW as a platform are also discussed. A cosmic-ray track candidate captured on a cell phone camera.

  11. Sensor web enables rapid response to volcanic activity

    USGS Publications Warehouse

    Davies, Ashley G.; Chien, Steve; Wright, Robert; Miklius, Asta; Kyle, Philip R.; Welsh, Matt; Johnson, Jeffrey B.; Tran, Daniel; Schaffer, Steven R.; Sherwood, Robert

    2006-01-01

    Rapid response to the onset of volcanic activity allows for the early assessment of hazard and risk [Tilling, 1989]. Data from remote volcanoes and volcanoes in countries with poor communication infrastructure can only be obtained via remote sensing [Harris et al., 2000]. By linking notifications of activity from ground-based and spacebased systems, these volcanoes can be monitored when they erupt.Over the last 18 months, NASA's Jet Propulsion Laboratory (JPL) has implemented a Volcano Sensor Web (VSW) in which data from ground-based and space-based sensors that detect current volcanic activity are used to automatically trigger the NASA Earth Observing 1 (EO-1) spacecraft to make highspatial-resolution observations of these volcanoes.

  12. Automatic Control of Silicon Melt Level

    NASA Technical Reports Server (NTRS)

    Duncan, C. S.; Stickel, W. B.

    1982-01-01

    A new circuit, when combined with melt-replenishment system and melt level sensor, offers continuous closed-loop automatic control of melt-level during web growth. Installed on silicon-web furnace, circuit controls melt-level to within 0.1 mm for as long as 8 hours. Circuit affords greater area growth rate and higher web quality, automatic melt-level control also allows semiautomatic growth of web over long periods which can greatly reduce costs.

  13. A Forest Fire Sensor Web Concept with UAVSAR

    NASA Astrophysics Data System (ADS)

    Lou, Y.; Chien, S.; Clark, D.; Doubleday, J.; Muellerschoen, R.; Zheng, Y.

    2008-12-01

    We developed a forest fire sensor web concept with a UAVSAR-based smart sensor and onboard automated response capability that will allow us to monitor fire progression based on coarse initial information provided by an external source. This autonomous disturbance detection and monitoring system combines the unique capabilities of imaging radar with high throughput onboard processing technology and onboard automated response capability based on specific science algorithms. In this forest fire sensor web scenario, a fire is initially located by MODIS/RapidFire or a ground-based fire observer. This information is transmitted to the UAVSAR onboard automated response system (CASPER). CASPER generates a flight plan to cover the alerted fire area and executes the flight plan. The onboard processor generates the fuel load map from raw radar data, used with wind and elevation information, predicts the likely fire progression. CASPER then autonomously alters the flight plan to track the fire progression, providing this information to the fire fighting team on the ground. We can also relay the precise fire location to other remote sensing assets with autonomous response capability such as Earth Observation-1 (EO-1)'s hyper-spectral imager to acquire the fire data.

  14. Pervasive sensing

    NASA Astrophysics Data System (ADS)

    Nagel, David J.

    2000-11-01

    The coordinated exploitation of modern communication, micro- sensor and computer technologies makes it possible to give global reach to our senses. Web-cameras for vision, web- microphones for hearing and web-'noses' for smelling, plus the abilities to sense many factors we cannot ordinarily perceive, are either available or will be soon. Applications include (1) determination of weather and environmental conditions on dense grids or over large areas, (2) monitoring of energy usage in buildings, (3) sensing the condition of hardware in electrical power distribution and information systems, (4) improving process control and other manufacturing, (5) development of intelligent terrestrial, marine, aeronautical and space transportation systems, (6) managing the continuum of routine security monitoring, diverse crises and military actions, and (7) medicine, notably the monitoring of the physiology and living conditions of individuals. Some of the emerging capabilities, such as the ability to measure remotely the conditions inside of people in real time, raise interesting social concerns centered on privacy issues. Methods for sensor data fusion and designs for human-computer interfaces are both crucial for the full realization of the potential of pervasive sensing. Computer-generated virtual reality, augmented with real-time sensor data, should be an effective means for presenting information from distributed sensors.

  15. A Novel Petri Nets-Based Modeling Method for the Interaction between the Sensor and the Geographic Environment in Emerging Sensor Networks

    PubMed Central

    Zhang, Feng; Xu, Yuetong; Chou, Jarong

    2016-01-01

    The service of sensor device in Emerging Sensor Networks (ESNs) is the extension of traditional Web services. Through the sensor network, the service of sensor device can communicate directly with the entity in the geographic environment, and even impact the geographic entity directly. The interaction between the sensor device in ESNs and geographic environment is very complex, and the interaction modeling is a challenging problem. This paper proposed a novel Petri Nets-based modeling method for the interaction between the sensor device and the geographic environment. The feature of the sensor device service in ESNs is more easily affected by the geographic environment than the traditional Web service. Therefore, the response time, the fault-tolerant ability and the resource consumption become important factors in the performance of the whole sensor application system. Thus, this paper classified IoT services as Sensing services and Controlling services according to the interaction between IoT service and geographic entity, and classified GIS services as data services and processing services. Then, this paper designed and analyzed service algebra and Colored Petri Nets model to modeling the geo-feature, IoT service, GIS service and the interaction process between the sensor and the geographic enviroment. At last, the modeling process is discussed by examples. PMID:27681730

  16. The application of geography markup language (GML) to the geological sciences

    NASA Astrophysics Data System (ADS)

    Lake, Ron

    2005-11-01

    GML 3.0 became an adopted specification of the Open Geospatial Consortium (OGC) in January 2003, and is rapidly emerging as the world standard for the encoding, transport and storage of all forms of geographic information. This paper looks at the application of GML to one of the more challenging areas of automated geography, namely the geological sciences. Specific features of GML of interest to geologists are discussed and then illustrated through a series of geological case studies. We conclude the paper with a discussion of anticipated geological web services that GML will enable. GML is written in XML and makes use of XML Schema for extensibility. It can be used both to represent or model geographic objects and to transport them across the Internet. In this way it serves as the foundation for all manner of geographic web services. Unlike vertical application grammars such as LandXML, GML was intended to define geographic application languages, and hence is applicable to any geographic domain including forestry, environmental sciences, geology and oceanography. This paper provides a review of the basic features of GML that are fundamental to the geological sciences including geometry, coverages, observations, reference systems and temporality. These constructs are then employed in a series of simple geological case studies including structural geological description, surficial geology, representation of geological time scales, mineral occurrences, geohazards and geochemical reconnaissance.

  17. Harvesting data from advanced technologies.

    DOT National Transportation Integrated Search

    2014-11-01

    Data streams are emerging everywhere such as Web logs, Web page click streams, sensor data streams, and credit card transaction flows. : Different from traditional data sets, data streams are sequentially generated and arrive one by one rather than b...

  18. Electronic sensor and actuator webs for large-area complex geometry cardiac mapping and therapy

    PubMed Central

    Kim, Dae-Hyeong; Ghaffari, Roozbeh; Lu, Nanshu; Wang, Shuodao; Lee, Stephen P.; Keum, Hohyun; D’Angelo, Robert; Klinker, Lauren; Su, Yewang; Lu, Chaofeng; Kim, Yun-Soung; Ameen, Abid; Li, Yuhang; Zhang, Yihui; de Graff, Bassel; Hsu, Yung-Yu; Liu, ZhuangJian; Ruskin, Jeremy; Xu, Lizhi; Lu, Chi; Omenetto, Fiorenzo G.; Huang, Yonggang; Mansour, Moussa; Slepian, Marvin J.; Rogers, John A.

    2012-01-01

    Curved surfaces, complex geometries, and time-dynamic deformations of the heart create challenges in establishing intimate, nonconstraining interfaces between cardiac structures and medical devices or surgical tools, particularly over large areas. We constructed large area designs for diagnostic and therapeutic stretchable sensor and actuator webs that conformally wrap the epicardium, establishing robust contact without sutures, mechanical fixtures, tapes, or surgical adhesives. These multifunctional web devices exploit open, mesh layouts and mount on thin, bio-resorbable sheets of silk to facilitate handling in a way that yields, after dissolution, exceptionally low mechanical moduli and thicknesses. In vivo studies in rabbit and pig animal models demonstrate the effectiveness of these device webs for measuring and spatially mapping temperature, electrophysiological signals, strain, and physical contact in sheet and balloon-based systems that also have the potential to deliver energy to perform localized tissue ablation. PMID:23150574

  19. Secure Sensor Semantic Web and Information Fusion

    DTIC Science & Technology

    2014-06-25

    data acquired and transmitted by wireless sensor networks (WSNs). In a WSN, due to a need for robustness of monitoring and low cost of the nodes...3 S. Ozdemir and Y. Xiao, “Secure data aggregation in wireless sensor networks : A comprehensive overview...Elisa Bertino, and Somesh Jha: Secure data aggregation technique for wireless sensor networks in the presence of collusion attacks. To appear in

  20. Dissemination of satellite-based river discharge and flood data

    NASA Astrophysics Data System (ADS)

    Kettner, A. J.; Brakenridge, G. R.; van Praag, E.; de Groeve, T.; Slayback, D. A.; Cohen, S.

    2014-12-01

    In collaboration with NASA Goddard Spaceflight Center and the European Commission Joint Research Centre, the Dartmouth Flood Observatory (DFO) daily measures and distributes: 1) river discharges, and 2) near real-time flood extents with a global coverage. Satellite-based passive microwave sensors and hydrological modeling are utilized to establish 'remote-sensing based discharge stations', and observed time series cover 1998 to the present. The advantages over in-situ gauged discharges are: a) easy access to remote or due to political reasons isolated locations, b) relatively low maintenance costs to maintain a continuous observational record, and c) the capability to obtain measurements during floods, hazardous conditions that often impair or destroy in-situ stations. Two MODIS instruments aboard the NASA Terra and Aqua satellites provide global flood extent coverage at a spatial resolution of 250m. Cloud cover hampers flood extent detection; therefore we ingest 6 images (the Terra and Aqua images of each day, for three days), in combination with a cloud shadow filter, to provide daily global flood extent updates. The Flood Observatory has always made it a high priority to visualize and share its data and products through its website. Recent collaborative efforts with e.g. GeoSUR have enhanced accessibility of DFO data. A web map service has been implemented to automatically disseminate geo-referenced flood extent products into client-side GIS software. For example, for Latin America and the Caribbean region, the GeoSUR portal now displays current flood extent maps, which can be integrated and visualized with other relevant geographical data. Furthermore, the flood state of satellite-observed river discharge sites are displayed through the portal as well. Additional efforts include implementing Open Geospatial Consortium (OGC) standards to incorporate Water Markup Language (WaterML) data exchange mechanisms to further facilitate the distribution of the satellite gauged river discharge time series.

  1. A Service Oriented Architecture to Enable Sensor Webs

    NASA Technical Reports Server (NTRS)

    Sohlberg, Rob; Frye, Stu; Cappelaere, Pat; Ungar, Steve; Ames, Troy; Chien, Steve

    2006-01-01

    This viewgraph presentation reviews the development of a Service Oriented Architecture to assist in lowering the cost of new Earth Science products. This architecture will enable rapid and cost effective reconfiguration of new sensors.

  2. Software Framework for Development of Web-GIS Systems for Analysis of Georeferenced Geophysical Data

    NASA Astrophysics Data System (ADS)

    Okladnikov, I.; Gordov, E. P.; Titov, A. G.

    2011-12-01

    Georeferenced datasets (meteorological databases, modeling and reanalysis results, remote sensing products, etc.) are currently actively used in numerous applications including modeling, interpretation and forecast of climatic and ecosystem changes for various spatial and temporal scales. Due to inherent heterogeneity of environmental datasets as well as their size which might constitute up to tens terabytes for a single dataset at present studies in the area of climate and environmental change require a special software support. A dedicated software framework for rapid development of providing such support information-computational systems based on Web-GIS technologies has been created. The software framework consists of 3 basic parts: computational kernel developed using ITTVIS Interactive Data Language (IDL), a set of PHP-controllers run within specialized web portal, and JavaScript class library for development of typical components of web mapping application graphical user interface (GUI) based on AJAX technology. Computational kernel comprise of number of modules for datasets access, mathematical and statistical data analysis and visualization of results. Specialized web-portal consists of web-server Apache, complying OGC standards Geoserver software which is used as a base for presenting cartographical information over the Web, and a set of PHP-controllers implementing web-mapping application logic and governing computational kernel. JavaScript library aiming at graphical user interface development is based on GeoExt library combining ExtJS Framework and OpenLayers software. Based on the software framework an information-computational system for complex analysis of large georeferenced data archives was developed. Structured environmental datasets available for processing now include two editions of NCEP/NCAR Reanalysis, JMA/CRIEPI JRA-25 Reanalysis, ECMWF ERA-40 Reanalysis, ECMWF ERA Interim Reanalysis, MRI/JMA APHRODITE's Water Resources Project Reanalysis, meteorological observational data for the territory of the former USSR for the 20th century, and others. Current version of the system is already involved into a scientific research process. Particularly, recently the system was successfully used for analysis of Siberia climate changes and its impact in the region. The software framework presented allows rapid development of Web-GIS systems for geophysical data analysis thus providing specialists involved into multidisciplinary research projects with reliable and practical instruments for complex analysis of climate and ecosystems changes on global and regional scales. This work is partially supported by RFBR grants #10-07-00547, #11-05-01190, and SB RAS projects 4.31.1.5, 4.31.2.7, 4, 8, 9, 50 and 66.

  3. Towards Web-based representation and processing of health information

    PubMed Central

    Gao, Sheng; Mioc, Darka; Yi, Xiaolun; Anton, Francois; Oldfield, Eddie; Coleman, David J

    2009-01-01

    Background There is great concern within health surveillance, on how to grapple with environmental degradation, rapid urbanization, population mobility and growth. The Internet has emerged as an efficient way to share health information, enabling users to access and understand data at their fingertips. Increasingly complex problems in the health field require increasingly sophisticated computer software, distributed computing power, and standardized data sharing. To address this need, Web-based mapping is now emerging as an important tool to enable health practitioners, policy makers, and the public to understand spatial health risks, population health trends and vulnerabilities. Today several web-based health applications generate dynamic maps; however, for people to fully interpret the maps they need data source description and the method used in the data analysis or statistical modeling. For the representation of health information through Web-mapping applications, there still lacks a standard format to accommodate all fixed (such as location) and variable (such as age, gender, health outcome, etc) indicators in the representation of health information. Furthermore, net-centric computing has not been adequately applied to support flexible health data processing and mapping online. Results The authors of this study designed a HEalth Representation XML (HERXML) schema that consists of the semantic (e.g., health activity description, the data sources description, the statistical methodology used for analysis), geometric, and cartographical representations of health data. A case study has been carried on the development of web application and services within the Canadian Geospatial Data Infrastructure (CGDI) framework for community health programs of the New Brunswick Lung Association. This study facilitated the online processing, mapping and sharing of health information, with the use of HERXML and Open Geospatial Consortium (OGC) services. It brought a new solution in better health data representation and initial exploration of the Web-based processing of health information. Conclusion The designed HERXML has been proven to be an appropriate solution in supporting the Web representation of health information. It can be used by health practitioners, policy makers, and the public in disease etiology, health planning, health resource management, health promotion and health education. The utilization of Web-based processing services in this study provides a flexible way for users to select and use certain processing functions for health data processing and mapping via the Web. This research provides easy access to geospatial and health data in understanding the trends of diseases, and promotes the growth and enrichment of the CGDI in the public health sector. PMID:19159445

  4. Global Test Range: Toward Airborne Sensor Webs

    NASA Technical Reports Server (NTRS)

    Mace, Thomas H.; Freudinger, Larry; DelFrate John H.

    2008-01-01

    This viewgraph presentation reviews the planned global sensor network that will monitor the Earth's climate, and resources using airborne sensor systems. The vision is an intelligent, affordable Earth Observation System. Global Test Range is a lab developing trustworthy services for airborne instruments - a specialized Internet Service Provider. There is discussion of several current and planned missions.

  5. Net-Centric Sensors and Data Sources (N-CSDS) GEODSS Sidecar

    NASA Astrophysics Data System (ADS)

    Richmond, D.

    2012-09-01

    Vast amounts of Space Situational Sensor data is collected each day on closed, legacy systems. Massachusetts Institute of Technology Lincoln Laboratory (MIT/LL) developed a Net-Centric approach to expose this data under the Extended Space Sensors Architecture (ESSA) Advanced Concept Technology Demonstration (ACTD). The Net-Centric Sensors and Data Sources (N-CSDS) Ground-based Electro Optical Deep Space Surveillance (GEODSS) Sidecar is the next generation that moves the ESSA ACTD engineering tools to an operational baseline. The N-CSDS GEODSS sidecar high level architecture will be presented, highlighting the features that supports deployment at multiple diverse sensor sites. Other key items that will be covered include: 1) The Web Browser interface to perform searches of historical data 2) The capabilities of the deployed Web Services and example service request/responses 3) Example data and potential user applications will be highlighted 4) Specifics regarding the process to gain access to the N-CSDS GEODSS sensor data in near real time 5) Current status and future deployment plans (Including plans for deployment to the Maui GEODSS Site)

  6. Sensor Webs: Autonomous Rapid Response to Monitor Transient Science Events

    NASA Technical Reports Server (NTRS)

    Mandl, Dan; Grosvenor, Sandra; Frye, Stu; Sherwood, Robert; Chien, Steve; Davies, Ashley; Cichy, Ben; Ingram, Mary Ann; Langley, John; Miranda, Felix

    2005-01-01

    To better understand how physical phenomena, such as volcanic eruptions, evolve over time, multiple sensor observations over the duration of the event are required. Using sensor web approaches that integrate original detections by in-situ sensors and global-coverage, lower-resolution, on-orbit assets with automated rapid response observations from high resolution sensors, more observations of significant events can be made with increased temporal, spatial, and spectral resolution. This paper describes experiments using Earth Observing 1 (EO-1) along with other space and ground assets to implement progressive mission autonomy to identify, locate and image with high resolution instruments phenomena such as wildfires, volcanoes, floods and ice breakup. The software that plans, schedules and controls the various satellite assets are used to form ad hoc constellations which enable collaborative autonomous image collections triggered by transient phenomena. This software is both flight and ground based and works in concert to run all of the required assets cohesively and includes software that is model-based, artificial intelligence software.

  7. Coupling flood forecasting and social media crowdsourcing

    NASA Astrophysics Data System (ADS)

    Kalas, Milan; Kliment, Tomas; Salamon, Peter

    2016-04-01

    Social and mainstream media monitoring is being more and more recognized as valuable source of information in disaster management and response. The information on ongoing disasters could be detected in very short time and the social media can bring additional information to traditional data feeds (ground, remote observation schemes). Probably the biggest attempt to use the social media in the crisis management was the activation of the Digital Humanitarian Network by the United Nations Office for the Coordination of Humanitarian Affairs in response to Typhoon Yolanda. The network of volunteers performing rapid needs & damage assessment by tagging reports posted to social media which were then used by machine learning classifiers as a training set to automatically identify tweets referring to both urgent needs and offers of help. In this work we will present the potential of coupling a social media streaming and news monitoring application ( GlobalFloodNews - www.globalfloodsystem.com) with a flood forecasting system (www.globalfloods.eu) and the geo-catalogue of the OGC services discovered in the Google Search Engine (WMS, WFS, WCS, etc.) to provide a full suite of information available to crisis management centers as fast as possible. In GlobalFloodNews we use advanced filtering of the real-time Twitter stream, where the relevant information is automatically extracted using natural language and signal processing techniques. The keyword filters are adjusted and optimized automatically using machine learning algorithms as new reports are added to the system. In order to refine the search results the forecasting system will be triggering an event-based search on the social media and OGC services relevant for crisis response (population distribution, critical infrastructure, hospitals etc.). The current version of the system makes use of USHAHIDI Crowdmap platform, which is designed to easily crowdsource information using multiple channels, including SMS, email, Twitter and the web we want to show the potential of monitoring floods at the global scale.

  8. EarthServer: Cross-Disciplinary Earth Science Through Data Cube Analytics

    NASA Astrophysics Data System (ADS)

    Baumann, P.; Rossi, A. P.

    2016-12-01

    The unprecedented increase of imagery, in-situ measurements, and simulation data produced by Earth (and Planetary) Science observations missions bears a rich, yet not leveraged potential for getting insights from integrating such diverse datasets and transform scientific questions into actual queries to data, formulated in a standardized way.The intercontinental EarthServer [1] initiative is demonstrating new directions for flexible, scalable Earth Science services based on innovative NoSQL technology. Researchers from Europe, the US and Australia have teamed up to rigorously implement the concept of the datacube. Such a datacube may have spatial and temporal dimensions (such as a satellite image time series) and may unite an unlimited number of scenes. Independently from whatever efficient data structuring a server network may perform internally, users (scientist, planners, decision makers) will always see just a few datacubes they can slice and dice.EarthServer has established client [2] and server technology for such spatio-temporal datacubes. The underlying scalable array engine, rasdaman [3,4], enables direct interaction, including 3-D visualization, common EO data processing, and general analytics. Services exclusively rely on the open OGC "Big Geo Data" standards suite, the Web Coverage Service (WCS). Conversely, EarthServer has shaped and advanced WCS based on the experience gained. The first phase of EarthServer has advanced scalable array database technology into 150+ TB services. Currently, Petabyte datacubes are being built for ad-hoc and cross-disciplinary querying, e.g. using climate, Earth observation and ocean data.We will present the EarthServer approach, its impact on OGC / ISO / INSPIRE standardization, and its platform technology, rasdaman.References: [1] Baumann, et al. (2015) DOI: 10.1080/17538947.2014.1003106 [2] Hogan, P., (2011) NASA World Wind, Proceedings of the 2nd International Conference on Computing for Geospatial Research & Applications ACM. [3] Baumann, Peter, et al. (2014) In Proc. 10th ICDM, 194-201. [4] Dumitru, A. et al. (2014) In Proc ACM SIGMOD Workshop on Data Analytics in the Cloud (DanaC'2014), 1-4.

  9. Climate Data Service in the FP7 EarthServer Project

    NASA Astrophysics Data System (ADS)

    Mantovani, Simone; Natali, Stefano; Barboni, Damiano; Grazia Veratelli, Maria

    2013-04-01

    EarthServer is a European Framework Program project that aims at developing and demonstrating the usability of open standards (OGC and W3C) in the management of multi-source, any-size, multi-dimensional spatio-temporal data - in short: "Big Earth Data Analytics". In order to demonstrate the feasibility of the approach, six thematic Lighthouse Applications (Cryospheric Science, Airborne Science, Atmospheric/ Climate Science, Geology, Oceanography, and Planetary Science), each with 100+ TB, are implemented. Scope of the Atmospheric/Climate lighthouse application (Climate Data Service) is to implement the system containing global to regional 2D / 3D / 4D datasets retrieved either from satellite observations, from numerical modelling and in-situ observations. Data contained in the Climate Data Service regard atmospheric profiles of temperature / humidity, aerosol content, AOT, and cloud properties provided by entities such as the European Centre for Mesoscale Weather Forecast (ECMWF), the Austrian Meteorological Service (Zentralanstalt für Meteorologie und Geodynamik - ZAMG), the Italian National Agency for new technologies, energies and sustainable development (ENEA), and the Sweden's Meteorological and Hydrological Institute (Sveriges Meteorologiska och Hydrologiska Institut -- SMHI). The system, through an easy-to-use web application permits to browse the loaded data, visualize their temporal evolution on a specific point with the creation of 2D graphs of a single field, or compare different fields on the same point (e.g. temperatures from different models and satellite observations), and visualize maps of specific fields superimposed with high resolution background maps. All data access operations and display are performed by means of OGC standard operations namely WMS, WCS and WCPS. The EarthServer project has just started its second year over a 3-years development plan: the present status the system contains subsets of the final database, with the scope of demonstrating I/O modules and visualization tools. At the end of the project all datasets will be available to the users.

  10. Brokering technologies to realize the hydrology scenario in NSF BCube

    NASA Astrophysics Data System (ADS)

    Boldrini, Enrico; Easton, Zachary; Fuka, Daniel; Pearlman, Jay; Nativi, Stefano

    2015-04-01

    In the National Science Foundation (NSF) BCube project an international team composed of cyber infrastructure experts, geoscientists, social scientists and educators are working together to explore the use of brokering technologies, initially focusing on four domains: hydrology, oceans, polar, and weather. In the hydrology domain, environmental models are fundamental to understand the behaviour of hydrological systems. A specific model usually requires datasets coming from different disciplines for its initialization (e.g. elevation models from Earth observation, weather data from Atmospheric sciences, etc.). Scientific datasets are usually available on heterogeneous publishing services, such as inventory and access services (e.g. OGC Web Coverage Service, THREDDS Data Server, etc.). Indeed, datasets are published according to different protocols, moreover they usually come in different formats, resolutions, Coordinate Reference Systems (CRSs): in short different grid environments depending on the original data and the publishing service processing capabilities. Scientists can thus be impeded by the burden of discovery, access and normalize the desired datasets to the grid environment required by the model. These technological tasks of course divert scientists from their main, scientific goals. The use of GI-axe brokering framework has been experimented in a hydrology scenario where scientists needed to compare a particular hydrological model with two different input datasets (digital elevation models): - the Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) dataset, v.2. - the Shuttle Radar Topography Mission (SRTM) dataset, v.3. These datasets were published by means of Hyrax Server technology, which can provide NetCDF files at their original resolution and CRS. Scientists had their model running on ArcGIS, so the main goal was to import the datasets using the available ArcPy library and have EPSG:4326 with the same resolution grid as the reference system, so that model outputs could be compared. ArcPy however is able to access only GeoTIff datasets that are published by a OGC Web Coverage Service (WCS). The GI-axe broker has then been deployed between the client application and the data providers. It has been configured to broker the two different Hyrax service endpoints and republish the data content through a WCS interface for the use of the ArcPy library. Finally, scientists were able to easily run the model, and to concentrate on the comparison of the different results obtained according to the selected input dataset. The use of a third party broker to perform such technological tasks has also shown to have the potential advantage of increasing the repeatability of a study among different researchers.

  11. Prototyping a Web-of-Energy Architecture for Smart Integration of Sensor Networks in Smart Grids Domain.

    PubMed

    Caballero, Víctor; Vernet, David; Zaballos, Agustín; Corral, Guiomar

    2018-01-30

    Sensor networks and the Internet of Things have driven the evolution of traditional electric power distribution networks towards a new paradigm referred to as Smart Grid. However, the different elements that compose the Information and Communication Technologies (ICTs) layer of a Smart Grid are usually conceived as isolated systems that typically result in rigid hardware architectures which are hard to interoperate, manage, and to adapt to new situations. If the Smart Grid paradigm has to be presented as a solution to the demand for distributed and intelligent energy management system, it is necessary to deploy innovative IT infrastructures to support these smart functions. One of the main issues of Smart Grids is the heterogeneity of communication protocols used by the smart sensor devices that integrate them. The use of the concept of the Web of Things is proposed in this work to tackle this problem. More specifically, the implementation of a Smart Grid's Web of Things, coined as the Web of Energy is introduced. The purpose of this paper is to propose the usage of Web of Energy by means of the Actor Model paradigm to address the latent deployment and management limitations of Smart Grids. Smart Grid designers can use the Actor Model as a design model for an infrastructure that supports the intelligent functions demanded and is capable of grouping and converting the heterogeneity of traditional infrastructures into the homogeneity feature of the Web of Things. Conducted experimentations endorse the feasibility of this solution and encourage practitioners to point their efforts in this direction.

  12. Prototyping a Web-of-Energy Architecture for Smart Integration of Sensor Networks in Smart Grids Domain

    PubMed Central

    Vernet, David; Corral, Guiomar

    2018-01-01

    Sensor networks and the Internet of Things have driven the evolution of traditional electric power distribution networks towards a new paradigm referred to as Smart Grid. However, the different elements that compose the Information and Communication Technologies (ICTs) layer of a Smart Grid are usually conceived as isolated systems that typically result in rigid hardware architectures which are hard to interoperate, manage, and to adapt to new situations. If the Smart Grid paradigm has to be presented as a solution to the demand for distributed and intelligent energy management system, it is necessary to deploy innovative IT infrastructures to support these smart functions. One of the main issues of Smart Grids is the heterogeneity of communication protocols used by the smart sensor devices that integrate them. The use of the concept of the Web of Things is proposed in this work to tackle this problem. More specifically, the implementation of a Smart Grid’s Web of Things, coined as the Web of Energy is introduced. The purpose of this paper is to propose the usage of Web of Energy by means of the Actor Model paradigm to address the latent deployment and management limitations of Smart Grids. Smart Grid designers can use the Actor Model as a design model for an infrastructure that supports the intelligent functions demanded and is capable of grouping and converting the heterogeneity of traditional infrastructures into the homogeneity feature of the Web of Things. Conducted experimentations endorse the feasibility of this solution and encourage practitioners to point their efforts in this direction. PMID:29385748

  13. Objectively Optimized Observation Direction System Providing Situational Awareness for a Sensor Web

    NASA Astrophysics Data System (ADS)

    Aulov, O.; Lary, D. J.

    2010-12-01

    There is great utility in having a flexible and automated objective observation direction system for the decadal survey missions and beyond. Such a system allows us to optimize the observations made by suite of sensors to address specific goals from long term monitoring to rapid response. We have developed such a prototype using a network of communicating software elements to control a heterogeneous network of sensor systems, which can have multiple modes and flexible viewing geometries. Our system makes sensor systems intelligent and situationally aware. Together they form a sensor web of multiple sensors working together and capable of automated target selection, i.e. the sensors “know” where they are, what they are able to observe, what targets and with what priorities they should observe. This system is implemented in three components. The first component is a Sensor Web simulator. The Sensor Web simulator describes the capabilities and locations of each sensor as a function of time, whether they are orbital, sub-orbital, or ground based. The simulator has been implemented using AGIs Satellite Tool Kit (STK). STK makes it easy to analyze and visualize optimal solutions for complex space scenarios, and perform complex analysis of land, sea, air, space assets, and shares results in one integrated solution. The second component is target scheduler that was implemented with STK Scheduler. STK Scheduler is powered by a scheduling engine that finds better solutions in a shorter amount of time than traditional heuristic algorithms. The global search algorithm within this engine is based on neural network technology that is capable of finding solutions to larger and more complex problems and maximizing the value of limited resources. The third component is a modeling and data assimilation system. It provides situational awareness by supplying the time evolution of uncertainty and information content metrics that are used to tell us what we need to observe and the priority we should give to the observations. A prototype of this component was implemented with AutoChem. AutoChem is NASA release software constituting an automatic code generation, symbolic differentiator, analysis, documentation, and web site creation tool for atmospheric chemical modeling and data assimilation. Its model is explicit and uses an adaptive time-step, error monitoring time integration scheme for stiff systems of equations. AutoChem was the first model to ever have the facility to perform 4D-Var data assimilation and Kalman filter. The project developed a control system with three main accomplishments. First, fully multivariate observational and theoretical information with associated uncertainties was combined using a full Kalman filter data assimilation system. Second, an optimal distribution of the computations and of data queries was achieved by utilizing high performance computers/load balancing and a set of automatically mirrored databases. Third, inter-instrument bias correction was performed using machine learning. The PI for this project was Dr. David Lary of the UMBC Joint Center for Earth Systems Technology at NASA/Goddard Space Flight Center.

  14. High-sensitivity acoustic sensors from nanofibre webs.

    PubMed

    Lang, Chenhong; Fang, Jian; Shao, Hao; Ding, Xin; Lin, Tong

    2016-03-23

    Considerable interest has been devoted to converting mechanical energy into electricity using polymer nanofibres. In particular, piezoelectric nanofibres produced by electrospinning have shown remarkable mechanical energy-to-electricity conversion ability. However, there is little data for the acoustic-to-electric conversion of electrospun nanofibres. Here we show that electrospun piezoelectric nanofibre webs have a strong acoustic-to-electric conversion ability. Using poly(vinylidene fluoride) as a model polymer and a sensor device that transfers sound directly to the nanofibre layer, we show that the sensor devices can detect low-frequency sound with a sensitivity as high as 266 mV Pa(-1). They can precisely distinguish sound waves in low to middle frequency region. These features make them especially suitable for noise detection. Our nanofibre device has more than five times higher sensitivity than a commercial piezoelectric poly(vinylidene fluoride) film device. Electrospun piezoelectric nanofibres may be useful for developing high-performance acoustic sensors.

  15. High-sensitivity acoustic sensors from nanofibre webs

    PubMed Central

    Lang, Chenhong; Fang, Jian; Shao, Hao; Ding, Xin; Lin, Tong

    2016-01-01

    Considerable interest has been devoted to converting mechanical energy into electricity using polymer nanofibres. In particular, piezoelectric nanofibres produced by electrospinning have shown remarkable mechanical energy-to-electricity conversion ability. However, there is little data for the acoustic-to-electric conversion of electrospun nanofibres. Here we show that electrospun piezoelectric nanofibre webs have a strong acoustic-to-electric conversion ability. Using poly(vinylidene fluoride) as a model polymer and a sensor device that transfers sound directly to the nanofibre layer, we show that the sensor devices can detect low-frequency sound with a sensitivity as high as 266 mV Pa−1. They can precisely distinguish sound waves in low to middle frequency region. These features make them especially suitable for noise detection. Our nanofibre device has more than five times higher sensitivity than a commercial piezoelectric poly(vinylidene fluoride) film device. Electrospun piezoelectric nanofibres may be useful for developing high-performance acoustic sensors. PMID:27005010

  16. LandEx - Fast, FOSS-Based Application for Query and Retrieval of Land Cover Patterns

    NASA Astrophysics Data System (ADS)

    Netzel, P.; Stepinski, T.

    2012-12-01

    The amount of satellite-based spatial data is continuously increasing making a development of efficient data search tools a priority. The bulk of existing research on searching satellite-gathered data concentrates on images and is based on the concept of Content-Based Image Retrieval (CBIR); however, available solutions are not efficient and robust enough to be put to use as deployable web-based search tools. Here we report on development of a practical, deployable tool that searches classified, rather than raw image. LandEx (Landscape Explorer) is a GeoWeb-based tool for Content-Based Pattern Retrieval (CBPR) contained within the National Land Cover Dataset 2006 (NLCD2006). The USGS-developed NLCD2006 is derived from Landsat multispectral images; it covers the entire conterminous U.S. with the resolution of 30 meters/pixel and it depicts 16 land cover classes. The size of NLCD2006 is about 10 Gpixels (161,000 x 100,000 pixels). LandEx is a multi-tier GeoWeb application based on Open Source Software. Main components are: GeoExt/OpenLayers (user interface), GeoServer (OGC WMS, WCS and WPS server), and GRASS (calculation engine). LandEx performs search using query-by-example approach: user selects a reference scene (exhibiting a chosen pattern of land cover classes) and the tool produces, in real time, a map indicating a degree of similarity between the reference pattern and all local patterns across the U.S. Scene pattern is encapsulated by a 2D histogram of classes and sizes of single-class clumps. Pattern similarity is based on the notion of mutual information. The resultant similarity map can be viewed and navigated in a web browser, or it can download as a GeoTiff file for more in-depth analysis. The LandEx is available at http://sil.uc.edu

  17. Data Publishing and Sharing Via the THREDDS Data Repository

    NASA Astrophysics Data System (ADS)

    Wilson, A.; Caron, J.; Davis, E.; Baltzer, T.

    2007-12-01

    The terms "Team Science" and "Networked Science" have been coined to describe a virtual organization of researchers tied via some intellectual challenge, but often located in different organizations and locations. A critical component to these endeavors is publishing and sharing of content, including scientific data. Imagine pointing your web browser to a web page that interactively lets you upload data and metadata to a repository residing on a remote server, which can then be accessed by others in a secure fasion via the web. While any content can be added to this repository, it is designed particularly for storing and sharing scientific data and metadata. Server support includes uploading of data files that can subsequently be subsetted, aggregrated, and served in NetCDF or other scientific data formats. Metadata can be associated with the data and interactively edited. The THREDDS Data Repository (TDR) is a server that provides client initiated, on demand, location transparent storage for data of any type that can then be served by the THREDDS Data Server (TDS). The TDR provides functionality to: * securely store and "own" data files and associated metadata * upload files via HTTP and gridftp * upload a collection of data as single file * modify and restructure repository contents * incorporate metadata provided by the user * generate additional metadata programmatically * edit individual metadata elements The TDR can exist separately from a TDS, serving content via HTTP. Also, it can work in conjunction with the TDS, which includes functionality to provide: * access to data in a variety of formats via -- OPeNDAP -- OGC Web Coverage Service (for gridded datasets) -- bulk HTTP file transfer * a NetCDF view of datasets in NetCDF, OPeNDAP, HDF-5, GRIB, and NEXRAD formats * serving of very large volume datasets, such as NEXRAD radar * aggregation into virtual datasets * subsetting via OPeNDAP and NetCDF Subsetting services This talk will discuss TDR/TDS capabilities as well as how users can install this software to create their own repositories.

  18. Cool Apps: Building Cryospheric Data Applications with Standards-Based Service Oriented Architecture

    NASA Astrophysics Data System (ADS)

    Oldenburg, J.; Truslove, I.; Collins, J. A.; Liu, M.; Lewis, S.; Brodzik, M.

    2012-12-01

    The National Snow and Ice Data Center (NSIDC) holds a large collection of cryospheric data, and is involved in a number of informatics research and development projects aimed at improving the discoverability and accessibility of these data. To develop high- quality software in a timely manner, we have adopted a Service- Oriented Architecture (SOA) approach for our core technical infrastructure development. Data services at NSIDC are internally exposed to other tools and applications through standards-based service interfaces. These standards include OAI-PMH (Open Archives Initiative Protocol for Metadata Harvesting), various OGC (Open Geospatial Consortium) standards including WMS (Web Map Service) and WFS (Web Feature Service), ESIP (Federation of Earth Sciences Information Partners) OpenSearch, and NSIDC-defined service endpoints which follow a RESTful architecture. By taking a standards-based approach, we are able to use off-the-shelf tools and libraries to consume, translate and broker these data services, and thus develop applications faster. Additionally, by exposing public interfaces to these services we provide valuable data services to technical collaborators; for example, NASA Reverb (http://reverb.echo.nasa.gov) uses NSIDC's WMS services. Our latest generation of web applications consume these data services directly. The most complete example of this is the Operation IceBridge Data Portal (http://nsidc.org/icebridge/ portal) which depends on many of the aforementioned services, retrieving data in several ways. The maps it displays are obtained through the use of WMS and WFS protocols from a MapServer instance hosted at NSIDC. Links to the scientific data collected on Operation IceBridge campaigns are obtained through ESIP OpenSearch requests service providers that encapsulate our metadata databases. These standards-based web services are also developed at NSIDC and are designed to be used independently of the Portal. This poster provides a visual representation of the relationships described above, with additional details and examples, and more generally outlines the benefits and challenges of this SOA approach.

  19. A Pragmatic Approach to Sustainable Interoperability for the Web 2.0 World

    NASA Astrophysics Data System (ADS)

    Wright, D. J.; Sankaran, S.

    2015-12-01

    In the geosciences, interoperability is a fundamental requirement. Members of various standards organizations such as the OGC and ISO-TC 211 have done yeomen services to promote a standards-centric approach to manage the interoperability challenges that organizations face today. The specific challenges that organizations face when adopting interoperability patterns are very many. One approach, that of mandating the use of specific standards has been reasonably successful. But scientific communities, as with all others, ultimately want their solutions to be widely accepted and used. And to this end there is a crying need to explore all possible interoperability patterns without restricting the choices to mandated standards. Standards are created by a slow and deliberative process that sometimes takes a long time to come to fruition and therefore sometime feel to fall short of user expectations. It seems therefore that organizations are left with a series of perceived orthogonal requirements when they want to pursue interoperability. They want a robust but agile solution, a mature approach that also needs to satisfy latest technology trends and so on. Sustainable interoperability patterns need to be forward looking and should choose the patterns and paradigms of the Web 2.0 generation. To this end, the key is to choose platform technologies that embrace multiple interoperability mechanisms that are built on fundamental "open" principles and which align with popular mainstream patterns. We seek to explore data-, metadata- and web service-related interoperability patterns through the prism of building solutions that encourage strong implementer and end-user engagement, improved usability and scalability considerations, and appealing developer frameworks that can grow the audience. The path to tread is not new, and the geocommunity only needs to observe and align its end goals with current Web 2.0 patterns to realize all the benefits that today we all take for granted as part of our everyday use of technology.

  20. Catch the A-Train from the NASA GIBS/Worldview Platform

    NASA Astrophysics Data System (ADS)

    Schmaltz, J. E.; Alarcon, C.; Baynes, K.; Boller, R. A.; Cechini, M. F.; De Cesare, C.; De Luca, A. P.; Gunnoe, T.; King, B. A.; King, J.; Pressley, N. N.; Roberts, J. T.; Rodriguez, J.; Thompson, C. K.; Wong, M. M.

    2016-12-01

    The satellites and instruments of the Afternoon Train are providing an unprecedented combination of nearly simultaneous measurements. One of the challenges for researchers and applications users is to sift through these combinations to find particular sets of data that correspond to their interests. Using visualization of the data is one way to explore these combinations. NASA's Worldview tool is designed to do just that - to interactively browse full-resolution satellite imagery. Worldview (https://worldview.earthdata.nasa.gov/) is web-based and developed using open libraries and standards (OpenLayers, JavaScript, CSS, HTML) for cross-platform compatibility. It addresses growing user demands for access to full-resolution imagery by providing a responsive, interactive interface with global coverage and no artificial boundaries. In addition to science data imagery, Worldview provides ancillary datasets such as coastlines and borders, socio-economic layers, and satellite orbit tracks. Worldview interacts with the Earthdata Search Client to provide download of the data files associated with the imagery being viewed. The imagery used by Worldview is provided NASA's Global Imagery Browse Services (GIBS - https://earthdata.nasa.gov/gibs) which provide highly responsive, highly scalable imagery services. Requests are made via the OGC Web Map Tile Service (WMTS) standard. In addition to Worldview, other clients can be developed using a variety of web-based libraries, desktop and mobile app libraries, and GDAL script-based access. GIBS currently includes more than 106 science data sets from seven instruments aboard three of the A-Train satellites and new data sets are being added as part of the President's Big Earth Data Initiative (BEDI). Efforts are underway to include new imagery types, such as vectors and curtains, into Worldview/GIBS which will be used to visualize additional A-Train science parameters.

  1. A New Data Management System for Biological and Chemical Oceanography

    NASA Astrophysics Data System (ADS)

    Groman, R. C.; Chandler, C.; Allison, D.; Glover, D. M.; Wiebe, P. H.

    2007-12-01

    The Biological and Chemical Oceanography Data Management Office (BCO-DMO) was created to serve PIs principally funded by NSF to conduct marine chemical and ecological research. The new office is dedicated to providing open access to data and information developed in the course of scientific research on short and intermediate time-frames. The data management system developed in support of U.S. JGOFS and U.S. GLOBEC programs is being modified to support the larger scope of the BCO-DMO effort, which includes ultimately providing a way to exchange data with other data systems. The open access system is based on a philosophy of data stewardship, support for existing and evolving data standards, and use of public domain software. The DMO staff work closely with originating PIs to manage data gathered as part of their individual programs. In the new BCO-DMO data system, project and data set metadata records designed to support re-use of the data are stored in a relational database (MySQL) and the data are stored in or made accessible by the JGOFS/GLOBEC object- oriented, relational, data management system. Data access will be provided via any standard Web browser client user interface through a GIS application (Open Source, OGC-compliant MapServer), a directory listing from the data holdings catalog, or a custom search engine that facilitates data discovery. In an effort to maximize data system interoperability, data will also be available via Web Services; and data set descriptions will be generated to comply with a variety of metadata content standards. The office is located at the Woods Hole Oceanographic Institution and web access is via http://www.bco-dmo.org.

  2. Extending Climate Analytics as a Service to the Earth System Grid Federation Progress Report on the Reanalysis Ensemble Service

    NASA Astrophysics Data System (ADS)

    Tamkin, G.; Schnase, J. L.; Duffy, D.; Li, J.; Strong, S.; Thompson, J. H.

    2016-12-01

    We are extending climate analytics-as-a-service, including: (1) A high-performance Virtual Real-Time Analytics Testbed supporting six major reanalysis data sets using advanced technologies like the Cloudera Impala-based SQL and Hadoop-based MapReduce analytics over native NetCDF files. (2) A Reanalysis Ensemble Service (RES) that offers a basic set of commonly used operations over the reanalysis collections that are accessible through NASA's climate data analytics Web services and our client-side Climate Data Services Python library, CDSlib. (3) An Open Geospatial Consortium (OGC) WPS-compliant Web service interface to CDSLib to accommodate ESGF's Web service endpoints. This presentation will report on the overall progress of this effort, with special attention to recent enhancements that have been made to the Reanalysis Ensemble Service, including the following: - An CDSlib Python library that supports full temporal, spatial, and grid-based resolution services - A new reanalysis collections reference model to enable operator design and implementation - An enhanced library of sample queries to demonstrate and develop use case scenarios - Extended operators that enable single- and multiple reanalysis area average, vertical average, re-gridding, and trend, climatology, and anomaly computations - Full support for the MERRA-2 reanalysis and the initial integration of two additional reanalyses - A prototype Jupyter notebook-based distribution mechanism that combines CDSlib documentation with interactive use case scenarios and personalized project management - Prototyped uncertainty quantification services that combine ensemble products with comparative observational products - Convenient, one-stop shopping for commonly used data products from multiple reanalyses, including basic subsetting and arithmetic operations over the data and extractions of trends, climatologies, and anomalies - The ability to compute and visualize multiple reanalysis intercomparisons

  3. Cool Apps: Building Cryospheric Data Applications With Standards-Based Service Oriented Architecture

    NASA Astrophysics Data System (ADS)

    Collins, J. A.; Truslove, I.; Billingsley, B. W.; Oldenburg, J.; Brodzik, M.; Lewis, S.; Liu, M.

    2012-12-01

    The National Snow and Ice Data Center (NSIDC) holds a large collection of cryospheric data, and is involved in a number of informatics research and development projects aimed at improving the discoverability and accessibility of these data. To develop high-quality software in a timely manner, we have adopted a Service-Oriented Architecture (SOA) approach for our core technical infrastructure development. Data services at NSIDC are internally exposed to other tools and applications through standards-based service interfaces. These standards include OAI-PMH (Open Archives Initiative Protocol for Metadata Harvesting), various OGC (Open Geospatial Consortium) standards including WMS (Web Map Service) and WFS (Web Feature Service), ESIP (Federation of Earth Sciences Information Partners) OpenSearch, and NSIDC-specific RESTful services. By taking a standards-based approach, we are able to use off-the-shelf tools and libraries to consume, translate and broker these data services, and thus develop applications faster. Additionally, by exposing public interfaces to these services we provide valuable data services to technical collaborators; for example, NASA Reverb (http://reverb.echo.nasa.gov) uses NSIDC's WMS services. Our latest generation of web applications consume these data services directly. The most complete example of this is the Operation IceBridge Data Portal (http://nsidc.org/icebridge/portal) which depends on many of the aforementioned services, and clearly exhibits many of the advantages of building applications atop a service-oriented architecture. This presentation outlines the architectural approach and components and open standards and protocols adopted at NSIDC, demonstrates the interactions and uses of public and internal service interfaces currently powering applications including the IceBridge Data Portal, and outlines the benefits and challenges of this approach.

  4. Decentralized coordinated control of elastic web winding systems without tension sensor.

    PubMed

    Hou, Hailiang; Nian, Xiaohong; Chen, Jie; Xiao, Dengfeng

    2018-06-26

    In elastic web winding systems, precise regulation of web tension in each span is critical to ensure final product quality, and to achieve low cost by reducing the occurrence of web break or fold. Generally, web winding systems use load cells or swing rolls as tension sensors, which add cost, reduce system reliability and increase the difficulty of control. In this paper, a decentralized coordinated control scheme with tension observers is designed for a three-motor web-winding system. First, two tension observers are proposed to estimate the unwinding and winding tension. The designed observers consider the essential dynamic, radius, and inertial variation effects and only require the modest computational effort. Then, using the estimated tensions as feedback signals, a robust decentralized coordinated controller is adopted to reduce the interaction between subsystems. Asymptotic stabilities of the observer error dynamics and the closed-loop winding systems are demonstrated via Lyapunov stability theory. The observer gains and the controller gains can be obtained by solving matrix inequalities. Finally, some simulations and experiments are performed on a paper winding setup to test the performance of the designed observers and the observer-base DCC method, respectively. Copyright © 2018 ISA. Published by Elsevier Ltd. All rights reserved.

  5. An in situ mediator-free route to fabricate Cu2O/g-C3N4 type-II heterojunctions for enhanced visible-light photocatalytic H2 generation

    NASA Astrophysics Data System (ADS)

    Ji, Cong; Yin, Su-Na; Sun, Shasha; Yang, Shengyang

    2018-03-01

    Cu2O nanoparticles doped g-C3N4 are synthesized via an in situ method and investigated in detail by IR techniques, X-ray diffraction, X-ray photoelectron spectroscopy, transmission electron microscopy, ultraviolet visible diffuse reflection spectroscopy, and photoluminescence spectroscopy. The as-prepared Cu2O/g-C3N4 hybrids demonstrate enhanced photocatalytic activity toward hydrogen generation compared to pure bulk g-C3N4, the effect of Cu2O content on the rate of visible light photocatalytic hydrogen evolution reveals the optimal hydrogen evolution rate can reach 33.2 μmol h-1 g-1, which is about 4 times higher that of pure g-C3N4. The enhanced photocatalytic activity can be attributed to the improved separation and transfer of photogenerated electron-hole pairs at the intimate interface between g-C3N4 and Cu2O. A possible photocatalytic mechanism of the Cu2O/g-C3N4 composite is also discussed. This mediator-free in situ chemical doping strategy developed in this work will contribute to the achievement of other multicomponent photocatalysts.

  6. An Intelligent Tool for Activity Data Collection

    PubMed Central

    Jehad Sarkar, A. M.

    2011-01-01

    Activity recognition systems using simple and ubiquitous sensors require a large variety of real-world sensor data for not only evaluating their performance but also training the systems for better functioning. However, a tremendous amount of effort is required to setup an environment for collecting such data. For example, expertise and resources are needed to design and install the sensors, controllers, network components, and middleware just to perform basic data collections. It is therefore desirable to have a data collection method that is inexpensive, flexible, user-friendly, and capable of providing large and diverse activity datasets. In this paper, we propose an intelligent activity data collection tool which has the ability to provide such datasets inexpensively without physically deploying the testbeds. It can be used as an inexpensive and alternative technique to collect human activity data. The tool provides a set of web interfaces to create a web-based activity data collection environment. It also provides a web-based experience sampling tool to take the user’s activity input. The tool generates an activity log using its activity knowledge and the user-given inputs. The activity knowledge is mined from the web. We have performed two experiments to validate the tool’s performance in producing reliable datasets. PMID:22163832

  7. Silicon web process development

    NASA Technical Reports Server (NTRS)

    Duncan, C. S.; Seidensticker, R. G.; Mchugh, J. P.; Hill, F. E.; Skutch, M. E.; Driggers, J. M.; Hopkins, R. H.

    1980-01-01

    A barrier crucible design which consistently maintains melt stability over long periods of time was successfully tested and used in long growth runs. The pellet feeder for melt replenishment was operated continuously for growth runs of up to 17 hours. The liquid level sensor comprising a laser/sensor system was operated, performed well, and meets the requirements for maintaining liquid level height during growth and melt replenishment. An automated feedback loop connecting the feed mechanism and the liquid level sensing system was designed and constructed and operated successfully for 3.5 hours demonstrating the feasibility of semi-automated dendritic web growth. The sensitivity of the cost of sheet, to variations in capital equipment cost and recycling dendrites was calculated and it was shown that these factors have relatively little impact on sheet cost. Dendrites from web which had gone all the way through the solar cell fabrication process, when melted and grown into web, produce crystals which show no degradation in cell efficiency. Material quality remains high and cells made from web grown at the start, during, and the end of a run from a replenished melt show comparable efficiencies.

  8. The Live Access Server Scientific Product Generation Through Workflow Orchestration

    NASA Astrophysics Data System (ADS)

    Hankin, S.; Calahan, J.; Li, J.; Manke, A.; O'Brien, K.; Schweitzer, R.

    2006-12-01

    The Live Access Server (LAS) is a well-established Web-application for display and analysis of geo-science data sets. The software, which can be downloaded and installed by anyone, gives data providers an easy way to establish services for their on-line data holdings, so their users can make plots; create and download data sub-sets; compare (difference) fields; and perform simple analyses. Now at version 7.0, LAS has been in operation since 1994. The current "Armstrong" release of LAS V7 consists of three components in a tiered architecture: user interface, workflow orchestration and Web Services. The LAS user interface (UI) communicates with the LAS Product Server via an XML protocol embedded in an HTTP "get" URL. Libraries (APIs) have been developed in Java, JavaScript and perl that can readily generate this URL. As a result of this flexibility it is common to find LAS user interfaces of radically different character, tailored to the nature of specific datasets or the mindset of specific users. When a request is received by the LAS Product Server (LPS -- the workflow orchestration component), business logic converts this request into a series of Web Service requests invoked via SOAP. These "back- end" Web services perform data access and generate products (visualizations, data subsets, analyses, etc.). LPS then packages these outputs into final products (typically HTML pages) via Jakarta Velocity templates for delivery to the end user. "Fine grained" data access is performed by back-end services that may utilize JDBC for data base access; the OPeNDAP "DAPPER" protocol; or (in principle) the OGC WFS protocol. Back-end visualization services are commonly legacy science applications wrapped in Java or Python (or perl) classes and deployed as Web Services accessible via SOAP. Ferret is the default visualization application used by LAS, though other applications such as Matlab, CDAT, and GrADS can also be used. Other back-end services may include generation of Google Earth layers using KML; generation of maps via WMS or ArcIMS protocols; and data manipulation with Unix utilities.

  9. Vision-Based Sensor for Early Detection of Periodical Defects in Web Materials

    PubMed Central

    Bulnes, Francisco G.; Usamentiaga, Rubén; García, Daniel F.; Molleda, Julio

    2012-01-01

    During the production of web materials such as plastic, textiles or metal, where there are rolls involved in the production process, periodically generated defects may occur. If one of these rolls has some kind of flaw, it can generate a defect on the material surface each time it completes a full turn. This can cause the generation of a large number of surface defects, greatly degrading the product quality. For this reason, it is necessary to have a system that can detect these situations as soon as possible. This paper presents a vision-based sensor for the early detection of this kind of defects. It can be adapted to be used in the inspection of any web material, even when the input data are very noisy. To assess its performance, the sensor system was used to detect periodical defects in hot steel strips. A total of 36 strips produced in ArcelorMittal Avilés factory were used for this purpose, 18 to determine the optimal configuration of the proposed sensor using a full-factorial experimental design and the other 18 to verify the validity of the results. Next, they were compared with those provided by a commercial system used worldwide, showing a clear improvement. PMID:23112629

  10. Spider-web inspired multi-resolution graphene tactile sensor.

    PubMed

    Liu, Lu; Huang, Yu; Li, Fengyu; Ma, Ying; Li, Wenbo; Su, Meng; Qian, Xin; Ren, Wanjie; Tang, Kanglai; Song, Yanlin

    2018-05-08

    Multi-dimensional accurate response and smooth signal transmission are critical challenges in the advancement of multi-resolution recognition and complex environment analysis. Inspired by the structure-activity relationship between discrepant microstructures of the spiral and radial threads in a spider web, we designed and printed graphene with porous and densely-packed microstructures to integrate into a multi-resolution graphene tactile sensor. The three-dimensional (3D) porous graphene structure performs multi-dimensional deformation responses. The laminar densely-packed graphene structure contributes excellent conductivity with flexible stability. The spider-web inspired printed pattern inherits orientational and locational kinesis tracking. The multi-structure construction with homo-graphene material can integrate discrepant electronic properties with remarkable flexibility, which will attract enormous attention for electronic skin, wearable devices and human-machine interactions.

  11. Citizen Sensors for SHM: Towards a Crowdsourcing Platform

    PubMed Central

    Ozer, Ekin; Feng, Maria Q.; Feng, Dongming

    2015-01-01

    This paper presents an innovative structural health monitoring (SHM) platform in terms of how it integrates smartphone sensors, the web, and crowdsourcing. The ubiquity of smartphones has provided an opportunity to create low-cost sensor networks for SHM. Crowdsourcing has given rise to citizen initiatives becoming a vast source of inexpensive, valuable but heterogeneous data. Previously, the authors have investigated the reliability of smartphone accelerometers for vibration-based SHM. This paper takes a step further to integrate mobile sensing and web-based computing for a prospective crowdsourcing-based SHM platform. An iOS application was developed to enable citizens to measure structural vibration and upload the data to a server with smartphones. A web-based platform was developed to collect and process the data automatically and store the processed data, such as modal properties of the structure, for long-term SHM purposes. Finally, the integrated mobile and web-based platforms were tested to collect the low-amplitude ambient vibration data of a bridge structure. Possible sources of uncertainties related to citizens were investigated, including the phone location, coupling conditions, and sampling duration. The field test results showed that the vibration data acquired by smartphones operated by citizens without expertise are useful for identifying structural modal properties with high accuracy. This platform can be further developed into an automated, smart, sustainable, cost-free system for long-term monitoring of structural integrity of spatially distributed urban infrastructure. Citizen Sensors for SHM will be a novel participatory sensing platform in the way that it offers hybrid solutions to transitional crowdsourcing parameters. PMID:26102490

  12. On-Board Mining in the Sensor Web

    NASA Astrophysics Data System (ADS)

    Tanner, S.; Conover, H.; Graves, S.; Ramachandran, R.; Rushing, J.

    2004-12-01

    On-board data mining can contribute to many research and engineering applications, including natural hazard detection and prediction, intelligent sensor control, and the generation of customized data products for direct distribution to users. The ability to mine sensor data in real time can also be a critical component of autonomous operations, supporting deep space missions, unmanned aerial and ground-based vehicles (UAVs, UGVs), and a wide range of sensor meshes, webs and grids. On-board processing is expected to play a significant role in the next generation of NASA, Homeland Security, Department of Defense and civilian programs, providing for greater flexibility and versatility in measurements of physical systems. In addition, the use of UAV and UGV systems is increasing in military, emergency response and industrial applications. As research into the autonomy of these vehicles progresses, especially in fleet or web configurations, the applicability of on-board data mining is expected to increase significantly. Data mining in real time on board sensor platforms presents unique challenges. Most notably, the data to be mined is a continuous stream, rather than a fixed store such as a database. This means that the data mining algorithms must be modified to make only a single pass through the data. In addition, the on-board environment requires real time processing with limited computing resources, thus the algorithms must use fixed and relatively small amounts of processing time and memory. The University of Alabama in Huntsville is developing an innovative processing framework for the on-board data and information environment. The Environment for On-Board Processing (EVE) and the Adaptive On-board Data Processing (AODP) projects serve as proofs-of-concept of advanced information systems for remote sensing platforms. The EVE real-time processing infrastructure will upload, schedule and control the execution of processing plans on board remote sensors. These plans provide capabilities for autonomous data mining, classification and feature extraction using both streaming and buffered data sources. A ground-based testbed provides a heterogeneous, embedded hardware and software environment representing both space-based and ground-based sensor platforms, including wireless sensor mesh architectures. The AODP project explores the EVE concepts in the world of sensor-networks, including ad-hoc networks of small sensor platforms.

  13. A FPGA embedded web server for remote monitoring and control of smart sensors networks.

    PubMed

    Magdaleno, Eduardo; Rodríguez, Manuel; Pérez, Fernando; Hernández, David; García, Enrique

    2013-12-27

    This article describes the implementation of a web server using an embedded Altera NIOS II IP core, a general purpose and configurable RISC processor which is embedded in a Cyclone FPGA. The processor uses the μCLinux operating system to support a Boa web server of dynamic pages using Common Gateway Interface (CGI). The FPGA is configured to act like the master node of a network, and also to control and monitor a network of smart sensors or instruments. In order to develop a totally functional system, the FPGA also includes an implementation of the time-triggered protocol (TTP/A). Thus, the implemented master node has two interfaces, the webserver that acts as an Internet interface and the other to control the network. This protocol is widely used to connecting smart sensors and actuators and microsystems in embedded real-time systems in different application domains, e.g., industrial, automotive, domotic, etc., although this protocol can be easily replaced by any other because of the inherent characteristics of the FPGA-based technology.

  14. A FPGA Embedded Web Server for Remote Monitoring and Control of Smart Sensors Networks

    PubMed Central

    Magdaleno, Eduardo; Rodríguez, Manuel; Pérez, Fernando; Hernández, David; García, Enrique

    2014-01-01

    This article describes the implementation of a web server using an embedded Altera NIOS II IP core, a general purpose and configurable RISC processor which is embedded in a Cyclone FPGA. The processor uses the μCLinux operating system to support a Boa web server of dynamic pages using Common Gateway Interface (CGI). The FPGA is configured to act like the master node of a network, and also to control and monitor a network of smart sensors or instruments. In order to develop a totally functional system, the FPGA also includes an implementation of the time-triggered protocol (TTP/A). Thus, the implemented master node has two interfaces, the webserver that acts as an Internet interface and the other to control the network. This protocol is widely used to connecting smart sensors and actuators and microsystems in embedded real-time systems in different application domains, e.g., industrial, automotive, domotic, etc., although this protocol can be easily replaced by any other because of the inherent characteristics of the FPGA-based technology. PMID:24379047

  15. Lightweight monitoring and control system for coal mine safety using REST style.

    PubMed

    Cheng, Bo; Cheng, Xin; Chen, Junliang

    2015-01-01

    The complex environment of a coal mine requires the underground environment, devices and miners to be constantly monitored to ensure safe coal production. However, existing coal mines do not meet these coverage requirements because blind spots occur when using a wired network. In this paper, we develop a Web-based, lightweight remote monitoring and control platform using a wireless sensor network (WSN) with the REST style to collect temperature, humidity and methane concentration data in a coal mine using sensor nodes. This platform also collects information on personnel positions inside the mine. We implement a RESTful application programming interface (API) that provides access to underground sensors and instruments through the Web such that underground coal mine physical devices can be easily interfaced to remote monitoring and control applications. We also implement three different scenarios for Web-based, lightweight remote monitoring and control of coal mine safety and measure and analyze the system performance. Finally, we present the conclusions from this study and discuss future work. Copyright © 2014 ISA. Published by Elsevier Ltd. All rights reserved.

  16. Webcam classification using simple features

    NASA Astrophysics Data System (ADS)

    Pramoun, Thitiporn; Choe, Jeehyun; Li, He; Chen, Qingshuang; Amornraksa, Thumrongrat; Lu, Yung-Hsiang; Delp, Edward J.

    2015-03-01

    Thousands of sensors are connected to the Internet and many of these sensors are cameras. The "Internet of Things" will contain many "things" that are image sensors. This vast network of distributed cameras (i.e. web cams) will continue to exponentially grow. In this paper we examine simple methods to classify an image from a web cam as "indoor/outdoor" and having "people/no people" based on simple features. We use four types of image features to classify an image as indoor/outdoor: color, edge, line, and text. To classify an image as having people/no people we use HOG and texture features. The features are weighted based on their significance and combined. A support vector machine is used for classification. Our system with feature weighting and feature combination yields 95.5% accuracy.

  17. Namibian Flood Early Warning SensorWeb Pilot

    NASA Technical Reports Server (NTRS)

    Mandl, Daniel; Policelli, Fritz; Frye, Stuart; Cappelare, Pat; Langenhove, Guido Van; Szarzynski, Joerg; Sohlberg, Rob

    2010-01-01

    The major goal of the Namibia SensorWeb Pilot Project is a scientifically sound, operational trans-boundary flood management decision support system for Southern African region to provide useful flood and waterborne disease forecasting tools for local decision makers. The Pilot Project established under the auspices of: Namibian Ministry of Agriculture Water and Forestry (MAWF), Department of Water Affairs; Committee on Earth Observing Satellites (CEOS), Working Group on Information Systems and Services (WGISS); and moderated by the United Nations Platform for Space-based Information for Disaster Management and Emergency Response (UN-SPIDER). The effort consists of identifying and prototyping technology which enables the rapid gathering and dissemination of both space-based and ground sensor data and data products for the purpose of flood disaster management and water-borne disease management.

  18. A Wearable Wireless Sensor Network for Indoor Smart Environment Monitoring in Safety Applications

    PubMed Central

    Antolín, Diego; Medrano, Nicolás; Calvo, Belén; Pérez, Francisco

    2017-01-01

    This paper presents the implementation of a wearable wireless sensor network aimed at monitoring harmful gases in industrial environments. The proposed solution is based on a customized wearable sensor node using a low-power low-rate wireless personal area network (LR-WPAN) communications protocol, which as a first approach measures CO2 concentration, and employs different low power strategies for appropriate energy handling which is essential to achieving long battery life. These wearables nodes are connected to a deployed static network and a web-based application allows data storage, remote control and monitoring of the complete network. Therefore, a complete and versatile remote web application with a locally implemented decision-making system is accomplished, which allows early detection of hazardous situations for exposed workers. PMID:28216556

  19. A Wearable Wireless Sensor Network for Indoor Smart Environment Monitoring in Safety Applications.

    PubMed

    Antolín, Diego; Medrano, Nicolás; Calvo, Belén; Pérez, Francisco

    2017-02-14

    This paper presents the implementation of a wearable wireless sensor network aimed at monitoring harmful gases in industrial environments. The proposed solution is based on a customized wearable sensor node using a low-power low-rate wireless personal area network (LR-WPAN) communications protocol, which as a first approach measures CO₂ concentration, and employs different low power strategies for appropriate energy handling which is essential to achieving long battery life. These wearables nodes are connected to a deployed static network and a web-based application allows data storage, remote control and monitoring of the complete network. Therefore, a complete and versatile remote web application with a locally implemented decision-making system is accomplished, which allows early detection of hazardous situations for exposed workers.

  20. Uncertainty visualisation in the Model Web

    NASA Astrophysics Data System (ADS)

    Gerharz, L. E.; Autermann, C.; Hopmann, H.; Stasch, C.; Pebesma, E.

    2012-04-01

    Visualisation of geospatial data as maps is a common way to communicate spatially distributed information. If temporal and furthermore uncertainty information are included in the data, efficient visualisation methods are required. For uncertain spatial and spatio-temporal data, numerous visualisation methods have been developed and proposed, but only few tools for visualisation of data in a standardised way exist. Furthermore, usually they are realised as thick clients, and lack functionality of handling data coming from web services as it is envisaged in the Model Web. We present an interactive web tool for visualisation of uncertain spatio-temporal data developed in the UncertWeb project. The client is based on the OpenLayers JavaScript library. OpenLayers provides standard map windows and navigation tools, i.e. pan, zoom in/out, to allow interactive control for the user. Further interactive methods are implemented using jStat, a JavaScript library for statistics plots developed in UncertWeb, and flot. To integrate the uncertainty information into existing standards for geospatial data, the Uncertainty Markup Language (UncertML) was applied in combination with OGC Observations&Measurements 2.0 and JavaScript Object Notation (JSON) encodings for vector and NetCDF for raster data. The client offers methods to visualise uncertain vector and raster data with temporal information. Uncertainty information considered for the tool are probabilistic and quantified attribute uncertainties which can be provided as realisations or samples, full probability distributions functions and statistics. Visualisation is supported for uncertain continuous and categorical data. In the client, the visualisation is realised using a combination of different methods. Based on previously conducted usability studies, a differentiation between expert (in statistics or mapping) and non-expert users has been indicated as useful. Therefore, two different modes are realised together in the tool: (i) adjacent maps showing data and uncertainty separately, and (ii) multidimensional mapping providing different visualisation methods in combination to explore the spatial, temporal and uncertainty distribution of the data. Adjacent maps allow a simpler visualisation by separating value and uncertainty maps for non-experts and a first overview. The multidimensional approach allows a more complex exploration of the data for experts by browsing through the different dimensions. It offers the visualisation of maps, statistic plots and time series in different windows and sliders to interactively move through time, space and uncertainty (thresholds).

Top